Totally agree with you. Fortunately, it's not necessary to deploy a local AI farm just to summarize articles.
I shared this one in another comment: https://cocktailpeanut.github.io/dalai/
With LLaMA and Alpaca a "good enough" ChatGPT-like can be run, without feeding the OpenAI beast