Comment by mrfumier

1 month ago

The comparison with calculators overlooks several key developments.

LLMs are becoming increasingly efficient. Through techniques such as distillation, quantization, and optimized architectures, it is already possible to run capable models offline, including on personal computers and even smartphones. This trend reduces reliance on constant access to centralized providers and enables local, self-contained usage.

Rather than avoiding LLMs, the rational response is to build local, portable, and open alternatives in parallel. The natural trajectory of LLMs points toward smaller, more efficient, and locally executable models, mirroring the path that calculators themselves once followed.