Comment by estimator7292
4 days ago
> There's plenty of futures which include LLMs and don’t include the centralization but they require a departure from our current trajectory.
I don't think that's true at all. It's pretty clear that local models are the future of agentic coding, and everyone's been moving towards that goal.
It's also becoming clear that current models are much bigger than they really need to be. New research indicates that most transformer models can be shrunk significantly and still perform the same.
We definitely aren't there yet, but models that run on a single consumer GPU are getting better at a pretty fast pace. Model size keeps going down, efficiency keeps going up, and compute keeps getting faster and cheaper.
I really don't see a future where enormous datacenters are the only way to run a coding agent. Huge models might continue to be more performant, but the gap between that and a local model is closing quick.
No comments yet
Contribute on Hacker News ↗