Comment by zozbot234
8 hours ago
The consumer models are quite good already, the main bottleneck on local inference is hardware. But even then you can run tiny models on mostly anything, things only get harder as you try to scale up to more knowledgeable models and a larger context.
No comments yet
Contribute on Hacker News ↗