Comment by nacs
3 days ago
> I really hope more people realize that local LLMs are where it's at
No worries, the AI companites thought ahead - by sending GPU, RAM, and now even harddrive prices through the roof, you won't have a computer to run a local model.
No comments yet
Contribute on Hacker News ↗