Comment by gf000
10 hours ago
Why not just run a local LLM for practically free? You can even trivially parallelize it with multiple instances.
I would believe that many NLP problems can be easily solved even by smaller LLM models.
10 hours ago
Why not just run a local LLM for practically free? You can even trivially parallelize it with multiple instances.
I would believe that many NLP problems can be easily solved even by smaller LLM models.
No comments yet
Contribute on Hacker News ↗