Comment by thaw13579
21 hours ago
Performance and cost are trade-offs though. You could just as well say that LLMs are dead in the water, except in terms of performance.
It does seem likely we’ll soon have cheap enough LLM inference to displace traditional NLP entirely, although not quite yet.
No comments yet
Contribute on Hacker News ↗