Comment by mikepurvis
1 year ago
I was just going to say the same— LLMs are great for giving a name to a described concept, architecture, or phenomenon. And I would argue that hallucinations don't actually much matter for this usage as you're going to turn around and google the name anyway, once you've been told it.
No comments yet
Contribute on Hacker News ↗