Comment by lnx01
2 days ago
LLMs are so good at telling me about things I know little to nothing about, but when when I ask about things I have expert knowledge on they consistently fail, hallucinate, and confidently lie...
2 days ago
LLMs are so good at telling me about things I know little to nothing about, but when when I ask about things I have expert knowledge on they consistently fail, hallucinate, and confidently lie...
Feels like https://en.wikipedia.org/wiki/Gell-Mann_amnesia_effect
It was a very clear sarcastic reference to it.
But it's still not completely right. LLMs are actually great to tell you about things you know little about. You just have to take names, ideas, and references from it, not facts.
(And that makes agentic coding almost useless, by the way.)
I’ve found that they vary a huge amount based on the subject matter. In my case, I have noticed the opposite of what you observed. They know a lot about the web space (which I’ve been in for around 25 years), but are pretty bad (though not useless) at esoteric languages such as Hare.
Obviously, since the training material for such esoteric languages is scarce. (That's why they are esoteric!) So by definition, LLM will never be good at esoteric languages.
I think you end up asking it basic questions about stuff you know little about, but much more complex/difficult questions for stuff you're already an expert in.
You just don't know enough to identify the bullshit when you aren't an expert in that domain.
That’s the joke