Comment by TerrifiedMouse
2 years ago
I get the feeling that LLMs will tell you they don’t know if “I don’t know” is one of the responses in their training data set. If they actually don’t know, i.e. no trained responses, that’s when they start hallucinating.
No comments yet
Contribute on Hacker News ↗