Comment by Nition
6 months ago
The good option would be for the LLM to say it doesn't know. It's the making up answers that's the problem.
6 months ago
The good option would be for the LLM to say it doesn't know. It's the making up answers that's the problem.
No comments yet
Contribute on Hacker News ↗