Comment by KalMann
2 months ago
> Models don't have access to "reality"
This is an explanation of why models "hallucinate" not a criticism for the provided definition of hallucination.
2 months ago
> Models don't have access to "reality"
This is an explanation of why models "hallucinate" not a criticism for the provided definition of hallucination.
That's a poor definition, then. It claims that a model is "hallucinating" when its output doesn't match a reference point that it can't possibly have accurate information about. How is that an "hallucination" in any meaningful sense?