← Back to context

Comment by antonvs

7 days ago

Hallucinations still occur regularly in all models. It’s certainly not a solved problem. If you’re not seeing them, either the kinds of queries you’re doing don’t tend to elicit hallucinations, or you’re incorrectly accepting them as real.

The example in the OP is a common one: ask a model how to do something with a tool, and if there’s no easy way to perform that operation they’ll commonly make up a plausible answer.