Comment by vitorgrs
3 days ago
Yes! I always ask these models a simple question, that all models don't have the right answers.
"List of mayors of my City X".
All OF THEM, get it wrong. Hallucinate the names, wrong dates, etc. The list is on wikipedia, and for sure they trained on that data, but they are not able to answer properly.
o3-mini? It just says it doesn't know lol
Yeah, that's the big upside for sure - it baseline hallucinates less. But when it does, it's very assertive in gaslighting you that it's hallucination is in fact the truth, it can't "fix" its own errors. I've found this tradeoff not to be worth it for general use.