Comment by johndough
19 hours ago
> (it doesn't hallucinate on this)
But how do we know that you did not hallucinate the claim that ChatGPT does not hallucinate its version number?
We could try to exfiltrate the system prompt which probably contains the model name, but all extraction attempts could of course be hallucinations as well.
(I think there was an interview where Sam Altman or someone else at OpenAI where it was mentioned that they hardcoded the model name in the prompt because people did not understand that models don't work like that, so they made it work. I might be hallucinating though.)
Confabulating* If you were hallucinating we would be more amused :)