Comment by burtonator
1 year ago
I'm honestly confused as to why it is doing this and why it thinks I'm right when I tell it that it is incorrect.
I've tried asking it factual information, and it asserts that it's incorrect but it will definitely hallucinate questions like the above.
You'd think the reasoning would nail that and most of the chain-of-thought systems I've worked on would have fixed this by asking it if the resulting answer was correct.
No comments yet
Contribute on Hacker News ↗