← Back to context

Comment by fooker

8 hours ago

Yeah that stuff generated embarrassingly wrong scientific 'facts' and citations.

That kind of hallucination is somewhat acceptable for something marketed as a chatbot, less so for an assistant helping you with scientific knowledge and research.

I thought it was weird at the time how much hate Galactica got for its hallucinations compared to hallucinations of competing models. I get your point and it partially explains things. But it's not a fully satisfying explanation.