Comment by HarHarVeryFunny
11 hours ago
I hadn't heard that, but he was heavily involved in a cancelled project called Galactica that was an LLM for scientific knowledge.
11 hours ago
I hadn't heard that, but he was heavily involved in a cancelled project called Galactica that was an LLM for scientific knowledge.
Yeah that stuff generated embarrassingly wrong scientific 'facts' and citations.
That kind of hallucination is somewhat acceptable for something marketed as a chatbot, less so for an assistant helping you with scientific knowledge and research.
I thought it was weird at the time how much hate Galactica got for its hallucinations compared to hallucinations of competing models. I get your point and it partially explains things. But it's not a fully satisfying explanation.
I guess another aspect is - being too early is not too different from being wrong.