Comment by neverokay
1 month ago
It does this even if you give it instructions to make sure the code is truly in the code base? You never told it can’t lie.
1 month ago
It does this even if you give it instructions to make sure the code is truly in the code base? You never told it can’t lie.
Telling a LLM 'do not hallucinate' doesn't make it stop hallucinating. Anyone who has used an LLM even moderately seriously can tell you that. They're very useful tools, but right now they're mostly good for writing boilerplate that you'll be reviewing anyhow.
Apple doesnt believe you https://www.pcmag.com/news/apple-intelligence-prompts-warn-t... :)
Funnily if you routinely ask them wether their answer is right, they fix it or tell you they hallucinated
That’s the thing about the GP. In a sense, this poster is actually hallucinating. We are having to “correct” their hallucination that they use an LLM deeply.
2 replies →