Comment by idiotsecant

1 month ago

Telling a LLM 'do not hallucinate' doesn't make it stop hallucinating. Anyone who has used an LLM even moderately seriously can tell you that. They're very useful tools, but right now they're mostly good for writing boilerplate that you'll be reviewing anyhow.

Funnily if you routinely ask them wether their answer is right, they fix it or tell you they hallucinated