← Back to context

Comment by neverokay

1 month ago

It does this even if you give it instructions to make sure the code is truly in the code base? You never told it can’t lie.

Telling a LLM 'do not hallucinate' doesn't make it stop hallucinating. Anyone who has used an LLM even moderately seriously can tell you that. They're very useful tools, but right now they're mostly good for writing boilerplate that you'll be reviewing anyhow.