Comment by liveoneggs

14 hours ago

But can't it, literally, hallucinate raw data at any point in the run?

Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.

  • If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations