← Back to context Comment by liveoneggs 8 hours ago But can't it, literally, hallucinate raw data at any point in the run? 3 comments liveoneggs Reply cube00 5 hours ago Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt. davidcbc 5 hours ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations tmountain 8 hours ago Yes.
cube00 5 hours ago Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt. davidcbc 5 hours ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
davidcbc 5 hours ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.
If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
Yes.