Comment by liveoneggs 7 months ago But can't it, literally, hallucinate raw data at any point in the run? 5 comments liveoneggs Reply cube00 7 months ago Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt. sumeno 7 months ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations samrus 7 months ago they did mention that it would make up fake transactions to balance the book tmountain 7 months ago Yes. LgLasagnaModel 7 months ago Luddite!!!
cube00 7 months ago Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt. sumeno 7 months ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
sumeno 7 months ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.
If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
they did mention that it would make up fake transactions to balance the book
Yes.
Luddite!!!