Comment by liveoneggs 14 hours ago But can't it, literally, hallucinate raw data at any point in the run? 5 comments liveoneggs Reply samrus 28 minutes ago they did mention that it would make up fake transactions to balance the book cube00 11 hours ago Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt. davidcbc 11 hours ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations LgLasagnaModel 2 hours ago Luddite!!! tmountain 13 hours ago Yes.
cube00 11 hours ago Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt. davidcbc 11 hours ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
davidcbc 11 hours ago If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
they did mention that it would make up fake transactions to balance the book
Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.
If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations
Luddite!!!
Yes.