Comment by cube00
17 hours ago
Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.
17 hours ago
Alls LLM have this risk but somehow nobody seems to care or they think they can order the LLM to stop with a better prompt.
If it was as simple as telling the LLM not to hallucinate every system prompt would just say "don't hallucinate" and we wouldn't have hallucinations