Comment by sebazzz 7 months ago I suppose with an LLM you could never know if it is hallucinating a supposed system prompt. 0 comments sebazzz Reply No comments yet Contribute on Hacker News ↗
No comments yet
Contribute on Hacker News ↗