Comment by a_better_world 13 hours ago wot, like a prompt injection attack? Impossible now that models don't hallucinate. 0 comments a_better_world Reply No comments yet Contribute on Hacker News ↗
No comments yet
Contribute on Hacker News ↗