← Back to context Comment by a_better_world 14 hours ago wot, like a prompt injection attack? Impossible now that models don't hallucinate. 0 comments a_better_world Reply No comments yet Contribute on Hacker News ↗
No comments yet
Contribute on Hacker News ↗