Comment by anon373839
1 year ago
> Silent prompt injections
That’s the crux of what’s so off-putting about this whole thing. If Google or OpenAI told you your query was to be prepended with XYZ instructions, you could calibrate your expectations correctly. But they don’t want you to know they’re doing that.
No comments yet
Contribute on Hacker News ↗