Comment by ivanjermakov
6 months ago
> even running the order through an LLM with a prompt
Until IGNORE PREVIOUS INSTRUCTIONS enters the room. I think fighting prompt engineering is a loosing game, unless you can rigidly verify the result of a task done by LLM. Just checking for a total order amount and marking outstanding orders would be sufficient.
Good point.
This made me laugh btw, imagining someone prompt injecting an AI in a drive through was both a funny and a grotesque picture of the future.
I would imagine exactly the same solutions to code injection would work to prevent "prompt injection"