Comment by zahlman
16 days ago
If you could influence the LLM's actions so easily, what would stop it from equally being influenced by prompt injection from the data being processed?
What you need is more fine-grained control over the harness.
16 days ago
If you could influence the LLM's actions so easily, what would stop it from equally being influenced by prompt injection from the data being processed?
What you need is more fine-grained control over the harness.
No comments yet
Contribute on Hacker News ↗