Comment by alexhans
9 days ago
I thought of doing a similar LLM in a AI evals teaching site to tell users to interact through it but was concerned with inducing users into a prompt injection friendly pattern.
9 days ago
I thought of doing a similar LLM in a AI evals teaching site to tell users to interact through it but was concerned with inducing users into a prompt injection friendly pattern.
No comments yet
Contribute on Hacker News ↗