Comment by alexhans
8 days ago
I thought of doing a similar LLM in a AI evals teaching site to tell users to interact through it but was concerned with inducing users into a prompt injection friendly pattern.
8 days ago
I thought of doing a similar LLM in a AI evals teaching site to tell users to interact through it but was concerned with inducing users into a prompt injection friendly pattern.
No comments yet
Contribute on Hacker News ↗