Comment by spacecadet
2 hours ago
Yaaawn. Our team tried this last year, had a fine tuned model singing prompt injection attacks. Prompt Injection research is dead people. Refusal is NOT a problem... Secure systems, don't just focus on models. Hallucinations are a feature not a bug, etc etc etc. Can you hear me in the back yet?
No comments yet
Contribute on Hacker News ↗