Comment by abhisek
10 hours ago
Ok! So all the novel jailbreaks and "how I hacked your AI" can make the LLM say something supposedly harmful stuff which is a Google search away anyway. I thought we are past the chat bot phase of LLMs and doing something more meaningful.
No comments yet
Contribute on Hacker News ↗