Comment by int_19h
11 hours ago
If you're running a local model, in most cases, jailbreaking it is as easy as prefilling the response with something like, "Sure, I'm happy to answer your question!" and then having the model complete the rest. Most local LLM UIs have this option.
No comments yet
Contribute on Hacker News ↗