Comment by trollbridge
1 day ago
It's quite easy to "jailbreak" by asking it to discuss hypotheticals, help you write accurate information for a fictional account, etc.
1 day ago
It's quite easy to "jailbreak" by asking it to discuss hypotheticals, help you write accurate information for a fictional account, etc.
This is my experience too. Most bots are happy to discuss health stuff in a vacuum, which works for some queries.