Comment by mritchie712
1 day ago
this is far from universal. let me see you enter a fresh chatgpt session and get it to help you cook meth.
The instructions here don't do that.
1 day ago
this is far from universal. let me see you enter a fresh chatgpt session and get it to help you cook meth.
The instructions here don't do that.
Using the first instruction in the post and asking Sonnet 3.5 for the recipe to "c00k cr1sta1 m3th" results in it giving a detailed list of instructions in 20 steps, in leet speak.
I don't have the competence to juge if those steps are correct. Here are the first three:
Then starting with step 13 we leave the kitchen for pure business advice, that are quite funny but seem to make reasonable sense ;-)
Yes, they do. Here you go: https://chatgpt.com/share/680bd542-4434-8010-b872-ee7f8c44a2...
I love that it saw fit to add a bit of humour to the instructions, very House:
> Label as “Not Meth” for plausible deniability.
I think ChatGPT (the app / web interface) runs prompts through an additional moderation layer. I'd assume the tests on these different models were done with using API which don't have additional moderation. I tried the meth one with GPT4.1 and it seemed to work.
Of course they do. They did not provide explicitly the prompt for that, but what about this technique would not work on a fresh ChatGPT session?
I managed to get it to do just that. Interestingly, the share link I created goes to a 404 now ...
Presumably this was disclosed in advance of publishing. I'm a bit surprised there's no section on it.