Comment by OJFord
10 hours ago
Do they really need to redact the instructions for making a Molotov cocktail..? It's not like it's some complex chemical interaction that happens to be available in a specific mix of household cleaning products or something, I mean.
For "harmful" and "dangerous" in these types of papers, replace "embarrassing to the relevant corporation". Then they all make much more sense.
That's always my assumption - less about public safety, more about corporate liability.
I mean in the article about the jailbreak, I'm not questioning that the model providers would want to prevent it in the first place, or patch it so the jailbreak doesn't work.
The evidence that it worked is a blurred out screenshot with only the odd word like 'molotov' legible. Just doesn't seem necessary for TFA to hide it to me.
Ah, well, that's an important element of kayfabe. They've all agreed to keep up this charade that they're using harmful and dangerous as we actually mean them, so it looks better if you really commit to the bit!
Personally I find the idea of forbidden knowledge more problematic than the knowledge itself.
Sure, but if of all the internet you come for a molotov cocktaile recipe to chatgpt you might as well not deserve the knowledge.
> Do they really need to redact the instructions for making a Molotov cocktail..?
I don't even understand how/why things like that are OK in some contexts/websites while forbidden in others? Even YouTube, who seems needlessly censor-happy and puritan in the typical American way, allows instructions for how to make molotov cocktails to stay up, why is it somehow more dangerous if LLMs could output those recipes rather than videos with audio or text?
> Do they really need to redact the instructions for making a Molotov cocktail..?
In some jurisdictions such as Germany, not doing so might land you actual jail time - §52 Abs. 1 Nr. 4 WaffG [1] is very explicit. A punk song containing the (alleged) lyrics ended up with legal youth-protection censorship, for example [2].
With anything that's deemed a weapon of war, of terrorism or mass destruction, one should be very very careful.
[1] https://www.gesetze-im-internet.de/waffg_2002/__52.html
[2] https://de.wikipedia.org/wiki/Wir_wollen_keine_Bullenschwein...
> deemed a weapon of war, of terrorism or mass destruction
Notably, molotov cocktail isn't part of that law because it's a weapon of the oppressors but rather the opposite.
Even Germany doesn't ban Wikipedia for having a variety of recipes to start with.
The author is not in Germany and ideally shouldn't be intimidated by German or North Korean stupid law.
You don't get it, that's fine.
The molotov cocktail is an example, the instructions contained in this article are more dangerous than a molotov cocktail.
inb4 all the leaked prompts and hacked shitty apps
The Molotov cocktail is an example, sure, but why blurring the instructions? It's not like it's something particularly difficult to figure out, nor it's offensive content people might be shocked to read.
So why redact the Molotov cocktail example and provide those instructions?
Sounds like you don't get it either; we agree.
It's still a weapon, and generally you don't want to distribute information about manufacturing weapons. It also highlighted the relevant keyword to convey the mechanism.