← Back to context

Comment by selfhoster11

1 day ago

Yes, and a table saw can take your hand. As can a whole variety of power tools. That does not render them illegal to sell to adults.

It dose render them illigal to sell without studying their safety.

An interesting comparison.

Table saws sold all over the world are inspected and certified by trusted third parties to ensure they operate safely. They are illegal to sell without the approval seal.

Moreover, table saws sold in the United States & EU (at least) have at least 3 safety features (riving knife, blade guard, antikickback device) designed to prevent personal injury while operating the machine. They are illegal to sell without these features.

Then of course there are additional devices like sawstop, but it is not mandatory yet as far as I'm aware. Should be in a few years though.

LLMs have none of those board labels or safety features, so I'm not sure what your point was exactly?

  • They are somewhat self regulated, as they can cause permament damage to the company that releases them, and they are meant for general consumers without any training, unlike table saws that are meant for trained people.

    An example is the first Microsoft bot that started to go extreme rightwing when people realized how to make it go that direction. Grok had a similar issue recently.

    Google had racial issues with its image generation (and earlier with image detection). Again something that people don't forget.

    Also an OpenAI 4o release was encouraging stupid things to people when they asked stupid questions and they just had to roll it back recently.

    Of course I'm not saying that that's the real reason (somehow they never say that the problem is with performance for not releasing stuff), but safety matters with consumer products.

    • > They are somewhat self regulated, as they can cause permament damage to the company that releases them

      And then you proceed to give a number of examples of that not happening. Most people already forgot those.

  • An LLM is not gonna chop of your limb. You can’t use it to attack someone.

    • An LLM is gonna convince you to treat your wound with quack medics instead of seeing a doctor, which will eventually result the limb being chopped to save you from gangrene.

      You can perfectly use an LLM to attack someone. Your sentence is very weird as it comes off as a denial of things that have been happening for months and are ramping up. Examples abound: generate scam letters, find security flaws in a codebase, extract personal information from publicly-available-yet-not-previously-known locations, generate attack software customized for particular targets, generate untraceable hit offers and then post them on anonymized Internet services on your behalf, etc. etc.

      1 reply →