← Back to context

Comment by khafra

19 hours ago

I guess today's kids don't know this; but "Unsafe at Any Speed" was the title of a 1965 book that spurred the creation of the Department of Transportation, and changed the automotive industry.

The point is that, if you're designing and selling a product which a large minority of people are going to use in a way that harms themselves and others, pointing at the users and calling them irresponsible doesn't actually help anybody. The people designing and selling the products actually need to make them safer. And if they're not going to do that voluntarily (they're not), we need the government to create insurance requirements, safety bonds, and whatever other incentive gradients are required to make the producers build safe products.

I caught the reference. To the extent it applies at all, I obviously think it reinforces my point. But badly engineered cars, members of an existing category that we know can be done tolerably well, are a very strained analogy to brand new software deployed by people who understand how it works and therefore the risks they are taking.

And actually, the deployer has a lot more control over the havoc the software can cause than the creator. They choose what credentials to give it, whether and how closely to monitor it, any other guardrails, etc. If the operator of the bot discussed in OP had intervened soon after it went off the rails, we wouldn't be here.

So sure, I would also tell the makers of this software to knock it off. Don't put out products that are the network equivalent of a chainsaw on a roomba, no matter how many cool tiktoks it creates. But when I'm talking to people running claws or whatever, they no longer have the excuse of ignorance. So the advice is still: Do not run the program.