← Back to context

Comment by andrewflnr

18 hours ago

I caught the reference. To the extent it applies at all, I obviously think it reinforces my point. But badly engineered cars, members of an existing category that we know can be done tolerably well, are a very strained analogy to brand new software deployed by people who understand how it works and therefore the risks they are taking.

And actually, the deployer has a lot more control over the havoc the software can cause than the creator. They choose what credentials to give it, whether and how closely to monitor it, any other guardrails, etc. If the operator of the bot discussed in OP had intervened soon after it went off the rails, we wouldn't be here.

So sure, I would also tell the makers of this software to knock it off. Don't put out products that are the network equivalent of a chainsaw on a roomba, no matter how many cool tiktoks it creates. But when I'm talking to people running claws or whatever, they no longer have the excuse of ignorance. So the advice is still: Do not run the program.