← Back to context

Comment by paxys

4 days ago

> We have prior art that says humans don't just launch all the nukes just because the computers or procedures say to.

This relies on processes being in place to ensure that a human will always make the final decision. What about when that gets taken away?

I find it hard to imagine that the people in a position to kill those processes could ever be that zealously in love with AI, but recent events have given me a tiny bit of doubt.

  • I mean in the cases where higher command has said launch your nukes and lower command has not done so and everything turned out ok, I think to higher command it of course is good it worked out this time but it certainly also looks like a problem with the system that needs to be automated away. So a computer that will launch all nukes when ordered must look very appealing in contrast to humans who might save humanity.