← Back to context

Comment by staticman2

5 days ago

Terminator descends from a tradition of science fiction cold war parables. Even in Terminator 2 there's a line suggesting the movie isn't really about robots:

John:We're not gonna make it, are we? People, I mean.

Terminator: It's in your nature to destroy yourselves.

Seems odd to worry about computers shooting the ozone when there's plenty of real existential threats loaded in missles aimed at you right now.

I'm not in any way discounting the danger represented by those missiles. In fact I think AI only makes it more likely that they might someday be launched. But I will say that in my experience the error-condition that causes a system to fail is usually the one that didn't seem likely to happen, because the more obvious failure modes were taken seriously from the beginning. Is it so unusual to be able to consider more than one risk at a time?