← Back to context

Comment by DonsDiscountGas

4 days ago

Check out "the precipice" by Tony Ord. Biological warfare and global warming are unlikely to lead to total human extinction (though both present large risks of massive harm).

Part of the argument is that we've had nuclear weapons for a long time but no apocalypse so the annual risk can't be larger than 1%, whereas we've never created AI so it might be substantially larger. Not a rock solid argument obviously, but we're dealing with a lot of unknowns.

A better argument is that most of those other risks are not neglected, plenty of smart people working against nuclear war. Whereas (up until a few years ago) very few people considered AI a real threat, so the marginal benefit of a new person working on it should be bigger.