Comment by ACCount37
1 day ago
Global warming just isn't harmful enough to pose a credible extinction risk.
The damage is too limited and happens far too slowly. Even the unlikely upper end projections aren't enough to upend the human civilization - the harms up there at the top end are "worse than WW2", but WW2 sure didn't end humankind. At the same time: the ever-worsening economics of fossil fuel power put a bound on climate change even in a "no climate policy" world, which we are outperforming.
It's like the COVID of global natural disasters. Harmful enough to be worth taking measures against. But just harmless enough that you could do absolutely nothing, and get away with it.
The upper bound on AI risks is: total extinction of humankind.
I fucking wish that climate change was the biggest threat as far as eye can see.
AI is less of an extinction than climate change. Especially what passes for AI these days. Climate change is going to displace billions of people. That alone is going to be chaos, but food/water shortages are going to be a problem too.
AI is only a threat if we suddenly reach sci-fi levels where AI concludes that the earth is better off without humanity on it and there's zero indication that we're anywhere near that today.
The good news is that our society will likely collapse or require resources pulled away from power/water hungry AI datacenters well before we see any actual I in AI
Climate change could lead to a massive world war over arable land and potable water. It could also make wildfires more common and more damaging. It will make cyclonic storms stronger. This may not be extinction level, but could be a major pressure on population numbers.