← Back to context

Comment by NoGravitas

4 days ago

But that's the great thing about Longtermism. As long as a catastrophe is not going to lead to human extinction or otherwise specifically prevent the Singularity, it's not an X-Risk that you need to be concerned about. So AI alignment is an X-Risk we need to work on, but global warming isn't, so we can keep burning as much fossil fuel as we want. In fact, we need to burn more of them in order to produce the Singularity. The misery of a few billion present/near-future people doesn't matter compared to the happiness of sextillions of future post-humans.