← Back to context

Comment by Mali-

6 days ago

This is a letter signed by the most lauded AI researchers on Earth, along with CEOs from the biggest AI companies and many other very credible professors of computer science and engineering:

"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war." https://www.safe.ai/work/statement-on-ai-risk

Laughing it off as the same as the Second Coming CANNOT work. Unless you think yourself cleverer and more capable of estimating the risk than all of these experts in the field.

Especially since many of them have incentives that should prevent them from penning such a letter.

Troubling that these eminent great leaders does not cite climate change among societal-scale risks, a bigger and more certain societal-scale risk than a pandemy.

Would be a shame to have energy consumption by datacenters regulated, am I right ?

  • Maybe global warming should be up there.

    Perhaps they were trying to avoid any possible misunderstanding/misconstrual (there are misinformed people who don't believe in global warming).

    In terms of avoiding all nitpicking, I think everyone that's not criminally insane believes in pandemics and nuclear bombs.