Comment by throwawaylolllm
9 days ago
It's my belief (and I'm far from the only person who thinks this) that many AI optimists are motivated by an essentially religious belief that you could call Singularitarianism. So "wishful thinking" would be one answer. This document would then be the rough equivalent of a Christian fundamentalist outlining, on the basis of tangentially related news stories, how the Second Coming will come to pass in the next few years.
Crackpot millenarians have always been a thing. This crop of them is just particularly lame and hellbent on boiling the oceans to get their eschatological outcome.
This is a letter signed by the most lauded AI researchers on Earth, along with CEOs from the biggest AI companies and many other very credible professors of computer science and engineering:
"Mitigating the risk of extinction from AI should be a global priority alongside other societal-scale risks such as pandemics and nuclear war." https://www.safe.ai/work/statement-on-ai-risk
Laughing it off as the same as the Second Coming CANNOT work. Unless you think yourself cleverer and more capable of estimating the risk than all of these experts in the field.
Especially since many of them have incentives that should prevent them from penning such a letter.
Troubling that these eminent great leaders does not cite climate change among societal-scale risks, a bigger and more certain societal-scale risk than a pandemy.
Would be a shame to have energy consumption by datacenters regulated, am I right ?
Maybe global warming should be up there.
Perhaps they were trying to avoid any possible misunderstanding/misconstrual (there are misinformed people who don't believe in global warming).
In terms of avoiding all nitpicking, I think everyone that's not criminally insane believes in pandemics and nuclear bombs.
Spot on, see the 2017 article "God in the machine: my strange journey into transhumanism" about that dynamic:
https://www.theguardian.com/technology/2017/apr/18/god-in-th...
Eh, not sure if the second coming is a great analogy. That wholly depends on the whims of a fictional entity performing some unlikely actions.
Instead think of them saying a crusade occurring in the next few years. When the group saying the crusade is coming is spending billions of dollars to trying to make just that occur you no longer have the ability to say it's not going to happen. You are now forced to examine the risks of their actions.
Reminds me of Fallout's Children of Atom "Church of the Children of Atom"
Maybe we'll see "Church of the Children of Altman" /s
It seems without a framework of ethics/morality (insert XYZ religion), us humans find one to grasp onto. Be it a cult, a set of not-so-fleshed-out ideas/philosophies etc.
People who say they aren't religious per-se, seem to have some set of beliefs that amount to religion. Just depends who or what you look towards for those beliefs, many of which seem to be half-hazard.
People I may disagree with the most, many times at least have a realization of what ideas/beliefs are unifying their structure of reality, with others just not aware.
A small minority of people can rely on schools of philosophical thought, and 'try on' or play with different ideas, but have a self-reflection that allows them to see when they transgress from ABC philosophy or when the philosophy doesn't match with their identity to a degree.