Comment by argomo
19 hours ago
Not everybody thinks it's nonsensical. Here's a different take:
If Anyone Builds It, Everyone Dies https://en.wikipedia.org/wiki/If_Anyone_Builds_It,_Everyone_...
19 hours ago
Not everybody thinks it's nonsensical. Here's a different take:
If Anyone Builds It, Everyone Dies https://en.wikipedia.org/wiki/If_Anyone_Builds_It,_Everyone_...
Yudkowski is a clown, the local crackhead in your street is probably more accurate and less insane than him.
At this point we should have had Ai induced apocalypse a few times according to him
Being an insane clown (posse optional) with less accuracy than the town crackhead doesn't seem to be a barrier to success in tech anymore.
Certainly makes you qualified to be CEO or Spokesperson.
Yes, nonsensical people like EY don’t think it’s nonsensical.
Researches at top AI labs don't consider EY to be a kook even though they may not necessarily agree. EY concepts/terminology appear in Anthropic safety papers. Geoffrey Hinton takes him quite seriously and mentions him in his interviews.
Anthropic is the AI doomer / safetyism lab, and Hinton is one of the patron saints of 'rationalist' AI doomerism.
AI doomerism is psychologically attractive to "people with autistic cognitive traits, including dichotomous (black-and-white) thinking, intolerance of uncertainty, and a tendency toward catastrophizing". They are pascal's mugging themselves, to ironically use one of their terms. It's fundamentally a cognitive distortion.
14 replies →
Just because some researchers are infected with this idiocy that EY propagates does not mean that it is legit.
Maybe they should pay more attention to real problems like the sycophantic nature of current LLMs causing psychosis in people and worry less about theoretical AGI.
3 replies →
And people working on the metaverse endlessly referenced Ready Player One despite it being ludicrous fiction.
Yudkowsky is obviously read a lot by some people working in AI. That doesn't make his ideas prescient.
1 reply →
Researchers at top AI labs also have the incentive to say whatever shit it will take to get their lab funded, reason be damned.
EY = Eliezer Yudkowsky
Appreciate that you made account just for this. I was well aware of Yudkowsky but even so couldn't parse this "EY" initialism
Thank you, like most of the world I would assume "EY" would refer to Ernst and Young, the multi-national Big Four with a website of ey.com who I'm sure has opinions on AI, but nowhere near enough to be classed as expertise
That book was written by him, so I figured the acronym was obvious. My bad!
Ok but that's a metaphor for the free market, not literal speculation about a machine.
Edit: i was mistaken and people clearly do take this seriously now. Oh dear