Comment by AIPedant
4 days ago
I think I found the problem!
The rationalist community was drawn together by AI researcher Eliezer Yudkowsky’s blog post series The Sequences, a set of essays about how to think more rationally
I actually don't mind Yudkowski as an individual - I think he is almost always wrong and undeservedly arrogant, but mostly sincere. Yet treating him as an AI researcher and serious philosopher (as opposed to a sci-fi essayist and self-help writer) is the kind of slippery foundation that less scrupulous people can build cults from. (See also Maharishi Mahesh Yogi and related trends - often it is just a bit of spiritual goofiness as with David Lynch, sometimes you get a Charles Manson.)
How has he fared in the fields of philosophy and AI research in terms of peer review, is there some kind of roundup or survey around about this?
EY and MIRI as a whole have largely failed to produce anything which even reaches the point of being peer reviewable. He does not have any formal education and is uninterested in learning how to navigate academia.
I see. But should this really preclude review, "peer" or not? Philosophers talk about ideas from non academic thinkers all the time after all.
Don't forget the biggest scifi guy turned cult leader of all L. Ron Hubbard
I don't think Yudkowski is at all like L. Ron Hubbard. Hubbard was insane and pure evil. Yudkowski seems like a decent and basically reasonable guy, he's just kind of a blowhard and he's wrong about the science.
L. Ron Hubbard is more like the Zizians.
I don't have a horse in the battle but could you provide a few examples where he was wrong?
2 replies →