← Back to context

Comment by skrebbel

4 days ago

This article is beautifully written, and it's full of proper original research. I'm sad that most comments so far are knee-jerk "lol rationalists" type responses. I haven't seen any comment yet that isn't already addressed in much more colour and nuance in the article itself.

The contrarian dynamic strikes again! https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que...

(I'm referring to how this comment, objecting to the other comments as unduly negative, has been upvoted to the top of the thread.)

(p.s. this is not a criticism!)

  • I think that since it's not possible to reply to multiple comments at the same time, people will naturally open a new top-level comment the moment there's a clearly identifiable groupthink emerging. Quoting one of your earlier comments about this:

    >This happens so frequently that I think it must be a product of something hard-wired in the medium *[I mean the medium of the internet forum]

    I would say it's only hard-wired in the medium of tree-style comment sections. If HN worked more like linear forums with multi-quote/replies, it might be possible to have multiple back-and-forths of subgroup consensus like this.

  • Hahaah yeah true. If I had been commenting earlier I might’ve written “lol rationalists”

> I haven't seen any comment yet that isn't already addressed in much more colour and nuance in the article itself.

I once called rationalists infantile, impotent liberal escapism, perhaps that's the novel take you are looking for.

Essentially my view is that the fundamental problem with rationalists and the effective altruist movement is that they are talking about profound social and political issues, with any and all politics completely and totally removed from it. It is liberal depoliticisation[1] driven to its ultimate conclusion. That's just why they are ineffective and wrong about everything, but that's also why they are popular among the tech elites that are giving millions to associated groups like MIRI[2]. They aren't going away, they are politically useful and convenient to very powerful people.

[1] https://en.wikipedia.org/wiki/Post-politics

[2] https://intelligence.org/transparency/

Asterisk is basically "rationalist magazine" and the author is a well-known rationalist blogger, so it's not a surprise that this is basically the only fair look into this phenomenon - compared to the typical outside view that rationalism itself is a cult and Eliezer Yudkowsky is a cult leader, both of which I consider absurd notions.

  • The view from the inside, written by a person who is waist deep into the movement, is the only fair look into the phenomenon?

    • In theory, there should be a middle way between "waist deep into the movement" and "my research consists of collecting rumors on the internet, and then calling one or two people to give me a quote".

      In practice, I don't remember reading an article on the rationality community written from such position. Most articles are based on other articles, which are based on yet other articles... ultimately based on someone's opinion posted on their blogs. (Plus the police reports about the Zizians.)

      I think it would be really nice for a change if e.g. some journalist infiltrated the rationality community under a fake identity, joined one of their meetups or workshops, talked to a few people there, and then heroically exposed to the public all the nefarious plans... or the lack thereof. Shouldn't be that hard, I think. New people are coming all the time, no one does a background check on them. Yet for some mysterious reason, this never happens.

      Notice how this article describes more bad things in the community than a typical outsider-written article. Three specific rationalist cults were named! The difference is not insider vs outsider, but having specific information vs vibes-based reporting.

      2 replies →

    • Okay, true, that was a silly statement for me to make. It's just a look that's different from the typical media treatment of the rationalist community, and is as far as I know the first time there's an inside view of this cult-spawning phenomenon from a media outlet or publication.

      The story from the outside is usually reduced to something like "rationalism is a wacky cult", with the recent ones tacking on "and some of its members include this Ziz gang who murdered many people". Like the NYT article a week ago.

  • > the typical outside view that rationalism itself is a cult and Eliezer Yudkowsky is a cult leader, both of which I consider absurd notions

    Cults are a whole biome of personalities. The prophet does not need to be the same person as the leader. They sometimes are and things can be very ugly in those cases, but they often aren’t. After all, there are Christian cults today even though Jesus and his supporting cast have been dead for approaching 2k years.

    Yudkowsky seems relatively benign as far as prophets go, though who knows what goes on in private (I’m sure some people on here do, but the collective We do not). I would guess that the failure mode for him would be a David Miscavige type who slowly accumulates power while Yudkowsky remains a figurehead. This could be a girlfriend or someone who runs one of the charitable organizations (controlling the purse strings when everyone is dependent on the organization for their next meal is a time honored technique). I’m looking forward to the documentaries that get made in 20 years or so.

I think it's perfectly fine to read these articles, think "definitely a cult" and ignore whether they believe in spaceships, or demons, or AGI.

The key takeaway from the article is that if you have a group leader who cuts you off from other people, that's a red flag – not really a novel, or unique, or situational insight.

  • That's a side point of the article, acknowledged as an old idea. The central points of this article are actually quite a bit more interesting than that. He even summarized his conclusions concisely at the end, so I don't know what your excuse is for trivializing it.

  • The other key takeaway, that people with trauma are more attracted to organizations that purport to be able to fix and are thus over-represented in them (vs in the general population), is also important.

    Because if you're going to set up a hierarchical (explicitly or implicitly) isolated organization with a bunch of strangers, it's good to start by asking "How much do I trust these strangers?"

  • > The key takeaway from the article is that if you have a group leader who cuts you off from other people, that's a red flag

    Even better: a social group with a lot of invented lingo is a red flag that you can see before you get isolated from your loved ones.

    • By this token, most scientists would be considered cultists: normal people don't have "specific tensile strength" or "Jacobian" or "Hermitian operator" etc in their vocabulary. "Must be some cult"?

      Edit: it seems most people don't understand what I'm pointing out.

      Having terminology is not the red flag.

      Having intricate terminology without a domain is the red flag.

      In science or mathematics, there are enormous amounts of jargon, terms, definitions, concepts, but they are always situated in some domain of study.

      The "rationalists" (better call them pseudorationalists) invent their own concepts without actual corresponding domain, just life. It's like kids re-inventing their generation specific words each generation to denote things they like or dislike, etc.

      5 replies →

  • > The key takeaway from the article is that if you have a group leader who cuts you off from other people, that's a red flag – not really a novel, or unique, or situational insight

    Well yes and no. The reason why I think the insight is so interesting is that these groups were formed, almost definitionally for the purpose of avoiding such "obvious" mistakes. The name of the group is literally the "Rationalists"!

    I find that funny, ironic, and saying something important about this philosophy, in that it implies that the rest of society wasn't so "irrational" after all.

    As a more extreme and silly example, imagine there was a group called "Cults suck, and we are not a cult!", that was created for the very purpose of fighting cults, and yet, ironically, became a cult into and of itself. That would be insightful and funny.