Comment by jmoggr
4 days ago
I think the comments here have been overly harsh. I have friends in the community and have visited the LessWrong "campus" several times. They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb (in hopefully somewhat respectful manner).
As for the AI doomerism, many in the community have more immediate and practical concerns about AI, however the most extreme voices are often the most prominent. I also know that there has been internal disagreement on the kind of messaging they should be using to raise concern.
I think rationalists get plenty of things wrong, but I suspect that many people would benefit from understanding their perspective and reasoning.
> They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb
I don't think LessWrong is a cult (though certainly some of their offshoots are) but it's worth pointing out this is very characteristic of cult recruiting.
For cultists, recruiting cult fodder is of overriding psychological importance--they are sincere, yes, but the consequences are not what you and I would expect from sincere people. Devotion is not always advantageous.
Does insincerity, cruelty, unfriendliness, and impatience make a community less likely to be a cult?
> They seemed very welcoming, sincere, and were kind and patient even when I was basically asserting that several of their beliefs were dumb
I mean, I'm not sure what that proves. A cult which is reflexively hostile to unbelievers won't be a very effective cult, as that would make recruitment almost impossible.