Comment by Viliam1234
3 days ago
LessWrong was originally a personal blog of Eliezer Yudkowsky. It was an inspiration for what later became the "rationality community". These days, LessWrong is a community blog. The original articles were published as a book, freely available at: https://www.readthesequences.com/ If you read it, you can see what the community was originally about; but it is long.
Some frequent topics debated on LessWrong are AI safety, human rationality, effective altruism. But it has no strict boundaries; some people even post about their hobbies or family life. Debating politics is discouraged, but not banned. The website is mostly moderated by its users, by voting on articles and comments. The voting is relatively strict, and can be scary for many newcomers. (Maybe it is not strategic to say this, but most comments on Hacker News would probably be downvoted on LessWrong for insufficient quality.)
Members of the community, the readers of the website, are all over the planet. (Just what you would expect from readers of an internet forum.) But in some cities there are enough of them so they can organize an offline meetup once in a while. And if a very few cities, there are so many of them, that they are practically a permanent offline community; most notably in the Bay Area.
I don't live in the Bay Area. To describe how the community functions in my part of the world: we meet about once in a month, sometimes less frequently, and we discuss various nerdy stuff. (Apologies if this is insufficiently impressive. From my perspective, the quality of those discussions is much higher than I have seen anywhere else, but I guess there is no way to provide this experience second-hand.) There is a spirit of self-improvement; we encourage each other to think logically and try to improve our lives.
Oh, and how does the bad part connect to it?
Unfortunately, although the community is about trying to think better, for some reason it also seems very attractive for people who are looking for someone to tell them how to think. (I mean, we do tell them how to think, but in a very abstract way: check the evidence, remember your cognitive biases, et cetera.) They are a perfect material for a cult.
The rationality community itself is not a cult. Too much disagreement and criticism of our own celebrities for that! There is also no formal membership; anyone is free to come and go. Sometimes a wannabe cult leader joins the community, takes a few vulnerable people aside, and starts a small cult. Two out of three examples in the article, it was a group of about five people -- when you have hundreds of members in a city, you won't notice when five of them start attending your meetups less frequently, and then disappear completely. And one day... you read about them in the newspapers.
> How do we go from “let’s rationally analyze how we think and get rid of bias” to creating a crypto, or being hype focused on AI, or summoning demons? Why did they raise this idea of matching confrontation always with escalation?
Rationality and AI have always been the focus of the community. Buying cryptos was considered common sense back then when Bitcoin was cheap; but I haven't heard talking about cryptos in the rationality community recently.
On the other hand, believing in demons, and the idea that you should always escalate... those are specific ideas of the leaders of the small cults, definitely not shared by the rest of the community.
Notice how the first things the wannabe cult leaders do is isolate their followers even from the rest of the rationality community. They are quite aware that what they are doing would be considered wrong by the rest of the community.
The question is, how can the community prevent this? If your meetings are open for everyone, how can you prevent one newcomer from privately contacting a few other newcomers, meeting them in private, and brainwashing them? I don't have a good answer for that.
No comments yet
Contribute on Hacker News ↗