Comment by BrenBarn
4 days ago
This is a very interesting article. It's surprising though to see it not use the term "certainty" at all. (It only uses "certain" in a couple instances of like "a certain X" and one use of "certainly" for generic emphasis.)
Most of what the article says makes sense, but it seems to sidestep the issue that a major feature distinguishing the "good" rationalists from the "bad" is that the bad ones are willing to take very extreme actions in support of their beliefs. This is not coincidentally something that distinguishes good believers in various religions or philosophies from bad believers (e.g., people who say God told them to kill people). This is also lurking in the background of discussion of those who "muddled through" or "did the best they could". The difference is not so much in the beliefs as in the willingness to act on them, and that willingness is in turn largely driven by certainty.
I think it's plausible there is a special dimension to rationalism that may exacerbate this, namely a tendency of rationalists to feel especially "proud" of their beliefs because of their meta-belief that they derived their beliefs rationally. Just like an amateur painter may give themselves extra brownie points because no one taught them how to paint, my impression of rationalists is that they sometimes give themselves an extra pat on the back for "pulling themselves up by their bootstraps" in the sense of not relying on faith or similar "crutches" to determine the best course of action. This can paradoxically increase their certainty in their beliefs when actually it's often a warning that those beliefs may be inadequately tested against reality.
I always find it a bit odd that people who profess to be rationalists can propose or perform various extreme acts, because it seems to me that one of the strongest and most useful rational beliefs is that your knowledge is incomplete and your beliefs are almost surely not as well-grounded as you think they are. (Certainly no less an exponent of reason than Socrates was well aware of this.) This on its own seems sufficient to me to override some of the most absurd "rationalist" conclusions (like that you should at all costs become rich or fix Brent Dill's depression). It's all the more so when you combine it with some pretty common-sense forecasts of what might happen if you're wrong. (As in, if you devote your life to curing Brent Dill's depression on the theory that he will then save the world, and he turns out to be just an ordinary guy or worse, you wasted your life curing one person's depression when you yourself could have done more good with your own abilities, just by volunteering at a soup kitchen or something.) It's never made sense to me that self-described rationalists could seriously consider some of these possible courses of action in this light.
Sort of related is the claim at the end that rationalists "want to do things differently from the society around them". It's unclear why this would be a rational desire. It might be rational in a sense to say you want to avoid being influenced by the society around you, but that's different from affirmatively wanting to differ from it. This again suggests a sort of "psychological greed" to reach a level of certainty that allows you to confidently, radically diverge from society, rather than accepting that you may never reach a level of certainty that allows you to make such deviations on a truly rational basis.
It's also interesting to me that the article focuses a lot not on rationalist belief per se, but on the logistics and practices of rationalist communities. This in itself seems like a warning that the rationality of rationalism is not all it's cracked up to be. It's sort of like, you can try to think as logically as possible, but if you hit yourself in the head with a hammer every day you're likely going to make mistakes anyway. And some of the "high demand" practices mentioned seem like slightly less severe psychological versions of that.
> it seems to sidestep the issue that a major feature distinguishing the "good" rationalists from the "bad" is that the bad ones are willing to take very extreme actions in support of their beliefs.
What is a "very extreme action"? Killing someone? In our culture, yes. What about donating half of your salary to charity? I think many people would consider that quite extreme, too. Maybe even more difficult to understand than the murder... I mean, prisons are full of murderers; they are not so exceptional.
The difference is that the bad ones are willing to take abusive actions.
> It's also interesting to me that the article focuses a lot not on rationalist belief per se, but on the logistics and practices of rationalist communities.
That's what happens when you read about the rationality community from someone who is actually familiar with it. If you want to determine whether a group is dysfunctional (i.e. a cult), the actual practices are much more important than the stated beliefs. You could have two communities with the same or very similar beliefs, yet one of them nice and the other one abusive.
> What about donating half of your salary to charity? I think many people would consider that quite extreme, too.
Maybe, but there are also degrees of extremity in terms of stuff like how broadly you donate (like there's a difference between donating a huge amount to one charity vs. spreading it around 10). Also I don't think the mere fact of donating half your salary would itself necessarily be seen as extreme; it would depend on the person's total wealth. It seems not unusual for wealthy individuals who get certain jobs to donate (or refuse) their entire salary (like Arnold Schwarzenegger declining his salary as CA governor).
Ultimately though I don't agree that this is anywhere close to as extreme as cold-blooded murder.
> I mean, prisons are full of murderers; they are not so exceptional.
I have a hunch that a large proportion of murderers in prisons are not comparable to rationalist murderers. There's a difference between just killing someone and killing someone due to your belief that that is the rational and correct thing to do. A lot of murders are crimes of passion or occur in the commission of other crimes. I could see an intermediate case where someone says "We're going to rob this bank and if the guard gives us any trouble we'll just shoot him", which is perhaps comparable to "always escalate conflict", but I don't think most murders even reach that level of premeditation.
> The difference is that the bad ones are willing to take abusive actions.
I'm not so sure that that is the difference, rather than that they are willing to take extreme actions, and then the extreme actions they wind up taking (for whatever reason) are abusive. It's sort of like, if you fire a gun into a crowd, your willingness to do so is important whether or not you actually hit anyone. Similarly a willingness to go well outside the bounds of accepted behaviors is worrisome even if you don't happen to harm anyone by doing so. I could certainly imagine that many rationalists do indeed formulate belief systems that exclude certain kinds of extreme behavior while allowing others. I'm just saying, if I found out that someone was spending all their days doing any spookily extreme thing (e.g., 8 hours a day building a scale model of Hoover Dam one grain of sand at a time) I would feel a lot less safe around them.
> > It's also interesting to me that the article focuses a lot not on rationalist belief per se, but on the logistics and practices of rationalist communities.
> That's what happens when you read about the rationality community from someone who is actually familiar with it. If you want to determine whether a group is dysfunctional (i.e. a cult), the actual practices are much more important than the stated beliefs.
Sure. My point is just that, insofar as this is true, it means what the article is saying is more about cults in general and less about anything specific to rationalism.
> My point is just that, insofar as this is true, it means what the article is saying is more about cults in general and less about anything specific to rationalism.
The lesson I took from it was how tiny cults can appear inside of a larger group which itself is not a cult. Not specific to rationalism, I agree.
There are a few thousand rationalists, and the sizes of the cults mentioned in the article were about 20, 5, 5, I think.