← Back to context

Comment by nathan_compton

4 days ago

Thinking too hard about anything will drive you insane but I think the real issue here is that rationalists simply over-estimate both the power of rational thought and their ability to do it. If you think of people who tend to make that kind of mistake you can see how you get a lot of crazy groups.

I guess I'm a radical skeptic, secular humanist, utilitarianish sort of guy, but I'm not dumb enough to think throwing around the words "bayesian prior" and "posterior distribution" makes actually figuring out how something works or predicting the outcome of an intervention easy or certain. I've had a lot of life at this point and gotten to some level of mastery at a few things and my main conclusion is that most of the time its just hard to know stuff and that the single most common cognitive mistake people make is too much certainty.

I'm lucky enough work in a pretty rational place (small "r"). We're normally data-limited. Being "more rational" would mean taking/finding more of the right data, talking to the right people, reading the right stuff. Not just thinking harder and harder about what we already know.

There's a point where more passive thinking stops adding value and starts subtracting sanity. It's pretty easy to get to that point. We've all done it.

  • > We're normally data-limited.

    This is a common sentiment but is probably not entirely true. A great example is cosmology. Yes, more data would make some work easier, but astrophysicists and cosmologists have shown that you can gather and combine existing data and look at it in novel new ways to produce unexpected results, like place bounds that can include/exclude various theories.

    I think a philosophy that encourages more analysis rather than sitting back on our laurels with an excuse that we need more data is good, as long as it's done transparently and honestly.

    • This depends on what you are trying to figure out.

      If you are talking about cosmology? Yea, you can look at existing data in new ways, cause you probably have enough data to do that safely.

      If you are looking at human psychology? Looking at existing data in new ways is essentially p-hacking. And you probably won’t ever have enough data to define a “universal theory of the human mind”.

      1 reply →

    • I suspect you didn't read some parts of my comment. I didn't say everyone in the world is always data-limited, I said we normally are where I work. I didn't recommend "sitting back on our laurels." I made very specific recommendations.

      The qualifier "normally" already covers "not entirely true". Of course it's not entirely true. It's mostly true for us now. (In fact twenty years ago we used more numerical models than we do now, because we were facing more unsolved problems where the solution was pretty well knowable just by doing more complicated calculations, but without taking more data. Back then, when people started taking lots of data, it was often a total waste of time. But right now, most of those problems seem to be solved. We're facing different problems that seem much harder to model, so we rely more on data. This stage won't be permanent either.)

      It's not a sentiment, it's a reality that we have to deal with.

      2 replies →

I don't disagree, but to steelman the case for (neo)rationalism: one of its fundamental contributions is that Bayes' theorem is extraordinarily important as a guide to reality, perhaps at the same level as the second law of thermodynamics; and that it is dramatically undervalued by larger society. I think that is all basically correct.

(I call it neorationalism because it is philosophically unrelated to the more traditional rationalism of Spinoza and Descartes.)

  • I don't understand what "Bayes' theorem is a good way to process new data" (something that is not at all a contribution of neorationalism) has to do with "human beings are capable of using this process effectively at a conscious level to get to better mental models of the world." I think the rationalist community has a thing called "motte and bailey" that would apply here.

  • Where Bayes' theorem applies in unconventional ways is not remotely novel for "rationalism" (maybe only in their strange busted hand wavy circle jerk "thought experiments"). This has been the domain of statistical mechanics long before Yudkowski and other cult leaders could even probably mouth "update your priors".

    • I don't know, most of science still runs on frequentist statistics. Juries convict all the time on evidence that would never withstand a Bayesian analysis. The prosecutor's fallacy is real.

      1 reply →

Even the real progenitors of a lot of this sort of thought, like E.T. Jaynes, expoused significantly more skepticism than I've ever seen a "rationalist" use. I would even imagine if you asked almost all rationalists who E.T. Jaynes was (if they weren't well versed in statistical mechanics) they'd have no idea who he was or why his work was important to applying "Bayesianism".

  • It would surprise me if most rationalists didn't know who Jaynes was. I first heard of him via rationalists. The Sequences talk about him in adulatory tones. I think Yudkowsky would acknowledge him as one of his greatest influences.

People find academic philosophy impenetrable and pretentious, but it has two major advantages over rationalist cargo cults.

The first is diffusion of power. Social media is powered by charisma, and while it is certainly true that personality-based cults are nothing new, the internet makes it way easier to form one. Contrast that with academic philosophy. People can have their own little fiefdoms, and there is certainly abuse of power, but rarely concentrated in such a way that you see within rationalist communities.

The second (and more idealistic) is that the discipline of Philosophy is rooted in the Platonic/Socratic notion that "I know that I know nothing." People in academic philosophy are on the whole happy to provide a gloss on a gloss on some important thinker, or some kind of incremental improvement over somebody else's theory. This makes it extremely boring, and yet, not nearly as susceptible to delusions of grandeur. True skepticism has to start with questioning one's self, but everybody seems to skip that part and go right to questioning everybody else.

Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy. They mostly seem to dedicate their time to providing post-hoc justifications for the most banal unquestioned assumptions of their subset of contemporary society.

  • > Rationalists have basically reinvented academic philosophy from the ground up with none of the rigor, self-discipline, or joy.

    Taking academic philosophy seriously, at least as an historical phenomenon, would require being educated in the humanities, which is unpopular and low-status among Rationalists.

  • > True skepticism has to start with questioning one's self, but everybody seems to skip that part and go right to questioning everybody else.

    Nuh-uh! Eliezer Yudkowsky wrote that his mother made this mistake, so he's made sure to say things in the right order for the reader not to make this mistake. Therefore, true Rationalists™ are immune to this mistake. https://www.readthesequences.com/Knowing-About-Biases-Can-Hu...

  • > the discipline of Philosophy is rooted in the Platonic/Socratic notion that "I know that I know nothing."

    I can see how that applies to Socrates, but I wouldn't guess it also applies to e.g. Hegel.

The second-most common cognitive mistake we make has to be the failure to validate what we think we know -- is it actually true? The crux of being right isn't reasoning. It's avoiding dumb blunders based on falsehoods, both honest and dishonest. In today's political and media climate, I'd say dishonest falsehoods are a far greater cause for being wrong than irrationality.