Comment by aharrison
10 years ago
That is an excellent essay and it scares the crap out of me, because I see it play out every day in both directions and it often does feel exactly like epistemic learned helplessness.
There are so many biases that affect individuals (hindsight, hyperbolic discounting, confirmation, etc) that to get correct answers, you have to look at and trust other accounts. You have to do studies, you have to trust those studies, and you have to track actual hard evidence.
But large enough systems become increasingly opaque, and it can be hard to test your web of trust empirically. This reduces huge organizations to cargo cult behavior, or worse, fundamentally unsound behavior because "it worked for me."
I don't know what to do about this besides what CFAR is trying, which is to get people better at weighing evidence and changing their damned minds. You have to constantly be asking yourself: "what would the world look like if this were true? What would the world like like if it wasn't?" It is hard work, but I think it is critical to our continued improvements as a civilization and a species.
I don't know if the right answer is to be generally more open to changing your mind.
There are far too many things to be worried about: the danger of AI, religion, the environment, other political views, the results of scientific papers in dozens of fields... epistemic learned helplessness is a defence against wasting literally all of your time taking every argument seriously.
At the end, the author notes that we should be glad for the specialists who are well-versed in certain subjects enough to actually be able to evaluate and experiment on outlandish theories and unusual study results.
Maybe we need to emphasize more specialization? Or is it working alright already?
The other solution is to not really care all that much about most things.
There's some new physics result about how black holes interact with the quantum foam? I can't build anything practical with it, so beyond "that's interesting" it doesn't really matter to me.
Some food is suddenly found to cause cancer in rats? Unless it's fairly new, the effect on humans can't be all that strong or we'd have noticed by now. Something else will probably kill me first (like whatever the medical term for "old age" is these days).
Someone's claiming that some particular aspect of modern diet causes obesity? It would have to either make you feel lethargic (lower calories out) or make you hungrier (raise calories in). Both of which are directly observable and easily correlated to the contents of recent meals, without having to argue over mechanisms and confounders and personalized gut bacteria and such.
There's a new higher estimate for the percent of scientific results that are fake or poorly done or just statistical noise? This does help for knowing how not-seriously to take everything, but beyond "that's too damn high" the exact numbers aren't all that relevant (unless you're testing a fix).
Someone wants money to research AI risk? Are they looking at how to build safe AI, or how to prevent anyone either carelessly or deliberately building unsafe AI? Only the second is worth looking into further.
>Someone's claiming that some particular aspect of modern diet causes obesity? It would have to either make you feel lethargic (lower calories out) or make you hungrier (raise calories in). Both of which are directly observable and easily correlated to the contents of recent meals, without having to argue over mechanisms and confounders and personalized gut bacteria and such.
This is extremely facile. Take trans fats scarring arteries and causing systemic inflammation, for instance.