Comment by gwd
2 years ago
There's another factor in this, which makes it hard to change:
For the people in child welfare organizations, for social workers, for doctors, for police, for judges to change their mind about current and future decisions requires them to change their mind about past decisions. The necessary implication is that many of the people they have persecuted in the past were, in fact, innocent. It requires them to admit that they personally have likely caused untold suffering to parents, caretakers, and children.
This is hard for anyone; but if you've lived your life trying to be the hero, feeling good about swooping in and rescuing children from the clutches of evil villains, how can you face the fact that you are the evil villain in so many children's stories?
You might call this the Paradox of Judgment: If you don't say that something is that bad, then lots of people don't think it's a big deal and don't do anything about it. But if you do say that something is really bad, then there develop all these pathologies of denialism around it.
This is spot on. This psychological barrier is probably the number one obstacle to a wider recognition of the existence and extent of this problem.
People like me who challenge the science behind the diagnoses of SBS face an absolutely unprecedented and unreasonable pushback, like I've never seen in any other area. Basically everyone who has worked on this side has faced threats, insults, personal attacks, cancellations, boycotts, and so on. The "cognitive bias" you mention (does it have a name? perhaps cognitive dissonance?) is a likely reason for this amount of antagonism.
Closest name I can think of is "commitment and consistency". People tend to behave as they have behaved in the past, doing so is both a cognitive shortcut and a source of positive emotion. We go to great lengths to maintain consistency (see also: confirmation bias), and being consistent even in the face of conflicting evidence feels better than being inconsistent but right.
From Cialdini: "Once we have made a choice or taken a stand, we will encounter personal and interpersonal pressures to behave consistently with that commitment. Those pressures will cause us to respond in ways that justify our earlier decision."
In my head -- and I'm no doctor, believe you me! -- I call this "emotional inertia", which is a phrase I've borrowed from one of the many doctors who has treated me for depression.
It certainly would be a form of cognitive dissonance, but that's much more general; I experienced cognitive dissonance hearing the word "nicht" pronounced by a native German speaker yesterday evening, because it wasn't at all like what I expected it to sound like.
"Confirmation bias", where you tend to see what you expect to see, is narrower; but still I think doesn't capture what we're talking about. We're specifically talking about resistance to accepting the idea because accepting it would mean reclassifying actions you yourself had taken from "very good" to "very bad". It's kind of weird that it doesn't have a name -- I'm convinced it plays a pretty big part of human behavior, much more than is commonly acknowledged.
> We're specifically talking about resistance to accepting the idea because accepting it would mean reclassifying actions you yourself had taken from "very good" to "very bad".
That's exactly it. I'd love to discover scientific literature about this phenomenon, and I'd also be surprised if it doesn't already have a name and an extensive literature. But if that's the case: I think there are research carriers in psychology to make here...
Edit: ChatGPT found "belief perseverance" [1] but, again, that's not exactly what we're talking about, which also relates to a personal sense of morality and "being one of the good guys".
[1] https://en.wikipedia.org/wiki/Belief_perseverance
8 replies →
It's the sunk cost fallacy.
This is why Max Planch (German physicist) has quipped that science advances one funeral at a time.
I think the related term, the escalation of commitment, suits this better.
It's hard to understand something if your salary depends on it being false.
It's much, much harder again to understand something if it makes your life's work ignoble.
It's hard to understand something if your salary depends on your misunderstanding.
It's even harder to understand something if your self-conception as honorable depends on your mistunderstanding?
Interestingly enough, no bigger offender then the psychiatric and mental health community. There's a very sophisticated system for shutting down criticism and lashing out at patients that have civil rights concerns.
They do a lot of mental gymnastics trying to run from the idea that their main function is to imprison and take away peoples rights, often without due process.
Medical industry is rife with abuse. They routinely kill people out of spite, torture dying people and their families, and want to be shielded from any criticism... so fuck all the patients and look for reasons they're "not righteous", etc, so you can dismiss them.
It's quite interesting (and disturbing) to see how much culture evolves around deflecting blame and victim blaming.
Your comment reminds me of the Rosenhan Experiment[1]. "The first part involved the use of healthy associates or "pseudopatients" (three women and six men, including Rosenhan himself) who briefly feigned auditory hallucinations in an attempt to gain admission to 12 psychiatric hospitals in five states in the United States. All were admitted and diagnosed with psychiatric disorders. ... The second part of his study involved a hospital administration challenging Rosenhan to send pseudopatients to its facility, whose staff asserted that they would be able to detect the pseudopatients. Rosenhan agreed, and in the following weeks 41 out of 193 new patients were identified as potential pseudopatients, with 19 of these receiving suspicion from at least one psychiatrist and one other staff member. Rosenhan sent no pseudopatients to the hospital."
1: https://en.wikipedia.org/wiki/Rosenhan_experiment
In fairness, psychiatry is totally different today than in 1973. The obvious change is that a huge reduction in the number of inpatient beds, combined with increasing demand, have created huge pressures to admit only the most obviously unwell patients and discharge them as quickly as possible. Most psychiatric inpatient stays are just a few days - just enough to get a patient through a crisis, revise their medication and (hopefully, but not always) arrange for appropriate outpatient care and support. The downtown of most US cities is a testament to the fact that, in 2023, under-treatment of severe mental illness is a far greater concern than over-treatment.
On an ontological level, psychiatry made a huge leap forward in 1980 with the publication of the DSM-III. One of the core goals of the DSM-III was to address the concerns raised in the Rosenhan experiment, making diagnostic criteria more robust and reliable. While there are still many controversies and shortcomings - most prominently regarding the over-diagnosis of less severe conditions - we now have a suite of reliable, validated diagnostic instruments for most serious conditions. For the most part, we aren't diagnosing or treating patients based on the gut instinct of an individual practitioner; we're using objective criteria with proven inter-relater reliability, guided by the over-arching principle that, regardless of symptomatology, no-one is mentally ill unless a) they're experiencing distress and/or b) they're causing significant harm to others. There are many shortcomings in how psychiatric medicine is practised today, but the era of locking people up just because they behave strangely is definitively over.
My favorite control experiment! Tell somebody you did something and ask them what the results were, but don't actually do the something.
"Honey, I doubled the salt in the pasta this time, how does it taste?" "Oh, it's really salty". "Ha-HA! I didn't actually add ANY salt!"
(do not actually do this to people you like or who like you)
There's also the normalization of seeing and hearing awful things. After a while of being exposed to the wretches of humanity you begin to see the signals for the wretches everywhere.
As the warrior poet Maslow put it, "if the only tool you ever have is a hammer, you tend to see every problem as a nail."
I actually strongly suspect that this is a major issue with cops. Even the most well-meaning new hire is likely to become jaded and paranoid after years of interacting primarily with criminals. They are probably more likely to assume the worst of a given stranger, even in contexts where there is no reason to suspect that stranger.
Totally, child abuse pediatricians, forensic pediatric pathologists etc. are exposed on a daily basis to the very worse things imaginable in the world (autopsies of babies beaten to death and so on), and yet they need to keep a calm and rational stance by analyzing facts objectively. This is hard and they don't always succeed. Some are led to see the worse in everyone and they see potential child abusers in every parent and caregiver.
This can go quite far, with some experts stating that the histories reported by parents and caregivers bringing a child to the hospital with some injuries are always falsified. This can surely happen, but a foundational tenet of medicine is to listen to the patient/parents.
I've seen experts concluding to abuse in 100% of their cases, including those where children hah obvious, DNA-proven genetic conditions causing the observed injuries. Fortunately, some judges remain reasonable and act as "gatekeepers" by exculpating parents and caregivers despite affirmative opinions by reputable experts. But many don't.
Thanks for this insight, I'd never thought about it this way.