Comment by bondarchuk
16 days ago
Can we really conclude that people "see" what they say they see? I think most people would not think twice about saying "protesters did not block the road" when in fact they know full well protesters blocked the road and they really mean "protesters blocked the road and that's good actually".
There's a strange pattern of die hard obstinence, even in the face of basic and common facts that we as as society until fairly recently all agreed upon. The reason is that works, if you admit fault/guilt then the usual consequences follow. If they remain obstinate, there's a chance they can project their crime on someone else which doesn't really work except it does retain for them a certain level of public support, from those who "see" what they want to see.
It's devastating society.
I think there's a real deficit in research on and understanding motivated cognition, and a lot of blurriness about attitudes versus belief versus perception. I don't just mean anything political, I mean things including physical pain and all sorts of things. When someone states something, it's very difficult to distinguish between "this is honestly what I saw or felt" versus "this is what I wanted to see or feel". When you get into the fact that consensus can be wrong, it leads to all sorts of issues.
It would be nice to have some kind of way to discriminate at what point in the percept -> attitude -> construal chain (which is probably more of a feedback loop) we are.
Before getting to research, I think a more honest attitude towards admitting motivated cognition in oneself and others is appropriate. I may give a spur-of-the-moment remark on a political situation, but at least if someone presses me, I will readily provide more insight on my biases and values. When I take the time to contemplate, I usually try to modify my eventual response to avoid undue bias altogether. Being reminded that motivated cognition is pervasive in all of us should reduce the unintentional-but-convenient faults in our cognition.
The tricky part is that people don't necessarily report what they see as what they see, and you can't really look inside their brains to get at what they meaningfully perceive.
A good example of this was the inauguration crowd size photos where people who were unfamiliar with the topic reported a unified perception on which crowd was bigger based purely on the photos. People who knew what the photos were of varied their conclusion based upon their political stance.
One conclusion you could draw from this is that their beliefs were altering their perception, but how would you distinguish that from people altering their expression of what they saw based upon their beliefs?
That basketball gorilla experiment seems like pretty solid evidence that people only notice what they expect to see and are primed to pay attention to, even in situations with no ideological component.