What We See May Depend on What We’re Willing to See
On November 23, 1951, Dartmouth played Princeton in a football game filled with penalties. Early in the game, Princeton’s star player left with a broken nose. Then a Dartmouth player’s leg was broken. By the end of the game, both sides had been penalized many times. Days later, two social researchers asked students from each school who they felt started the rough play. Only 36 percent of Dartmouth students blamed their own team and 86 percent of Princeton students blamed Dartmouth. Concerned that hearsay or press reports may have biased these results, the researchers asked a new group of students from both schools to watch the game film and record infractions of the rules. Dartmouth students reported about the same number for each team, but Princeton students recorded twice as many Dartmouth infractions as for their own team. Yet all saw the same game. Or did they?
The researchers acknowledged that all students had the same “sensory impingements” but had different interpretations of what actually happened. But what if some students literally did not have the same “sensory impingements”? If the brain is a camera, recording everything it gets from our senses, this could not be the case. Is it possible that many students didn’t ignore but instead literally didn’t see infractions by their own players?
A recent study by Professors Rafael Polania and Todd Hare suggests such distortions may occur because our visual perception is affected by an unconscious desire to see (or not see) the world in a certain way. They asked 86 participants to compare a number of paired black-and-white patterns, known as Gabor patches, and say which pattern was closer to a 45-degree angle. They got 15 points for every correct answer. The goal was to amass as many points as possible. In a second round with the same pairs the points rewarded increased from 0 to 45 degrees depending on how close to 45 degrees each Gabor patch of the pair was determined to be. Since both trials used the same Gabor pairs, the sensory input on the retina was the same – but the results were not. The second group found more patterns closer to 45 degrees than the first. The researchers concluded that we may in fact unconsciously alter our perception based on the benefit we stand to gain. As he put it: “People flexibly and unconsciously adjust their perceptions when it works to their advantage.”
In terms of the Princeton-Dartmouth results, it’s possible that each team’s fans literally only saw penalties they wished to see. Or, as Polania put it for his experiment: “As soon as we look at something, we try to maximize our own benefit. This means that cognitive bias starts long before we consciously think about something.”
We usually assume that cognitive bias occurs after we get sensory input and then distort that input. But Polonia and Hare suggest that we may unconsciously distort perception before any thinking is applied to it. In short, we physically can’t see what we don’t want to see. While their research focused on visual perception, it’s possible the same distortion occurs with other sensory inputs.
In decision situations with more profound consequences than watching a football game, this matters a lot. What if a police officer suspicious of racial minorities literally cannot see a black man is holding a cell phone and not a gun? What if voters, such as a liberal Democrat and a conservative Republican, cannot hear objective information in a speech by a candidate because they just don’t trust her? What if partners arguing about their debt can only see what the other purchased on their credit card statement?
Polania and Hare’s research also suggests that confirmation bias may be fueled by this unconscious filtering of sensory inputs. Readers of the same research on climate may come to radically different conclusions because their attention is focused on only what confirms their existing belief. The rest of the data in a report may be literally unseen, filtered out before it can be consciously considered.
Understanding the power of the unconscious mind to filter perception and play to our preferences ought to make us cautious in what we’re certain we know, especially in interactions with those who “see” the same situation differently. The expression “how can you be so blind?” that we apply to others who ignore what is plainly visible to us is one we should realize applies to ourselves as well.
“We don’t see things as they are, we see them as we are,” the writer Anais Nin once remarked. Sharing and discussing our differing perceptions with some humility could make us better family members, friends and citizens. We may not change our minds but at least we won’t be closed to what others see that we do not.
Photo Credit: Alexas Fotos@pixabay.com