"What is the most resilient parasite?” asks Leonardo DiCaprio in the 2011 movie Inception. “Bacteria? A virus? An intestinal worm?”
Before anyone can ask what exactly he’s getting at—or volunteer some gruesome facts about, say, toxoplasmosis —DiCaprio answers his own question. “An idea,” he declares, with a meaningful squint. “Resilient... highly contagious. Once an idea has taken hold of the brain, it's almost impossible to eradicate.”
Metaphors aside, Leo’s got a point. We all know someone clinging steadfast to a belief that is provably false. How can an otherwise intelligent person end up with such blatant blind spots, seemingly immune to logic or fact? One culprit is confirmation bias.
Confirmation bias occurs when you selectively choose what information to value, and what information to discard, based on the picture you want to paint for yourself. Imagine a detective who needs to believe his client is innocent so badly, he keeps insisting that their alibi is the most important piece of evidence, all while maintaining that the bloody knife found in their glove compartment means nothing at all.
There’s a reason we do this. Life is full of contradictions, and the human brain was not built to wrestle with constant paradoxes. In order to assemble a coherent view of the world around us, sooner or later, we all have to do some mental sorting—who to listen to, how to interpret it, and what even matters. Ideally, we’ll pursue the truth above all else, considering the evidence as it emerges, questioning our assumptions, and constructing the ultimate solution from the facts we’ve gathered.
The trouble is, we are far more emotional than we want to believe. Left unchecked, our feelings seep into the proceedings, and that means all too often, not only do we give our idea-parasites a free ride—we actually let them call the shots.
Take job interviews. What qualities are important in a police chief? One study had participants examine two (fictitious) resumes: one for a streetwise risk-taker beloved by local cops but with a low aptitude for paperwork, and one for an educated administrator with an excellent understanding of procedure but no rapport with the rest of the force. The survey respondents then had to decide which traits to prioritize in hiring. The catch: at random, one applicant was assigned the name “Michael” and the other was “Michelle.”
Which qualities were overall valued more highly? Whichever qualities were attached to Michael.
When the more educated candidate was assumed to be male, the respondents thought education was a key requirement for a police chief. When Michelle was the educated one, respondents were noticeably more likely to devalue education. Overall, this pattern was slightly true among female survey takers, and significantly more present among men.
Interestingly, the pro-male bias was especially pronounced among survey takers who rated themselves as “highly objective.”
So, how do we avoid these confirmation bias pitfalls?
1. Have the courage to examine your own built-in biases.
There’s a word for people who insist they are always purely rational and objective: wrong. Your point of view is informed by your experiences, your politics, and the types of news you consume. Ask yourself, “What do I already believe? What do I want to believe? How might this color my perceptions?
2. Try to define your guidelines ahead of time, and apply them evenly to all cases.
If a scientific study only surveys twenty people, even a freshman Biology student could tell you that the sample is too small to draw any real conclusions. But we’re way more likely to notice this sort of shortcoming if we don’t agree with the results.
Take the police chief hiring study cited above. It was an attractive example to use in this blog post because it so keenly illustrated our point. However, had it failed to emphasize the power of confirmation bias, a person writing a post like this one might be more likely to decry its sample size of 73 people and caution against drawing broad conclusions. In this case, a 2015 meta-study looked at 136 hiring bias studies—with an overall sample size higher than 20,000—and found similar results. Still, this is something to always look out for.
3. Make an effort to reject black-and-white thinking.
We can all name public figures whose work is important to us. However, be careful not to put any human on a pedestal. If you can begin by accepting that someone you like can still make mistakes—even big mistakes—you are more likely to retain a degree of critical thinking about them, thus leaving the door open to re-evaluate your opinion if need be.
4. Consider your sources.
Expertise matters. A climate scientist is more likely to have an accurate stance on climate change than a politician. A nutritionist is a better source on the health benefits of pomegranate juice than say, the label on a bottle of pomegranate juice. And in both cases, if ninety five percent of climate scientists or nutritionists have the same position on something, their word should be weighted more heavily than a random holdout—especially if that random holdout is, say, in the pocket of Pom Wonderful.
When it comes to ideas we cherish and protect in the face of any challenges, we all have a parasite or two. But to escape the clutches of confirmation bias, we need to remember not to let that parasite steer.