Confirmation bias is the human tendency to seek out and remember examples which support our existing beliefs.
It’s easiest to see this with controversial issues. Your friend who doesn’t believe evolution will point to the gap in the fossil record, but not apply the same rigor to the authenticity of religious texts. Another who believes drugs should be legalized, may be overly critical of research showing damaging health effects, even if she doesn’t consider that point essential to her argument.
Confirmation corrupts ordinary thinking too. In a famous experiment, participants were given a numbered sequence (2, 4, 6) and asked to guess the pattern. They could test out other sequences, and gravitated to those that fit their previous theory. If they thought the rule was “1x, 2x, 3x” they might test (3, 6, 9). The real rule was more general, any ascending sequence would do. Yet, the participants failed to discover this because they were biased to confirm their prexisting beliefs.
Can Bias Be Fixed?
Given the pervasive implications of biases, a lot of people (myself included) put a great deal of thought in how to overcome them. That’s why I found this comment by Daniel Kahneman, the psychologist whose work on biases earned him a Nobel, had to say after a lifetime of studying these errors:
“What can be done about biases? â€¦ The short answer is that little can be achieved without a considerable investment of effort. As I know from experience, [the system of generating impressions and intuitions] is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues. â€¦
“Unfortunately, [the procedure of recognizing errors] is least likely to be applied when needed most. We would all like to have a warning bell that rings loudly whenever we are about to make a serious error, but no such bell is available, and cognitive illusions are generally much more difficult to recognize than perceptual illusions.”
He continues, arguing that the practical purpose of his book was to give people tools to point out the cognitive errors and biases of others, since he doubts we can fix our own.
I suggest that if the man who practically invented the field is skeptical about his ability to overcome bias, there’s not much hope for the rest of us.
Why Can’t Bias Be Fixed?
It may be the case that bias is unfixable because it’s integral to how our cognitive machinery works.
Daniel Willingham, whose work I previously featured here, makes a clear point about learning: we learn things based on the context of what we already know. It may even be the case that everything we learn must be built off of previous experience.
Evidence of this comes in the form of reading comprehension. Students were given an essay to read about baseball. Those who knew more about baseball were able to remember far more from the essay than others, holding reading ability constant. The prior knowledge about baseball made it possible to learn new information about baseball.
If this is so, it suggests the confirmation bias is also a feature, not just a bug. We form mental models about how the world works and then, whenever we encounter new information, we learn it by trying to integrate it into our existing framework. If that integration fails, then the information is simply forgotten or rejected, often unconsciously.
The problem is that baseball facts are usually uncontroversial. Our ability to fit other facts into our worldview often depends crucially on other tenuous assumptions we may not even have realized we have made before. Worse, this entire process may be a completely inescapable by-product of learning anything.
What Can We Do?
I think recognizing that you can’t not be biased is a good first step. If forming biased beliefs is an irresistible compulsion, then the rationalist piety and attempting to exercise discipline as a solution quickly falls away, as it should.
Instead, like anything that is irresistibly tempting, the only solution must be to change the environment to reduce the temptation or limit its damage.
If your prior beliefs will color everything you read, and what you read over and over becomes your prior beliefs, then only reading one viewpoint is incredibly dangerous. So if you’re a liberal who never reads conservatives, or vice versa, the only thing you can be certain of is how biased your opinions are.
Cultivating plurality in your input sources won’t eliminate bias, but it prevents you from naturally exacerbating them as you learn more. If all your friends are academics or travelers or vegetarians or entrepreneurs, you’ll develop tunnel vision. A good intellectual social circle shouldn’t just have mild disagreement, but have you talking to at least a few people who you think are completely wrong about everything.
Finally, you can strive to make your beliefs more explicit. Articulated beliefs aren’t immune to bias, as the number-sequence case illustrates. But an articulated belief can be debated, considered and possibly rejected. It’s the unarticulated assumptions about the world that ensnare us.
Bias may be inescapable. But you can maintain an intellectual environment which jostles you in enough random directions that getting stuck on the wrong track becomes less likely.