Finding answers to our questions has never been easier. We no longer have to go to a library, pick up an encyclopedia, or take a course to satisfy our curiosity. The internet makes available to us the answers to almost any question we could imagine.
With so much knowledge literally at our fingertips, it should be easy to change our minds when we discover one of our opinions or beliefs is incorrect. Shouldn’t it?
Unfortunately, changing our minds can be difficult, because our minds play a series of tricks on us in order to resist change.
In the field of psychology, the brain is commonly referred to as a “cognitive miser.” This means that the brain tends to conserve mental resources by urging us to think, give attention to detail, and solve problems in ways that require the least amount of effort possible. Holding an opinion requires little effort, but actually changing an opinion requires our brains to engage in difficult, sophisticated, and expensive processes—that is, expensive for our mental resources.
The result of this cognitive miserliness is a strong reluctance to change our minds. We don’t naturally gravitate towards information that challenges our perspectives, makes us uncomfortable, or requires us to grow. We do naturally gravitate towards information that confirms our perspectives and allows us to stay the same—even when that information may go against the best data we currently have available.
This is an example of what is called cognitive bias. Cognitive biases are errors in thinking that affect our ability to reason and solve problems.
Renowned psychologists Amos Tversky and Daniel Kahneman published their research on cognitive bias in the early 1970s, and the concept quickly became of extreme importance in many other fields, including philosophy, education, management, and economics.
In the years since Tversky and Kahneman’s initial work, many different types of cognitive bias have been identified. Here are some of the most common which you are almost certainly already running into with your friends, your family, and on the internet generally.
1. Actor-observer bias. This refers to our tendency to give more importance to external causes for our own bad behaviour while at the same time giving more importance to internal causes for others’ bad behaviour. For example, “I was rude because I had a difficult day at work, but that person was rude because they are a naturally rude person.”
2. Anchoring bias. This refers to our tendency to give the first piece of information we hear on a subject the most weight. For example, once we’ve heard an interesting theory on a subject, it may be more difficult for us to accept alternative theories, even if those alternatives are better supported by the evidence.
3. Confirmation bias. This refers to our tendency to favour information that supports what we already believe and discount information that disproves us. This is the most easily recognizable form of cognitive bias and causes people to only register the evidence they want to accept.
4. The Dunning-Kruger effect. This is our tendency to believe that we are smarter, more capable, and less likely to make mistakes in judgement than others. For example, we may be less likely to accept the findings of experts in a field, relying instead on our own research and reasoning even though we lack skill and training in that field. In other words, we’re reluctant to acknowledge our own incompetence.
5. The misinformation effect. This refers to our tendency to alter our own memories based on new information, often in situations where our memories of an important life event change after we watch the news. Many people experienced this after 9/11, remembering that they had seen the second plane hit live on television when in reality they actually only saw it later on the news.
6. Self-serving bias. This refers to our tendency to give ourselves credit when we succeed, but to blame external causes when we fail. For example, “I won the first hand of poker because I’m amazing, but I lost the rest of the game because I had bad cards.”
Falling prey to cognitive biases doesn’t just happen to so-called “unintelligent” people. Scientists are vulnerable to these biases as well, which is why they generally rely on the scientific method—a rigorous, unforgiving rock against which many pet theories and hypotheses have been broken over the last few centuries.
A good scientist will abandon a theory, even reluctantly, if it cannot pass the standards of evidence we have put in place to help root out and eliminate the biases we all have. We can all be good scientists by doing the same.
Why is it so important to learn about cognitive biases? Because in this current era, we have greater access to more information than ever before. The internet puts the sum total of all available human knowledge literally into our pockets.
Yet not all of this information is correct. Among it is misinformation, false advertising, and outright lies. Every day, bad science convinces people that the world is flat. Conspiracy theories abound. Consider the latest batch of conspiracy theories around the COVID-19 pandemic.
In the middle of all this, our biases lead us to reach for information that supports what we want to believe and discard the rest, leaving us vulnerable to being taken advantage of. Or, in the case of the anti-vaccination movement, we’re left vulnerable to very real public health risks.
Again, each and every one of us is vulnerable to cognitive biases. It’s in our nature. But that doesn’t mean we should give up trying to figure out the truth, nor does it mean we should stop trying to communicate what we have learned to others.
What it does mean is that we need to do the work to uncover our biases, set them aside, wrestle with uncomfortable facts, and (when necessary) change our opinions based on the best data available.
Most importantly, it means we ought to deal more kindly with those we find ourselves in disagreement with. We’re all working to make sense of our world with the same cognitive tools, and we’re all vulnerable to the same mental pitfalls.
That, at least, is something we all have in common.