This is a series in which my son Ben learns about the components of critical thinking, starting with the important topic of bias. Because biases can short-circuit our reasoning, recognizing them in ourselves and others and being able to control for them is one of the most important steps in becoming a critical thinker. The material Ben studied to learn about biases are linked at the end of his article.
Ben on Bias
The amount of control context plays in the things we do may seem obvious, yet when our existing mental slants are pointed out, we often find ourselves shocked by how much outside sources control our viewpoints. As I learned about biases, what I discovered, among other things, was how unnoticed biases can go until those rare instances when they are pointed out.
In my reading, I learned that the ways most people normally form beliefs fall into three categories. With a priori thinking, one views the world through preconceived notions, based on information you already have, to decide if you agree or disagree with something. This idea is very similar to what is commonly known as Confirmation Bias.
Through the lens of authority, you are inclined to believe what generally respected opinion already says. With tenacity, you reject commonly held beliefs or what authority figures tell you. Each of these leads to biased thinking.
When I first read about these, I was intrigued by the contrast of the last two – authority and tenacity. It gave me the pessimistic feeling that the only way to shake one bias was by practicing another. Might it be possible to practice neither?
Learning about what leads to biased thinking reminded me of a Democratic debate the family watched in January. At the time, there were seven remaining major candidates for the presidential nomination. The debate seemed like a shocking anomaly, a rare moment when almost every candidate spoke eloquently, convincingly, and passionately. Even the candidates I had no interest in struck me as impassioned and effective. But the next day, I was surprised that no one agreed with me about how intelligent and well matched a debate it was. Political articles and forums that responded to debates didn’t seem to have had the same response as I did!
I realized that this quality of the discussion during the debate was apparent to me not because I had some great insight, but because the person I was watching the debate with pointed out how well he thought the candidates did. People who didn’t have the luxury of having this pointed out to them came to a different conclusion based on their existing (a priori) beliefs about the candidates.
But was I really being unbiased? Or was my thinking impacted by authority based on opinions of someone I respected? Did I really see a great debate, or did I just substitute one source of bias for another?
I also learned that biases may come from how our brain’s “fast process” works. The Israeli psychologist Daniel Kahneman explains the fast process as a function by which our brains fill in blanks while quickly taking in and processing information to help us better understand it. In contrast, we also have a slow process involved with more insightful and in-depth thinking. By learning about cognitive illusions – easy ways to trick the brain – I realized that the faster we’re thinking, the more mistakes we make.
For instance, when I scrolled through cheap clickbait articles at the bottom of web pages I looked at, I saw a host of advertising techniques designed to appeal to our fast thinking. For example, there was a thumbnail with a picture of Jason Bateman (a shot from his Netflix hit drama, Ozark) with the caption “All the Shows Netflix Is Cancelling (And What’s Getting Renewed).” The title was made intentionally long so that our fast thinking brains will only read the first part which talked about shows getting cancelled. Even if you read the whole title, you’re inclined to believe that the image of Batemen is meant to connect the main title about what’s being cancelled, and not the parenthetical part (what’s being renewed).
When I clicked through to the story, I discovered that not only is Ozark not only not being cancelled, it’s not even in the slide deck for the piece! No verdict on season four of Ozark has even been announced yet. So the publisher of that article was taking advantage of how our brains fill in the gaps, in this case by assuming we would link the thumbnail image to the headline. While they never said outright that Ozark was ending, they let the fast-thinking part of our brains make that connection so we’d do what they wanted us to do (click on their link).
When you’re reading a book, your brain takes in the information fast because you know the language; you’re able to read without great effort. In contrast, when you do double-digit math in your head or deal with multi-faceted reasoning, your slow process is required. So the more we use our slow processes, the more we might be able to shake our biases.
Another way to control for bias I learned about is the Principle of Charity. When you’re in an argument, a natural but dangerous instinct is to search for weak points, or assume the most illogical version of someone else’s argument. That version, created through “strawman-ing,” becomes much easier to argue with but can also remove masses of substance from the argument.
When we opt to use the Principle of Charity by arguing with the strongest possible version of our opponent’s argument, it allows us to put our inherent reservation about our opponent aside so we can have a valuable discussion. As we embrace our slow process, it becomes easier to consider an argument this way.
Knowing where my own biases come from helps me understand that they can be resisted. And when we refuse to let them fully define how we think, we’re in a better position to share ideas.
If you want to learn along with Ben, here is what he read in preparation for writing this piece:
If you’re interested in learning more: