Photo by Pawel Janiak on Unsplash
Metacognition Part 3: Your Ideological Immune System
It's important to consider that some of our more automatic thought processes, which we explored in part 2 of this module, are set up to preserve our preexisting and/or desired views. For example, we often look for information that supports beliefs and broader ideologies and ignore or reject information that seem to conflict with our what we believe to be true.
Scott Lilienfeld and his colleagues (2009) outline several ways the human mind is set up to preserve preexisting beliefs and ideological structures. They and Jay Snelson chunk these mental processes together under the label ideological immune system. In short, our minds are equipped to fend off or shield us from information suggesting we’re wrong. By analogy, we can think of the body’s readiness to fend off infectious organisms—opposing ideas are like infectious organisms out there in the world. The mind, in many ways, is set up to take protective measures to fight off such contrary ideas, kind of like the body's immune system fights disease causing agents.
For instance, if you’re someone who believes that childhood Measles, Mumps, Rubella (MMR) vaccines cause autism your mind is equipped with several ways of shielding that belief from information that could undermine it (note: it's important to note that MMR vaccines do not cause autism; see, for example, Hviid, Hansen, Frisch, & Melbye, 2019).
These defences may offer some comfort—particularly if you're averse to uncertainty—but when they're functioning to preserve false beliefs, they presents a significant barrier to getting closer to an accurate understanding of the world.
If you're hoping to get closer to the truth about a particular issue—like vaccine safety—understanding, identifying, and working to surmount these mental processes is an important step toward this goals. Below, you'll encounter some key features of our thinking to reflect on as you consume information and discuss important issues with the people around you.
Confirmation bias
When we're seeking to discover what's real about the world around us, understanding the role of confirmation bias in our thinking is essential. Confirmation bias is, “the tendency to seek out evidence consistent with one’s views, and to ignore, dismiss, or selectively reinterpret evidence that contradicts them” (Lilienfeld et al., 2009, p. 391). Confirmation bias is often at play when you’re skimming headlines and you see one suggesting that, upon clicking the link, you’ll find content that conflicts with one of your core or strongly held beliefs. For instance, a person who believes MMR vaccines cause autism might come across an article stating that there's no scientific evidence of a link to autism (like the one linked here). Confirmation bias is active when a person wants to ignore the information in the article for the reason that it conflicts with their belief. If the person does happen to click through and read further, confirmation bias may be active when they dismiss what they read as nonsense or twist it to be less in conflict with their belief.
Similarly, if you're a person who believes MMR vaccines are safe and have no relationship with autism at all, you might at some point come across an article that challenges that view. Confirmation bias may impact our thinking and behaviour regardless of the veracity of the belief. It might sound, then, like confirmation bias can be a good thing, allowing us to hold onto our true beliefs or beliefs we cherish. However, it's important to note that it's not applied selectively to preserve true beliefs or beliefs we hold with good reason. It should concern us that we have a tendency to hold onto views solely because they're ours. There are times when doing so can prevent important learning or might even be putting us in harm's way.
Confirmation bias can have quite subtle effects that are not easy to identify through casual reflection on how we process information. For instance, you might decide to go on reading the article that conflicts with your belief, but get more adversarial or nit-picky than usual with its arguments and evidence. Although this careful analysis could be part of good critical thinking, if strong critique is practiced only, or to a greater extent, on arguments and evidence with which you disagree, there's bias at play and that selective critique can be a problem. A specific instance of apparently effective critical thinking may be couched in a larger pattern of bias (i.e., a tendency to approach things you find disagreeable more critically than things with which you agree), thus rendering one's information processing relatively ineffective overall. We want to strive toward effective critical thinking regardless of whether we agree or disagree with what we read, watch, and hear.
Importantly, none of this is deliberate and most often we're quite unconscious of the process—we're not thinking, "I better read this hyper-critically because it could be damaging to my beliefs about vaccines." Rather, we may tend to go on believing that we're diligent truth seekers. It's valuable to question that assumption we have about ourselves.
Naïve realism
Naïve realism involves the view that the world truly is as we perceive it to be (Ross & Ward, 1996). This idea that “seeing is believing” can have in several ways of distorting thinking as we consume media.
For instance, if you believe you tend to see the world as it truly is, you may be prone to seeing people who disseminate views that are contrary to your own as ignorant, irrational, or deceptive. That is, if you believe that you see the world accurately, you might think that those with contrary views must have things totally wrong (and are perhaps unintelligent) or are willfully misleading their audience (and are, therefore, perhaps evil!). As a result, you might dismiss any arguments for their position without further consideration, or dismiss anything they might say on other topics.
Believing that our raw perceptions reflect the true nature of the world can also lead us to view our personal experiences as evidence that is of greater value than the evidence that has accumulated in the scientific literature. For instance, naïve realism could be guiding your thinking if you argue that “the world just seems flat to me so it probably is flat—science is wrong!” or “it just doesn’t feel right to me that humans evolved from a common ancestor with chimps, so it can't be true!” This tendency can lead us to dismiss good evidence and sound arguments that do not align with our own perceptions, beliefs, attitudes, and past experiences.
Of course, sometimes you'll encounter such a wacky perspective that perhaps it should be dismissed without further consideration. But have you given much thought to when such dismissal is sensible and when it might be unwarranted? When might the world not be as we see it? An easy example anyone can get on board with concerns optical illusions. With optical illusions, we can all agree that what we see in front of us does not represent reality.
But of what see directly in front of us with our own eyes doesn't always reflect reality, then what about more abstract opinions about the world? If we sometimes must question whether what we see in front of us is real, we certainly must questions our more abstract beliefs. Think politics, for example—to what extent do you trust that your views on on a political or social issue accurately reflects something true? Where is that trust in your perspective coming from? Could it be that others with different views also have reasonable views that look different from yours?
Hot Cognition
It's impossible to disconnect emotions and motivation from our thinking, no matter how cool and calculated we think we're being. And you wouldn't want to! Without the capacity for emotion, much of our thinking, like decision-making and moral judgments, just wouldn't work (see Descartes' Error by Antonio Damasio). Sometimes, though, our emotions and motives can lead us astray. When our thought is heavily influenced by feelings and desires, we can call this hot cognition. Imagine, for instance, that your favourite NBA team loses an important playoff game. If you're a big fan, emotions are running high, and it can make you jump to unwarranted conclusions about the causes of the loss (e.g., the refs always have it out for us!).
Similarly, many of us identify with, and feel strong emotional attachments to, political teams. We can be motivated to evaluate political arguments, debates, and election results with bias toward parties and candidates we're attached to or invested in. In the realm of sports fandom it's often all in fun, but elsewhere—like in politics or when we're drawing conclusions about whether to vaccinate our children—judgments and decisions based heavily or solely on feelings like anger and fear, or even admiration, can get us into serious trouble.
In the domain of digital media, It’s a good idea to consider whether you’re experiencing a particular emotional state as you consume content. (which, again, isn't inherently bad—it's just important to reflect on what that emotion could be doing). Could your feelings be altering your capacity for critical thinking in one way or another? How do you feel, for instance, when you’re reading a political rant on social media, whether it supports or runs counter to your views? Is your emotion making it harder to objectively assess the arguments that are set forth. For example, did you feel moral outrage about a word the author used that you find offensive. Do you feel a little joy when an the author has expressed a view with which you agree?
Next, you might consider whether your emotional state might be motivating you to act in a way you might later regret. For example, do you feel compelled to express indignation upon encountering apparently false information some stranger has shared online? What are some possible consequences—both good and bad—of acting on such feelings?
Consider the following too:
-
How do you feel when you are faced with arguments counter to your core beliefs or ideas that contradict your own? Do you feel tense or angry? On the other hand, how do you feel when you are faced with beliefs or ideas that support your own. Do you feel good? Does it seem like this could be helping or hindering your capacity to think critically? What can you do to work around that issue?
-
How do you feel about those authors or speakers whose politics or affiliations you find disagreeable? Does it seem like your feelings about the person and their affiliations could be undermining your capacity to accurately assess things that they say? (see People & Context module for more)
Bias Blind spot
As you scratch the surface of some biases and other mental processes that serve as barriers to critical thinking, there's good reason to question whether you're getting the point. You may have found yourself thinking that, although other people think and act according to biased mental processes, you're personally relatively free from bias. How wonderful if that were true! Unfortunately, your personal insight into your own mental life—your metacognitive capacity—is also fallible. You harbour these biasing tendencies whether or not you believe it to be the case.
Here, a new concept could help. It's called the bias blind spot—for various reasons, while a relatively informed person can sometimes do a reasonable job of identifying biased tendencies in others, seeing biases in ourselves can be very hard (this is like a bias about biases—take a minute to get your head wrapped around that!). This can be a huge barrier to effective reflection on one's thinking, even after attaining extensive understanding of human cognition. That is, people can perfectly understand some of the fallible mental processes we've explored above while utterly failing to see their relevance to our their own reasoning.
Social Pressures
As social beings, our minds obviously don't function independently of what's going on around us. Thus, it's not just our own ideas that we're liable to protect—there are beliefs and systems of belief floating around in the social world that we may wish to preserve. Perhaps you want to align yourself with what seems to be a common belief among your peers or those you respect (or fear!) on social media. Perhaps you feel pressure from authority figures, like higher-ups at work, or the norms you perceive in a social group, like your classmates. We can even end up working to sustain particular viewpoints found in our social world when we don't personally endorse them at all.
Are there pressures in your social world guiding what you read, watch, and hear online or your interpretation of the info you consume? In general, are the people, groups, and organizations with which you affiliate supportive of a particular ideology (e.g., activist friends or groups)? How do you feel about potentially discovering and sharing evidence that might rub your affiliates or group-mates the wrong way? Does it feel bad to hold beliefs that run up against the views of your affiliations? It might sometimes even feel wrong to privately debate those ideas. Does it feel like you hold a view that you might have to keep hidden? These are signs that social forces are affecting how you explore and think about information, and ultimately what you do with it.
Here are a few sources of social pressure to consider:
-
Authority figures (e.g., educators, professional or community organizations, certain family members and peers with higher social status) who are invested in a particular viewpoint. For example, a professor of one of your courses may have a political ideology that you as a student feel you must adopt, or at least not contradict in class or assignments, for the sake of your grade or approval (hopefully this isn't the case, but it's not uncommon!).
-
Group norms for a particular belief or way of thinking. What groups are you a part of—activist, family, academic, etc.—and what are their ideological norms? For instance, do you fear reprisal from your social group or an external activist group if you express an opinion that contradicts the normative or expected view within the group?
-
Inconsistency. We don’t often enjoy feeling like, or being perceived as thought, we are being inconsistent with ourselves. Are other people aware of your previous commitment to an ideology or belief? Are you worried that someone will find out you've changed your mind? Would it feel bad if they knew you were exploring ideas or evidence that challenges your prior beliefs or that you are in the process of potentially changing your mind?
It's important to note that the above are a small sampling of social and cognitive forces that can mess with our capacity to reason effectively. It's a foundation to get you started, but in no a complete portrait of what we have to cope with and surmount to get closer to good critical thinking about the information we're processing.
If you find the above interesting or useful, I suggest checking out a vast array of cognitive biases at the Decision Lab.
Key Terms & Ideas
Bias blind spot: We tend to find it harder to identify biased judgments in ourselves compared to in other people.
Confirmation bias: “The tendency to seek out evidence consistent with one’s views, and to ignore, dismiss, or selectively reinterpret evidence that contradicts them” (Lilienfeld et al., 2009, p. 391)
Hot Cognition: Thought that is heavily influenced by feelings and desires.
Ideological immune system: In many ways our minds are set up to preserve our pre-existing beliefs and ideological structures. The concept of the ideological immune system is a practical tool we can use to contain those mental processes that protect our beliefs from information that could undermine them.
Naive realism: the view that the world is truly the way we perceive or believe it to be.
Social Pressures: Various aspects of the social world may guide what you read, watch, and hear online or your interpretation of the info you consume. Think about social norms, authority figures, etc. Think about what might happen if you express competing views. Are you compelled to keep quite or express the opposite of what you feel? Do you feel compelled by those around you to follow particular social media influencers or read certain material? When might this undermine competent reasoning?
Applying It
Briefly write down your beliefs about whether there is a link between childhood vaccines and autism. You might also consider the relative strength of your beliefs—does it feel like there's some uncertain there or does it feel firm and unshakeable?
Read the following very short summary of a recent research project investigating the link between MMR vaccines and autism: Measles, Mumps, Rubella Vaccination and Autism. What's the relationship between the findings of this study and your belief? That is, does it align with or counter what you believed before you read it?
What kinds of things were going on mentally as you read? For instance, did reading the article make you feel vaguely positive or negative? Did you find yourself reading closely or quickening your pace as you read? Were you coming up with counterarguments or alternative explanations?
Now imagine that the article found the opposite of its conclusions. How would this change how you read it, what you thought, and how you felt? If possible, bring in confirmation bias to help you explain.
Further Reading
Social Cognition and Attitudes | Yanine D. Hess and Cynthia L. Pickett | Noba
This chapter from Noba will help you expand your thinking about cognitive processes like biases and mental shortcuts beyond just those that preserve what we believe or others in our social world believe to be true. There's a lot to explore here to expand your thinking about your own mind.
“Reality” is constructed by your brain. Here’s what that means, and why it matters |Brian Resnick | Vox
This is a very cool article that may help you to understand the importance of naive realism via a look at optical illusions. If what see directly in front of us with our own eyes doesn't always reflect reality, then what about the more abstract opinions about the world that vary from person to person? If we sometimes must question whether what we see in front of us is real, we certainly must questions our more abstract beliefs.
Learning Check
