top of page
david-matos-xtLIgpytpck-unsplash.jpg

Becoming Mindful of Mental Processes

Photo by David Matos on Unsplash

Metacognition 2: Becoming Mindful of Mental Processes

If the goal is for us to get better at processing what we read, watch, and hear, it's necessary to understand and address aspects of our thinking that can get in the way. However, we shouldn't be overly optimistic about how easy this is—psychologists have found that mere awareness of fallible mental processes, like  cognitive shortcuts and biases, may do very little to help us deal with them. For instance, simply being aware that feeling morally outraged can cause us to act hastily may do little to help us manage our online behaviour when we're actually feeling outraged. Likewise, simply being aware that humans are prone to several cognitive biases doesn't necessarily help us as individuals to surmount instances of bias in our own thinking. 

 

To make the awareness of hidden mental processes practically useful, we'll need to go a few steps further. We don't just want to know about systematic mental errors and stop there. Rather, we want to develop the capacity to identify our mental mistakes as they occur and take advantage of mental tools found to be effective in regulating our thinking. That's harder. 

The next two sections of the Metacognition module will get us started on features and bugs of human mental life. Afterwards, in later sections, we'll take a necessary next step to look at some tools for reflecting on and addressing these mental processes as we process the world around us.

Limiting Our Focus

To get started, consider that researchers have developed a massive—and sometimes messy—literature on mental shortcuts, biases, and fallacious modes of reasoning that affect our thinking. Most of us, of course, don’t have time to wade through mountains of psychological concepts—this section will get you started on the right track without overburdening your mind. For those who wish to read further on these topics, some great entry points for thinking about human rationality and irrationality include What intelligence Tests Miss: The Psychology of Rational Thought by Keith Stanovich and Thinking Fast and Slow by Daniel Kahneman.


A small subset of ideas can get us started with achieving our aims. We can begin with the basic idea that most of our thinking is inaccessible to us—it's fast,  automatic, and often takes place outside of conscious awareness. While we can't live without it, our automatic and unconscious thought often oversimplifies things, hastily jump to conclusions, and exhibits biased information processing. As we move forward, it will help to be aware that things that are happening back there in the black box of our minds. The below section will introduce a simple idea to help us frame automatic and controlled thought.

Automatic and Controlled Thought

Psychologists often make use the idea of two general types of thought, we'll refer to as Type 1 thinking (fast, automatic, relatively low-effort & often nonconscious) and Type 2 thinking (slow, controlled, relatively high-effort, & accessible to consciousness). As Daniel Kahneman (2011) argues, its useful as shorthand to have a mental representation of these two "types" of thought, although this dichotomy is not a perfect representation of the mind's complexity (See Stanovich, 2009, for a slightly more complex but compatible view and Melnikoff & Bargh, 2018, for a critique). 

 

In short, many mental processes are automatic and occur outside of our awareness (Type 1 thinking). By contrast, we're at least partly consciously aware of, and feel like we can control, other mental processes (Type 2 thinking). The following are typically associated with Type 1 thought:

  • Intelligible formation of sentences as you converse with a friend.

  • Arrival at intuitive answers to simple math questions (e.g., 2+2).

  • Feelings of shame and the motivation to escape the moment as you convey an embarrassing story to a friend. 

 

The following are typical of Type 2 thought:

 

  • The process of solving a math problem of moderate difficulty (e.g., 13 x 24).

  • Learning how to drive or play a new song on the piano.

  • Identifying and applying tools to overcome cognitive biases or an undesired emotional state.

 

Importantly, automatic processes (Type 1) continue to operate outside of awareness alongside effortful and deliberate processing (Type 2). Our more controlled and deliberate decisions, like choosing which sort of phone to buy, for example, are always informed by underlying Type 1 processes. For instance, even though you feel like you're making a rational decision, you may harbour unknown bias in favour of a particular company because you think the CEO is smart and you like his sweaters. That is, no matter how well-reasoned and deliberate you may believe your decisions to be, there are generally things going on behind the facade of our conscious deliberation (Type 1 thought) that sway our choices. As we'll see, it's very hard—very often impossible—for individuals to uncover these hidden mental processes. 

Think about this example adapted from Daniel Kahneman's, Thinking Fast and Slow, which illustrates the various automatic & often unconscious processes that operate while we do other things with our controlled and conscious thought. As an experienced driver, do you ever have to put effort into determining what a stop sign means when you see one? Do you have to effortfully work through which pedal stops the car? Do you always consciously process when you should begin to slow down before coming to a stop? The answer to these questions is, of course, no (and if you answered yes, please don't drive in my neighbourhood!). Once learned, these are seamless processes requiring little deliberate thought—most of us can simultaneously have a meaningful conversation or follow a complex argument on a podcast.

 

When you're skimming scrolling your social media feed, glancing at several headlines, or perusing videos on Youtube, your unconscious mental life is operating in a similarly effortless way. Generally speaking, we are unaware of what's going on back there in Type 1 thought. For instance, we often move swiftly over and accept arguments that confirm our pre-existing beliefs while dismissing or spending more time criticizing arguments that run counter to our personal views (see part 3 of this module for more on this idea which is referred to as confirmation bias). We aren't usually doing this intentionally—that'd be odd.

 

Although we're generally unaware of how our intuitions are affecting how we're processing information, with some learning and effortful attention, we can sometimes take educated guess about when emotional states and cognitive biases might be causing us to think and act in particular ways. We'll work on these ideas in parts 3-5.

Now, think about a hypothetical time on the road when you would need to switch into more deliberate and effortful (Type 2) thought. You're driving through a storm, at night, through twisting country roads—oh, and you're also in Ireland (i.e., assuming you normally drive on the right side of the road, you're now on the left). Further, your best friend and travel buddy is sitting beside you singing along to Taylor Swift songs. This situation requires way more conscious attention than your casual drive to work. For example, you still know what a stop sign looks like, but now they're not where you expect them to be because they're on the opposite side of the road. You need to make an effort and override the habitual response of where you tend to look and direct your eyes elsewhere. 

 

This extra effort requires deliberate, controlled attention. But because your conscious effort is a limited resource, you can probably tell that if you were in this situation, you'd likely need your friend to stop singing and turn down her music so you can focus. This is very different from the casual drive to work, in which a little more multitasking is possible.

Also noteworthy is that you would likely intuitively move into a more conscious and deliberate mode of thought. That's not the case for all of our thinking. Often when we're digesting political or scientific news, for example, we be sort of on autopilot (Type 1) and might need to shift to a more deliberate mode of thought (Type 2) to fully digest or effectively critique the material.

 

Effortful & Deliberate Thought is Needed for Critical Thinking

The drive to work versus drive in Ireland example is analogous to two different ways of reading information you find online. A leisurely, relatively carefree read is often easy—you might even watch a movie or chat with a friend all while skimming the article. By contrast, a read in which you want to fully grasp the content requires effortful attention. The limitations of Type 2 thinking mean that you're probably not going to be maximizing your capacity to take in and critique information if you're simultaneously watching a movie or or having a conversation.

To get all the information you're looking for, you'd need to pay attention to how your biases and pre-existing beliefs might be directing your attention and thinking. Overriding your gut-level, intuitive evaluation of the information coming your way generally takes conscious effort. There is an important difference from the above "driving in Ireland" example—as you read, you're not in a life and death situation in which poor performance can cause you to go over a cliff or hit another car. Because the stakes are lower as we're consuming content, we often cruise on auto-pilot (Type 1 thinking), imagining we're processing what we read just fine, when we might actually be making some big mistakes.

 

Luckily, just like you'd eventually get used to driving in Ireland, effective ways of processing information can start to become automatic too—this requires knowledge of and integration of useful mental tools, which we'll continue to explore as we move through this module. 

To summarize, most of our day to day media consumption is treated like recognizing and responding to a stop sign on the way to work—our intuitions and assumptions based on past experiences play a huge role (Type 1 thinking). We don't usually consider that effectively processing new information is generally a high-effort task (Type 2 thinking). That is, recognizing and overriding biases to effectively evaluate truth claims, arguments, and explanations is like successfully traveling through a storm, at night, through the weaving roads of the Irish countryside with your best friend nostalgically revisiting old songs her dad used to sing along to in the car. It's novel and difficult, so it requires focused, effortful attention.

What's great, though, is that critical thinking can be like driving in Ireland in another way. After a while, you get used to it—some elements of good driving and good critical thinking become habitual or more automatic. It's all about practice.

Unconscious, Automatic Mental Shortcuts

 

Everyone, everyday takes shortcuts in their thinking so that they can make judgments more quickly and efficiently. This might not be something you’ve thought much about as these shortcuts generally occur automatically, outside of conscious awareness and without deliberate effort (see Type 1 thinking above). They’re a good place to start as we begin thinking about some of the things our brains are doing without our conscious intervention.

 

Imagine, for example, that you need a hot sauce for a recipe, and you don’t have any in the fridge. You head to the appropriate aisle at the grocery store only to find that there are dozens of hot sauces from which to choose.

hot sauce.png

One approach you could theoretically take is to try to find the sauce you prefer most, but that would take ages and quite possibly get you into trouble (you probably shouldn’t be trying all these sauces in the store!). Another possibility, if you’re interested in others’ opinions on hot sauces, is that you could go online to look at reviews, but looking through information other people have shared about all of these sauces would also take ages and, anyway, you’re really not interested in spending all afternoon looking at reviews.

 

What, then, can you do? You’re left, I think, with the option of taking some shortcuts to help you make a relatively quick decision and move on with your life. Instead of seeking to answer a relatively hard question—like which is best—you can substitute an easier question and answer that one. There are lots of things we can ask, for example:

  • Which bottle is the coolest or most eye-catching?

  • Which is most expensive?

  • Which looks hottest?

 

Of course, it's rare that we would ever deliberately run through the various strategies for making a decision to sort out an optimal or efficient choice. Instead, we're usually automatically and unconsciously using cognitive shortcuts to make judgements quickly and efficiently. Cognitive scientists refer to these shortcuts as heuristics.

 

Imagine that you just decide to grab the most eye-catching bottle. You make the recipe and find it turned out just fine. Great! The shortcut worked. Perhaps there was a dozen or so better sauces. Who cares? The heuristic served you well in the context.

 

But mental shortcuts are not perfect—if we use them in situations where the right answer really matters, we can make disastrous mistakes.

 

An Example Shortcut: The Availability Heuristic

 

Many heuristics are not as situation specific as “find the most eye-catching bottle” or "purchase one that's relatively expensive". Rather, there are a number of heuristics that the human mind is equipped to use across a wide range of contexts. These heuristics are pervasive in human judgement. Some examples are noted and linked below:

 

Availability heuristic

Representativeness heuristic

Affect heuristic

Anchoring and adjustment

 

We can look at the availability heuristic now, as one example of commonly used heuristic that can often be helpful but can sometimes also lead us astray.

 

According to Amos Tversky and Daniel Kahneman, who first studied it in the 1970s, the availability heuristic is a mental shortcut whereby we, “assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind." We use it to efficiently estimate likelihoods ranging from the relatively innocuous, like the chances of getting stuck in traffic this morning, to the very serious, like whether contracting a particular virus will kill us.

 

Essentially, when we lack clear answer or relevant data, we substitute the question we really want to or ought to ask (“what’s the probability of contracting measles?”) with a simpler question that harnesses ease of recall (“How easily do instances of measles come to mind?”). When instances feel easier to recall, we will tend to estimate instances as more likely.

 

It works relatively well much of the time—particularly when the ease with which something can be brought to mind is tethered to real-world frequency. An example relayed by Kahneman in Thinking Fast and Slow asks readers to judge which of the following lists of letters can be used to construct more words:

 

XUZONLCJM

TAPCERHOB  

 

With a quick glance at the two lists, you likely arrived at an answer rather quickly. You don’t know how many words either list can form—you didn’t even have to generate a single word—but the intuitive judgment of ease in this case is apt. Similarly, if you've spent some time following the news over the past month, you should be able to take a decent guess as to which countries at war will receive the most media attention in the upcoming week. Though we must factor in relative lack of predictability for future events, ease of recall for prior headlines and stories should be fairly well-calibrated to frequency, and the availability heuristic can help us take guesses about the near future, at least under the condition that world events and media interest remain relatively constant.

 

What features of events make them more easy to recall? Recency is a big one. When we’ve recently seen news stories about airline accidents, for example, we’ll tend to judge them as more likely than if we haven’t seen such stories in a while. Frequency is also powerful. The more we see stories about plane crashes, the more likely we’ll tend to think they are.

 

Stories and images with more vividness also increase ease of recall. Video, imagery, and stories about accidents rather than statistics about them are easier to bring to mind. Finally, more emotional content makes recall easier. Stories about sad, scary, or anger-inducing individual cases, for instance, versus more abstract or detached reporting of events.

 

Here's where it gets interesting. The availability heuristic can go awry when it’s poorly adapted to the environment, such as when it’s used to judge the extent to which we’re in danger from the kinds of deadly events that receive disproportionate media attention.

 

Considerable exposure to sensational stories, like those involving murder and natural disasters can make them appear to be greater threats than say diabetes and heart disease, which in fact kill more people. Similarly, air travel accidents receive more attention in the news than car accidents despite the relative safety of flying. This media attention is one reason for the common fear of flying, and the mistaken view that air travel is more dangerous than driving. Indeed, in the wake of 9/11, while media and public attention to the attacks was extremely high, many people opted for the roads rather than the air, ultimately putting more lives at risk.

 

On the flip side, when we’re using the availability heuristic to judge frequency and probability, harder to imagine information may be weighted less heavily, even if prominent in the media. In a 1985 experiment by Steven Sherman and colleagues, two groups of students received information about a disease that was purportedly becoming prevalent on campus.

 

One group was given a summary of the disease that included easy to imagine symptoms like low energy and muscle aches. The other group received a summary that including harder to imagine abstract symptoms like a malfunctioning nervous system. Both groups were told to imagine a hypothetical future period during which they have contracted the illness. A set of two control groups were also distinguished from each other by these easy or hard to imagine summaries, but they were not asked to imagine a future with the illness.

 

Finally, all groups were asked to estimate their likelihood of later contracting the illness. In accordance with the availability heuristic, relative to the participants who read about the illness but did not imagine having it, those who thought actively about easy to imagine symptoms reported believing they had a higher likelihood of contracting the disease. Critically, those who thought actively about the hard to imagine symptoms predicted that they would be less likely to contract it.

 

Thus, if it’s hard to bring certain information to mind, we tend to judge it as less probable.

 

Valuable information like abstract data doesn’t have the narrative structure or striking imagery that a story about a person’s experience with a shark attack or way does. As such, it takes more effort to bring it to mind when we need it. The information we truly ought to weigh most heavily—statistical data, scientific consensus—is harder to recall and, therefore, may be placed at a significant disadvantage as we draw inferences about dangers in our environment and other likelihoods.

 

As a final key point, consider that different online networks and communities may receive vastly different information about the world. Some of us are repeatedly exposed to frightening and memorable falsities and other misleading content, such as unfounded conspiracy theories and foolish health advice (see also concept of epistemic bubbles).

 

Because the content of false news tends to be more novel, surprising, and fear inducing than true stories, it is also often easier to call to mind. The falsities that come to mind for one epistemically siloed group of people may be completely absent from the thoughts of others. As such, people in particular online communities may be prone to serious availability-based errors whereby they overestimate threats that do not even remotely reflect reality.

Moving Forward

Next, we'll look at some of the automatic mental tendencies that can impair information processing and critical thinking. We'll loosely chunk these into what we'll call the ideological immune system. In many ways, your mind is set up in such a way that it tends to preserve your preexisting and desired views of the world, fending off ideas that seem to conflict with those views. These tendencies are often at the root of distortions in thought, misunderstandings between people, and strife on social media.

Key Terms & Ideas

Type 1 thought: Thinking that is fast, automatic, and/or nonconscious. Absolutely critical to catching a ball, driving competently, speaking, reading, etc. It's often doing things that you are not aware of. But knowledge of some of the things going on in Type 1 thought can help us grasp when our thinking may be going astray.

Type 2 thought: Thinking that is relatively slow, deliberate, and effortful. We can often consciously report the thought process. It's needed—along with Type 1 thought—in emergencies and otherwise novel situations, learning a new skill, and solving hard problems. We often need to harness this kind of thinking to correct errors we make when we encounter new information, such as when we're listening to a podcast or reading an article.

Why is reflection on type 1 and 2 thought important to critical thinking? Much of our thinking happens quickly and is inaccessible to us (Type 1 thought). Sometimes, when we're trying to understand things, come to conclusions, or solve problems, this kind of thinking takes shortcuts that harness limited information to get close to the right answer. At other times, what's happening is this type of thinking is not trying to get the right answer at all, but working to preserve pre-existing beliefs. Being aware that these processes are at work can help you to acknowledge that, like everyone else, you are at the whim of the mind's hidden processes. Many of these hidden processes are crucial, but others bias your thinking in counterproductive ways. When it's important, we need to find ways to take greater control over our thinking (harnessing Type 2 thought).

Heuristics: cognitive shortcuts we use to make judgements quickly and efficiently. These generally operate unconsciously, but can sometimes be used consciously and intentionally.

Availability heuristic: a mental shortcut whereby we, “assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind." Tversky & Kahneman (1974)

Further Reading

Of 2 Minds: How Fast and Slow Thinking Shape Perception and Choice | Daniel Kahneman | Scientific American

 

This is an excerpt from from Kahneman's "Thinking Fast and Slow" — it's a nice alternate introduction to Type 1 and 2 thought.

Learning Check

© Darcy Dupuis 2025

Contact

To provide feedback or to learn about using Fallible Fox content for personal, educational or organizational purposes, contact Darcy at dupuisdarcy@gmail.com

bottom of page