Cognitive Biases Defined: 7 Examples and Resources

defining cognitive biasesWe are often presented with situations in life where we need to make a decision with imperfect information, and instead, we unknowingly rely on prejudices or biases.

For example, we might:

  • Trust someone more if they’re an authority figure than if they’re not.
  • Assume someone’s gender based on their profession.
  • Make poor decisions based on the information that we’re given.

The reasons for our poor decision-making can be a consequence of heuristics and biases. In general, heuristics and biases describe a set of decision-making strategies and the way that we weight certain types of information. The existing literature on cognitive biases and heuristics is extensive, but this post is a user-friendly summary.

Central to this post’s topic is how cognitive heuristics and biases influence our decision making. We will also learn more about how to overcome them.

Before you read on, we thought you might like to download Positive CBT Exercises for free. These science-based exercises will provide you with detailed insight into Positive CBT and will give you the tools to apply it in your therapy or coaching.

What are Cognitive Biases?

Before defining the term ‘cognitive biases,’ be aware that there is an overlap between cognitive biases and heuristics. Sometimes it appears that these two terms are used interchangeably, as though they are synonyms; however, their relationship is nuanced.

In his book, Thinking, Fast and Slow, Professor Daniel Kahneman defined heuristics as

a simple procedure that helps find adequate, though often imperfect, answers to difficult questions” (pp. 98).

Tversky and Kahneman (1974) define the relationship between biases and heuristics as follows:

… cognitive biases that stem from the reliance on judgmental heuristics” (pp. 1130).

Gonzalez (2017) reworded their definition as:

Heuristics are the ‘shortcuts’ that humans use to reduce task complexity in judgment and choice, and biases are the results gaps between normative behavior and the heuristically determined behavior” (pp. 251)


Lists and Types of Biases: The Codex

cognitive bias codexAccording to the Cognitive Bias Codex, there are estimates of 180 cognitive biases (this list is frequently updated.)

Created by John Manoogian III and Buster Banson, this Codex is a useful tool for visually representing all of the biases that exist to date.

The biases are arranged in a circle, which can be divided into four quadrants. Each quadrant, listed below, is dedicated to a specific group of cognitive biases:

  1. What should we remember
    Biases that affect our memory for people, events, and information.

  2. Too much information
    Biases that affect how we perceive certain events and people.

  3. Not enough meaning
    Biases that we use when we have too little information and need to ‘fill in’ the gaps.

  4. Need to act fast
    Biases that affect how we make decisions.

The Cognitive Bias Codex is a handy visual tool that organizes biases in a meaningful way; however, it is worth pointing out that the Codex lists both heuristics and biases together as ‘biases.’

If you decide to rely on the Cognitive Bias Codex, then keep in mind the distinction between heuristics and biases mentioned above when relying on this tool.


4 Examples of Cognitive Biases

gambler's falacyThere are numerous examples of cognitive biases that exist, and the list keeps growing. Here are a few examples that frequently occur.


1. Confirmation Bias

Looking for or overvaluing information that confirms our beliefs or expectations (Edgar & Edgar, 2016; Nickerson, 1998). For example, a police officer who is looking for physical signs of lying might mistakenly classify other behaviors as evidence of lying.


2. Gambler’s Fallacy

This false belief describes our tendency to believe that something will happen because it hasn’t happened yet (Ayton & Fischer, 2004; Clotfelter & Cook, 1993).

For example, when betting on a roulette table, if previous outcomes were ‘Red’ then we might mistakenly assume that the next outcome must be ‘Black’; however, these events are independent of each other (i.e., the probability of their results do not affect each other).


3. Gender Bias

Our tendency to assign specific behavior and characteristics to one gender without supporting evidence (Garb, 1997).

For example, complaints of pain are taken more seriously when made by male, rather than female patients (Gawande, 2014); women are perceived as better caregivers than men (Anthony, 2004); specific clinical syndromes are more readily diagnosed in women than in men (Garb, 1997), and students often rate female lecturers lower than male lecturers (MacNell, Driscoll, & Hunt; 2014; Mitchell & Martin, 2018).


4. Group Attribution Error

This error describes our tendency to overgeneralize how a group of people will behave based on an interaction with only one person from that group (Pettigrew, 1979).

For example, a negative experience with someone from a different group (e.g., a different culture, gender, religion, political party, etc.) might make us say that all members of that group share the same negative characteristics. Group attribution error forms part of the explanation for prejudice in social psychology.


Examples in Business and Everyday Life

Gender bias in the workplace is well documented and researched area of cognitive bias. Women often do not occupy top senior positions. For example, the percentage of women who held the top positions of the US Fortune-500 companies was only 15.2% in 2010 (Soares, 2010). Even when women are employed, they tend to earn less than their male counterparts, and women’s salaries differ according to their marital status.

For example, consider these statistics reported by Güngör and Biernat:

[In 2005]  … 68.1% of married and 79.8% of single mothers in the U.S. participate in the workforce, but while non-mothers earn 90 cents to a man’s dollar, mothers earn 73 cents, and single mothers earn about 60 cents.” (pp. 232).

The social desirability bias is a concern for anyone who uses self-report data. Companies that run internal surveys investigating topics that may cast an employee in poor light must be aware of how the social desirability bias will affect the validity of their data.

Knowing that people adjust their answers to appear more socially desirable, investigators (such as researchers and clinicians) can try to reframe their questions to be less direct, use formal tests, or anonymize responses.

Another sphere of our lives where biases can have devastating effects is personal finances. According to Hershey, Jacobs-Lawson, and Austin (2012), there are at least 40 cognitive biases that negatively affect our ability to make sound financial decisions, thus hindering our ability to plan for retirement properly. Some of these biases include the:

  • Halo Effect (just because that real estate agent was nice doesn’t mean it’s a good deal).
  • Optimistic Overconfidence (“I’ll be fine in the future, so I don’t need to save that much now”).
  • Confirmation Bias (looking for information to confirm or validate unwise financial decisions).


Role of Biases in Decision Making

The Monty Hall Problem

the monty hall problemTo illustrate how biases affect our decision making, the game show ‘What’s behind the door?’ is a good example.

Assume that there are three doors;

  • behind one door is a fantastic prize – a car;
  • behind the other two doors are mediocre prizes – $1 000.

You choose Door 1, which the presenter opens to reveal the mediocre prize. The presenter gives you the option to either (1) keep the $1 000, or (2) take the risk and choose between the remaining two doors. What should you do now: Should you keep your mediocre prize, or should you take the risk and open one of the remaining two doors?

The correct answer is that you should rather make a new selection between the remaining two doors. This is called the Monty Hall problem. The reasons why you should opt to make a second decision are as follows:

  1. when you made your initial decision, you didn’t know what the outcome would be (mediocre prize versus awesome prize)
  2. but after making that first decision, you have more information about which prizes are behind which doors,
  3. and based on this information, you’re more likely to find the car in your second guess because the odds are 0.66 (or ⅔).

Most people, however, decide to stick with the original prize.


Other Cognitive Biases

The Monty Hall problem is an excellent example of how our intuitions and heuristics lead us to make poor decisions. However, there are lots of other cognitive biases and heuristics that also affect our decision making.

Kahneman, Slovic, Slovic, & Tversky (1982) list 13 biases that arise from the following three heuristics:

  • Representativeness
    We think that the likelihood of two things happening is higher when the two things resemble or are similar to each other:

    • A cognitive bias that may result from this heuristic is that we ignore the base rate of events occurring when making decisions. For example, I am afraid of flying; however, it’s more likely that I might be in a car crash than in a plane crash. Despite this, I still hate flying but am indifferent to hopping into my car.

  • Availability
    We tend to overestimate how likely something will happen based on how easily we can remember the same thing happening previously.

    • For example, when a violent crime occurs in a neighborhood, neighbors in that neighborhood will give a bigger estimate of the frequency of these crimes, compared to the reported statistics. The reason for their over-estimate is that the ‘memory’ of the violent crime is easy to retrieve, which makes it seems like violent crime happens more frequently than it actually does.

  • Adjustment and Anchoring
    Our decisions, and the changes to our decisions, are influenced by the first bit of information that we’re given.

    • For example, assume that I offer to sell you a car and I ask for $250. You counter with $200. You might think that this is a good deal because you bought the car for less than the asking price; however, your counter-offer was heavily influenced by my asking price, and you’re not likely to deviate too much from my asking price.


2 Popular Experiments

1. Anchoring and Adjustment

Tversky and Kahneman (1974) found that our estimates are heavily influenced by the first number given to us. For example, participants were asked to estimate the percentage of African countries in the United Nations.

Before giving their answer, each participant had to spin a ‘Wheel of Fortune’, which would determine their initial starting percentage. The result of the ‘Wheel of Fortune’ was random and meaningless. Despite this, participants still made poor estimates, and their views didn’t differ much from the initial random ‘Wheel of Fortune’ amount regardless of what that amount was.


2. The Attractiveness Halo Effect

Male students were asked to rate a pack of essays written by female authors (Landy & Sigall, 1974). The quality of the essays varied: some were poorly written, and others were well written.

Additionally, some of the essays were accompanied by a photograph of the author (who was either attractive or unattractive) or were submitted without a photograph. Male college students rated the quality of the essay and the talent of the authors higher when:

  • the essay was written by an attractive author, and
  • this effect was evident only when the essay was of poor quality.

Therefore, the male students demonstrated a halo effect, which was affected by the perceived attractiveness of the female author, and gave better ratings based on the visual appearance of the author rather than the true quality of the paper.


4 Ways to Overcome Your Biases

ways to overcome cognitive biasHere is a list of four methods that you can use to overcome your own biases.


1. Reflect on Past Decisions

If you’ve been in a similar situation before, then you can reflect on the outcomes of those previous decisions to learn how to overcome your biases.

An example of this is budgeting: We tend to underestimate how much money we need to budget for certain areas of our life. However, you can learn how much money to budget by tracking your expenditure for the last few months. Using this information, you can better predict how much money you need for different financial categories.


2. Include External Viewpoints

There is some evidence that we make better decisions and negotiations when we consult with other people who are objective, such as mediators and facilitators (Caputo, 2016).

Therefore, before making a decision, talk to other people so that you have an opportunity to consider different viewpoints, and have your own views challenged. Importantly, other people might spot your own cognitive biases.


3. Challenge Your Viewpoints

When making a decision, try to see the weaknesses in your thinking regardless of how small, unlikely, or inconsequential these weaknesses might seem. You can be more confident in your decision if it withstands serious, critical scrutiny.


4. Do Not Make Decisions Under Pressure

A final way to protect yourself from relying on your cognitive biases is not to make any decisions under time pressure. Although it might not feel like it, there are very few instances where you need to make a decision immediately. Therefore when faced with a decision that can have substantial consequences, rather

  1. take the necessary time to ruminate,
  2. list the pros and cons in a notebook,
  3. talk to friends or family members for advice (but remember that they may have their own biases) and
  4. try to poke holes in your reasoning.


Bias Modification Exercises and Activities

In the last decade, research has looked at cognitive bias modification (CBM) since cognitive biases are associated with the severity of anxiety and depression. The relationship between cognitive biases and anxiety and depression is assumed to be causal: that is, cognitive biases cause an increase in the severity of symptoms.

CBM exercises are designed with this causal relationship in mind: If the cognitive bias is removed or reduced, then the severity of the symptoms should also lessen.

There are two categories of CBM exercises that exist:

  1. Changing attentional bias: In this type of exercise, participants are trained to pay more attention to positive stimuli instead of negative stimuli.

  2. Changing interpretation bias: Participants are primed with positive information before completing an emotionally ambiguous task.

To date, there are at least six meta-analyses, and they report conflicting findings (Beard, Sawyer, & Hofmann, 2012; Cristea, Kok, & Cuijpers, 2015; Hakamta et al., 2010; Hallion & Ruscio, 2011; Heeren, Mogoase, Philippot, & McNally, 2015; Mogoase, David, & Koster, 2014).

There are many reasons for these differences. For example, the types of studies included, the moderators included, the definition of the interventions, the outcome variable used, the clinical condition studied, and so forth. Therefore, the jury is still out on whether CBM affects symptom severity reliably.


A Look At Cognitive Bias Modification Apps

There are many cognitive bias modification apps available to download. Before purchasing an app, research whether the creator of the app has followed sound research principles or done any research when developing the app (Zhang, Ying, Song, Fung, & Smith, 2018).

Most of the bias modification apps aim to change the attentional bias. For example, the following apps aim to train users to respond quicker to happy faces than to sad or angry faces. All four hypothesize that repeated use will result in more positive moods.

The Cognitive Bias Cheatsheet is a useful way to remind oneself of the different cognitive biases that exist.


4 Relevant Books

Here is a list of books most relevant for anyone interested in cognitive biases.

Firstly, any list would be remiss without Thinking, Fast and Slow by Daniel Kahneman. In this book, Kahneman unpacks some of the most common biases that we experience when making decisions. (Amazon)

In the same vein is The Drunkard’s Walk: How Randomness Rules our Lives (Leonard Mlodinow). This book also addresses how humans misjudge how much effect randomness has on our decision making. (Amazon)

Predictably Irrational (Dan Ariely) is an excellent and very accessible book about how our behavior is often governed by seemingly random and illogical thought processes. The opening chapter is jaw-dropping. (Amazon)

Nassim Nicholas Taleb published a series of books – five, in fact – and I include two of them on this list. These two books are Fooled by Randomness and The Black Swan. The entire series discusses various aspects of uncertainty. (Amazon)


Our Favorite TedTalks On The Topic

We’ve put together an impressive list of our favorite TED talks, and TEDx talks on cognitive biases.

If you want to learn more about cognitive biases, then watch these talks:


Are we in control of our own decisions?  – Dan Ariely


Confirmation bias – Nassor Al Hilal


Why you think you’re right – even if you’re wrong – Julia Galef


If you want to learn how to overcome your biases, then we can recommend the following:


How to outsmart your own unconscious bias – Valerie Alexander


How to design gender bias out of your workplace – Sara Sanford


Unpacking the biases that shape our beliefs – Mike Hartmann


Resources From

We have useful resources that you can use when tackling cognitive biases.

First, we have a mindfulness tool called Increasing Awareness of Cognitive Distortion that can aim to change the way you think about yourself and your environment. Ultimately, users will increase their awareness of their cognitive biases, and through this awareness, be able to change their behavior.

Our Nonjudgmental Reflection tool is also useful for combating negative thoughts and biases. This exercise promotes awareness of nonjudgmental thinking, rather than negative thinking, as a way to reframe evaluations of personal events.

The Observer Meditation tool and Moving from Cognition Fusion to Defusion tool are useful ways to help us to relate to our thoughts more objectively. The Observer meditation tool is a meditation tool used to observe states, but the act of being an observer can also be beneficial for decision-making.

By employing an observer mindset, it might be possible to separate ourselves from our cognitive biases. Similarly, the second tool helps us to change the way we perceive our thoughts and to become less attached to the content of our thoughts. This technique might also help us’ step away’ from our biases.


A Take-Home Message

We often rely on cognitive heuristics and biases when making decisions.

Heuristics can be useful in certain circumstances; however, heuristics and biases can result in poor decision making and can reinforce unhealthy behavior.

There are many different types of cognitive biases, and all of us are victim to one or more.

However, being aware of our biases and how they affect our behavior is the first step towards resisting them.

We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free.

  • Anthony, A. S. (2004). Gender bias and discrimination in nursing education: can we change it?. Nurse Educator, 29(3), 121-125.
  • Ariely, D. (2008). Predictably irrational. New York, NY: Harper Perennial.
  • Ayton, P., & Fischer, I. (2004). The hot hand fallacy and the gambler’s fallacy: Two faces of subjective randomness? Memory & Cognition, 32(8), 1369-1378.
  • Beard, C., Sawyer, A. T., & Hofmann, S. G. (2012). Efficacy of attention bias modification using threat and appetitive stimuli: A meta-analytic review. Behavior therapy, 43(4), 724-740.
  • Caputo, A. (2016). Overcoming judgmental biases in negotiations: A scenario-based survey analysis on third party direct intervention. Journal of Business Research, 69(10), 4304-4312.
  • Clotfelter, C. T., & Cook, P. J. (1993). The “gambler’s fallacy” in lottery play. Management Science, 39(12), 1521-1525.
  • Cristea, I. A., Kok, R. N., & Cuijpers, P. (2015). Efficacy of cognitive bias modification interventions in anxiety and depression: meta-analysis. The British Journal of Psychiatry, 206(1), 7-16.
  • Davison, W. P. (1983). The third-person effect in communication. Public opinion quarterly, 47(1), 1-15.
  • Edgar, G., & Edgar, H. (2016). Perception and attention: Errors and Accidents In D. Groome and M.W. Eysenck (Eds). An Introduction to Applied Cognitive Psychology (2nd edition). (pp. 9 -38). New Tork, United States: Routledge.
  • Feldmann‐Wüstefeld, T., Schmidt‐Daffy, M., & Schubö, A. (2011). Neural evidence for the threat detection advantage: differential attention allocation to angry and happy faces. Psychophysiology, 48(5), 697-707.
  • Garb, H. N. (1997). Race bias, social class bias, and gender bias in clinical judgment. Clinical Psychology: Science and Practice, 4(2), 99-120.
  • Gawande, A. (2014). Being mortal: Medicine and what matters in the end. Metropolitan Books.
  • Gonzalez, C. (2017). Decision-Making: A Cognitive Science Perspective. In S. Chipman (Eds.), The Oxford Handbook of Cognitive Science (pp. 249 – 264). Oxford, United Kingdom: Oxford University Press. Accessed on 9 July 2020 from
  • Güngör, G., & Biernat, M. (2009). Gender bias or motherhood disadvantage? Judgments of blue-collar mothers and fathers in the workplace. Sex Roles, 60(3-4), 232-246.
  • Hakamata, Y., Lissek, S., Bar-Haim, Y., Britton, J. C., Fox, N. A., Leibenluft, E., … & Pine, D. S. (2010). Attention bias modification treatment: a meta-analysis toward the establishment of novel treatment for anxiety. Biological psychiatry, 68(11), 982-990.
  • Hallion, L. S., & Ruscio, A. M. (2011). A meta-analysis of the effect of cognitive bias modification on anxiety and depression. Psychological Bulletin, 137(6), 940.
  • Heeren, A., Mogoașe, C., Philippot, P., & McNally, R. J. (2015). Attention bias modification for social anxiety: a systematic review and meta-analysis. Clinical psychology review, 40, 76-90.
  • Hershey, D. A., Jacobs-Lawson, J. M., & Austin, J. T. (2012). Effective financial planning for retirement. In M. Wang (Ed.), Oxford handbook of retirement (pp. 402–430). New York:
    Oxford University Press.
  • Kahneman, D. (2011). Thinking, fast and slow. New York, United States: Farrar, Straus, and Giroux
  • Kahneman, D., Slovic, S. P., Slovic, P., & Tversky, A. (Eds.). (1982). Judgment under uncertainty: Heuristics and biases. Cambridge University Press.
  • Kahneman, D., & Tversky, A. (2013). Choices, values, and frames. In Handbook of the fundamentals of financial decision making: Part I (pp. 269-278).
  • Kaplan, R. M. (1978). Is beauty talent? Sex interaction in the attractiveness halo effect. Sex Roles, 4(2), 195-204.
  • Landy, D., & Sigall, H. (1974). Beauty is talent: Task evaluation as a function of the performer’s physical attractiveness—Journal of Personality and Social Psychology, 29(3), 299.
  • MacNell, L., Driscoll, A., & Hunt, A. N. (2015). What’s in a name: Exposing gender bias in student ratings of teaching. Innovative Higher Education, 40(4), 291-303.
  • Malpass, R. S., & Kravitz, J. (1969). Recognition for faces of own and other races. Journal of personality and social psychology, 13(4), 330.
  • Mitchell, K. M., & Martin, J. (2018). Gender bias in student evaluations. P.S.: Political Science & Politics, 51(3), 648-652.
  • Mlodinow, L. The drunkard’s walk: How randomness rules our lives. Vintage, 2009.
  • Mogoaşe, C., David, D., & Koster, E. H. (2014). Clinical efficacy of attentional bias modification procedures: An updated meta‐analysis. Journal of Clinical Psychology, 70(12), 1133-1157.
  • Nederhof, A. J. (1985). Methods of coping with social desirability bias: A review. European journal of social psychology, 15(3), 263-280.
  • Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of general psychology, 2(2), 175-220.
  • Nisbett, R. E., & Wilson, T. D. (1977). The halo effect: evidence for unconscious alteration of judgments. Journal of personality and social psychology, 35(4), 250.
  • Northcraft, G. B., & Neale, M. A. (1987). Experts, amateurs, and real estate: An anchoring-and-adjustment perspective on property pricing decisions. Organizational behavior and human decision processes, 39(1), 84-97.
  • Peeters, G., & Czapinski, J. (1990). Positive-negative asymmetry in evaluations: The distinction between affective and informational negativity effects. European review of social psychology, 1(1), 33-60.
  • Pettigrew, T. F. (1979). The ultimate attribution error: Extending Allport’s cognitive analysis of prejudice. Personality and social psychology bulletin, 5(4), 461-476.
  • Rosenthal, R., & Fode, K. L. (1963a). The effect of experimenter bias on the performance of the albino rat. Behavioral Science, 8(3), 183-189.
  • Rosenthal, R., & Fode, K. L. (1963b). Psychology of the scientist: V. Three experiments in experimenter bias. Psychological Reports, 12(2), 491-511.
  • Soares, R. (2010). 2010 catalyst census: Fortune 500 women board directors. New York: Catalyst.
  • Taleb, N. (2005). Fooled by randomness: The hidden role of chance in life and in the markets (Vol. 1). Random House Incorporated.
  • Taleb, N. N. (2007). The black swan: The impact of the highly improbable (Vol. 2). Random House Incorporated.
  • Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185, 1124-1131.
  • Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review, 90(4), 293.
  • Zhang, M., Ying, J., Song, G., Fung, D. S., & Smith, H. (2018). Attention and cognitive bias modification apps: review of the literature and of commercially available apps. JMIR mHealth and uHealth, 6(5), e10034.

About the Author

Alicia Nortje, Ph.D. is a research fellow at the University of Cape Town, where she is involved in multiple projects investigating eyewitness memory and face recognition. She’s highly skilled in research design, data analysis, and critical thinking. When she’s not working, she indulges in running on the road or the trails, and enjoys cooking.


  1. Chris Sanders

    Thanks for the detailed blog. I’m going to provide this as a link on my critical thinking course that I teach at a university.


Leave a Reply

Your email address will not be published. Required fields are marked *