Robert Evans Wilson Jr.

Cognitive Bias Is the Loose Screw in Critical Thinking

Recognizing your biases enhances understanding and communication..

Posted May 17, 2021 | Reviewed by Jessica Schrader

  • People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality.
  • Cognitive biases are mental shortcuts people take in order to process the mass of information they receive daily.
  • Cognitive biases include confirmation bias, anchoring bias, bandwagon effect, and negativity bias.

When I was a kid, I was enamored of cigarette-smoking movie stars. When I was a teenager , some of my friends began to smoke; I wanted to smoke too, but my parents forbid it. I was also intimidated by the ubiquitous anti-smoking commercials I saw on television warning me that smoking causes cancer. As much as I wanted to smoke, I was afraid of it.

When I started college as a pre-med major, I also started working in a hospital emergency room. I was shocked to see that more than 90% of the nurses working there were smokers, but that was not quite enough to convince me that smoking was OK. It was the doctors: 11 of the 12 emergency room physicians I worked with were smokers. That was all the convincing I needed. If actual medical doctors thought smoking was safe, then so did I. I started smoking without concern because I had fallen prey to an authority bias , which is a type of cognitive bias. Fortunately for my health, I wised up and quit smoking 10 years later.

It's Likely You're Unaware of These Habits

Have you ever thought someone was intelligent simply because they were attractive? Have you ever dismissed a news story because it ran in a media source you didn’t like? Have you ever thought or said, “I knew that was going to happen!” in reference to a team winning, a stock going up in value, or some other unpredictable event occurring? If you replied "yes” to any of these, then you may be guilty of relying on a cognitive bias.

In my last post, I wrote about the importance of critical thinking, and how in today’s information age, no one has an excuse for living in ignorance. Since then, I recalled a huge impediment to critical thinking: cognitive bias. We are all culpable of leaning on these mental crutches, even though we don’t do it intentionally.

What Are Cognitive Biases?

The Cambridge English Dictionary defines cognitive bias as the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.

PhilosophyTerms.com calls it a bad mental habit that gets in the way of logical thinking.

PositivePsychology.com describes it this way: “We are often presented with situations in life when we need to make a decision with imperfect information, and we unknowingly rely on prejudices or biases.”

And, according to Alleydog.com, a cognitive bias is an involuntary pattern of thinking that produces distorted perceptions of people, surroundings, and situations around us.

In brief, a cognitive bias is a shortcut to thinking. And, it’s completely understandable; the onslaught of information that we are exposed to every day necessitates some kind of time-saving method. It is simply impossible to process everything, so we make quick decisions. Most people don’t have the time to thoroughly think through everything they are told. Nevertheless, as understandable as depending on biases may be, it is still a severe deterrent to critical thinking.

Here's What to Watch Out For

Wikipedia lists 197 different cognitive biases. I am going to share with you a few of the more common ones so that in the future, you will be aware of the ones you may be using.

Confirmation bias is when you prefer to attend media and information sources that are in alignment with your current beliefs. People do this because it helps maintain their confidence and self-esteem when the information they receive supports their knowledge set. Exposing oneself to opposing views and opinions can cause cognitive dissonance and mental stress . On the other hand, exposing yourself to new information and different viewpoints helps open up new neural pathways in your brain, which will enable you to think more creatively (see my post: Surprise: Creativity Is a Skill, Not a Gift! ).

Anchoring bias occurs when you become committed or attached to the first thing you learn about a particular subject. A first impression of something or someone is a good example (see my post: Sometimes You Have to Rip the Cover Off ). Similar to anchoring is the halo effect , which is when you assume that a person’s positive or negative traits in one area will be the same in some other aspect of their personality . For example, you might think that an attractive person will also be intelligent without seeing any proof to support it.

critical thinking and unconscious bias

Hindsight bias is the inclination to see some events as more predictable than they are; also known as the “I knew it all along" reaction. Examples of this bias would be believing that you knew who was going to win an election, a football or baseball game, or even a coin toss after it occurred.

Misinformation effect is when your memories of an event can become affected or influenced by information you received after the event occurred. Researchers have proven that memory is inaccurate because it is vulnerable to revision when you receive new information.

Actor-observer bias is when you attribute your actions to external influences and other people's actions to internal ones. You might think you missed a business opportunity because your car broke down, but your colleague failed to get a promotion because of incompetence.

False consensus effect is when you assume more people agree with your opinions and share your values than actually do. This happens because you tend to spend most of your time with others, such as family and friends, who actually do share beliefs similar to yours.

Availability bias occurs when you believe the information you possess is more important than it actually is. This happens when you watch or listen to media news sources that tend to run dramatic stories without sharing any balancing statistics on how rare such events may be. For example, if you see several stories on fiery plane crashes, you might start to fear flying because you assume they occur with greater frequency than they actually do.

Bandwagon effect, also known as herd mentality or groupthink , is the propensity to accept beliefs or values because many other people also hold them as well. This is a conformity bias that occurs because most people desire acceptance, connection, and belonging with others, and fear rejection if they hold opposing beliefs. Most people will not think through an opinion and will assume it is correct because so many others agree with it.

Authority bias is when you accept the opinion of an authority figure because you believe they know more than you. You might assume that they have already thought through an issue and made the right conclusion. And, because they are an authority in their field, you grant more credibility to their viewpoint than you would for anyone else. This is especially true in medicine where experts are frequently seen as infallible. An example would be an advertiser showing a doctor, wearing a lab coat, touting their product.

Negativity bias is when you pay more attention to bad news than good. This is a natural bias that dates back to humanity’s prehistoric days when noticing threats, risks, and other lethal dangers could save your life. In today’s civilized world, this bias is not as necessary (see my post Fear: Lifesaver or Manipulator ).

Illusion of control is the belief that you have more control over a situation than you actually do. An example of this is when a gambler believes he or she can influence a game of chance.

Understand More and Communicate Better

Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking.

Source: Cognitive Bias Codex by John Manoogian III/Wikimedia Commons

Robert Wilson is a writer and humorist based in Atlanta, Georgia.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience

2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

Learning objectives.

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

Connections

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Confirmation bias.

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk cost fallacy.

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Think Like a Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Authors: Nathan Smith
  • Publisher/website: OpenStax
  • Book title: Introduction to Philosophy
  • Publication date: Jun 15, 2022
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Section URL: https://openstax.org/books/introduction-philosophy/pages/2-2-overcoming-cognitive-biases-and-engaging-in-critical-reflection

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles and JavaScript.

  • View all journals
  • Explore content
  • About the journal
  • Publish with us
  • Sign up for alerts
  • Published: 08 March 2021

Understanding unconscious bias

  • Barry Oulton 1  

BDJ In Practice volume  34 ,  pages 26–27 ( 2021 ) Cite this article

20 Accesses

1 Altmetric

Metrics details

What if you could improve the way you communicate with your patients, peers and team by recognising your unconscious biases and challenging them? Dr Barry Oulton explores.

Most of us don't like to think of ourselves as discriminating against others or having biases towards or against certain groups, but unconscious bias or 'implicit bias', as it is also called, is innate to human nature.

This is a preview of subscription content, access via your institution

Access options

Subscribe to this journal

We are sorry, but there is no personal subscription option available for your country.

Buy this article

  • Purchase on Springer Link
  • Instant access to full article PDF

Prices may be subject to local taxes which are calculated during checkout

Author information

Authors and affiliations.

The Confident Dentist, Vine Cottage, Hindhead, UK

Barry Oulton

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Barry Oulton .

Rights and permissions

Reprints and permissions

About this article

Cite this article.

Oulton, B. Understanding unconscious bias. BDJ In Pract 34 , 26–27 (2021). https://doi.org/10.1038/s41404-021-0685-8

Download citation

Published : 08 March 2021

Issue Date : March 2021

DOI : https://doi.org/10.1038/s41404-021-0685-8

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

Quick links

  • Explore articles by subject
  • Guide to authors
  • Editorial policies

critical thinking and unconscious bias

  • Climate Change
  • Departments & Research Centres
  • Behavioural economics
  • Corporate regulation
  • Digital disruption
  • Experimental economics
  • Health economics
  • Labour market
  • Organisational performance
  • Superannuation
  • Sustainability

Your Search Terms:

How unconscious bias shapes your thinking (and what you can do about it).

How unconscious bias shapes your thinking (and what you can do about it)

Photo: iStock

Most Australians believe in values such as fairness and equality – but these can be harder to act on than you would think.

All applicants for permanent visas in Australia sign off on values that include “freedom and dignity of the individual”, “equality of men and women and a spirit of egalitarianism that embraces mutual respect, tolerance, fair play and compassion for those in need”, and “equality of opportunity for individuals, regardless of their race, religion or ethnic background”.

Yet women are structurally paid less than men, research shows that people with Chinese names send in 68 per cent more CVs to land a job interview compared to Anglo-Saxon applicants (followed by people with Middle Eastern names at 64 per cent), while Aboriginal and LGTBIQ community members are at least twice as likely to end their lives than people in the wider community.

To better understand these statistics we need to talk about “unconscious biases”.

Snap judgments

Unconscious biases are thought patterns; mental shortcuts. Everybody has them. We learn these tendencies over our lifetime because they help us.

When walking home from the train on a dark rainy night, our past experiences culminate when we judge the appearance of that approaching stranger in a snap instant. Can we trust them? Did you think of the stranger being a man?

This is a case in point. We can do a complex activity like riding a bike, without consciously thinking about it. In a very similar way, biases help us navigate a complex social world.

Unfortunately, biases also have negative effects. We make snap judgments about others all the time: on the street, online, or when interviewing for a job. We use stereotypes to judge people from other groups.

If you feel uncomfortable reading this, you are experiencing what it feels like to be confronted with your own biases. Don’t run away – embrace them and commit to keeping them in check.

Dr Tim Soutphommasane, Australia’s Race Discrimination Commissioner, speaks about how Asians are generally seen as inoffensive, diligent, and productive.

Yet these traits are too easily interpreted as passivity, acquiescence and subservience, for example when promotions time comes around. It is also too easy to only socialise with people that are similar to us – those with the same interests, language, and problems.

Take Facebook: who does it recommend as your possible friends? Which pages does it suggest you might “like”? Research shows that we don’t know “different others” as well, that we don’t trust and respect them as much.

What do you really know about your neighbours, colleagues, homeless people, or the Aboriginal people of the Kimberley? And what do you assume? We all are victims and perpetrators of these unconscious biases.

 width=

We make snap judgements based on appearance all the time. Image: iStock

Bias checklist

So what can we do to keep our biases in check?

1. Check your distance.

Distance is everything that stands between you and others. Perhaps you are really close to your partner or immediate family but further removed from relatives, colleagues, or the refugees at Manus Island.

Languages, technology, country borders, and generations (age) all create distance and make interaction harder. Do you have cliques at work of people from the same university, expertise, or nationality? Do you reach out beyond your cliques’ boundaries?

Go see your relatives, ask how your colleagues are, read newspapers, surf beyond what Facebook recommends. Be curious and get to know what you are now only assuming.

2. Check yourself.

Discover your biases by taking Harvard University’s Implicit Association tests here. Several surveys can help you identify your biases related to skin tone, religion, age, weight, sexuality, disability and many more.

As you start becoming aware of your biases, you might catch yourself acting on them. Embarrassment and shame are common with this realisation, but remember that everyone has biases. Apologise, rephrase, and move on.

Keep challenging your stereotypes. If you are biased against considering women for leadership positions, find female role models who challenge that bias or put yourself in an applicant’s shoes for a moment.

3. Check others.

Biases are everywhere, in your family, work team, even in your book club. We need to call others on their biases without embarrassing them or yourself.

Primarily, raise awareness about biases. Many organisations have bias trainings; Google makes the company’s training publicly available.

Once aware, everyone can commit to checking each other’s biases. In teams, for example, members can gently knock on the table to call out bias in a meeting without disrupting.

Simple processes can help teams and organisations alike. Company policies around the use of unbiased language in job advertisements, blind CVs, and structured interviews are the first steps to unbiased selection and hiring practices.

 width=

Sweet serenity – for some

For many people living in Australia, the serenity is pretty sweet. We are lucky to live in a country that has the opportunity to deliver comfort and prosperity for all its residents.

Yet every day, women, elderly, gay people, immigrants, Aboriginals, the homeless, refugees, and other groups are excluded from larger society as a result of unconscious biases.

We can’t get rid of bias, but if we carefully check our own and others’ biases, will all Australians one day experience the serenity?

Published on 24 Nov 2017

  • Unconscious bias
  • Discrimination

MORE ON THIS TOPIC

Breaking the silence on corporate pay secrecy

Breaking the silence on corporate pay secrecy

Five ways to reboot your organisation and develop diverse leaders

Five ways to reboot your organisation and develop diverse leaders

Could achieving gender diversity in leadership be as simple as changing the default in the selection process?

Could achieving gender diversity in leadership be as simple as changing the default in the selection process?

Subscribe to impact.

* Mandatory

First Name *

Last Name *

Organisation / Institute

Country Select ... Australia New Zealand --------------- Afghanistan Åland Islands Albania Algeria American Samoa Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, The Democratic Republic of The Cook Islands Costa Rica Cote D'ivoire Croatia Cuba Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guam Guatemala Guernsey Guinea Guinea-bissau Guyana Haiti Heard Island and Mcdonald Islands Holy See (Vatican City State) Honduras Hong Kong Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jersey Jordan Kazakhstan Kenya Kiribati Korea, Democratic People's Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People's Democratic Republic Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, The Former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Marshall Islands Martinique Mauritania Mauritius Mayotte Mexico Micronesia, Federated States of Moldova, Republic of Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands Netherlands Antilles New Caledonia Nicaragua Niger Nigeria Niue Norfolk Island Northern Mariana Islands Norway Oman Pakistan Palau Palestinian Territory, Occupied Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Puerto Rico Qatar Reunion Romania Russian Federation Rwanda Saint Helena Saint Kitts and Nevis Saint Lucia Saint Pierre and Miquelon Saint Vincent and The Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Seychelles Sierra Leone Singapore Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and The South Sandwich Islands Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan, Province of China Tajikistan Tanzania, United Republic of Thailand Timor-leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States United States Minor Outlying Islands Uruguay Uzbekistan Vanuatu Venezuela Viet Nam Virgin Islands, British Virgin Islands, U.S. Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe

My relationship** with Monash University (tick all that apply): I'm an alumnus, friend or supporter (including donors, mentors and industry partners) I'm a Monash student I'm interested in studying at Monash I recently applied to study at Monash I'm a Monash staff member I recently participated in research activities or studies with Monash Other

Submit your details

I agree to receive marketing communications from Monash Business School. Monash University values the privacy of every individual's personal information and is committed to the protection of that information from unauthorised use and disclosure except where permitted by law. For information about the handling of your personal information please see Data Protection and Privacy Procedure and our ** Data Protection and Privacy Collection Statements . If you have any questions about how Monash University is collecting and handling your personal information, please contact our Data Protection and Privacy Office at [email protected] .

You may republish this article online or in print under our Creative Commons licence. You may not edit or shorten the text, you must attribute the article to Impact, and you must include the author’s name in your republication.

If you have any questions, please email [email protected]

Creative Commons Attribution-No Derivatives

brand logo

Taking steps to recognize and correct unconscious assumptions toward groups can promote health equity.

JENNIFER EDGOOSE, MD, MPH, MICHELLE QUIOGUE, MD, FAAFP, AND KARTIK SIDHAR, MD

Fam Pract Manag. 2019;26(4):29-33

Author disclosures: no relevant financial affiliations disclosed.

critical thinking and unconscious bias

Jamie is a 38-year-old woman and the attending physician on a busy inpatient teaching service. On rounds, she notices several patients tending to look at the male medical student when asking a question and seeming to disregard her. Alex is a 55-year-old black man who has a history of diabetic polyneuropathy with significant neuropathic pain. His last A1C was 7.8. He reports worsening lower extremity pain and is frustrated that, despite his bringing this up repeatedly to different clinicians, no one has addressed it. Alex has been on gabapentin 100 mg before bed for 18 months without change, and his physicians haven't increased or changed his medication to help with pain relief.

Alisha is a 27-year-old Asian family medicine resident who overhears labor and delivery nurses and the attending complain that Indian women are resistant to cervical exams.

These scenarios reflect the unconscious assumptions that pervade our everyday lives, not only as practicing clinicians but also as private citizens. Some of Jamie's patients assume the male member of the team is the attending physician. Alex's physicians perceive him to be a “drug-seeking” patient and miss opportunities to improve his care. Alisha is exposed to stereotypes about a particular ethnic group.

Although assumptions like these may not be directly ill-intentioned, they can have serious consequences. In medical practice, these unconscious beliefs and stereotypes influence medical decision-making. In the classic Institute of Medicine report “Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care,” the authors concluded that “bias, stereotyping, and clinical uncertainty on the part of health care providers may contribute to racial and ethnic disparities in health care” often despite providers' best intentions. 1 For example, studies show that discrimination and bias at both the individual and institutional levels contribute to shocking disparities for African-American patients in terms of receiving certain procedures less often or experiencing much higher infant mortality rates when compared with non-Hispanic whites. 2 , 3 As racial and ethnic diversity increases across our nation, it is imperative that we as physicians intentionally confront and find ways to mitigate our biases.

Implicit bias is the unconscious collection of stereotypes and attitudes that we develop toward certain groups of people, which can affect our patient relationships and care decisions.

You can overcome implicit bias by first discovering your blind spots and then actively working to dismiss stereotypes and attitudes that affect your interactions.

While individual action is helpful, organizations and institutions must also work to eliminate systemic problems.

DEFINING AND REDUCING IMPLICIT BIAS

For the last 30 years, science has demonstrated that automatic cognitive processes shape human behavior, beliefs, and attitudes. Implicit or unconscious bias derives from our ability to rapidly find patterns in small bits of information. Some of these patterns emerge from positive or negative attitudes and stereotypes that we develop about certain groups of people and form outside our own consciousness from a very young age. Although such cognitive processes help us efficiently sort and filter our perceptions, these reflexive biases also promote inconsistent decision making and, at worst, systematic errors in judgment.

Cognitive processes lead us to associate unconscious attributes with social identities. The literature explores how this influences our views on race, ethnicity, age, gender, sexual orientation, and weight, and studies show many people are biased in favor of people who are white, young, male, heterosexual, and thin. 4 Unconsciously, we not only learn to associate certain attributes with certain social groupings (e.g., men with strength, women with nurturing) but also develop preferential ranking of such groups (e.g., preference for whites over blacks). This unconscious grouping and ranking takes root early in development and is shaped by many outside factors such as media messages, institutional policies, and family beliefs. Studies show that health care professionals have the same level of implicit bias as the general population and that higher levels are associated with lower quality care. 5 Providers with higher levels of bias are more likely to demonstrate unequal treatment recommendations, disparities in pain management, and even lack of empathy toward minority patients. 6 In addition, stressful, time-pressured, and overloaded clinical practices can actually exacerbate unconscious negative attitudes. Although the potential impact of our biases can feel overwhelming, research demonstrates that these biases are malleable and can be overcome by conscious mitigation strategies. 7

We recommend three overarching strategies to mitigate implicit bias – educate, expose, and approach – which we will discuss in greater detail. We have further broken down these strategies into eight evidence-based tactics you can incorporate into any quality improvement project, diagnostic dilemma, or new patient encounter. Together, these eight tactics spell out the mnemonic IMPLICIT. (See “ Strategies to combat our implicit biases .”)

When we fail to learn about our blind spots, we miss opportunities to avoid harm. Educating ourselves about the reflexive cognitive processes that unconsciously affect our clinical decisions is the first step. The following tactics can help:

Introspection . It is not enough to just acknowledge that implicit bias exists. As clinicians, we must directly confront and explore our own personal implicit biases. As the writer Anais Nin is often credited with saying, “We don't see things as they are, we see them as we are.” To shed light on your potential blind spots and unconscious “sorting protocols,” we encourage you to take one or more implicit association tests . Discovering a moderate to strong bias in favor of or against certain social identities can help you begin this critical step in self exploration and understanding. 8 You can also complete this activity with your clinic staff and fellow physicians to uncover implicit biases as a group and set the stage for addressing them. For instance, many of us may be surprised to learn after taking an implicit association test that we follow the typical bias of associating males with science — an awareness that may explain why the patient in our first case example addressed questions to the male medical student instead of the female attending.

Mindfulness .It should come as no surprise that we are more likely to use cognitive shortcuts inappropriately when we are under pressure. Evidence suggests that increasing mindfulness improves our coping ability and modifies biological reactions that influence attention, emotional regulation, and habit formation. 9 There are many ways to increase mindfulness, including meditation, yoga, or listening to inspirational texts. In one study, individuals who listened to a 10-minute meditative audiotape that focused them and made them more aware of their sensations and thoughts in a nonjudgmental way caused them to rely less on instinct and show less implicit bias against black people and the aged. 10

It is also helpful to expose ourselves to counter-stereotypes and to focus on the unique individuals we interact with. Similarity bias is the tendency to favor ourselves and those like us. When our brains label someone as being within our same group, we empathize better and use our actions, words, and body language to signal this relatedness. Experience bias can lead us to overestimate how much others see things the same way we do, to believe that we are less vulnerable to bias than others, and to assume that our intentions are clear and obvious to others. Gaining exposure to other groups and ways of thinking can mitigate both of these types of bias. The following tactics can help:

Perspective-taking . This tactic involves taking the first-person perspective of a member of a stereotyped group, which can increase psychological closeness to that group. 8 Reading novels, watching documentaries, and listening to podcasts are accessible ways to reach beyond our comfort zone. To authentically perceive another person's perspective, however, you should engage in positive interactions with stereotyped group members in real life. Increased face-to-face contact with people who seem different from you on the surface undermines implicit bias.

Learn to slow down . To recognize our reflexive biases, we must pause and think. For example, the next time you interact with someone in a stereotyped group or observe societal stereotyping, such as through the media, recognize what responses are based on stereotypes, label those responses as stereotypical, and reflect on why the responses occurred. You might then consider how the biased response could be avoided in the future and replace it with an unbiased response. The physician treating Alex in the introduction could use this technique by slowing down and reassessing his medical care. By acknowledging the potential for bias, the physician may recognize that safe options remain for managing Alex's neuropathic pain.

Additionally, research strongly supports the use of counter-stereotypic imaging to replace automatic responses. 11 For example, when seeking to contradict a prevailing stereotype, substitute highly defined images, which can be abstract (e.g., modern Native Americans), famous (e.g., minority celebrities like Oprah Winfrey or Lin-Manuel Miranda), or personal (e.g., your child's teacher). As positive exemplars become more salient in your mind, they become cognitively accessible and challenge your stereotypic biases.

Individuation . This tactic relies on gathering specific information about the person interacting with you to prevent group-based stereotypic inferences. Family physicians are trained to build and maintain relationships with each individual patient under their care. Our own social identities intersect with multiple social groupings, for example, related to sexual orientation, ethnicity, and gender. Within these multiplicities, we can find shared identities that bring us closer to people, including shared experiences (e.g., parenting), common interests (e.g., sports teams), or mutual purpose (e.g., surviving cancer). Individuation could have helped the health care workers in Alisha's labor and delivery unit to avoid making judgments based on stereotypes. We can use this tactic to help inform clinical decisions by using what we know about a person's specific, individual, and unique attributes. 11

Like any habit, it is difficult to change biased behaviors with a “one shot” educational approach or awareness campaign. Taking a systematic approach at both the individual and institutional levels, and incorporating a continuous process of improvement, practice, and reflection, is critical to improving health equity.

Check your messaging . Using very specific messages designed to create a more inclusive environment and mitigate implicit bias can make a real difference. As opposed to claiming “we don't see color” or using other colorblind messaging, statements that welcome and embrace multiculturalism can have more success at decreasing racial bias.

Institutionalize fairness . Organizations have a responsibility to support a culture of diversity and inclusion because individual action is not enough to deconstruct systemic inequities. To overcome implicit bias throughout an organization, consider implementing an equity lens – a checklist that helps you consider your blind spots and biases and assures that great ideas and interventions are not only effective but also equitable (an example is included in the table above ). Another example would be to find opportunities to display images in your clinic's waiting room that counter stereotypes. You could also survey your institution to make sure it is embracing multicultural (and not colorblind) messaging.

Take two . Resisting implicit bias is lifelong work. The strategies introduced here require constant revision and reflection as you work toward cultural humility. Examining your own assumptions is just a starting point. Talking about implicit bias can trigger conflict, doubt, fear, and defensiveness. It can feel threatening to acknowledge that you participate in and benefit from systems that work better for some than others. This kind of work can mean taking a close look at the relationships you have and the institutions of which you are a part.

MOVING FORWARD

Education, exposure, and a systematic approach to understanding implicit bias may bring us closer to our aspirational goal to care for all our patients in the best possible way and move us toward a path of achieving health equity throughout the communities we serve. The mnemonic IMPLICIT can help us to remember the eight tactics we all need to practice. While disparities in social determinants of health are often beyond the control of an individual physician, we can still lead the fight for health equity for our own patients, both from within and outside the walls of health care. With our specialty-defining goal of getting to know each patient as a unique individual in the context of his or her community, family physicians are well suited to lead inclusively by being humble, respecting the dignity of each person, and expressing appreciation for how hard everyone works to overcome bias.

Smedley BD, Stith AY, Nelson AR, eds Unequal Treatment: Confronting Racial and Ethnic Disparities in Health Care . Washington, DC: Institute of Medicine, National Academy Press; 2003.

Hannan EL, van Ryn M, Burke J, et al.; Access to coronary artery bypass surgery by race/ethnicity and gender among patients who are appropriate for surgery. Med Care . 1999;37(1):68-77.

Infant mortality and African Americans. U.S Department of Health and Human Services Office of Minority Health website. https://minorityhealth.hhs.gov/omh/browse.aspx?lvl=4&lvlid=23 . Updated Nov. 9, 2017. Accessed June 10, 2019.

Nosek BA, Smyth FL, Hansen JJ, et al.; Pervasiveness and correlates of implicit attitudes and stereotypes. Eur Rev Soc Psychol . 2007;18(1):36-88.

FitzGerald C, Hurst S. Implicit bias in healthcare professionals: a systematic review. BMC Med Ethics . 2017;18(1):19.

Maina IW, Belton TD, Ginzberg S, Singh A, Johnson TJ. A decade of studying implicit racial/ethnic bias in healthcare providers using the implicit association test. Soc Sci Med . 2018;199:219-229.

Charlesworth TES, Banaji MR. Patterns of implicit and explicit attitudes: I. long-term change and stability from 2007 to 2016. Psychol Sci . 2019;30(2):174-192.

Sukhera J, Wodzinski M, Teunissen PW, Lingard L, Watling C. Striving while accepting: exploring the relationship between identity and implicit bias recognition and management. Acad Med . 2018;93(11S Association of American Medical Colleges Learn Serve Lead: Proceedings of the 57th Annual Research in Medical Education Sessions):S82-S88.

Burgess DJ, Beach MC, Saha S. Mindfulness practice: A promising approach to reducing the effects of clinician implicit bias on patients. Patient Educ Couns . 2017;100(2):372-376.

Lueke A, Gibson B. Mindfulness meditation reduces implicit age and race bias: the role of reduced automaticity of responding. Soc Psychol Personal Sci . 2015;6(3):284-291.

Devine PG, Forscher PS, Austin AJ, Cox WTL. Long-term reduction in implicit race bias: a prejudice habit-breaking intervention. J Exp Soc Psychol . 2012;48(6):1267-1278.

Continue Reading

critical thinking and unconscious bias

More in FPM

More in pubmed.

Copyright © 2019 by the American Academy of Family Physicians.

This content is owned by the AAFP. A person viewing it online may make one printout of the material and may use that printout only for his or her personal, non-commercial reference. This material may not otherwise be downloaded, copied, printed, stored, transmitted or reproduced in any medium, whether now known or later invented, except as authorized in writing by the AAFP.  See permissions  for copyright questions and/or permission requests.

Copyright © 2024 American Academy of Family Physicians. All Rights Reserved.

Cart

  • SUGGESTED TOPICS
  • The Magazine
  • Newsletters
  • Managing Yourself
  • Managing Teams
  • Work-life Balance
  • The Big Idea
  • Data & Visuals
  • Reading Lists
  • Case Selections
  • HBR Learning
  • Topic Feeds
  • Account Settings
  • Email Preferences

Unconscious Bias Training That Works

  • Francesca Gino
  • Katherine Coffman

critical thinking and unconscious bias

To become more diverse, equitable, and inclusive, many companies have turned to unconscious bias (UB) training. By raising awareness of the mental shortcuts that lead to snap judgments—often based on race and gender—about people’s talents or character, it strives to make hiring and promotion fairer and improve interactions with customers and among colleagues. But most UB training is ineffective, research shows. The problem is, increasing awareness is not enough—and can even backfire—because sending the message that bias is involuntary and widespread may make it seem unavoidable.

UB training that gets results, in contrast, teaches attendees to manage their biases, practice new behaviors, and track their progress. It gives them information that contradicts stereotypes and allows them to connect with colleagues whose experiences are different from theirs. And it’s not a onetime session; it entails a longer journey and structural organizational changes.

In this article the authors describe how rigorous UB programs at Microsoft, Starbucks, and other organizations help employees overcome denial and act on their awareness, develop the empathy that combats bias, diversify their networks, and commit to improvement.

Increasing awareness isn’t enough. Teach people to manage their biases, change their behavior, and track their progress.

Idea in Brief

The problem.

Conventional training to combat unconscious bias and make the workplace more diverse, equitable, and inclusive isn’t working.

This training aims to raise employees’ awareness of biases based on race or gender. But by also sending the message that such biases are involuntary and widespread, it can make people feel that they’re unavoidable.

The Solution

Companies must go beyond raising awareness and teach people to manage biases and change behavior. Firms should also collect data on diversity, employees’ perceptions, and training effectiveness; introduce behavioral “nudges”; and rethink policies.

Across the globe, in response to public outcry over racist incidents in the workplace and mounting evidence of the cost of employees’ feeling excluded, leaders are striving to make their companies more diverse, equitable, and inclusive. Unconscious bias training has played a major role in their efforts. UB training seeks to raise awareness of the mental shortcuts that lead to snap judgments—often based on race and gender—about people’s talents or character. Its goal is to reduce bias in attitudes and behaviors at work, from hiring and promotion decisions to interactions with customers and colleagues.

  • Francesca Gino is a behavioral scientist and the Tandon Family Professor of Business Administration at Harvard Business School. She is the author of Rebel Talent and Sidetracked . francescagino
  • KC Katherine Coffman is an associate professor of business administration at Harvard Business School. Her research focuses on how stereotypes affect beliefs and behavior.

Partner Center

  • Individual Programs
  • Download a Brochure
  • Facilitation
  • Learning Experience
  • Certificates

Unconscious bias: what it is and how to avoid it in the workplace

Callum Hughson | September 23rd, 2019

Unconscious Bias

An unconscious bias is a thinking error that can cloud judgment and lead to poor decisions.

As a leader, it’s important to look for and process a broad range of information from many perspectives. It’s equally important to be open to alternatives not previously considered. The more perspectives and strategies you have to choose from, the more likely it is you will make the best decisions for your team and organization as a whole.

But a powerful, yet subtle obstacle can stand in the way of open-mindedness in leadership: unconscious bias.

What is unconscious bias?

For most of human history, people experienced very little new information during their lifetimes. Decisions were based on the need for survival. In our modern world, we are constantly receiving new information and have to make numerous complicated choices each day. As many researchers have explained , our minds are ill-equipped to handle the modern world’s decision-making demands. Evaluating evidence (especially when it is complex or ambiguous) requires a great deal of mental energy. To save us from becoming overwhelmed, our brains have a natural tendency to take shortcuts. Unconscious bias – also known as cognitive bias – refers to how our mind can take shortcuts when processing information. This saves time when making decisions, which is especially helpful when we’re under pressure and need to meet deadlines. While these shortcuts may save time, an unconscious bias is a systematic thinking error that can cloud our judgment, and as a result, impact our decisions.

See if you can answer this riddle: A bat and a ball cost $1.10 in total. The bat costs $1.00 more than the ball. How much does the ball cost?

A woman holding her chin looking up into the air

Did you answer 10 cents? Most people do. Although this response intuitively comes to mind, it is incorrect. If the ball costs 10 cents and the bat costs $1.00 more than the ball, then the bat would cost $1.10 for a grand total of $1.20 for the bat and the ball. The correct answer to this problem is that the ball costs five cents and the bat costs (at $1.00 more) $1.05, for a grand total of $1.10.

If you answered 10 cents to the example above, your mind took a shortcut by unconsciously substituting the “more than” statement in the problem (the bat costs $1.00 more than the ball) with an “absolute” statement – the bat costs $1.00. It makes the equation easier to process: if a ball and bat together cost $1.10 and the bat costs $1.00, then the ball must cost 10 cents.

Our unconscious mind uses embedded, unconscious beliefs formed from our cultural environment and personal experiences to make immediate decisions about everything around us. The problem is that these shortcuts result in wrong decisions much of the time – especially when rational, logical thinking is required. We all have unconscious bias, and it influences our decisions without us even realizing it.

Common types of unconscious bias

An article in The Atlantic states there are at least 100 distinctive cognitive biases , while Wikipedia’s List of Cognitive Biases  contains more than 185 entries. Many of the unconscious biases listed, such as the IKEA effect (to place disproportionately high value on products you help create yourself) don’t present themselves often in the workplace. The following unconscious biases are the most common in the workplace and have the potential to derail your decision-making ability as a leader:

Sunk cost bias

You non-sensically cling to things that have already cost you something. When you’ve invested time, money, or emotion into something, it can be difficult to let it go – even when it is clear it’s no longer viable. The aversion to this pain can distort your judgment and can cause you to make ill-advised investments.

To combat this bias: ask yourself if you haven’t already invested time, money, effort, or emotion into something, would you still do so now? What advice would you give to a friend in the same situation?

Halo effect

The halo effect occurs when you allow your personal perception of someone (how attractive they are, how much you like them, how much they remind you of yourself) to influence your judgments about them, especially performance. In sociology, this is known as  homophily - people like people who are like themselves. 

To combat this bias: If you notice you are giving consistently high (or low) performance grades across the board to particular individuals, it’s worth considering your judgment may be compromised by the halo effect. Focus on the performance and not on the person.

The Dunning-Kruger effect

The Dunning-Kruger effect describes what happens when people mistakenly overestimate their own ability because of a lack of self awareness. Have you ever heard the phrase “you don’t know what you don’t know”? It’s easy to be over-confident when you only have a rudimentary perspective of how things are.

It also works the other way. Because experts are keenly aware of how much they don’t know, they can drastically underestimate their own ability and lose confidence in themselves and their decision-making ability. This bias is also known as “imposter syndrome.”

To combat this bias: acknowledge the thoughts you have about yourself and put them in perspective. Learn to value constructive criticism, and understand that you’re slowing your team down when you don’t ask for help.

If you’re feeling like an imposter, it can be helpful to share what you’re feeling with trusted friends or mentors. People who have more experience can reassure you that what you’re feeling is normal. Knowing that others have been in your position can make it seem less scary.

Availability heuristic

This unconscious bias influences your judgments by favouring the ideas that come most easily to mind. Similar to recency effect, the more recent and emotionally powerful your memories are can make them seem more relevant. This can cause you to place an inordinate amount of importance on recent memories and apply them to decisions too readily.

To combat this bias: use metrics and statistical information rather than relying on first instincts and emotional influences when making a decision.

The desire for conformity and harmony within a group results in an irrational or dysfunctional decision-making outcome.

To combat this bias: seek to facilitate objective means of evaluating situations and encourage critical thinking practices as a group activity.

Confirmation or Implicit Bias

Confirmation bias causes us to look for evidence confirming what we already think or believe in and to discount or ignore any information that may support an alternate view. It’s the most pervasive unconscious bias in the workplace and the most damaging.

“What the human being is best at doing is interpreting all new information so that their prior conclusions remain intact.” — Warren Buffett

Accepting information that confirms our beliefs is easy and requires little mental energy. With confirmation bias, when we encounter contradicting information we avoid processing it and find a reason to ignore it. In The Case for Motivated Reasoning , social psychologist Ziva Kunda wrote “we give special weight to information that allows us to come to the conclusion we want to reach.” In fact, neuroscientists have demonstrated that our brain reacts differently to information that confirms our previously held beliefs than it does to evidence that contradicts our current beliefs.

If a leader’s view is limited by confirmation bias, they may not pay enough attention to information that could be crucial to their work. Leaders need to be aware of how their biases might impact the people that work for them and with them. For example, direct reports may not share all available information, or may only tell a leader what they think their leader wants to hear. This can lead to poor decision-making, missed opportunities, and negative outcomes.

To combat this bias: think of your ideas and belief system as a piece of software you’re trying to de-bug, rather than a list of things to be defended. Ask yourself the following questions and be mindful of your thought process when answering them:

  • Where do I get information about the issues I care about?
  • Do my most common sources of information confirm or challenge my perspective?
  • How much time do I spend listening to or reading opposing points of view?
  • When I make decisions, am I likely to choose the option that the people closest to me will agree with?

Being cognizant of confirmation bias is not easy, but with practice, it is possible to recognize the role it plays in the way we interpret information and make decisions.

How unconscious bias can impact inclusion and diversity in an organization

The correlation between diversity and financial performance is clear across different industries and regions: more diverse teams translates directly to significant financial performance. Between 2011 and 2015, the most gender diverse companies were 20 per cent more likely than the least diverse to have above average financial performance.

For organizations to attract the most talented people and ensure a vibrant and diverse workforce, they need to select from a wide-ranging and diverse talent pool. Unfortunately, when hiring, assessing, or promoting employees, we often evaluate people against our unconscious assumptions of what top talent looks like. These assumptions can favour one group over others, even if members of each group are equally likely to be successful.

During the hiring process, hiring managers gather a wide array of information about job candidates. Through interviews, candidates will share their educational background, work and personal experiences, and how they would behave in hypothetical situations. But most of the time hiring managers are measuring this information against their own personal belief of what the successful candidate “should” look like. Did they go to the right school? Would they behave in the same manner as I would in the same situation? Is their personality a close match to mine (see halo effect above) and the rest of my team?

Most hiring managers will select candidates who best match their unconscious template of what a successful candidate looks and sounds like. This approach can give preferences to the “safe” choice. For example, a hiring manager may believe that only MBA graduates from elite business schools are suitable to fill leadership roles. And if that criteria were applied to all vacancies, you would soon develop a leadership team of predominantly white males, as most MBA graduates are male and white. Because diversity spurs innovation,  the organization would then be at a competitive disadvantage. 

Innovation is not just a nice-to-have benefit of having diverse work teams. It is an integral part of any revenue-generating organization. A Boston Consulting Group study found that organizations with more diverse management teams have 19% higher revenues from innovation alone. 

How unconscious bias can be avoided

Although unconscious bias can’t be cured, there are many steps that can be taken to mitigate it. Leaders who can recognize their unconscious biases and make adjustments to overcome them are more likely to make better decisions. To be ever-mindful of unconscious bias, it’s important to practice self awareness and slow down decision making to consider what is driving you. Ask yourself if your decisions are data-driven and evidence-based or if you rely on gut instinct? Have you asked for and considered different perspectives? It can be helpful to discuss your decisions and behaviour at work with an Ivey Academy executive coach.  An executive coach can provide a sounding board, a neutral perspective, and applicable strategies to help you overcome your unique unconscious biases. 

Promoting inclusion and diversity

To promote inclusion and diversity in your organization's hiring practices, appropriate procedures and processes need to be put in place. To eliminate bias in hiring decisions, make promotions fairer, and increase diversity, organizations are using data-driven talent assessments .

Organizations that use robust assessment tools have improved hiring success rates, lowered employee turnover, increased employee engagement and productivity, and fostered a resilient corporate culture. Assessments provide organizations with a consistent definition of what leadership potential looks like, regardless of race, gender, or ethnicity. With the help of assessment tools, leaders are able to find “ hidden gems ” — employees who have low visibility or who previously were not seen to have leadership potential. Most importantly, talent assessment tools help to educate leaders about the difference between an employee’s experience and his or her capability to take on new and more challenging responsibilities. With the help of talent assessments, you can be confident in knowing your organization is taking a needed step in removing unconscious bias from the hiring process.

The Ivey Academy’s talent assessment tools enable your organization to identify the best candidates for vacant roles and professional development. With our help, your organization can create and maintain a competitive edge in the recruitment, development, and retention of top talent. Learn more about our talent assessments here .

About The Ivey Academy at Ivey Business School The Ivey Academy at Ivey Business School is the home for executive Learning and Development (L&D) in Canada. It is Canada’s only full-service L&D house, blending   Financial Times   top-ranked university-based executive education with talent assessment, instructional design and strategy, and behaviour change sustainment. 

Rooted in Ivey Business School’s real-world leadership approach, The Ivey Academy is a place where professionals come to get better, to break old habits and establish new ones, to practice, to change, to obtain coaching and support, and to join a powerful peer network. Follow The Ivey Academy on   LinkedIn ,   Twitter ,   Facebook , and   Instagram .

You might also be interested in...

Choosing the right executive coach

Choosing the right executive coach

Because no universally reliable credential exists to identify credible coaches, it’s important to scrutinize an executive coach's experience and education.

Building a culture of innovation

The most important factor in building a culture of innovation

To create value for customers in ways their competitors cannot, organizations need to develop a culture of innovation that permeates all aspects of the business.

Identifying Talent And Managing Succession

Identifying talent and managing succession

For organizations to be able to remain competitive in an ever-changing global business landscape and labour market, having an identified group of talented employees who are able to fill key leadership roles is critical.

critical thinking and unconscious bias

  • Have any questions? Try our live chat! Chat

Sign up for virtual event invites and new content delivered monthly.

Receive our latest news, offers, learning content, and more.

.css-s5s6ko{margin-right:42px;color:#F5F4F3;}@media (max-width: 1120px){.css-s5s6ko{margin-right:12px;}} Discover how today’s most successful IT leaders stand out from the rest. .css-1ixh9fn{display:inline-block;}@media (max-width: 480px){.css-1ixh9fn{display:block;margin-top:12px;}} .css-1uaoevr-heading-6{font-size:14px;line-height:24px;font-weight:500;-webkit-text-decoration:underline;text-decoration:underline;color:#F5F4F3;}.css-1uaoevr-heading-6:hover{color:#F5F4F3;} .css-ora5nu-heading-6{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-box-pack:start;-ms-flex-pack:start;-webkit-justify-content:flex-start;justify-content:flex-start;color:#0D0E10;-webkit-transition:all 0.3s;transition:all 0.3s;position:relative;font-size:16px;line-height:28px;padding:0;font-size:14px;line-height:24px;font-weight:500;-webkit-text-decoration:underline;text-decoration:underline;color:#F5F4F3;}.css-ora5nu-heading-6:hover{border-bottom:0;color:#CD4848;}.css-ora5nu-heading-6:hover path{fill:#CD4848;}.css-ora5nu-heading-6:hover div{border-color:#CD4848;}.css-ora5nu-heading-6:hover div:before{border-left-color:#CD4848;}.css-ora5nu-heading-6:active{border-bottom:0;background-color:#EBE8E8;color:#0D0E10;}.css-ora5nu-heading-6:active path{fill:#0D0E10;}.css-ora5nu-heading-6:active div{border-color:#0D0E10;}.css-ora5nu-heading-6:active div:before{border-left-color:#0D0E10;}.css-ora5nu-heading-6:hover{color:#F5F4F3;} Read the report .css-1k6cidy{width:11px;height:11px;margin-left:8px;}.css-1k6cidy path{fill:currentColor;}

  • Leadership |
  • 19 unconscious biases to overcome and h ...

19 unconscious biases to overcome and help promote inclusivity

Unconscious biases are learned assumptions, beliefs, or attitudes that we aren’t necessarily aware of. While bias is a normal part of human brain function, it can often reinforce stereotypes. To combat unconscious bias, learn about different types of biases, how they might surface at work, and how to avoid them so you can build a more inclusive and diverse workplace.

That being said, these biases can lead to skewed judgments and reinforce stereotypes, doing more harm than good for companies when it comes to recruitment and decision-making. 

It’s especially important to be aware of these biases during the hiring process since they can impact the success of your future team.  

To help you recognize and combat unconscious bias in the workplace, we cover 19 unconscious bias examples and prevention strategies. Taking the steps to reduce biases will help you improve inclusivity, trust, and productivity within your company. 

What is unconscious bias?

Unconscious bias, also known as implicit bias, is a learned assumption, belief, or attitude that exists in the subconscious. Everyone has these biases and uses them as mental shortcuts for faster information-processing.

Implicit biases are developed over time as we accumulate life experiences and get exposed to different stereotypes. 

According to the Kirwan Institute for the Study of Race and Ethnicity , “These biases, which encompass both favorable and unfavorable assessments, are activated involuntarily and without an individual’s awareness or intentional control.”

What is unconscious bias?

As a result, unconscious biases can have a big influence on our limiting beliefs and behaviors. When this translates to our professional lives, it can affect the way we hire, interact with colleagues, and make business decisions. 

If not properly addressed, these biases can negatively impact a company’s workplace culture and team dynamics. 

Although these biases are pervasive, you can reduce their impact with deliberate attention and effort. Being aware of and understanding the different types of biases that exist can help you find ways to combat them. 

Leading through change: Creating clarity and building trust

In this webinar, Asana experts outline concrete tips to guide your team through uncertainty. Learn how to help employees focus on what matters.

Leading through change webinar thumbnail

Types of unconscious bias

Unconscious biases manifest in different ways and have varying consequences. Some biases arise from judging people’s appearances, some are derived from preconceived notions, and others are borne of logical fallacies. We explore these common biases in detail below. 

1. Gender bias

Gender bias

Gender bias, the favoring of one gender over another, is also often referred to as sexism. This bias occurs when someone unconsciously associates certain stereotypes with different genders.  

This type of bias may affect recruitment practices and relationship dynamics within the company. An example of this bias during hiring is if the hiring panel favors male candidates over female candidates even though they have similar skills and job experience. 

Another well-known example is the gender pay gap. As of 2021, the average median salary for men is about 18% higher than women’s.  

The gender bias may reduce job and career advancement opportunities for certain populations.  

How to avoid gender bias

Here are some ways to create a more gender-diverse workplace: 

Set gender-neutral recruitment standards: Define the ideal candidate profile ahead of time and evaluate all candidates against those standards. 

Create diversity goals: Set qualitative gender diversity goals to create a more gender-balanced team. Support and provide resources for women to take on leadership roles. 

Ageism refers to stereotyping or discriminating against others based on their age, often happening to older team members. 

Although workers ages 40 and older are protected from workplace discrimination under the Age Discrimination in Employment Act, filing for a lawsuit against an employer can be a lengthy and costly process. 

Because not everyone files a complaint, ageism is still a prevalent issue. An AARP survey found that about 60% of workers age 45 and older have seen or experienced age discrimination in the workplace.

An example of ageism is if an older team member was passed over for a promotion, which ended up going to a younger team member with less seniority and experience. 

Companies that discriminate based on age may lose out on the valuable knowledge and experience that older workers bring. There may also be serious legal consequences if a team member decides to file a job discrimination lawsuit. 

How to avoid ageism bias

Preventing ageism involves combatting age-related stereotypes as well as engaging older team members in the workplace. Here are some ways to do that:

Don’t make assumptions based on age: For example, don’t automatically presume that older workers don’t know how to use technology or aren’t open to learning new skills. Provide equal learning opportunities for everyone. 

Foster cross-generational collaboration: Create two-way mentorship programs where a senior team member is paired with a new hire. This kind of collaboration facilitates communication between team members of different stages, which can help break down misconceptions about age.  

3. Name bias

Name bias is the tendency to prefer certain names over others, usually Anglo-sounding names.

Name bias is most prevalent in recruitment. If a recruiter tends to offer interviews to candidates with Anglo-sounding names over equally qualified candidates with non-Anglo names, this bias is present.  

Name bias can have a negative impact on diversity hiring and result in companies missing out on talented candidates. 

How to avoid name bias

A simple solution to avoid name bias is to omit names of candidates when screening. To do this, you can:

Use software: Use blind hiring software to block out candidates’ personal details on resumes.

Do it manually: Designate a team member to remove personal information on resumes for the hiring team. 

4. Beauty bias

Beauty bias refers to the favorable treatment and positive stereotyping of individuals who are considered more attractive. This has also given rise to the term “ lookism ,” which is discrimination based on physical appearance. 

An example of beauty bias is a hiring manager who is more inclined to hire candidates they think are good-looking. 

Hiring decisions should be based on skills, experience, and culture fit rather than physical appearance.

How to avoid beauty bias

Here are some ways to avoid beauty bias when screening job applicants:

Omit pictures from resumes: Focus on an applicant’s qualifications and experience when screening resumes.

Conduct telephone screening: Before scheduling an interview, consider doing a short telephone interview to get to know the applicant better without being influenced by their appearance. 

5. Halo effect

The halo effect, a term coined by psychologist Edward Thorndike in the 1920s, occurs when we develop an overall positive impression of someone because of one of their qualities or traits. 

This effect may lead us to inadvertently put people on a pedestal since we’re constructing an image of a person based on limited information. 

An example of this effect in recruitment is when a hiring manager sees that a candidate graduated from a prestigious school and assumes that they excel at their job. 

This halo is based on the hiring manager’s academic preferences. However, the school that someone went to doesn’t necessarily determine their level of job competency.  

By focusing too much on one positive trait, we may overlook negative behavior that could end up harming the company—for example, if a candidate was fired for misconduct in a previous job. 

How to avoid the halo effect

To reduce the impact of the halo effect, you could try out different interviewing strategies:

Conduct multiple interviews: Set up several rounds of interviews for candidates with different levels of management. That way, a candidate can be evaluated from various perspectives. 

Diversify your interview team: Getting someone from another team to interview the candidate may help since they’ll have less reason to “halo” them as they won’t be working with them directly. 

6. Horns effect

The horns effect is the opposite of the halo effect. This bias causes us to have a negative impression of someone based on one trait or experience. 

Putting too much weight on a single trait or interaction with someone can lead to inaccurate and unfair judgments of their character. 

For example, a new team member thinks the constructive criticism they received from their manager is harsh and assumes that their manager is a critical and stern person. 

If left unchecked, the horns effect can damage the cohesiveness and trust between team members. 

How to avoid the horns effect

In order to reduce the horns effect when interacting with others, try to: 

Challenge your first impressions: Take the time to get to know someone so you can develop a more concrete impression of that person as a whole.

Make judgments based on evidence: Ask yourself how you developed your first impression of someone and find evidence to support or refute that impression based on additional interactions. 

7. Confirmation bias

Confirmation bias

Confirmation bias is the tendency to seek out and use information that confirms one’s views and expectations. In other words, cherry-picking information to validate certain points. 

This affects our ability to think critically and objectively, which can lead to skewed interpretations of information and overlooking information with opposing views. 

For example, a product developer comes up with a product idea for the athletic market. Although market research shows little interest in the product, they try to validate the idea by reaching out to athlete friends who they know will support the idea. 

Although there’s gratification in validating a current idea, it’s important to consider the potential consequences of following through with the idea. 

How to avoid confirmation bias

Here are some ways to reduce confirmation bias:

Gather multiple sources: Whenever you’re testing a hypothesis or conducting research, gather information from a wide variety of sources to get a balanced perspective. 

Standardize interview questions : When recruiting new talent, come up with a list of standard interview questions to prevent asking off-topic or pointed questions that may or may not confirm your beliefs about a candidate. 

8. Conformity bias

Conformity bias is similar to groupthink, which occurs when we change our opinions or behaviors to match that of the bigger group, even if it doesn’t reflect our own opinions. 

This bias may occur when we encounter peer pressure or are trying to fit into a certain social group or professional environment. 

For example, a team is deciding between two proposals. One person thinks proposal A is better, but the rest of the team is leaning towards proposal B. That person is swayed by their opinions and ends up voting for proposal B because everyone else did. 

Although conformity can help prevent conflicts, it may also limit creativity, open discussions, and having other perspectives available. 

How to avoid conformity bias

Here are some ways to help encourage honest opinions in the workplace:

Use anonymous votes or surveys: The option to give feedback anonymously allows the freedom to express opinions without worrying about others’ preferences. 

Ask for opinions in advance: Before going into a meeting, have a private conversation with each team member to get their opinions. This gives everyone plenty of time to think about a topic and express their thoughts without the pressure of presenting in front of colleagues. 

9. Affinity bias

Affinity bias is also known as the similarity bias and refers to the tendency to favor people who share similar interests, backgrounds, and experiences. We tend to feel more comfortable around people who are like us. 

This bias may affect hiring decisions. For example, a hiring manager gravitates towards a job applicant because they share the same alma mater.

Over time, the affinity bias in hiring can hamper a company’s diversity and inclusion efforts. 

How to avoid affinity bias

While eliminating affinity bias entirely may not be possible, there are ways to reduce its effects:

Create a diverse hiring panel: Different people with varying perspectives and interests that conduct interviews can help reduce the affinity bias of one individual.

Go beyond hiring for “culture fit": The more hiring managers have in common with candidates, the more likely they are to evaluate them as a good “culture fit.” But the term "culture fit" is vague, and it can mean different things to different people. To assess candidates fairly, use specific language and examples when sharing feedback about them. Describe how well they embody company values or align with company missions. 

10. Contrast effect

We often make judgments by making comparisons. As a result, our judgments may be altered depending on what standard we’re comparing something to. This is known as the contrast effect.  

For instance, a team member is happy to receive a “meets expectations” on their performance review. However, they start to feel inadequate after finding out most of their colleagues got “exceeds expectations” on their reviews. 

Even though they got a decent review, the team member judges themselves more critically since their comparison standard is their colleagues’ results. 

There can also be positive contrast effects, which occur when something is perceived to be better than usual because it’s being compared to something worse. 

How to avoid the contrast effect

Here are some strategies to try when using comparisons to make decisions:

Make multiple comparisons: Instead of coming to a conclusion after making one comparison, compare something against different standards to broaden your perspective. 

Talk it out: Explain how you came to a given conclusion to your colleagues so they can understand your point of view. 

11. Status quo bias

This bias describes our preference for the way things are or for things to remain as they are, which can result in resistance to change. 

Following the status quo is a safe option and takes less effort, but it also results in becoming stagnant. As the business landscape continues to shift, change is necessary for business longevity and innovation. 

An example of the status quo bias in a company is continuing to hire team members from the same demographic group, making no effort to move forward with diversity goals. 

By repeatedly engaging in the same hiring practices, you may miss out on great candidates who can bring fresh ideas and perspectives to your company. 

How to avoid the status quo bias

Here are some ways you can challenge the status quo:

Use the framing effect: We often follow the status quo to avoid a loss, which we place greater weight on compared to gains. The framing effect involves looking at the default option as a loss to encourage exploring alternative options as gains. 

Encourage outside-the-box thinking: Create an environment that celebrates creativity and innovation. Adapt an open mindset to change so that your team can continue to push the status quo. 

12. Anchor bias

Anchor bias occurs when we overly rely on the first piece of information we receive as an anchor to base our decision-making upon. This causes us to see things from a narrow perspective. 

For example, the first thing a recruiter finds out about a candidate they’re interviewing is that they were unemployed for the past year. The recruiter focuses on this fact rather than the candidate’s solid qualifications and skills.

Instead of relying on one piece of information to make a decision, it’s important to look at the whole picture. 

How to avoid anchor bias

It takes time to make a thoughtful decision. Here are some tips to keep in mind:

Conduct thorough research: The first option may not always be the best one. Explore various possible options and their pros and cons before deciding.

Brainstorm with your team: Discussing a given decision with your teammates can help reveal the strengths and weaknesses of a plan. 

13. Authority bias

Authority bias

Authority bias refers to the tendency to believe in authority figures and follow their instructions. 

Generally, following a trusted authority figure with relevant expertise is a good idea. However, blindly following a leader’s direction without your own critical thinking may cause future issues.

For example, if a team member unquestionably follows their manager’s instructions to write a report in a way that matches the manager’s opinions, this could jeopardize the integrity of the report.

When receiving instructions on an area outside of your manager’s expertise, it can be worthwhile to seek additional information or expertise to minimize potential issues that may arise.

How to avoid authority bias

As with many unconscious biases, developing awareness of the bias is a good first step to countering it. 

Here is how to avoid being influenced by authority bias:

Ask questions: Don’t be afraid to ask your manager or company leader questions. The level of detail they provide may be an indicator of whether an idea was well thought-out or if it’s their authority coming into play. 

Do your research: Conduct your own research on a given topic to identify other credible sources or experts and see whether their suggestions align with your manager’s suggestions. 

14. Overconfidence bias

Overconfidence bias is the tendency for people to think they are better at certain abilities and skills than they actually are. 

This false assessment of our skill levels, stemming from an illusion of knowledge or control, can lead us to make rash decisions. 

For instance, an overconfident CEO decides to acquire a startup that they see high potential in and believe will bring high returns even though their performance indicates otherwise. 

Previous success or accomplishments may lead to an inflated ego. While leading with confidence is a good thing, it’s important to not let it get in the way of logical thinking and decision-making. 

How to avoid overconfidence bias

Here are tips to follow when you’re making decisions:

Consider the consequences: The decisions you make can have an impact on your company. Before committing to a decision, determine all the possible outcomes to ensure you’re prepared for them.

Ask for feedback: Getting feedback from your team can help you identify areas of improvement, whether it’s related to your performance or your ideas. Constructive criticism can keep egos in check.  

15. Perception bias

Perception bias occurs when we judge or treat others based on often inaccurate, overly simplistic stereotypes and assumptions about the group they belong in. It may involve other biases such as gender, age, and appearance. 

This type of bias may result in social exclusion, discrimination, and an overall reduction of a company’s diversity goals.

Say, for example, a team member doesn’t invite a teammate to an after-work social event because they assumed that they wouldn’t share similar interests with the group. 

Perception bias can make it difficult to have an objective understanding about members from diverse groups.

How to avoid perception bias

Reducing the impact of perception bias requires recognizing your biases:

Challenge your assumptions: Ask yourself, “How well do I really know that person or the group they belong to?” Don’t let preconceived notions prevent you from meeting or including new people. 

Think about the accuracy of statements: When you find yourself using strong words like “all,” “always,” and “never” to describe a certain group, pause and take a moment to ask yourself how accurate the description is. 

16. Illusory correlation

Illusory correlation is when we associate two variables, events, or actions together even though they’re unrelated to each other. 

For example, a hiring manager asks a candidate interview questions in an effort to gain insight into their personality but are unrelated to the job itself. Since the candidate struggles to come up with answers, the hiring manager decides they would not be a good fit.

These illusions can leads us to making decisions based on inaccurate correlations. 

How to avoid illusory correlation bias

We may be more prone to see false correlations in circumstances that we’re unfamiliar with or have little knowledge of. 

Here are tips to avoid making illusory correlations:

Get informed: Learning more about the areas you’re not familiar with can help you find evidence to support or refute the correlation. 

Consider all possibilities: When you associate two things, consider the likelihood of the cause and effect. You can also use a contingency table to visualize the relationships between the cause and effect. 

17. Affect heuristic

Heuristics are mental shortcuts that help us make decisions more efficiently. The affect heuristic occurs when we rely on our emotions to make decisions. This may help us reach a conclusion more quickly, though it may not always be accurate or fair. 

For example, an interview candidate makes an off-hand comment that offends a recruiter, though that wasn’t their intention. The recruiter decides to reject the candidate because they were vexed by the comment even though they were the most qualified candidate. 

Since emotions may cloud your judgment, it’s important not to make decisions in the heat of a moment. 

How to avoid the affect heuristic bias

Here are ways to lower the influence of emotions in different circumstances: 

Be aware of your emotions: Simply being aware of our level of emotions in a situation can help us step back from the situation and evaluate it more logically. 

Take time to reflect: Reflect on an event some time after it occurs. Your emotions likely won’t be as strong as they were during the event, so you’ll be able to come to a more objective conclusion. 

18. Recency bias

Recency bias occurs when we attribute greater importance to recent events over past events because they’re easier to remember. 

This bias is more likely to occur when we have to process a large amount of information. For example, since hiring managers often review a high volume of job applications in a day, it may be harder to recall candidates screened earlier during the day. 

Recency bias can also manifest during the interview process when a hiring manager becomes more inclined to make hiring decisions based on the most recent candidate they interviewed. 

To overcome this bias, using techniques to strengthen your memory can be helpful. 

How to avoid recency bias

Here are some tips to prevent recency bias when interviewing candidates: 

Take notes: Take detailed notes during each interview and review them afterward. This can help you keep track of notable candidates regardless of when you interviewed them. 

Give yourself mental breaks: Doing back-to-back interviews can be mentally draining. When your working memory takes a toll, you’re more likely to be affected by recency bias. Stay mentally alert by taking breaks in between interviews so your brain has time to absorb and remember the information.   

19. Idiosyncratic rater bias

Idiosyncratic rater bias affects the way we evaluate the performance of others. We often rate others based on our subjective interpretations of the assessment criteria and our own definition of what “success” looks like. 

In other words, we’re generally unreliable when it comes to rating other people. Research has found that about 60% of a manager’s rating is a reflection of the manager rather than the team member they’re rating. 

For example, a manager who excels at project management has higher standards for this skill and gives harsher ratings to team members for this skill. On the other hand, the manager is more lenient when rating team members’ marketing skills because they are less familiar with that area. 

Sources of rater bias may come from other biases, such as the halo effect, affinity bias, and confirmation bias. 

How to avoid idiosyncratic rater bias

Here are some strategies to avoid this bias when doing performance reviews: 

Set specific and clear assessment criteria: Create a rubric or a specific set of standards for evaluating performance. This prompts managers to provide supporting evidence based on a team member’s performance or achievements to determine how well they did.  

Conduct multi-rater reviews: This process involves a team member getting feedback from their colleagues and managers in addition to doing a self-evaluation. Having multiple reviews to draw from can help managers gain a more holistic view of a team member’s performance and identify potential areas for growth. 

Why it’s important to tackle unconscious biases

As these examples show, unconscious biases can hinder decision-making, impact team dynamics and leadership styles , and limit company diversity. This, in turn, can reduce equal opportunities for team members and job applicants. 

Tackling unconscious biases can help address these issues, as well as improve company diversity. 

Benefits of tackling unconscious bias

Increased company diversity can bring additional benefits such as:

Increasing company profitability: Teams that have solid problem-solving and decision-making skills can bring a competitive advantage to a company. For example, a McKinsey study found that gender-diverse companies were 21% more likely to gain above-average profitability.

Attracting diverse talent through inclusive hiring practices: By implementing inclusive recruitment strategies, companies are able to reach out to a wider talent pool. Job seekers would also be more likely to apply to companies that prioritize diversity. 

Increasing innovation: Diverse teams can bring a variety of fresh ideas to the table, allowing teams to come up with creative solutions that can drive sales. For example, a study by the Boston Consulting Group found that companies with diverse management teams bring 19% higher innovation revenue. 

Boosting company productivity: University research found that tech firms with diverse management teams have 1.32 times higher levels of productivity . Increased productivity can lead to more efficient project management and implementation. 

Encouraging higher employee engagement: Deloitte research showed that company diversity is directly related to employee engagement . Higher employee engagement can lead to higher job satisfaction, which in turn, can lower the turnover rate. 

Making fair and more efficient business decisions: Inclusive teams can make better business decisions up to 87% of the time. These business decisions can help improve a company’s performance and revenue. 

Be conscious of your unconscious biases

The good news: Once you’re aware of your unconscious biases, you can take steps to mitigate their effects. By taking micro-steps such as revamping your interview questions template and encouraging cross-team collaboration , you’re working towards a more diverse and inclusive workplace environment for you and your team.

Related resources

critical thinking and unconscious bias

Fix these common onboarding challenges to boost productivity

critical thinking and unconscious bias

How Asana uses work management to optimize resource planning

critical thinking and unconscious bias

Understanding dependencies in project management

critical thinking and unconscious bias

How Asana uses work management for organizational planning

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • Future Healthc J
  • v.8(1); 2021 Mar

Logo of futhealthcj

Implicit bias in healthcare: clinical practice, research and decision making

Dipesh p gopal.

A Barts and The London School of Medicine and Dentistry, London, UK

B University of Glasgow, Glasgow, UK

Patrick O'Donnell

C University of Limerick, Limerick, Ireland

Camille Gajria

D Imperial College London, London, UK

Jodie Blackadder-Weinstein

E Royal Centre of Defence Medicine, Edgbaston, UK

Bias is the evaluation of something or someone that can be positive or negative, and implicit or unconscious bias is when the person is unaware of their evaluation. This is particularly relevant to policymaking during the coronavirus pandemic and racial inequality highlighted during the support for the Black Lives Matter movement. A literature review was performed to define bias, identify the impact of bias on clinical practice and research as well as clinical decision making (cognitive bias). Bias training could bridge the gap from the lack of awareness of bias to the ability to recognise bias in others and within ourselves. However, there are no effective debiasing strategies. Awareness of implicit bias must not deflect from wider socio-economic, political and structural barriers as well ignore explicit bias such as prejudice.

Introduction

Bias is the evaluation of something or someone that can be positive or negative, and implicit or unconscious bias is when the person is unaware of their evaluation. 1,2 It is negative implicit bias that is of particular concern within healthcare. Explicit bias, on the other hand, implies that there is awareness that an evaluation is taking place. Bias can have a major impact on the way that clinicians conduct consultations and make decisions for patients but is not covered in the medical field outside clinical reasoning. Conversely, it is commonly highlighted in the world of business. 3,4 The lack of awareness of implicit bias may perpetuate systemic inequalities, resulting in lower pay for clinicians from ethnic minorities and lack of female surgeons in senior positions, for example. 5,6

Cognitive bias may explain political decisions in the coronavirus pandemic framing ventilators as ‘lifesaving’ and subsequent investment over public health non-pharmaceutical measures: framing bias. 7 Clinicians during the pandemic may have been tempted to prescribe medication despite lack of clear evidence due to fear of lack of action: action bias. 8 Action bias may have been exhibited by stressed members of the public when panic buying groceries despite reassurance of stable supply. 9 Cognitive bias may affect the way clinicians make decisions about healthcare given the novelty of the disease and evolving evidence base. Politicians may prioritise resources to goals that will provide short-term benefit over long-term benefit; this might include increases critical care capacity over public health investment: present bias. 7 Given the amount of poorly reported and implemented non-peer reviewed pre-print research during the pandemic, many clinicians may implement easily available research amplified by media rather than taking a critical look at the data: availability bias. 8,10 This may be compounded by physical and emotional stress. Media reporting of the coronavirus in the USA as the ‘Chinese virus’ was linked with increasing anti-American bias towards east Asians. 11

This article aims to identify the potential impact of bias on clinical practice and research as well as clinical decision making (cognitive bias) and how biases may be mitigated overall.

A non-systematic literature review approach was used given the heterogeneous and mixed-method study of bias in healthcare; such a topic would be unamenable to systematic review methodology. Inclusion criteria included English language articles which were identified by searching PubMed and the Cochrane database from January 1957 to December 2020 using the following search terms: ‘implicit bias’, ‘unconscious bias’, ‘cognitive bias’, and ‘diagnostic error and bias’. The highest level of evidence was prioritised for inclusion (such as recent systematic reviews, meta-analyses and literature reviews). Opinion articles were included to set context in the introduction and the discussion sections to identify possible future direction. Articles mentioning bias modification in clinical psychiatry were excluded as these focused on specific examples of clinical care rather than contributing to a broad overview of the potential impact of bias in medicine.

How does bias work and where does it come from?

Decision making can be understood to involve type 1 and type 2 processes (see Fig ​ Fig1 1 ). 12,13 Type 1 processes are fast, unconscious, ‘intuitive’ and require limited cognitive resources. 13,14 They are often known as mental shortcuts or heuristics, which allow rapid decision making. In contrast, type 2 processes are slower, conscious, ‘analytic’ and require more cognitive resources. 13 The above is known as dual process theory (DPT). It is type 1 processing that makes up the majority of decision making and is vulnerable to error. If this occurs in consecutive decisions, it can lead to systematic errors, such as when a car crash that occurs after errors in some of hundreds of tiny decisions that are made when driving a car. 13 Despite the critique of implicit bias, such automatic decisions are necessary for human function and such pattern recognition may have developed in early humans to identify threats (such as predators) to secure survival. 3 It is thought that our biases are formed in early life from reinforcement of social stereotypes, from our own learned experience and experience of those around us. 15

An external file that holds a picture, illustration, etc.
Object name is futurehealth-8-1-40fig1.jpg

Decision-making processes. a) The interaction between type 1 and type 2 processes allows diagnoses to be made from patient presentations. T = ‘toggle function’; the ability to switch between type 1 and type 2 processes. b) The type 1 processes that control calibration of decision making to make a diagnosis. Adapted with permission from Croskerry P, Singhal G, Mamede S. Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Qual Saf 2013;22(Suppl 2):ii58–64.

The Implicit Association Test (IAT) is the commonest measure of bias within research literature. It was developed from review work which identified that much of social behaviour was unconscious or implicit and may contribute to unintended discrimination. 16,17 The test involves users sorting words into groups as quickly and accurately as possible and comes in different categories from disability to age, and even presidential popularity. For the gender-career IAT, one vignette might include sorting gender, or names (eg Ben or Julia), into the family or career categories. This has been well summarised in meta-analyses comparing the ability of the IAT to predict social behaviour. 18,19 Furthermore, Oswald and colleagues found that the IAT was not a predictor of markers of discrimination when looking at race and ethnicity. 19

While the IAT is used widely in research literature, opponents of the IAT highlight that it is unclear what the test actually measures, and comment that the test cannot differentiate between association and automatically activated responses. 20 Furthermore, it is difficult to identify associations, bringing further confusion to the question of how to measure the activity of the unconscious mind. Given these conflicting views, while IAT testing is commonly used, it cannot be universally recommended. 21 There are ethical concerns that the IAT could be used as a ‘predictive’ tool for crimes that have not yet occurred, or a ‘diagnostic’ tool for prejudice such as racism. 22 The IAT should be used as a tool for self-reflection and learning, rather than a punitive measure of one's biases or stereotypes. 23 The test highlights individual deficiencies rather than looking at system faults.

A systematic review focusing on the medical profession showed that most studies found healthcare professionals have negative bias towards non-White people, graded by the IAT, which was significantly associated with treatment adherence and decisions, and poorer patient outcomes (n=4,179; 15 studies). 24 A further systematic review showed that healthcare professionals have negative bias in multiple categories from race to disability as graded by the IAT (n=17,185; 42 studies) but it did not link this to outcomes. 25 The reviews bring into question healthcare provider impartiality which may conflict with their ethical and moral obligations. 25,26

Bias in clinical medicine

Using the IAT, US medical students (n=4,732) and doctors (n=2,284) were demonstrated to have weight bias (ie prejudice against those who are overweight or obese) which may stem from a lack of undergraduate education in the causes of obesity and how to consult sensitively. 27–29 Many healthcare professionals believe that obesity is due to a lack of willpower and personal responsibility, but it may be due to other factors such as poverty and worsening generational insomnia. 30–32 Similarly the obesity IAT evaluated across 71 countries (n=338,121) between 2006 and 2010 identified that overweight individuals had lower bias towards to overweight people, while countries with high levels of obesity had greater bias towards obese people. 33

There is evidence to corroborate anecdotal reports of female doctors being mistaken for nurses while at work, and male members of staff and male students being mistaken for doctors despite the presence of a clear female leader. 34,35 Boge and colleagues found that patients (n=150) were 17.1% significantly less likely to recognise female consultants as leaders compared with their male counterparts, and 14% significantly more likely to recognise female nurses as nurses compared with male nurses. 34 In addition, female residents (registrars) have significantly negative evaluations by nursing staff compared with their male colleagues despite similar objective clinical evaluations between male and female colleagues. 36,37

One alarming disparity that deserves mention is gender-specific differences in myocardial infarction presentation and survival. While members of both genders present with chest pain, women often present with what is known as ‘atypical’ symptoms such as nausea, vomiting and palpitations. 38,39 The mention of ‘atypical’ in the literature is misleading given that women make up half of an average population. Large cohort studies (n=23,809; n=82,196) have found increased in-hospital mortality by 15–20% (adjusted odds ratios) for female patients compared with male patients, which contrasts with smaller cohorts (n=4,918; n=17,021), which have found no differences. 40–43 Interviews with patients under the age of 55 (n=2,985) who had suffered myocardial infarctions revealed that women were 7.4% (absolute risk) more likely to seek medical attention, and were 16.7% less likely to be told their symptoms were cardiac in origin. 44 This data indicates a need for education of the public and healthcare professionals alike about the symptoms of a myocardial infarction in women.

In 2019, the MBRRACE-UK report revealed that maternal and perinatal mortality in pregnancy was five times higher in Black women compared with White women, and this data has also been replicated in US data with a similar order of magnitude of three to four times. 45,46 While official reports have not offered clear explanations as to the causes of such differences, it has been suggested that a combination of stigma, systemic racism and socio-economic inequality are relevant causative factors rather than biological factors alone. 47,48 Lokugamage calls for healthcare professionals to challenge their own biases and assumptions when providing care using a ‘cultural safety’ model. 49,50 Such a model could help identify areas for power imbalances in the healthcare provider–patient relationship and resultant inequalities. Cultural competence training has been evaluated in a Cochrane systematic review, and a number of randomised controlled trials (RCTs) included did show that training courses (of varying lengths) did provide some improvement in cultural competency and perceived care quality at 6–12 months’ follow-up (five studies; 337 professionals; 84,00 patients). 51 However, there was limited effect on improving objective clinical markers such as decreasing blood pressure in ethnic minorities for example.

Bias in research, evidence synthesis and policy

While scientific and medical research is thought to be free from outside influence, ‘science is always shaped by the time and the place in which it is carried out’. 52 The research questions that are developed and answered depend on the culture and institutions in our societies, including public–private industry partnerships. During research conduct, minimisation of bias (specifically selection and measurement bias) within research is an important factor when attempting to produce generalisable and robust data. Canadian life science researchers note a consistent trend of small research institutions having a 42% lower chance of research grant application being successful compared with large research institutions. 53,54 In contrast, gender bias within the wider realm of research may discriminate against women in the selection of grant funding as well as in terms of the hierarchical structure of promotion in academic institutions. 55,56 At academic conferences and grand rounds, men were 21–46% more likely to introduced by their professional titles by women compared with when women were introduced by men. 57–59 Women were 8–25% more likely to introduce a fellow woman by her title compared with men introducing men. However, these differences were not always observed. 60

Taking an international perspective, when an IAT was used to assess healthcare professionals’ and researchers’ (n=321) views on the quality of research emanating from ‘rich’ and ‘poor’ countries (assessed by gross domestic product), the majority associated ‘good’ (eg trustworthy and valuable) research with ‘rich’ countries. 61 This alone does not mean much, but by using a randomised blinded crossover experiment (n=347), swapping the source of a research abstract from a low- to high-income country, improved the assessment of the research in English healthcare professionals. 62 A systematic review (three randomised control trials; n=2,568) found geographic bias for research from high-income countries or more prestigious journals over low-income countries or less prestigious journals. 63 This highlights how publication bias for research from high-income countries could neglect a wealth of data from low-income countries that is valid, even if it is not published, or only published in lower impact journals. These data highlight a greater need for more objective assessments of research, including multiple layers of blinding with a journal review board and peer reviewers from low-income countries. 63 However, blinding may be beneficial when recruiting people to jobs from job applications given that application photos may influence the selection process at resident or registrar level. 64 It may be difficult to anonymise citations or publication data during academic selection processes.

Bias comes into play during evidence generation and application of evidence-based policy (EBP) where scientific-based, single-faceted solutions can be seldom applied to multi-faceted or ‘wicked’ problems. 65 These problems are poorly defined, complex, dynamic issues where solutions may have unpredictable consequences (such as climate change or obesity). 66 Parkhurst identifies two forms of evidentiary bias in policymaking that can occur in the creation, selection and interpretation of evidence: technical bias and issue bias. 67 Technical bias is where use of the evidence does not follow scientific best practice, such as ‘cherry-picking’ rather than systematically reviewing the evidence to support a certain position. In contrast, issue bias occurs when the use of the evidence shifts political debate in a certain direction, such as presenting a policy with evidence reflecting one side of the debate.

Cognitive biases and diagnostic errors

Errors are inevitable in all forms of healthcare. 68 The prevalence of diagnostic errors varies between different healthcare settings and may be partly due to cognitive factors as well as system related factors. 69,70 Systematic reviews (76 studies; 19,123 autopsies) looking at studies where autopsies detected clinically important or ‘major’ errors involving principal underlying disease or primary cause of death found an error rate of 23.5–28% in adult and child inpatient settings. 71,72 A systematic review conducted in primary care identified a median error rate of 2.5 per 100 consultations or records reviewed (107 studies (nine systematic reviews and 98 primary studies); 128.8 million consultations/records). 73 Existing research on human factors using checklists to decrease hospital-associated infections and perioperative mortality supports emerging research that links bias to diagnostic errors. 74–76

A systematic review assessing associations between cognitive biases and medical decisions found cognitive biases were associated with diagnostic inaccuracies in 36.5%–77% of case scenarios (7 studies; n=726) from mostly clinician survey-based data. 77 There was an association found between cognitive bias and management errors in five studies (n=2,301). There was insufficient data to link physician biases and patient outcomes. The review was limited by a lack of definitions of the different types of cognitive biases in 40% of all studies (n=20) and a lack of systematic assessment of cognitive bias. Cognitive biases are one of several individual-related interweaving factors linked to errors, including inadequate communication, inadequate knowledge–experience skill set and not seeking help. 78 There are many different types of cognitive bias which can be illustrated in the healthcare diagnostic context (see Table ​ Table1 1 ).

Selected cognitive biases in a healthcare context with definitions illustrated with an example of a patient presenting with chest pain 79–86

Evidence-based bias training

Making diagnoses is thought to depend on the previously mentioned type 1 and type 2 processes which make up DPT. 87 Despite this, there has been a growing body of evidence suggestive that type 2 processing or ‘thinking slow’ is not necessarily better than type 1 processing ‘thinking fast’ in clinicians. 12 , 88–90 Furthermore, there has been suggestion that proposed solutions (such as reflection and cognitive forcing; strategies that force reconsideration of diagnoses) to identify and minimise biases and debiasing checklists has limited effect in bias and error reduction. 91–94 Small-scale survey-based data (n=37) suggested the presence of hindsight bias where clinicians disagree on the exact cognitive biases depending on the outcome of a diagnostic error (see Table 1). 95

A systematic review (28 studies; n=2,665) on cognitive interventions targeting DPT for medical students and qualified doctors found several interventions had mixed or no significant results in decreasing diagnostic error rate. 70 The vast majority of studies included small samples (n<200) and effects often did not extend beyond 4 weeks. Interventions included integration into educational curricula, checklists when making diagnoses, cognitive forcing, reflection and direct instructions. These interventions often come under the umbrella term of ‘meta-cognition’. A more recent systematic review and meta-analysis determined that diagnostic reflection improved diagnostic accuracy by 38% in medical students and doctors (n=1,336; 13 studies) with short-term follow-up. 96 This implies that decreasing bias can only occur after a diagnostic error has taken place. The limited evidence base for decreasing bias may be due to methodological differences or intrinsic differences in study subjects in the clinical studies and reviews. Some clinicians may find a practical checklist when providing healthcare in order to minimise their own biases when making decisions (Box ​ (Box1 1 ). 97–99 The nature of decreasing bias through a single-faceted intervention may be very difficult as bias is a ‘wicked’ or multi-faceted problem. 65 Unconventional methods of teaching bias may include a teaching bias to medical students in a non-clinical setting (such as a museum, a weekly series of case conferences examining health equity and implicit bias, and transformative learning theory). 100–102 Transformative learning theory resembles what many consider to be key components of Balint groups and combines multiple single interventions (such as experience, reflection, discussion and simulation). 102,103

Suggested checklist for making good clinical decisions 97–99

Hagiwara and colleagues outlined three translational gaps from social psychology to medical training which may hinder the effectiveness of bias training to improve health outcomes. 104 The first is a lack of evaluation of a person's motivation to make change along with bias awareness. The second is that bias training does not come with clear strategies to mitigate bias and may result in avoidance or overfriendliness which may come across as contrived in specific situations (such as clinics with marginalised groups). The third is lack of verbal and non-verbal communication training with bias training, given that communication is the mediator between bias and patient outcomes. Verbal communication training may involve micro-aggressions. 105

There are limited data to suggest reflective practice as a clear evidence-based strategy to decrease our biases on a clinician–patient level but options such as cultural safety checklists and previously outlined strategies (Box ​ (Box1) 1 ) could provide support to coalface clinicians. 97–99 Better appreciation of biases in clinical reasoning could help clinicians reduce clinical errors and improve patient safety and provide better care for marginalised communities who have the worst healthcare outcomes. 106,107 It is hoped that the training would help bridge the gap from the unawareness of bias to the ability to recognise bias in others and within ourselves to mitigate personal biases and identify how discrimination may occur. 108 Awareness of implicit bias allows individuals to examine their own reasoning in the workplace and wider environment. It asks for personal accountability and a single question: ‘If this person were different in terms of race, age, gender, etc, would we treat them the same?’

However, there is a conflict between those suggesting bias training which may increase awareness of bias and the limited evidence to identify any effective debiasing strategy following the identification of biases. 109 Advocates of bias training suggest that it should not be taught as an isolated topic but integrated into clinical specialty training. 110 Others deduce that bias training would be more effective with measures of personal motivation and communication training along with evidence-based strategies to decrease implicit bias. 101 Similarly, IAT testing should be administered with a caveat.

To our knowledge at the time of writing, only the Royal College of Surgeons of England has identified the importance of unconscious bias through an information booklet. 111 The booklet entitled Avoiding unconscious bias seems unlikely because type 2 processing is integral to human thinking. There is a need for better-powered research into the effectiveness of strategies that can decrease implicit and cognitive bias, especially in the long term. Furthermore, organisations should consider whether bias training should be integrated into undergraduate and postgraduate curriculum as there are no effective debiasing strategies.

As we move into data-driven societies, the impact of bias becomes every important. 112 A simple example is a step counting mobile application that undercounted steps, it is probably due to the application being likely constructed to count steps in an ‘average person’ ignoring differences in gender, body mass index and ethnic origin. 113 Within artificial intelligence, testing of data algorithms in different groups of people can help make algorithms more applicable to diverse populations and, ideally, diversely created algorithms should limit bias and increase applicability. 114,115

Since the Black Lives Matter movement, many institutions may consider implementing bias training to mitigate racism. However, awareness of implicit bias or tokenistic bias training must not deflect from wider socio-economic, political and structural barriers that individuals face. 116,117 Similarly, implicit bias should not be used to absolve responsibility, nor ignore explicit bias that may perpetuating prejudice and stereotypes. 117 Action to correct the lack of non-White skin in research literature and medical textbooks is welcome. 118–120 Furthermore, there has been much work to challenge the role of biological race in clinical algorithms and guidance (such as estimated glomerular filtration rate and blood pressure). 121,122 Most pertinent to the pandemic, Sjoding and colleagues compared almost 11,000 pairs of oxygen saturations with pulse oximetry and arterial blood gas among Black and White patients. 123 Black patients were 8–11% (relative risk three times) more likely to have lower arterial saturations when compared with pulse oximetry for White patients. This has implications for the coronavirus pandemic, respiratory conditions and is a call to tackle racial bias in medical devices.

With regards to the structure of our healthcare systems, the understanding of personal bias can help identify judgements made during recruitment processes and help build representative leadership and workforce in the healthcare system of the population they serve. 124 This is likely to help deliver better patient outcomes. Other strategies to decrease the impact of bias include using objective criteria to recruit, blind evaluations and salary disclosures. 125 Additional measures include providing a system of reporting discrimination and measuring outcomes such as employee pay and hiring, and routinely measuring employee perceptions of inclusion and fairness. Such measures are fundamental to help mitigate inequality and associated adversity.

Acknowledgements

We thank Prof Damien Ridge for his suggestions on this manuscript.

Conflicts of interest

Dipesh Gopal is an in-practice fellow supported by the Department of Health and Social Care and the National Institute for Health Research.

The views expressed are those of the author(s) and not necessarily those of the NHS, the NIHR or the Department of Health.

No honoraria paid to promote media cited including books, podcasts and websites.

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

critical thinking and unconscious bias

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

critical thinking and unconscious bias

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Issue Cover

  • Previous Article
  • Next Article

Bias at the HCP and Health Care System Level

Bias at the patient level, recommended interventions, implicit or unconscious bias in diabetes care.

ORCID logo

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Cite Icon Cite
  • Get Permissions

A. Enrique Caballero , Nuha A. ElSayed , Sherita Hill Golden , Raveendhara R. Bannuru , Brigid Gregg; Implicit or Unconscious Bias in Diabetes Care. Clin Diabetes 15 April 2024; 42 (2): 308–313. https://doi.org/10.2337/cd23-0048

Download citation file:

  • Ris (Zotero)
  • Reference Manager

“Implicit bias,” also called “unconscious bias,” refers to associations outside of conscious awareness that adversely affect one’s perception of a person or group. Awareness of implicit bias has been increasing in the realm of diabetes care. Here, the authors highlight several types of unconscious bias on the part of clinicians and patients, including biases based on race, ethnicity, and obesity. They discuss how these biases can negatively affect patient-centered clinical interactions and diabetes care delivery, and they recommend implementation of evidence-based interventions and other health system policy approaches to reduce the potential impact of such biases in health care settings.

The health care environment is fraught with structural, interpersonal, and systemic disparities. “Explicit bias” refers to conscious attitudes and beliefs about a person or group. In contrast, “implicit bias” or “unconscious bias” involves associations outside of conscious awareness that negatively evaluate a person based on the groups they belong to ( 1 ). This type of bias is often seen when one is interacting with members of racially or ethnically minoritized or otherwise underrepresented groups ( 2 , 3 ). The term “minoritization” is used to acknowledge the marginalization of a group by the dominant society. Interestingly, many populations thought of as minority groups in the United States are quite sizeable and soon will no longer represent a numeral minority compared with White Americans.

In the sphere of health care, racially and ethnically minoritized patients are often viewed as less intelligent, less able to comprehend and adhere to treatment recommendations, and less interested in their health than nonminoritized patients. Studies assessing implicit bias using the Implicit Association Test (IAT) have demonstrated that these biases are associated with disparities in empathy, treatment recommendations, and expectations of therapy adherence ( 4 , 5 ). Less intensive lifestyle modification and pharmacological approaches in racially and ethnically minoritized patients may be related to implicit biases on the part of diabetes care professionals.

The presence of obesity is another characteristic that seems to draw implicit bias in health care, particularly in diabetes care. Physicians have been found to show a preference for patients who are thin, which may negatively affect the care experiences of individuals with overweight or obesity ( 6 ).

Unconscious bias of all types leads to fewer patient-centered clinical interactions, more verbal dominance by health care professionals (HCPs), decreased participatory decision-making in the patient-provider relationship, and less effective health care ( 7 ). Although identifying and evaluating implicit bias in clinicians is not a straightforward task, several studies have shown a significant correlation between degree of implicit bias and quality of care ( 8 , 9 ). Recently, the negative impact of bias on the part of patients and their family members toward HCPs also has been described, and this bias can manifest both implicitly and explicitly ( 10 – 14 ). Experiencing and witnessing instances of bias and discrimination adversely affects the clinical care environment.

Although other forms of bias, such as sex bias, exist and also affect health care, the scope of this article focuses on how racial, ethnic, and obesity biases affect diabetes care delivery. We provide recommendations derived from evidence-based interventions to reduce both individual and systemic biases in health care settings. In addition, we describe health system policy approaches for addressing biases held by patients and family members and discrimination directed toward HCPs.

Bias spans the continuum of diabetes care. In type 2 diabetes prevention, people with overweight or obesity experience bias that may manifest through physical examination and weighing procedures that lack sensitivity or even engender feelings of shame and embarrassment (e.g., gowns and medical equipment that are not appropriate for a patient’s size) ( 15 ). These individuals are often stereotyped as lazy, unmotivated, and undisciplined. In addition to biased treatment in health care, they often experience employment discrimination, educational barriers, media stereotyping, and interpersonal problems ( 8 , 16 ). In a survey of 2,449 people with overweight or obesity, 69% reported experiencing weight bias by physicians, including 50% on multiple occasions; 46% by nurses; 37% by registered dietitians; and 21% by other HCPs ( 8 , 16 ). Even specialists in weight loss were identified as having significant pro-thin, anti-fat implicit bias on the IAT ( 17 ). This bias may reduce HCPs’ likelihood of being mindful to their patients’ needs.

Bias-based negative interactions can have significant impacts on health outcomes. Stigmatized people hide to avoid being subjected to discrimination and bias. People with obesity and high risk for diabetes are less likely to benefit from preventive health services and cancer screenings because they are more likely to cancel or delay such appointments ( 8 , 9 , 16 ). Furthermore, women with obesity reported delaying their preventive services because they have experienced disrespect and negative attitudes from clinicians and embarrassment from being weighed and from having medical equipment that is too small for them ( 9 ). Exercise and higher levels of physical activity are known to provide benefits in both the prevention and management of diabetes ( 18 ). However, weight bias experienced by people who have or are at high risk of developing diabetes results in avoidance of physical activity, less desire to exercise, and thus decreased levels of strenuous and moderate physical activity ( 19 ).

In diabetes care, implicit bias manifests in many ways that negatively affect the health and wellness of people with diabetes. For example, in one study of 1,227 people with type 2 diabetes who reported internalized weight stigma and diabetes self-stigma, a significant association with higher levels of diabetes-specific distress was identified ( 15 ). Adults who also expressed self-stigma regarding their diabetes reported less diabetes self-management and lower self-efficacy. Those who reported being judged about their weight by a doctor also exhibited more significant diabetes-specific distress ( 15 ). In primary care, people with type 2 diabetes who were Black and/or on Medicaid or Medicare were found to have increased risk of being labeled nonadherent compared with their White counterparts and/or those on private insurance, even after adjusting for A1C ( 20 ). This stigma in health care can hinder diabetes diagnosis and management, resulting in other poor health outcomes ( 21 ).

Bias not only affects people with type 2 diabetes and/or obesity. Findings from a cross-sectional study of people with type 1 diabetes demonstrated that diabetes stigma is negatively associated with both diabetes distress and glycemic control ( 22 ). Although technology can confer benefits to people with type 1 diabetes, youth with type 1 diabetes also experience bias with regard to technology access. For example, one study showed that having public insurance (a proxy of low socioeconomic status) resulted in less use of diabetes technology compared with people with private insurance ( 23 ). Also, real-world data from the T1D Exchange clinic registry in 2019 showed that barriers to technology adoption included implicit bias/institutional racism, social determinants of health, cost, access, geography, education, culture, individuals’ and HCPs’ preference, and health literacy and that these barriers resulted in significant care disparities ( 24 ). Of note, the lowest technology utilization rates were among Black patients, followed by Hispanics. This sequence in utilization rates remains even after adjusting for age, sex, study site, insurance type, education level, and neighborhood poverty level ( 24 ). More work is needed in this field to fully elucidate the causes of disparities in diabetes care.

Despite all of the benefits of having increased diversity in the health care workforce, HCPs also report experiencing bias. This bias may be explicit rather than unconscious and can have a negative impact on HCPs, the overall workforce, and patients themselves.

Mistreatment of physicians has been recognized and reported since the early 1980s. Mistreatment is best characterized in medical trainees. A recent single-center study reported that 93% of first-year residents had experienced some form of disruptive behavior directed toward them ( 25 ). Racial discrimination is the most common form of discrimination clinicians report, with overall rates varying between 19 and 71% ( 10 ). Although most research on mistreatment of HCPs has been done with medical trainees ( 11 , 12 ), such mistreatment has also been reported by nursing professionals and practicing physicians ( 26 , 27 ).

Less well characterized is bias on the part of and discrimination carried out by the patients and their families. One study reported that such discrimination accounted for 40% of physician mistreatment ( 25 ). Patient bias toward HCPs has been defined as “behavior or use of language that demeans clinicians based on their social identity traits, such as race, ethnicity, sex, disability, gender presentation, and sexual orientation” ( 13 ). The types of biased behavior by patients can include a variety of overt and more subtle manifestations. Commonly reported instances of mistreatment gathered from focus groups include refusal of care, explicitly biased remarks, questioning of the role of the clinician, nonverbal disrespect, ethnic stereotypes, assertive inquiries into background, and flirtatious remarks ( 13 , 14 ).

Although more recent medical curricula have included training on how to confront these difficult situations, clinicians traditionally have not had such training. The Accreditation Council for Graduate Medical Education, an independent, not-for-profit organization that sets and monitors voluntary professional educational standards essential in preparing physicians to deliver safe, high-quality medical care to all Americans, has encouraged the development of strategies to address mistreatment during residency across academic institutions ( 25 ). There is a concern that confrontation may be more difficult in the medical professional setting for practicing physicians ( 13 ). There is also a concern that responding to or reporting mistreatment could negatively affect physicians’ promotion opportunities and professional success ( 10 – 14 ).

Clinicians who have experienced mistreatment or witnessed the mistreatment of colleagues have reported negative consequences. These experiences have been described as leading to burnout, emotional burden, withdrawal from roles, and decreased clinical learning ( 13 , 14 ). On the other hand, a timely response to discrimination with a team debriefing or support team may improve feelings of inclusion. HCPs armed with training on how to confront mistreatment may also desire to take on leadership roles and model behaviors to improve the workplace for others ( 14 ).

Interventions to Address HCP Bias

Achieving a diverse biomedical workforce is important to reduce implicit bias in medical care. The Bias Reduction in Internal Medicine (BRIM) intervention developed by Carnes et al. ( 28 ) at the University of Wisconsin, Madison, is an evidence-based approach to mitigating unconscious bias in health care.

The BRIM intervention is delivered as a 3-hour interactive workshop consisting of three modules titled “Implicit Bias as a Habit,” “Becoming Bias Literate: If You Name It, You Can Tame It,” and “Evidence-Based Strategies to Break the Bias Habit.” The bias mitigation strategies taught are summarized in Table 1 ( 28 ).

BRIM Program Unconscious Bias Mitigation Strategies

The BRIM intervention was evaluated in a cluster-randomized study at the University of Wisconsin. Compared with faculty in the 46 control departments, faculty in the 46 intervention departments reported increased awareness, motivation, self-efficacy, and action for engaging in gender equity–promoting activities and reported a more positive departmental climate 3 months after the workshop ( 28 ). In addition, intervention departments had greater diversity in new hires 2–3 years after workshop participation ( 29 ).

Although the BRIM intervention was initially focused on addressing bias to diversify the biomedical workforce, its principles can be translated easily to mitigating HCPs’ bias toward patients from diverse backgrounds. We strongly recommend that clinicians carefully reflect on the importance of bias in our day-to-day activities. Recognizing that we all have biases is an important way to start our awareness and growth process in this important area.

Interventions to Address Health Care System Bias in Diabetes

The U.S. Department of Health and Human Services Office of Minority Health has established national Culturally and Linguistically Appropriate Services (CLAS) standards to guide health care organizations in achieving and promoting health equity and reducing health care disparities ( Table 2 ) ( 30 ). These standards require health care organizations to establish culturally and linguistically appropriate goals, policies, and procedures that are incorporated into clinical operations with a plan for evaluation, continuous quality improvement, and accountability ( 30 , 31 ). These standards should be part of the core value for any health care organization.

National CLAS Standards in Health and Health Care

To eliminate bias and achieve health equity, it is important to collect and maintain accurate and reliable demographic data. This effort will allow monitoring and evaluation of disease outcomes based on race, ethnicity, sex, English proficiency, ability status, sexual orientation, and gender identity (another CLAS recommendation). Health care organizations need to train their staff in the proper interview techniques to ascertain these data.

The CLAS standards call for health systems to provide proper assistance to patients with disabilities who have diabetes and language assistance to those with limited English proficiency and/or other communication needs. Among Latinos with diabetes and limited English proficiency, those who switched from a nonlanguage-concordant primary care professional to one who was language- concordant (i.e., Spanish-speaking) had significant improvement in glycemic and LDL cholesterol control ( 31 , 32 ). It is also crucial to provide easy-to-understand patient education materials and signage in clinical areas in the most commonly spoken languages for the population served by a health care organization ( 30 , 31 ).

Health services research in the field of diabetes has demonstrated the effectiveness of multilevel, culturally tailored interventions in improving diabetes outcomes for these vulnerable populations ( 31 ). Multilevel interventions target all aspects of health care, including patients, HCPs, and the health care system. Features of effective interventions that improved A1C included that they were culturally and health literacy–tailored, were led by community educators or laypeople, were provided as one-to-one (vs. group) interactions, incorporated treatment algorithms, focused on behavior- related tasks, provided feedback and were high-intensity over a long duration ( 31 ). Interventions specific to health care organizations that have resulted in improved glycemic control for minority patients with diabetes have included systems for rapid-turnaround (e.g., point-of-care) A1C measurement, circumscribed appointments, support staff involvement (e.g., from nurse case managers, community health workers, and pharmacists), and enhanced follow-up with home visits or telephone/mail contact ( 31 ).

It is certain that the implementation of such interventions would require financial support. Health care system leaders should not forget that optimizing diabetes care for all is the ultimate goal, which may in fact reduce overall health care costs. Therefore, investing in strategies that improve the quality of care across the board should be embraced.

Interventions to Address Patient Bias Toward HCPs

Policy intervention is required to create a health care environment that respects patients’ right to be an active participant in their treatment and care while also protecting clinicians from discrimination by patients. Yielding to a patient’s request to switch to a different HCP based on a protected class characteristic (e.g., an HCP’s race, ethnicity, sex, religion, gender identity, sexual orientation, age, or veteran status) is discriminatory and a violation of the HCP’s civil rights. Health care systems should clearly articulate that, under the majority of clinical circumstances, these requests will not be honored. Educational support should be provided for clinical staff on how to respond to such incidents in a manner that is respectful to patients and shows allyship to clinicians on the receiving end of the discrimination/bias ( 33 ).

This review highlights the need for multifaceted approaches to abolish bias and its consequences in diabetes care. To address implicit bias, we need to rally all stakeholders in various health care settings and communities to share facts, promote cultural competency, and correct misconceptions by amplifying the voices and stories of those marginalized and discriminated against. Moreover, deliberate efforts are needed to represent and portray marginalized groups in media and public information to correct misinformation, myths, and rumors. Furthermore, the adoption of existing and the creation of new systematic, effective, and evidence-based approaches to mitigating unconscious bias should be encouraged and supported.

Acknowledgments

American Diabetes Association staff support was provided by Caroline Blanco, MS, RDN, LDN.

Duality of Interest

No potential conflicts of interest relevant to this article were reported.

Author Contributions

All authors wrote the manuscript, edited and revised the manuscript, and approved the final version for submission. A.E.C. is the guarantor of this work and, as such, had full access to all the information presented and takes responsibility for the integrity and accuracy of the work.

Email alerts

  • Online ISSN 1945-4953
  • Print ISSN 0891-8929
  • Diabetes Care
  • Clinical Diabetes
  • Diabetes Spectrum
  • Standards of Medical Care in Diabetes
  • Scientific Sessions Abstracts
  • BMJ Open Diabetes Research & Care
  • ShopDiabetes.org
  • ADA Professional Books

Clinical Compendia

  • Clinical Compendia Home
  • Latest News
  • DiabetesPro SmartBrief
  • Special Collections
  • DiabetesPro®
  • Diabetes Food Hub™
  • Insulin Affordability
  • Know Diabetes By Heart™
  • About the ADA
  • Journal Policies
  • For Reviewers
  • Advertising in ADA Journals
  • Reprints and Permission for Reuse
  • Copyright Notice/Public Access Policy
  • ADA Professional Membership
  • ADA Member Directory
  • Diabetes.org
  • X (Twitter)
  • Cookie Policy
  • Accessibility
  • Terms & Conditions
  • Get Adobe Acrobat Reader
  • © Copyright American Diabetes Association

This Feature Is Available To Subscribers Only

Sign In or Create an Account

IMAGES

  1. Infographic : 18 Cognitive Bias Examples Show Why Mental Mistakes Get Made

    critical thinking and unconscious bias

  2. Cognitive and Unconscious Bias: What It Is and How to Overcome It

    critical thinking and unconscious bias

  3. 25 Unconscious Bias Examples (2024)

    critical thinking and unconscious bias

  4. 8 of the Most Common Biases in the Workplace

    critical thinking and unconscious bias

  5. Understanding Unconscious Bias

    critical thinking and unconscious bias

  6. Cognitive and Unconscious Bias: What It Is and How to Overcome It

    critical thinking and unconscious bias

VIDEO

  1. Critical Thinking Concepts: Status Quo Bias

  2. Types of Unconscious Bias in the Workplace

  3. Say Goodbye to Sleepless Nights ★ Melatonin Release ★ Overcome Stress, Anxiety

  4. Sleep Soundly In 5 Minutes ★ Release All Blockages ★ Emotional And Spiritual Healing

  5. The Curse of Knowledge Bias: Struggles of Highly Intelligent Person

  6. AOM Unconscious Biases In Peer Review

COMMENTS

  1. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  2. What Is Unconscious Bias (And How You Can Defeat It)

    Unconscious bias (also known as implicit bias) refers to unconscious forms of discrimination and stereotyping based on race, gender, sexuality, ethnicity, ability, age, and so on. It differs from ...

  3. Defeating Unconscious Bias: The Role of a Structured, Reflective, and

    Introduction. Unconscious or implicit biases are attitudes or stereotypes that arise from preformed mental associations, which influence our understanding, actions, and decisions in an unconscious manner. 1 Unconscious biases are universal and have adverse consequences for the workplace, health care, and the learning environment. 2 - 4 Studies show that clinicians' negative implicit bias ...

  4. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  5. Implicit Bias (Unconscious Bias): Definition & Examples

    Implicit bias (unconscious bias) refers to attitudes and beliefs outside our conscious awareness and control. ... Implicit biases are an example of system one thinking, so we are unaware they exist (Greenwald & Krieger, 2006). ... Because of the harmful nature of implicit biases, it is critical to examine how we can begin to remove them ...

  6. PDF Understanding Unconscious Bias

    In his book, Thinking Fast and Slow, he describes the two systems that our brain uses for making decisions a as Systems 1 and Systems 2. System 1: Intuitive Thinking • Unconscious, automatic, emotional, fast, effortless System 2: Rational Thinking • Conscious, deliberate, systematic, slow and effortful Common Workplace Biases

  7. PDF Understanding unconscious bias

    groups, but unconscious bias or 'implicit bias', as it is also called, is innate to human nature. We may not be aware of it, but we all ... thinking involved. It is the result of a subtle

  8. Cognitive Bias Is the Loose Screw in Critical Thinking

    Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...

  9. 2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  10. Understanding unconscious bias

    Unconscious bias happens automatically and without any thinking involved. It is the result of a subtle cognitive process in the subconscious of which we are not consciously aware. Conscious bias ...

  11. How unconscious bias shapes your thinking (and what you can do about it

    Unconscious biases are thought patterns; mental shortcuts. Everybody has them. We learn these tendencies over our lifetime because they help us. ... We can do a complex activity like riding a bike, without consciously thinking about it. In a very similar way, biases help us navigate a complex social world. Unfortunately, biases also have ...

  12. How to Identify, Understand, and Unlearn Implicit Bias in ...

    Explore and identify your own implicit biases by taking implicit association tests or through other means. Practice ways to reduce stress and increase mindfulness, such as meditation, yoga, or ...

  13. Unconscious Bias Training That Works

    Unconscious Bias Training That Works. Increasing awareness isn't enough. Teach people to manage their biases, change their behavior, and track their progress. Summary. To become more diverse ...

  14. Unconscious bias

    Unconscious bias EqualBITE 74 Unconscious bias Derek Jones What is bias? We are not aware of most of our cognition and thinking (Mlodinow, 2012; Norman, 2005). Each and every day we respond subliminally to a huge range of events and conditions. Most of the time we do not question or challenge these cognitive processes since

  15. Unconscious bias: what it is and how to avoid it in the workplace

    Although unconscious bias can't be cured, there are many steps that can be taken to mitigate it. Leaders who can recognize their unconscious biases and make adjustments to overcome them are more likely to make better decisions. To be ever-mindful of unconscious bias, it's important to practice self awareness and slow down decision making to ...

  16. All inside our heads? A critical discursive review of unconscious bias

    Consequentially, a range of thinking, actions, and outcomes are understood, explained, and made sense of through a lens of unconscious bias. The recognition of the problem of bias, whatever the understanding of the underlying nature and causes, has produced a range of interventions that aim to increase understanding, expectations, and/or ...

  17. Cognitive Biases and Their Influence on Critical Thinking and

    Researchers have discovered 200 cognitive biases that result in inaccurate or irrational judgments and decisions, ranging from actor-observer to zero risk bias.

  18. Fast thinking: How unconscious bias and binary language contribute to

    In this critical analysis, using Kahneman's fast and slow thinking, we argue that nurses working with hospitalised older people often rely on thinking quickly in hectic work environments, which can contribute to unconscious and conscious bias, use of binary language to describe older persons and nursing tasks, and ultimately rationing of care.

  19. 19 Unconscious Bias Examples and How to Prevent Them [2024] • Asana

    19 unconscious biases to overcome and help promote inclusivity. Team Asana. January 4th, 2024 16 min read. Summary. Unconscious biases are learned assumptions, beliefs, or attitudes that we aren't necessarily aware of. While bias is a normal part of human brain function, it can often reinforce stereotypes.

  20. Implicit bias in healthcare: clinical practice, research and decision

    To our knowledge at the time of writing, only the Royal College of Surgeons of England has identified the importance of unconscious bias through an information booklet. 111 The booklet entitled Avoiding unconscious bias seems unlikely because type 2 processingis integral to human thinking. There is a need for better-powered research into the ...

  21. Cognitive Bias List: 13 Common Types of Bias

    The Misinformation Effect. The Actor-Observer Bias. The False Consensus Effect. The Halo Effect. The Self-Serving Bias. The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases.

  22. 3 Critical Thinking Skills You Need In 2024

    To develop critical thinking for your career success, consider building the following skills: 1. Curiosity. Innovation comes through being curious enough to keep probing and digging for ...

  23. Implicit or Unconscious Bias in Diabetes Care

    Unconscious bias of all types leads to fewer patient-centered clinical interactions, more verbal dominance by health care professionals (HCPs), decreased participatory decision-making in the patient-provider relationship, and less effective health care ().Although identifying and evaluating implicit bias in clinicians is not a straightforward task, several studies have shown a significant ...

  24. Unconscious bias in the workplace and the need to address it

    Unconscious bias is a pervasive issue in the workplace and can have serious consequences for both employers and job seekers. Statistics show that hiring decisions are often made within minutes of ...

  25. Fast thinking: How unconscious bias and binary language contribute to

    scious bias. Methods: In this critical analysis, using Kahneman's fast and slow thinking, we argue that nurses working with hospitalised older people often rely on thinking quickly in hectic work environments, which can contribute to unconscious and conscious bias, use of binary language to describe older persons and nursing tasks, and ultimately