Confirmation Bias In Psychology: Definition & Examples

Julia Simkus

Editor at Simply Psychology

BA (Hons) Psychology, Princeton University

Julia Simkus is a graduate of Princeton University with a Bachelor of Arts in Psychology. She is currently studying for a Master's Degree in Counseling for Mental Health and Wellness in September 2023. Julia's research has been published in peer reviewed journals.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

On This Page:

Confirmation Bias is the tendency to look for information that supports, rather than rejects, one’s preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association).

One of the early demonstrations of confirmation bias appeared in an experiment by Peter Watson (1960) in which the subjects were to find the experimenter’s rule for sequencing numbers.

Its results showed that the subjects chose responses that supported their hypotheses while rejecting contradictory evidence, and even though their hypotheses were incorrect, they became confident in them quickly (Gray, 2010, p. 356).

Though such evidence of confirmation bias has appeared in psychological literature throughout history, the term ‘confirmation bias’ was first used in a 1977 paper detailing an experimental study on the topic (Mynatt, Doherty, & Tweney, 1977).

Confirmation bias as psychological objective attitude issue outline diagram. Incorrect information checking or aware of self interpretation vector illustration. Tendency to approve existing opinion.

Biased Search for Information

This type of confirmation bias explains people’s search for evidence in a one-sided way to support their hypotheses or theories.

Experiments have shown that people provide tests/questions designed to yield “yes” if their favored hypothesis is true and ignore alternative hypotheses that are likely to give the same result.

This is also known as the congruence heuristic (Baron, 2000, p.162-64). Though the preference for affirmative questions itself may not be biased, there are experiments that have shown that congruence bias does exist.

For Example:

If you were to search “Are cats better than dogs?” in Google, all you would get are sites listing the reasons why cats are better.

However, if you were to search “Are dogs better than cats?” google will only provide you with sites that believe dogs are better than cats.

This shows that phrasing questions in a one-sided way (i.e., affirmative manner) will assist you in obtaining evidence consistent with your hypothesis.

Biased Interpretation

This type of bias explains that people interpret evidence concerning their existing beliefs by evaluating confirming evidence differently than evidence that challenges their preconceptions.

Various experiments have shown that people tend not to change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Additionally, people accept “confirming” evidence more easily and critically evaluate the “disconfirming” evidence (this is known as disconfirmation bias) (Taber & Lodge, 2006).

When provided with the same evidence, people’s interpretations could still be biased.

For example:

Biased interpretation is shown in an experiment conducted by Stanford University on the topic of capital punishment. It included participants who were in support of and others who were against capital punishment.

All subjects were provided with the same two studies.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence or considering it inferior to the “confirming” evidence (Lord, Ross, & Lepper, 1979).

Biased Memory

To confirm their current beliefs, people may remember/recall information selectively. Psychological theories vary in defining memory bias.

Some theories state that information confirming prior beliefs is stored in the memory while contradictory evidence is not (i.e., Schema theory). Some others claim that striking information is remembered best (i.e., humor effect).

Memory confirmation bias also serves a role in stereotype maintenance. Experiments have shown that the mental association between expectancy-confirming information and the group label strongly affects recall and recognition memory.

Though a certain stereotype about a social group might not be true for an individual, people tend to remember the stereotype-consistent information better than any disconfirming evidence (Fyock & Stangor, 1994).

In one experimental study, participants were asked to read a woman’s profile (detailing her extroverted and introverted skills) and assess her for either a job of a librarian or real-estate salesperson.

Those assessing her as a salesperson better recalled extroverted traits, while the other group recalled more examples of introversion (Snyder & Cantor, 1979).

These experiments, along with others, have offered an insight into selective memory and provided evidence for biased memory, proving that one searches for and better remembers confirming evidence.

social media bias

Social Media

Information we are presented on social media is not only reflective of what the users want to see but also of the designers’ beliefs and values. Today, people are exposed to an overwhelming number of news sources, each varying in their credibility.

To form conclusions, people tend to read the news that aligns with their perspectives. For instance, new channels provide information (even the same news) differently from each other on complex issues (i.e., racism, political parties, etc.), with some using sensational headlines/pictures and one-sided information.

Due to the biased coverage of topics, people only utilize certain channels/sites to obtain their information to make biased conclusions.

Religious Faith

People also tend to search for and interpret evidence with respect to their religious beliefs (if any).

For instance, on the topics of abortion and transgender rights, people whose religions are against such things will interpret this information differently than others and will look for evidence to validate what they believe.

Similarly, those who religiously reject the theory of evolution will either gather information disproving evolution or hold no official stance on the topic.

Also, irreligious people might perceive events that are considered “miracles” and “test of faiths” by religious people to be a reinforcement of their lack of faith in a religion.

when Does The Confirmation Bias Occur?

There are several explanations why humans possess confirmation bias, including this tendency being an efficient way to process information, protect self-esteem, and minimize cognitive dissonance.

Information Processing

Confirmation bias serves as an efficient way to process information because of the limitless information humans are exposed to.

To form an unbiased decision, one would have to critically evaluate every piece of information present, which is unfeasible. Therefore, people only tend to look for information desired to form their conclusions (Casad, 2019).

Protect Self-esteem

People are susceptible to confirmation bias to protect their self-esteem (to know that their beliefs are accurate).

To make themselves feel confident, they tend to look for information that supports their existing beliefs (Casad, 2019).

Minimize Cognitive Dissonance

Cognitive dissonance also explains why confirmation bias is adaptive.

Cognitive dissonance is a mental conflict that occurs when a person holds two contradictory beliefs and causes psychological stress/unease in a person.

To minimize this dissonance, people adapt to confirmation bias by avoiding information that is contradictory to their views and seeking evidence confirming their beliefs.

Challenge avoidance and reinforcement seeking to affect people’s thoughts/reactions differently since exposure to disconfirming information results in negative emotions, something that is nonexistent when seeking reinforcing evidence (“The Confirmation Bias: Why People See What They Want to See”).

Implications

Confirmation bias consistently shapes the way we look for and interpret information that influences our decisions in this society, ranging from homes to global platforms. This bias prevents people from gathering information objectively.

During the election campaign, people tend to look for information confirming their perspectives on different candidates while ignoring any information contradictory to their views.

This subjective manner of obtaining information can lead to overconfidence in a candidate, and misinterpretation/overlooking of important information, thus influencing their voting decision and, eventually country’s leadership (Cherry, 2020).

Recruitment and Selection

Confirmation bias also affects employment diversity because preconceived ideas about different social groups can introduce discrimination (though it might be unconscious) and impact the recruitment process (Agarwal, 2018).

Existing beliefs of a certain group being more competent than the other is the reason why particular races and gender are represented the most in companies today. This bias can hamper the company’s attempt at diversifying its employees.

Mitigating Confirmation Bias

Change in intrapersonal thought:.

To avoid being susceptible to confirmation bias, start questioning your research methods, and sources used to obtain their information.

Expanding the types of sources used in searching for information could provide different aspects of a particular topic and offer levels of credibility.

  • Read entire articles rather than forming conclusions based on the headlines and pictures. – Search for credible evidence presented in the article.
  • Analyze if the statements being asserted are backed up by trustworthy evidence (tracking the source of evidence could prove its credibility). – Encourage yourself and others to gather information in a conscious manner.

Alternative hypothesis:

Confirmation bias occurs when people tend to look for information that confirms their beliefs/hypotheses, but this bias can be reduced by taking into alternative hypotheses and their consequences.

Considering the possibility of beliefs/hypotheses other than one’s own could help you gather information in a more dynamic manner (rather than a one-sided way).

Related Cognitive Biases

There are many cognitive biases that characterize as subtypes of confirmation bias. Following are two of the subtypes:

Backfire Effect

The backfire effect occurs when people’s preexisting beliefs strengthen when challenged by contradictory evidence (Silverman, 2011).

  • Therefore, disproving a misconception can actually strengthen a person’s belief in that misconception.

One piece of disconfirming evidence does not change people’s views, but a constant flow of credible refutations could correct misinformation/misconceptions.

This effect is considered a subtype of confirmation bias because it explains people’s reactions to new information based on their preexisting hypotheses.

A study by Brendan Nyhan and Jason Reifler (two researchers on political misinformation) explored the effects of different types of statements on people’s beliefs.

While examining two statements, “I am not a Muslim, Obama says.” and “I am a Christian, Obama says,” they concluded that the latter statement is more persuasive and resulted in people’s change of beliefs, thus affirming statements are more effective at correcting incorrect views (Silverman, 2011).

Halo Effect

The halo effect occurs when people use impressions from a single trait to form conclusions about other unrelated attributes. It is heavily influenced by the first impression.

Research on this effect was pioneered by American psychologist Edward Thorndike who, in 1920, described ways officers rated their soldiers on different traits based on first impressions (Neugaard, 2019).

Experiments have shown that when positive attributes are presented first, a person is judged more favorably than when negative traits are shown first. This is a subtype of confirmation bias because it allows us to structure our thinking about other information using only initial evidence.

Learning Check

When does the confirmation bias occur.

  • When an individual only researches information that is consistent with personal beliefs.
  • When an individual only makes a decision after all perspectives have been evaluated.
  • When an individual becomes more confident in one’s judgments after researching alternative perspectives.
  • When an individual believes that the odds of an event occurring increase if the event hasn’t occurred recently.

The correct answer is A. Confirmation bias occurs when an individual only researches information consistent with personal beliefs. This bias leads people to favor information that confirms their preconceptions or hypotheses, regardless of whether the information is true.

Take-home Messages

  • Confirmation bias is the tendency of people to favor information that confirms their existing beliefs or hypotheses.
  • Confirmation bias happens when a person gives more weight to evidence that confirms their beliefs and undervalues evidence that could disprove it.
  • People display this bias when they gather or recall information selectively or when they interpret it in a biased way.
  • The effect is stronger for emotionally charged issues and for deeply entrenched beliefs.

Agarwal, P., Dr. (2018, October 19). Here Is How Bias Can Affect Recruitment In Your Organisation. https://www.forbes.com/sites/pragyaagarwaleurope/2018/10/19/how-can-bias-during-interviewsaffect-recruitment-in-your-organisation

American Psychological Association. (n.d.). APA Dictionary of Psychology. https://dictionary.apa.org/confirmation-bias

Baron, J. (2000). Thinking and Deciding (Third ed.). Cambridge University Press.

Casad, B. (2019, October 09). Confirmation bias . https://www.britannica.com/science/confirmation-bias

Cherry, K. (2020, February 19). Why Do We Favor Information That Confirms Our Existing Beliefs? https://www.verywellmind.com/what-is-a-confirmation-bias-2795024

Fyock, J., & Stangor, C. (1994). The role of memory biases in stereotype maintenance. The British journal of social psychology, 33 (3), 331–343.

Gray, P. O. (2010). Psychology . New York: Worth Publishers.

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37 (11), 2098–2109.

Mynatt, C. R., Doherty, M. E., & Tweney, R. D. (1977). Confirmation bias in a simulated research environment: An experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29 (1), 85-95.

Neugaard, B. (2019, October 09). Halo effect. https://www.britannica.com/science/halo-effect

Silverman, C. (2011, June 17). The Backfire Effect . https://archives.cjr.org/behind_the_news/the_backfire_effect.php

Snyder, M., & Cantor, N. (1979). Testing hypotheses about other people: The use of historical knowledge. Journal of Experimental Social Psychology, 15 (4), 330–342.

Further Information

  • What Is Confirmation Bias and When Do People Actually Have It?
  • Confirmation Bias: A Ubiquitous Phenomenon in Many Guises
  • The importance of making assumptions: why confirmation is not necessarily a bias
  • Decision Making Is Caused By Information Processing And Emotion: A Synthesis Of Two Approaches To Explain The Phenomenon Of Confirmation Bias

Confirmation bias occurs when individuals selectively collect, interpret, or remember information that confirms their existing beliefs or ideas, while ignoring or discounting evidence that contradicts these beliefs.

This bias can happen unconsciously and can influence decision-making and reasoning in various contexts, such as research, politics, or everyday decision-making.

What is confirmation bias in psychology?

Confirmation bias in psychology is the tendency to favor information that confirms existing beliefs or values. People exhibiting this bias are likely to seek out, interpret, remember, and give more weight to evidence that supports their views, while ignoring, dismissing, or undervaluing the relevance of evidence that contradicts them.

This can lead to faulty decision-making because one-sided information doesn’t provide a full picture.

Print Friendly, PDF & Email

Confirmation Bias: Seeing What We Want to Believe

Confirmation Bias

Confirmation bias is a widely recognized phenomenon and refers to our tendency to seek out evidence in line with our current beliefs and stick to ideas even when the data contradicts them (Lidén, 2023).

Evolutionary and cognitive psychologists agree that we naturally tend to be selective and look for information we already know (Buss, 2016).

This article explores this tendency, how it happens, why it matters, and what we can do to get better at recognizing it and reducing its impact.

Before you continue, we thought you might like to download our three Positive CBT Exercises for free . These science-based exercises will provide you with detailed insight into positive Cognitive-Behavioral Therapy (CBT) and give you the tools to apply it in your therapy or coaching.

This Article Contains

Understanding confirmation bias, fascinating confirmation bias examples, 10 reasons we fall for it, 10 steps to recognizing and reducing confirmation bias, how confirmation bias impacts research, can confirmation bias be good, resources from positivepsychology.com, a take-home message.

We can understand the confirmation bias definition as the human tendency “to seek out, to interpret, to favor, and to selectively recall information that confirms beliefs they already hold, while avoiding or ignoring information that disconfirms these beliefs” (Gabriel & O’Connor, 2024, p. 1).

While it has been known and accepted since at least the 17th century that humans are inclined to form and hold on to ideas and beliefs — often tenaciously — even when faced with contradictory evidence, the term “confirmation bias” only became popular in the 1960s with the work of cognitive psychologist Peter Cathcart Wason (Lidén, 2023).

Wason’s (1960) famous 2–4–6 experiment was devised to investigate the nature of hypothesis testing.

Participants were given the numbers 2, 4, and 6 and told the numbers adhered to a rule.

They were then asked to arrive at a hypothesis explaining the sequence and try a new three-number series to test their rule (Wason, 1960; Lidén, 2023).

For example, if a participant thought the second number was twice that of the first and the third number was three times greater, they might suggest the numbers 10, 20, and 30.

However, if another participant thought it was a simple series increasing by two each time, they might suggest 13, 15, and 17 (Wason, 1960; Lidén, 2023).

The actual rule is more straightforward; the numbers are in ascending order. That’s all.

As we typically offer tests that confirm our initial beliefs, both example hypotheses appear to work, even if they are not the answer (Wason, 1960; Lidén, 2023).

The experiment demonstrates our confirmation bias; we seek information confirming our existing beliefs or hypotheses rather than challenging or disproving them (Lidén, 2023).

In the decades since, and with developments in cognitive science, we have come to understand that people don’t typically have everything they need, “and even if they did, they would not be able to use all the information due to constraints in the environment, attention, or memory” (Lidén, 2023, p. 8).

Instead, we rely on heuristics. Such “rules of thumb” are easy to apply and fairly accurate, yet they can potentially result in systematic and serious biases and errors in judgment (Lidén, 2023; Eysenck & Keane, 2015).

Confirmation bias in context

Confirmation bias is one of several cognitive biases ’(Lidén, 2023).

They are important because researchers have recognized that “vulnerability to clinical anxiety and depression depends in part on various cognitive biases” and that mental health treatments such as CBT  should support the goals of reducing them (Eysenck & Keane, 2015, p. 668).

Cognitive biases include (Eysenck & Keane, 2015):

  • Attentional bias Attending to threat-related stimuli more than neutral stimuli
  • Interpretive bias Interpreting ambiguous stimuli, situations, and events as threatening
  • Explicit memory bias The likelihood of retrieving mostly unpleasant thoughts rather than positive ones
  • Implicit memory bias The tendency to perform better for negative or threatening information on memory tests

Individuals possessing all four biases focus too much on environmental threats, interpret most incidents as concerning, and identify themselves as having experienced mostly unpleasant past events (Eysenck & Keane, 2015).

Similarly, confirmation bias means that individuals give too much weight to evidence that confirms their preconceptions or hypotheses, even incorrect and unhelpful ones. It can lead to poor decision-making because it limits their ability to consider alternative viewpoints or evidence that contradicts their beliefs (Lidén, 2023).

Unsurprisingly, such a negative outlook or bias will lead to unhealthy outcomes, including anxiety and depression (Eysenck & Keane, 2015).

Check out Tali Sharot’s video for a deeper dive.

Confirmation bias is commonplace and typically has a low impact, yet there are times when it is significant and newsworthy (Eysenck & Keane, 2015; Lidén, 2023).

Limits of information

In 2005, terrorists detonated four bombs in London (three on the London Underground and one on a bus), killing 52 and injuring 700 civilians. In the chaotic weeks that followed, a further attempt failed to detonate a suicide bomb, and the individual got away (Lidén, 2023).

Unsurprisingly, a mass hunt was launched to capture the escaped bomber, and many suspects came under surveillance. Yet, the security services made several significant mistakes.

On July 22, 2005, a man living in the same house as two suspects and bearing a resemblance to one of them was shot dead on an Underground train by officers.

“The context with the previous bombings, the available intelligence, and the pre-operation briefings, created expectations that the surveillance team would spot a suicide bomber leaving the doorway” (Lidén, 2023, p. 37).

The wrong man died because the officers involved failed to see the limits of the information available to them at the time.

Witness identification

In 1976, factory worker John Demjanjuk from Cleveland, Ohio, was identified as a Nazi war criminal known as Ivan the Terrible, perpetrator of many killings within prison camps in the Second World War (Lidén, 2023).

Due to the individual’s denial and limited evidence, the case rested on proof of identity via a photo line-up. However, it became known that “Ivan the Terrible” had a round face and was bald.

As the defendant was the only individual who matched the description, he was chosen by all the witnesses (Lidén, 2023).

Whether or not the witnesses were genuinely able to identify the factory worker as the criminal became irrelevant. The case centered around the unfairness of the line-up and the confirmation bias that resulted from the information they had been given (Lidén, 2023).

Years later, in 2012, following continuing challenges to his identity, John Demjanjuk died pending an appeal for his conviction in a German court. His identity remained unclear as the confirmation bias remained (“Ivan the Terrible,” 2024).

confirmation bias critical thinking

Download 3 Free Positive CBT Exercises (PDF)

These detailed, science-based exercises will equip you or your clients with tools to find new pathways to reduce suffering and more effectively cope with life stressors.

Download 3 Free Positive CBT Tools Pack (PDF)

By filling out your name and email address below.

Confirmation bias can significantly impact our own and others’ lives (Lidén, 2023; Kappes et al., 2020).

For that reason, it is helpful to understand why it happens and the psychological factors involved. Research confirms that people (Lidén, 2023; Kappes et al., 2020; Eysenck & Keane, 2015):

  • Don’t like to let go of their initial hypothesis
  • Prefer to use as much information as is initially available, often resulting in a too specific hypothesis
  • Show confirmation bias more on their hypothesis than others
  • Are more likely to adopt a confirmation bias when under high cognitive load
  • With a lower degree of intelligence are more likely to engage in confirmation bias (most likely due to being less able to manage higher cognitive loads and see the overall picture)
  • With cognitive impairments are more impacted by confirmation bias
  • Are often unable to actively consider and understand all relevant information to challenge the existing hypothesis or make a new one
  • Are influenced by their emotions and motivations and potentially “blinded” to the facts
  • Are biased by existing thoughts and beliefs (sometimes cultural), even if incorrect
  • Are influenced by the beliefs and arguments of those around them

Recognize confirmation bias

  • Recognize that confirmation bias exists and understand its impact on decision-making and how you interpret information. ​
  • Actively seek out and consider different viewpoints, opinions, and sources of information that challenge your existing beliefs and hypotheses. ​
  • Develop critical thinking skills that evaluate evidence and arguments objectively without favoring preconceived notions or desired outcomes.
  • Be aware of your biases and open to questioning your beliefs and assumptions.
  • Explore alternative explanations or hypotheses that may contradict your initial beliefs or interpretations.
  • Welcome feedback and criticism from others, even if they challenge your ideas; recognize it as an opportunity to learn and grow.
  • Apply systematic and rigorous methods to gather and analyze data, ensuring your conclusions are evidence-based rather than a result of personal biases.
  • Engage in collaborative discussions and debates with individuals with different perspectives to help see other viewpoints and challenge your biases.
  • Continuously seek new information and update your knowledge base to avoid becoming entrenched and support more-informed decision-making.
  • Practice analytical thinking, questioning assumptions, evaluating evidence objectively, and considering alternate explanations.

As far back as 1968, Karl Popper recognized that falsifiability (being able to prove that something can be incorrect or false) is crucial to all scientific inquiry, impacting researchers’ behavior and experimental outcomes.

As scientists, Popper argued, we should focus on looking for examples of why a theory does not work instead of seeking confirmation of its correctness. More recently, researchers have also considered that when findings suggest a theory is false, it may be due to issues with the experimental design or data accuracy (Eysenck & Keane, 2015).

Yet, confirmation bias has been an issue for a long time in scientific discovery and remains a challenge.

When researchers looked back at the work of Alexander Graham Bell in developing the telephone, they found that, due to confirmation bias, he ignored promising new approaches in favor of his tried-and-tested ones. It ultimately led to Thomas Edison being the first to develop the forerunner of today’s telephone (Eysenck & Keane, 2015).

More recently, a study showed that 88% of professional scientists working on issues in molecular biology responded to unexpected and inconsistent findings by blaming their experimental methods; they ignored the suggestion that they may need to modify, or even replace, their theories (Eysenck & Keane, 2015).

However, when those same scientists changed their approach yet obtained similarly inconsistent results, 61% revisited their theoretical assumptions (Eysenck & Keane, 2015).

Failure to report null research findings is also a problem. It is known as the “file drawer problem” because data remains unseen in the bottom drawer as the researcher does not attempt to get findings published or because journals show no interest in them (Lidén, 2023).

Positive confirmation bias

Researchers have recognized several potential benefits that arise from our natural inclination to seek out confirmation that we are right, including (Peters, 2022; Gabriel & O’Connor, 2024; Bergerot et al., 2023):

  • Assisting in the personal development of individuals by reinforcing their positive self-conceptions and traits
  • Helping individuals shape social structures by persuading others to adopt their viewpoints
  • Supporting increased confidence by reinforcing individuals’ beliefs and ignoring contradictory evidence
  • Contributing to social conformity and stability by reinforcing shared beliefs and values within a group, potentially boosting cooperation and coordination
  • Encouraging decision-making by removing uncertainty and doubt
  • Increasing the knowledge-producing capacity of a group by supporting a deeper exploration of individual members’ perspectives

It’s vital to note that the possible benefits also have their limitations. They potentially favor the individual at the cost of others’ needs while potentially distorting and hindering the formation of well-founded beliefs (Peters, 2022).

confirmation bias critical thinking

17 Science-Based Ways To Apply Positive CBT

These 17 Positive CBT & Cognitive Therapy Exercises [PDF] include our top-rated, ready-made templates for helping others develop more helpful thoughts and behaviors in response to challenges, while broadening the scope of traditional CBT.

Created by Experts. 100% Science-based.

We have many resources for coaches and therapists to help individuals and groups understand and manage their biases.

Why not download our free 3 Positive CBT Exercises Pack and try out the powerful tools contained within? Some examples include the following:

  • Re-Framing Critical Self-Talk  Self-criticism typically involves judgment and self-blame regarding our shortcomings (real or imagined), such as our inability to accomplish personal goals and meet others’ expectations. In this exercise, we use self-talk to help us reduce self-criticism and cultivate a kinder, compassionate relationship with ourselves.
  • Solution-Focused Guided Imagery Solution-focused therapy assumes we have the resources required to resolve our issues. Here, we learn how to connect with our strengths and overcome the challenges we face.

Other free resources include:

  • The What-If Bias We often get caught up in our negative biases, thinking about potentially dire outcomes rather than adopting rational beliefs. This exercise helps us regain a more realistic and balanced perspective.
  • Becoming Aware of Assumptions We all bring biases into our daily lives, particularly conversations. In this helpful exercise , we picture how things might be in five years to put them into context.

More extensive versions of the following tools are available with a subscription to the Positive Psychology Toolkit© , but they are described briefly below.

  • Increasing Awareness of Cognitive Distortions

Cognitive distortions refer to our biased thinking about ourselves and our environment. This tool helps reduce the effect of the distortions by dismantling them.

  • Step one – Begin by exploring cognitive distortions, such as all-or-nothing thinking, jumping to conclusions, and catastrophizing .
  • Step two – Next, identify the cognitive distortions relevant to your situation.
  • Step three – Reflect on your thinking patterns, how they could harm you, and how you interact with others.
  • Finding Silver Linings

We tend to dwell on the things that go wrong in our lives. We may even begin to think our days are filled with mishaps and disappointments.

Rather than solely focusing on things that have gone wrong, it can help to look on the bright side. Try the following:

  • Step one – Create a list of things that make you feel life is worthwhile, enjoyable, and meaningful.
  • Step two – Think of a time when things didn’t go how you wanted them to.
  • Step three – Reflect on what this difficulty cost you.
  • Step four – Finally, consider what you may have gained from the experience. Write down three positives.

If you’re looking for more science-based ways to help others through CBT, check out this collection of 17 validated positive CBT tools for practitioners. Use them to help others overcome unhelpful thoughts and feelings and develop more positive behaviors.

We can’t always trust what we hear or see because our beliefs and expectations influence so much of how we interact with the world.

Confirmation bias refers to our natural inclination to seek out and focus on what confirms our beliefs, often ignoring anything that contradicts them.

While we have known of its effect for over 200 years, it still receives considerable research focus because of its impact on us individually and as a society, often causing us to make poor decisions and leading to damaging outcomes.

Confirmation bias has several sources and triggers, including our unwillingness to relinquish our initial beliefs (even when incorrect), preference for personal hypotheses, cognitive load, and cognitive impairments.

However, most of us can reduce confirmation bias with practice and training. We can become more aware of such inclinations and seek out challenges or alternate explanations for our beliefs.

It matters because confirmation bias can influence how we work, the research we base decisions on, and how our clients manage their relationships with others and their environments.

We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free .

  • Bergerot, C., Barfuss, W., & Romanczuk, P. (2023). Moderate confirmation bias enhances collective decision-making . biorXiv. https://www.biorxiv.org/content/10.1101/2023.11.21.568073v1.full
  • Buss, D. M. (2016). Evolutionary psychology: The new science of the mind . Routledge.
  • Eysenck, M. W., & Keane, M. T. (2015). Cognitive psychology: A student’s handbook . Psychology Press.
  • Gabriel, N., & O’Connor, C. (2024). Can confirmation bias improve group learning? PhilSci Archive. https://philsci-archive.pitt.edu/20528/
  • Ivan the Terrible (Treblinka guard). (2024). In Wikipedia . https://en.wikipedia.org/wiki/Ivan_the_Terrible_(Treblinka_guard)
  • Kappes, A., Harvey, A. H., Lohrenz, T., Montague, P. R., & Sharot, T. (2020). Confirmation bias in the utilization of others’ opinion strength. Nature Neuroscience , 23 (1), 130–137.
  • Lidén, M. (2023). Confirmation bias in criminal cases . Oxford University Press.
  • Peters, U. (2022). What is the function of confirmation bias? Erkenntnis , 87 , 1351–1376.
  • Popper, K. R. (1968). The logic of scientific discovery . Hutchinson.
  • Rist, T. (2023). Confirmation bias studies: Towards a scientific theory in the humanities. SN Social Sciences , 3 (8).
  • Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology , 12 (3), 129–140.

' src=

Share this article:

Article feedback

Let us know your thoughts cancel reply.

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Related articles

Sunk cost fallacy

Sunk Cost Fallacy: Why We Can’t Let Go

If you’ve continued with a decision or an investment of time, money, or resources long after you should have stopped, you’ve succumbed to the ‘sunk [...]

Behavior therapy for kids

Behavior Therapy for Kids: 9 Fun Games and Techniques

In its many forms, behavior therapy is a powerful tool for treating a wide range of psychological disorders in children, aiming to increase their skills [...]

Activity Scheduling

How to Do Activity Scheduling: 6 Templates & Worksheets

The activities we engage in influence how we feel. And yet, when depressed, clients often find themselves unable to do those things that bring enjoyment [...]

Read other articles by their category

  • Body & Brain (49)
  • Coaching & Application (57)
  • Compassion (26)
  • Counseling (51)
  • Emotional Intelligence (24)
  • Gratitude (18)
  • Grief & Bereavement (21)
  • Happiness & SWB (40)
  • Meaning & Values (26)
  • Meditation (20)
  • Mindfulness (45)
  • Motivation & Goals (45)
  • Optimism & Mindset (34)
  • Positive CBT (28)
  • Positive Communication (20)
  • Positive Education (47)
  • Positive Emotions (32)
  • Positive Leadership (18)
  • Positive Parenting (3)
  • Positive Psychology (33)
  • Positive Workplace (37)
  • Productivity (16)
  • Relationships (46)
  • Resilience & Coping (36)
  • Self Awareness (21)
  • Self Esteem (37)
  • Strengths & Virtues (31)
  • Stress & Burnout Prevention (34)
  • Theory & Books (46)
  • Therapy Exercises (37)
  • Types of Therapy (64)

3 Positive CBT Exercises (PDF)

What Is the Function of Confirmation Bias?

  • Original Research
  • Open access
  • Published: 20 April 2020
  • Volume 87 , pages 1351–1376, ( 2022 )

Cite this article

You have full access to this open access article

  • Uwe Peters 1 , 2  

70k Accesses

33 Citations

183 Altmetric

19 Mentions

Explore all metrics

Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss three recent proposals of this kind before developing a novel alternative, what I call the ‘reality-matching account’. According to the account, confirmation bias evolved because it helps us influence people and social structures so that they come to match our beliefs about them. This can result in significant developmental and epistemic benefits for us and other people, ensuring that over time we don’t become epistemically disconnected from social reality but can navigate it more easily. While that might not be the only evolved function of confirmation bias, it is an important one that has so far been neglected in the theorizing on the bias.

Similar content being viewed by others

confirmation bias critical thinking

Not So Hypocritical After All: Belief Revision Is Adaptive and Often Unnoticed

Parameterizing developmental changes in epistemic trust.

Baxter S. Eaves Jr. & Patrick Shafto

confirmation bias critical thinking

Cognitive diversity and the contingency of evidence

Jack C. Lyons

Avoid common mistakes on your manuscript.

In recent years, confirmation bias (or ‘myside bias’), Footnote 1 that is, people’s tendency to search for information that supports their beliefs and ignore or distort data contradicting them (Nickerson 1998 ; Myers and DeWall 2015 : 357), has frequently been discussed in the media, the sciences, and philosophy. The bias has, for example, been mentioned in debates on the spread of “fake news” (Stibel 2018 ), on the “replication crisis” in the sciences (Ball 2017 ; Lilienfeld 2017 ), the impact of cognitive diversity in philosophy (Peters 2019a ; Peters et al. forthcoming; Draper and Nichols 2013 ; De Cruz and De Smedt 2016 ), the role of values in inquiry (Steel 2018 ; Peters 2018 ), and the evolution of human reasoning (Norman 2016 ; Mercier and Sperber 2017 ; Sterelny 2018 ; Dutilh Novaes 2018 ).

Confirmation bias is typically viewed as an epistemically pernicious tendency. For instance, Mercier and Sperber ( 2017 : 215) maintain that the bias impedes the formation of well-founded beliefs, reduces people’s ability to correct their mistaken views, and makes them, when they reason on their own, “become overconfident” (Mercier 2016 : 110). In the same vein, Steel ( 2018 ) holds that the bias involves an “epistemic distortion [that] consists of unjustifiably favoring supporting evidence for [one’s] belief, which can result in the belief becoming unreasonably confident or extreme” (897). Similarly, Peters ( 2018 ) writes that confirmation bias “leads to partial, and therewith for the individual less reliable, information processing” (15).

The bias is not only taken to be epistemically problematic, but also thought to be a “ubiquitous” (Nickerson 1998 : 208), “built-in feature of the mind” (Haidt 2012 : 105), found in both everyday and abstract reasoning tasks (Evans 1996 ), independently of subjects’ intelligence, cognitive ability, or motivation to avoid it (Stanovich et al. 2013 ; Lord et al. 1984 ). Given its seemingly dysfunctional character, the apparent pervasiveness of confirmation bias raises a puzzle: If the bias is indeed epistemically problematic, why is it still with us today? By definition, dysfunctional traits should be more prone to extinction than functional ones (Nickerson 1998 ). Might confirmation bias be or have been adaptive ?

Some philosophers are optimistic, arguing that the bias has in fact significant advantages for the individual, groups, or both (Mercier and Sperber 2017 ; Norman 2016 ; Smart 2018 ; Peters 2018 ). Others are pessimistic. For instance, Dutilh Novaes ( 2018 ) maintains that confirmation bias makes subjects less able to anticipate other people’s viewpoints, and so, “given the importance of being able to appreciate one’s interlocutor’s perspective for social interaction”, is “best not seen as an adaptation” (520).

In the following, I discuss three recent proposals of the adaptationist kind, mention reservations about them, and develop a novel account of the evolution of confirmation bias that challenges a key assumption underlying current research on the bias, namely that the bias thwarts reliable belief formation and truth tracking. The account holds that while searching for information supporting one’s pre-existing beliefs and ignoring contradictory data is disadvantageous when that what one takes to be reality is and stays different from what one believes it to be, it is beneficial when, as the result of one’s processing information in that way, that reality is changed so that it matches one’s beliefs. I call this process reality matching and contend that it frequently occurs when the beliefs at issue are about people and social structures (i.e., relationships between individuals, groups, and socio-political institutions). In these situations, confirmation bias is highly effective for us to be confident about our beliefs even when there is insufficient evidence or subjective motivation available to us to support them. This helps us influence and ‘mould’ people and social structures so that they fit our beliefs, Footnote 2 which is an adaptive property of confirmation bias. It can result in significant developmental and epistemic benefits for us and other people, ensuring that over time we don’t become epistemically disconnected from social reality but can navigate it more easily.

I shall not argue that the adaptive function of confirmation bias that this reality-matching account highlights is the only evolved function of the bias. Rather, I propose that it is one important function that has so far been neglected in the theorizing on the bias.

In Sects.  1 and 2 , I distinguish confirmation bias from related cognitions before briefly introducing some recent empirical evidence supporting the existence of the bias. In Sect.  3 , I motivate the search for an evolutionary explanation of confirmation bias and critically discuss three recent proposals. In Sects.  4 and 5 , I then develop and support the reality-matching account as an alternative.

1 Confirmation Bias and Friends

The term ‘confirmation bias’ has been used to refer to various distinct ways in which beliefs and expectations can influence the selection, retention, and evaluation of evidence (Klayman 1995 ; Nickerson 1998 ). Hahn and Harris ( 2014 ) offer a list of them including four types of cognitions: (1) hypothesis-determined information seeking and interpretation, (2) failures to pursue a falsificationist strategy in contexts of conditional reasoning, (3) a resistance to change a belief or opinion once formed, and (4) overconfidence or an illusion of validity of one’s own view.

Hahn and Harries note that while all of these cognitions have been labeled ‘confirmation bias’, (1)–(4) are also sometimes viewed as components of ‘motivated reasoning’ (or ‘wishful thinking’) (ibid: 45), i.e., information processing that leads people to arrive at the conclusions they favor (Kunda 1990 ). In fact, as Nickerson ( 1998 : 176) notes, confirmation bias comes in two different flavors: “motivated” and “unmotivated” confirmation bias. And the operation of the former can be understood as motivated reasoning itself, because it too involves partial information processing to buttress a view that one wants to be true (ibid). Unmotivated confirmation bias, however, operates when people process data in one-sided, partial ways that support their predetermined views no matter whether they favor them. So confirmation bias is also importantly different from motivated reasoning, as it can take effect in the absence of a preferred view and might lead one to support even beliefs that one wants to be false (e.g., when one believes the catastrophic effects of climate change are unavoidable; Steel 2018 ).

Despite overlapping with motivated reasoning, confirmation bias can thus plausibly be (and typically is) construed as a distinctive cognition. It is thought to be a subject’s largely automatic and unconscious tendency to (i) seek support for her pre-existing, favored or not favored beliefs and (ii) ignore or distort information compromising them (Klayman 1995 : 406; Nickerson 1998 : 175; Myers and DeWall 2015 : 357; Palminteri et al. 2017 : 14). I here endorse this standard, functional concept of confirmation bias.

2 Is Confirmation Bias Real?

Many psychologists hold that the bias is a “pervasive” (Nickerson 1998 : 175; Palminteri et al. 2017 : 14), “ineradicable” feature of human reasoning (Haidt 2012 : 105). Such strong claims are problematic, however. For there is evidence that, for instance, disrupting the fluency in information processing (Hernandez and Preston 2013 ) or priming subjects for distrust (Mayo et al. 2014 ) reduces the bias. Moreover, some researchers have recently re-examined the relevant studies and found that confirmation bias is in fact less common and the evidence of it less robust than often assumed (Mercier 2016 ; Whittlestone 2017 ). These researchers grant, however, the weaker claim that the bias is real and often, in some domains more than in others, operative in human cognition (Mercier 2016 : 100, 108; Whittlestone 2017 : 199, 207). I shall only rely on this modest view here. To motivate it a bit more, consider the following two studies.

Hall et al. ( 2012 ) gave their participants (N = 160) a questionnaire, asking them about their opinion on moral principles such as ‘Even if an action might harm the innocent, it can still be morally permissible to perform it’. After the subjects had indicated their view using a scale ranging from ‘completely disagree’ to ‘completely agree’, the experimenter performed a sleight of hand, inverting the meaning of some of the statements so that the question then read, for instance, ‘If an action might harm the innocent, then it is not morally permissible to perform it’. The answer scales, however, were not altered. So if a subject had agreed with the first claim, she then agreed with the opposite one. Surprisingly, 69% of the study participants failed to detect at least one of the changes. Moreover, they subsequently tended to justify positions they thought they held despite just having chosen the opposite . Presumably, subjects accepted that they favored a particular position, didn’t know the reasons, and so were now looking for support that would justify their position. They displayed a confirmation bias. Footnote 3

Using a similar experimental set-up, Trouche et al. ( 2016 ) found that subjects also tend to exhibit a selective ‘laziness’ in their critical thinking: they are more likely to avoid raising objections to their own positions than to other people’s. Trouche et al. first asked their test participants to produce arguments in response to a set of simple reasoning problems. Directly afterwards, they had them assess other subjects’ arguments concerning the same problems. About half of the participants didn’t notice that by the experimenter’s intervention, in some trials, they were in fact presented with their own arguments again; the arguments appeared to these participants as if they were someone else’s. Furthermore, more than half of the subjects who believed they were assessing someone else’s arguments now rejected those that were in fact their own, and were more likely to do so for invalid than for valid ones. This suggests that subjects are less critical of their own arguments than of other people’s, indicating that confirmation bias is real and perhaps often operative when we are considering our own claims and arguments.

3 Evolutionary Accounts of the Bias

Confirmation bias is typically taken to be epistemically problematic, as it leads to partial and therewith for the individual less reliable information processing and contributes to failures in, for instance, perspective-taking with clear costs for social and other types of cognition (Mercier and Sperber 2017 : 215; Steel 2018 ; Peters 2018 ; Dutilh Novaes 2018 ). Prima facie , the bias thus seems maladaptive.

But then why does it still exist? Granted, even if the bias isn’t an adaptation, we might still be able to explain why it is with us today. We might, for instance, argue that it is a “spandrel”, a by-product of the evolution of another trait that is an adaptation (Gould and Lewontin 1979 ). Or we may abandon the evolutionary approach to the bias altogether and hold that it emerged by chance.

However, evolutionary explanations of psychological traits are often fruitful. They can create new perspectives on these traits that may allow developing means to reduce the traits’ potential negative effects (Roberts et al. 2012 ; Johnson et al. 2013 ). Evolutionary explanations might also stimulate novel, testable predictions that researchers who aren’t evolutionarily minded would overlook (Ketelaar and Ellis 2000 ; De Bruine 2009 ). Moreover, they typically involve integrating diverse data from different disciplines (e.g., psychology, biology, anthropology etc.), and thereby contribute to the development of a more complete understanding of the traits at play and human cognition, in general (Tooby and Cosmides 2015 ). These points equally apply when it comes to considering the origin of confirmation bias. They provide good reasons for searching for an evolutionary account of the bias.

Different proposals can be discerned in the literature. I will discuss three recent ones, what I shall call (1) the argumentative - function account, (2) the group - cognition account, and the (3) intention – alignment account. I won’t offer conclusive arguments against them here. The aim is just to introduce some reservations about these proposals to motivate the exploration of an alternative.

3.1 The Argumentative-Function Account

Mercier and Sperber ( 2011 , 2017 ) hold that human reasoning didn’t evolve for truth tracking but for making us better at convincing other people and evaluating their arguments so as to be convinced only when their points are compelling. In this context, when persuasion is paramount, the tendency to look for material supporting our preconceptions and to discount contradictory data allows us to accumulate argumentative ammunition, which strengthens our argumentative skill, Mercier and Sperber maintain. They suggest that confirmation bias thus evolved to “serve the goal of convincing others” ( 2011 : 63).

Mercier and Sperber acknowledge that the bias also hinders us in anticipating objections, which should make it more difficult for us to develop strong, objection–resistant arguments ( 2017 : 225f). But they add that it is much less cognitively demanding to react to objections than to anticipate them, because objections might depend on particular features of one’s opponents’ preferences or on information that only they have access to. It is thus more efficient to be ‘lazy’ in anticipating criticism and let the audience make the moves, Mercier and Sperber claim.

There is reason to be sceptical about their proposal, however. For instance, an anticipated objection is likely to be answered more convincingly than an immediate response from one’s audience. After all, “forewarned is forearmed”; it gives a tactical advantage (e.g., more time to develop a reply) (Sterelny 2018 : 4). And even if it is granted that objections depend on private information, they also often derive from obvious interests and public knowledge, making an anticipation of them easy (ibid). Moreover, as Dutilh Novaes ( 2018 : 519) notes, there is a risk of “looking daft” when producing poor arguments, say, due to laziness in scrutinizing one’s thoughts. Since individuals within their social groups depend on their reputation so as to find collaborators, anticipating one’s audience’s responses should be and have been more adaptive than having a confirmation bias (ibid). If human reasoning emerged for argumentative purposes, the existence of the bias remains puzzling.

3.2 The Group-Cognition Account

Even if confirmation bias is maladaptive for individual s, it might still be adaptive for groups . For instance, Smart ( 2018 ) and Peters ( 2018 ) hold that in groups with a sufficient degree of cognitive diversity at the outset of solving a particular problem, each individual’s confirmation bias might help the group as a whole conduct a more in-depth analysis of the problem space than otherwise. When each subject is biased towards a different particular proposal on how to solve the problem, the bias will push them to invest greater effort in defending their favored proposals and might, in the light of counterevidence, motivate them to consider rejecting auxiliary assumptions rather than the proposals themselves. This contributes to a thorough exploration of them that is less likely with less committed thinkers. Additionally, since individuals appear to have a particular strength in detecting flaws in others’ arguments (Trouche et al. 2016 ), open social criticism within the group should ensure that the group’s conclusions remain reliable even if some, or at times most, of its members are led astray by their confirmation bias (Smart 2018 : 4190; Peters 2018 : 20).

Mercier and Sperber ( 2011 : 65) themselves already float the idea of such a social “division of cognitive labor”. They don’t yet take its group-level benefits to explain why confirmation bias evolved, however (Dutilh Novaes 2018 : 518f). Smart ( 2018 ) and Peters ( 2018 ) also don’t introduce their views as accounts of the evolved function of the bias. But Dutilh Novaes ( 2018 : 519) and Levy ( 2019 : 317) gesture toward, and Smith and Wald ( 2019 ) make the case for, an evolutionary proposal along these lines, arguing that the bias was selected for making a group’s inquiry more thorough, effective, and reliable.

While I have sympathies with this proposal, several researchers have noted that the concept of ‘group selection’ is problematic (West et al. 2007 ; Pinker 2012 ). One of the issues is that since individuals reproduce faster than groups, a trait T that is an adaptation that is good for groups but bad for an individual’s fitness won’t spread, because the rate of proliferation of groups is undermined by the evolutionary disadvantage of T within groups (Pinker 2012 ). The point equally applies to the proposal that confirmation bias was selected for its group-level benefits.

Moreover, a group arguably only benefits from each individual’s confirmation bias if there is a diversity of viewpoints in the group and members express their views, as otherwise “group polarization” is likely to arise (Myers and Lamm 1976 ): arguments for shared positions will accumulate without being criticized, making the group’s average opinion more extreme and less reliable, which is maladaptive. Crucially, ancestral ‘hunter-gather’ groups are perhaps unlikely to have displayed a diversity of viewpoints. After all, their members traveled less, interacted less with strangers, and were less economically dependent on other groups (Simpson and Beckes 2010 : 37). This should have homogenized them with respect to race, culture, and background (Schuck 2001 : 1915). Even today groups often display such homogeneity, as calls for diversity in academia, companies etc. indicate. These points provide reasons to doubt that ancestral groups provided the kind of conditions in which confirmation bias could have produced the benefits that the group-cognition account highlights rather than maladaptive effects tied to group polarization.

3.3 The Intention–Alignment Account

Turning to a third and here final extant proposal on the evolution of confirmation bias, Norman ( 2016 ) argues that human reasoning evolved for facilitating an “intention alignment” between individuals: in social interactions, reasons typically ‘overwrite’ nonaligned mental states (e.g., people’s divergent intentions or beliefs) with aligned ones by showing the need for changing them. Norman holds that human reasoning was selected for this purpose because it makes cooperation easier. He adds that, in this context, “confirmation bias would have facilitated intention alignment, for a tribe of hunter-gatherers prone to [the bias] would more easily form and maintain the kind of shared outlook needed for mutualistic collaboration. The mythologies and ideologies taught to the young would accrue confirming evidence and tend to stick, thereby cementing group solidarity” ( 2016 : 700). Norman takes his view to be supported by the “fact that confirmation bias is especially pronounced when a group’s ideological preconceptions are at stake” (ibid).

However, the proposal seems at odds with the finding that the bias inclines subjects to ignore or misconstrue their opponents’ objections. In fueling one-sided information processing to support one’s own view, the bias makes people less able to anticipate and adequately respond to their interlocutor’s point of view (Dutilh Novaes 2018 : 520). Due to that effect, the bias arguably makes an intention alignment with others (especially with one’s opponents) harder, not easier. Moreover, since our ancesteral groups are (as noted above) likely to have been largely viewpoint homogenous, in supporting intention-alignment in these social environments, confirmation bias would have again facilitated group polarization, which is prima facie evolutionarily disadvantageous.

All three proposals of the adaptive role of confirmation bias considered so far thus raise questions. While the points mentioned aren’t meant to be fatal for the proposals and might be answerable within their frameworks, they do provide a motivation to explore an alternative.

4 Towards an Alternative

The key idea that I want to develop is the following. Confirmation bias is typically taken to work against an individual’s truth tracking (Mercier and Sperber 2017 : 215; Peters 2018 : 15), and indeed searching for information supporting one’s beliefs and ignoring contradictory data is epistemically disadvantageous when what one takes to be reality is and stays different from what one believes it to be. However, reality doesn’t always remain unchanged when we form beliefs about it. Consider social beliefs, that is, beliefs about people (oneself, others, and groups) and social structures (i.e., relationships between individuals, groups, and socio-political institutions). I shall contend that a confirmation bias pertaining to social beliefs reinforces our confidence in these beliefs, therewith strengthening our tendency to behave in ways that cause changes in reality so that it corresponds to the beliefs, turning them (when they are initially inaccurate) into self - fulfilling prophecies (SFPs) (Merton 1948 ; Biggs 2009 ). Due to its role in helping us make social reality match our beliefs, confirmation bias is adaptive, or so I will argue. I first introduce examples of SFPs of social beliefs. Then I explore the relevance of these beliefs in our species, before making explicit the adaptive role of confirmation bias in facilitating SFPs.

4.1 Social Beliefs and SFPs

Social beliefs often lead to SFPs with beneficial outcomes. Here are four examples.

S (false) believes he is highly intelligent. His self-view motivates him to engage with intellectuals, read books, attend academic talks, etc. This makes him increasingly more intelligent, gradually confirming his initially inaccurate self-concept (for relevant empirical data, see Swann 2012 ).

Without a communicative intention, a baby boy looking at a kitten produces a certain noise: ‘ma-ma’. His mother is thrilled, believing (falsely) that he is beginning to talk and wants to call her. She responds accordingly, rushing to him, attending to him, and indicating excitement. This leads the boy to associate ‘ma-ma’ with the arrival and attention of his mother. And so he gradually begins using the sounds to call her, confirming her initially false belief about his communicative intention (for relevant empirical data, see Mameli 2001 ).

A father believes his adolescent daughter doesn’t regularly drink alcohol, but she does. He acts in line with his beliefs, and expresses it in communication with other people. His daughter notices and likes his positive view of her, which motivates her to increasingly resist drinks, gradually fulfilling her father’s optimistic belief about her (for relevant empirical data; see Willard et al. 2008 ).

A teacher (falsely) believes that a student’s current academic performance is above average. She thus gives him challenging material, encourages him, and communicates high expectations. This leads the student to increase his efforts, which gradually results in above-average academic performance (for relevant evidence, see Madon et al. 1997 ).

SFPs of initially false positive trait ascriptions emerge in many other situations too. They also occurred, for instance, when adults ascribed to children traits such as being tidy (Miller et al. 1975 ), charitable (Jensen and Moore 1977 ), or cooperative (Grusec et al. 1978 ). Similarly, in adults, attributions of, for example, kindness (Murray et al. 1996 ), eco-friendliness (Cornelissen et al. 2007 ), military competence (Davidson and Eden 2000 ), athletic ability (Solomon 2016 ), and even physiological changes (Turnwald et al. 2018 ) have all had self-fulfilling effects. Moreover, these effects don’t necessarily take much time to unfold but can happen swiftly in a single interaction (e.g., in interview settings; Word et al. 1974 ) right after the ascription (Turnwald et al. 2018 : 49).

SFPs are, however, neither pervasive nor all-powerful (Jussim 2012 ), and there are various conditions for them to occur (Snyder and Klein 2007 ). For instance, they tend to occur only when targets are able to change in accordance with the trait ascriptions, when the latter are believable rather than unrealistic (Alfano 2013 : 91f), and when the ascriber holds more power than the ascribee (Copeland 1994 : 264f). But comprehensive literature reviews confirm that SFPs are “real, reliable, and occasionally quite powerful” (Jussim 2017 : 8; Willard and Madon 2016 ).

4.2 The Distribution of Social Beliefs and Role of Prosociality in Humans

Importantly, SFPs can be pernicious when the beliefs at the center of them capture negative social conceptions, for instance, stereotypes, anxious expectations, fear, or hostility (Darley and Gross 1983 ; Downey et al. 1998 ; Madon et al. 2018 ). In these cases, SFPs would be maladaptive. Given this, what do we know about the distribution of social beliefs, in general, and positive ones, in particular, in ancestral human groups?

Many researchers hold that our evolutionary success as a species relies on our being “ultra-social” and “ultra-cooperative” animals (e.g., Tomasello 2014 : 187; Henrich 2016 ). Human sociality is “spectacularly elaborate, and of profound biological importance” because “our social groups are characterized by extensive cooperation and division of labour” (Sterelny 2007 : 720). Since we live in an almost continuous flow of interactions with conspecifics, “solving problems of coordination with our fellows is [one of] our most pressing ecological tasks” (Zawidzki 2008 : 198). A significant amount of our beliefs are thus likely to be social ones (Tomasello 2014 : 190f).

Moreover, when it comes to oneself, to group or “tribe” members, and to collaborators, these beliefs often capture positive to overly optimistic ascriptions of traits (e.g., communicativeness, skills, etc.; Simpson and Beckes 2010 ). This is well established when it comes to one’s beliefs about oneself (about 70% of the general population has a positive self-conception; Talaifar and Swann 2017 : 4) and one’s family members (Wenger and Fowers 2008 ). The assumption that the point also holds for ‘tribe’ members and collaborators, more generally, receives support from the “tribal-instincts hypothesis” (Richerson and Boyd 2001 ), which holds that humans tend to harbor “ethnocentric attitudes in favor of [their] own tribe along with its members, customs, values and norms”, as this facilitates social predictability and cooperation (Kelly 2013 : 507). For instance, in the past as much as today, humans “talk differently about their in-groups than their out-groups, such that they describe the in-group and its members [but not out-groups] as having broadly positive traits” (Stangor 2011 : 568). In subjects with such ‘tribal instincts’, judgments about out-group members might easily be negative. But within the groups of these subjects, among in-group members, overly optimistic, cooperation-enhancing conceptions of others should be and have been more dominant particularly in “intergroup conflict, [which] is undeniably pervasive across human societies” (McDonald et al. 2012 : 670). Indeed, such conflicts are known to fuel in-group “glorification” (Leidner et al. 2010 ; Golec De Zavala 2011 ).

Given these points, in ‘ultra-cooperative’ social environments in which ‘tribe’ members held predominantly positive social conceptions and expectations about in-group subjects, positive SFPs should have been overall more frequent and stronger than negative ones. Indeed, there is evidence that even today, positive SFPs in individual, dyadic interactions are more likely and pronounced than negative ones. Footnote 4 For instance, focusing on mothers’ beliefs about their sons’ alcohol consumption, Willard et al. ( 2008 ) found that children “were more susceptible to their mothers’ positive than negative self-fulfilling effects” (499): “mothers’ false beliefs buffered their adolescents against increased alcohol use rather than putting them at greater risk” (Willard and Madon 2016 : 133). Similarly, studies found that “teachers’ false beliefs raised students’ achievement more than they lowered it” (Willard and Madon 2016 : 118): teacher overestimates “increase[d] achievement more than teacher underestimates tended to decrease achievement among students” (Madon et al. 1997 : 806). Experiments with stigmatized subjects corroborate these results further (ibid), leading Jussim ( 2017 ) in his literature review to conclude that high teacher expectations help students “more than low expectations harm achievement” (8).

One common explanation of this asymmetry is that SFPs typically depend on whether the targets of the trait ascriptions involved accept the expectations imposed on them via the ascriptions (Snyder and Klein 2007 ). And since subjects tend to strive to think well of themselves (Talaifar and Swann 2017 ), they respond more to positive than negative expectations (Madon et al. 1997 : 792). If we combine these considerations with the assumption that in ancestral groups of heavily interdependent subjects, positive social beliefs about in-group members (in-group favoritism) are likely to have been more prevalent than negative ones, then there is reason to hold that the SFPs of the social conceptions in the groups at issue were more often than not adaptive. With these points in mind, it is time to return to confirmation bias.

4.3 From SFPs to Confirmation Bias

Notice that SFPs depend on trait or mental-state ascriptions that are ‘ahead’ of their own truth: they are formed when an objective assessment of the available evidence doesn’t yet support their truth. Assuming direct doxastic voluntarism is false (Matheson and Vitz 2014 ), how can they nonetheless be formed and confidently maintained?

I suggest that confirmation bias plays an important role: it allows subjects to become and remain convinced about their social beliefs (e.g., trait ascriptions) when the available evidence doesn’t yet support their truth. This makes SFPs of these beliefs more likely than if the ascriber merely verbally attributed the traits without committing to the truth of the ascriptions, or believed in them but readily revised the beliefs. I shall argue that this is in fact adaptive not only when it comes to positive trait ascriptions, but also to negative ones. I will illustrate the point first with respect to positive trait ascriptions.

4.3.1 Motivated Confirmation Bias and Positive Trait Ascriptions

Suppose that you ascribe a positive property T to a subject A , who is your ward, but (unbeknownst to you) the available evidence doesn’t yet fully support that ascription. The more convinced you are about your view of A even in the light of counterevidence, the better you are at conveying your conviction to A because, generally, “people are more influenced [by others] when [these] others express judgments with high confidence than low confidence” (Kappes et al. 2020 : 1; von Hippel and Trivers 2011 ). Additionally, the better you are at conveying to A your conviction that he has T , the more confident he himself will be that he has that trait (assuming he trusts you) (Sniezek and Van Swol 2001 ). Crucially, if A too is confident that he has T , he will be more likely to conform to the corresponding expectations than if he doesn’t believe the ascription, say, because he notices that you only say but don’t believe that he has T . Relatedly, the more convinced you are about your trait ascription to A , the clearer your signaling of the corresponding expectations to A in your behavior (Tormala 2016 ) and the higher the normative impetus on him, as a cooperative subject, to conform so as to avoid disrupting interactions with you.

Returning to confirmation bias, given what we know about the cognitive effect of the bias, the more affected you are by the bias, the stronger your belief in your trait ascriptions to A (Rabin and Schrag 1999 ), and so the lower the likelihood that you will reveal in your behavior a lack of conviction about them that could undermine SFPs. Thus, the more affected you are by the bias, the higher the likelihood of SFPs of the ascriptions because conviction about the ascriptions plays a key facilitative role for SFPs. This is also experimentally supported. For several studies found that SFPs of trait ascriptions occurred only when ascribers were certain of the ascriptions, not when they were less confident (Swann and Ely 1984 ; Pelham and Swann 1994 ; Swann 2012 : 30). If we add to these points that SFPs of trait ascriptions were in developmental and educational contexts in ancestral tribal groups more often beneficial for the targets than not, then there is a basis for holding that confirmation bias might in fact have been selected for sustaining SFPs.

Notice that the argument so far equally applies to motivated reasoning. This is to be expected because, as mentioned above, motivated confirmation bias is an instance of motivated reasoning (Nickerson 1998 ). To pertain specifically to confirmation bias, however, the evolutionary proposal that the bias was selected for facilitating SFPs of social conceptions also has to hold for unmotivated confirmation bias. Is this the case?

4.3.2 Unmotivated Confirmation Bias and Negative Trait Ascriptions

Notice that when we automatically reinforce any of our views no matter whether we favor them, then our preferences won’t be required for and undermine the reinforcement process and the SFPs promoted by it. This means that such a general tendency, i.e., a confirmation bias, can fulfil the function of facilitating SFPs more frequently than motivated cognitions, namely whenever the subject has acquired a social conception (e.g., as the result of upbringing, learning, or testimony). This is adaptive for at least three reasons.

First, suppose that as a parent, caretaker, or teacher you (unknowingly) wishfully believe that A , who is your ward, has a positive trait T . You tell another subject ( B ) that A has T , and, on your testimony, B subsequently believes this too. But suppose that unlike you, B has no preference as to whether A has T. Yet, as it happens, she still has a confirmation bias toward her beliefs. Just like you, B will now process information so that it strengthens her view about A . This increases her conviction in, and so the probability of an SFP of, the trait ascription to A , because now both you and B are more likely to act toward A in ways indicating ascription-related expectations. As a general tendency to support any of one’s beliefs rather than only favored ones, the bias thus enables a social ‘ripple’ effect in the process of making trait ascriptions match reality. Since this process is in ultra-social and ultra-cooperative groups more often than not adaptive (e.g., boosting the development of a positive trait in A ), in facilitating a social extension of it, confirmation bias is adaptive too.

Secondly, in ancestral groups, many of the social conceptions (e.g., beliefs about social roles, gender norms, stereotypes etc.) that subjects unreflectively acquired during their upbringing and socialization will have been geared toward preserving the group's function and status quo  and aligning individuals with them (Sterelny 2006 : 148). Since it can operate independently of a subject’s preferences, a confirmation bias in each member of the group would have helped the group enlist each of its members for re-producing social identities, social structures, traits, and roles in the image of the group’s conceptions even when these individuals disfavored them. In sustaining SFPs of these conceptions, which might have included various stereotypes or ethnocentric, prejudicial attitudes that we today consider offensive negative trait ascriptions (e.g., gender or racist stereotypes) (Whitaker et al. 2018 ), confirmation bias would have been adaptive in the past. For, as Richerson and Boyd ( 2005 : 121f) note too, in ancestral groups, selection pressure favored social conformity, predictability, and stability. That confirmation bias might have evolved for facilitating SFPs that serve the ‘tribal' collective , possibly even against the preference, autonomy, and better judgment of the individual, is in line with recent research suggesting that many uniquely human features of cognition evolved through pressures selecting for the ability to conform to other people and to facilitate social projects (Henrich 2016 ). It is thought that these features may work against common ideals associated with self-reliance or “achieving basic personal autonomy, because the main purpose of [them] is to allow us to fluidly mesh with others, making us effective nodes in larger networks” (Kelly and Hoburg 2017 : 10). I suggest that confirmation bias too was selected for making us effective ‘nodes’ in social networks by inclining us to create social reality that corresponds to these networks’ conceptions even when we dislike them or they are harmful to others (e.g., out-group members).

Thirdly, in helping us make social affairs match our beliefs about them even when we don’t favor them, confirmation bias also provides us with significant epistemic benefits in social cognition. Consider Jack and Jill. Both have just seen an agent A act ambiguously, and both have formed a first impression of A according to which A is acting the way he is because he has trait T . Suppose neither Jack nor Jill has any preference as to whether A has that trait but subsequently process information in the following two different ways. Jack does not have a confirmation bias but impartially assesses the evidence and swiftly revises his beliefs when encountering contradictory data. As it happens, A ’s behavior soon does provide him with just such evidence, leading him to abandon his first impression of A and reopen the search for an explanation of A ’s action. In contrast, Jill does have a confirmation bias with respect to her beliefs and interprets the available evidence so that it supports her beliefs. Jill too sees A act in a way that contradicts her first impression of him. But unlike Jack, she doesn’t abandon her view. Rather, she reinterprets A ’s action so that it bolsters her view. Whose information processing might be more adaptive? For Jack, encountering data challenging his view removes certainty and initiates a new cycle of computations about A , which requires him to postpone a possible collaboration with A . For Jill, however, the new evidence strengthens her view, leading her to keep the issue of explaining A ’s action settled and be ready to collaborate with him. Jack’s approach might still seem better for attaining an accurate view of A and predicting what he’ll do next. But suppose Jill confidently signals to A her view of him in her behavior. Since people have a general inclination to fulfil others’ expectations (especially positive ones) out of an interest in coordinating and getting along with them (Dardenne and Leyens 1995 ; Bacharach et al. 2007 ), when A notices Jill’s conviction that he displays T , he too is likely to conform, which provides Jill with a correct view of what he will do next. Jill’s biased processing is thus more adaptive than Jack’s approach: a confirmation bias provides her with certainty and simpler information processing that simultaneously facilitates accurate predictions (via contributing to SFPs). Generalizing from Jill, in everyday social interactions we all form swift first impressions of others without having any particular preference with respect to these impressions either way. Assuming that confirmation bias operates on them nonetheless, the bias will frequently be adaptive in the ways just mentioned.

4.3.3 Summing Up: The Reality-Matching Account

By helping subjects make social reality match their beliefs about it no matter whether they favor these beliefs or the latter are sufficiently evidentially supported, confirmation bias is adaptive: when the bias targets positive social beliefs and trait ascriptions, it serves both the subject and the group by producing effects that (1) assist them in their development (to become, e.g., more communicative, cooperative, or knowledgeable) and (2) make social cognition more tractable (by increasing social conformity and predictability). To be sure, when it targets negative trait ascriptions (pernicious stereotypes, etc.), the bias can have ethically problematic SFP effects. But, as noted, especially in ancestral ‘tribal’ groups, it would perhaps still have contributed to social conformity, predictability, and sustaining the status quo , which would have been adaptive in these groups (Richerson and Boyd  2005 ) inter alia  by facilitating social cognition. Taken together, these considerations provide a basis for holding that confirmation bias was selected for promoting SFPs. I shall call the proposal introduced in this section, the  reality-matching (RM) account of the function of confirmation bias.

5 Supporting the RM Account

Before offering empirical support for the RM account and highlighting its explanatory benefits, it is useful to disarm an objection: if confirmation bias was selected for its SFP-related effects, then people should not also display the bias with respect to beliefs that can’t produce SFPs (e.g., beliefs about physics, climate change, religion, etc.). But they do (Nickerson 1998 ).

5.1 From Social to Non-social Beliefs

In response to the objection just mentioned, two points should be noted. First, the RM account is compatible with the view that confirmation bias was also selected for adaptive effects related to non -social beliefs. It only claims that facilitating the alignment of social reality with social beliefs (i.e., reality matching) is one of the important adaptive features for which the bias was selected that has so far been neglected.

Second, it doesn’t follow that because confirmation bias also affects beliefs that can’t initiate SFPs that it could not have been selected for affecting beliefs that can and do initiate SFPs. The literature offers many examples of biological features or cognitive traits that were selected for fulfilling a certain function despite rarely doing so or even having maladaptive effects (Millikan 1984 ; Haselton and Nettle 2006 ). Consider the “baby-face overgeneralization” bias (Zebrowitz and Montepare 2008 ). Studies suggest that people have a strong readiness to favorably respond to babies’ distinctive facial features. And this tendency is overgeneralized such that even adults are more readily viewed more favorably, treated as likeable (but also physically weak, and naïve) when they display babyface features. While this overgeneralization tendency often leads to errors, it is thought to have evolved because failures to respond favorably to babies (i.e., false negatives) are evolutionarily more costly than overgeneralizing (i.e., false positives) (ibid).

Might our domain-general tendency to confirm our own beliefs be similarly less evolutionarily costly than not having such a general tendency? It is not implausible to assume so because, as noted, we are ultra-social and ultra-cooperative, and our beliefs about people’s social standing, knowledge, intentions, abilities, etc. are critical for our flourishing (Sterelny 2007 : 720; Tomasello 2014 : 190f; Henrich 2016 ). Importantly, these beliefs, unlike beliefs about the non-social world, are able to and frequently do initiate SFPs contributing to the outlined evolutionary benefits. This matters because if social beliefs are pervasive and SFPs of them significant for our flourishing, then a domain-general tendency to confirm any of our beliefs ensures that we don’t miss opportunities to align social reality with our conceptions and to reap the related developmental and epistemic benefits. Granted, this tendency overgeneralizes, which creates clear costs. But given the special role of social beliefs in our species and our dependence on social learning and social cognition, which are facilitated by SFPs, it is worth taking seriously the possibility that these costs can often outweigh the benefits.

While this thought doesn’t yet show that the RM account is correct, it does help disarm the above objection. For it explains why the fact that confirmation bias also affects beliefs that cannot initiate SFPs doesn’t disprove the view that the bias was selected for reality matching: the special role of social beliefs in our species (compared to others species) lends plausibility to the assumption that the costs of the bias’ overgeneralizing might be lower than the costs of its failing to generalize. I now turn to the positive support for the RM account.

5.2 Empirical Data

If, as the RM account proposes, confirmation bias was selected for facilitating the process of making reality match our beliefs, then the bias should be common and pronounced when (1) it comes to social beliefs, that is, beliefs (a) about oneself, (b) about other people, and (c) about social structures that the subject can determine, and when (2) social conditions are conducive to reality matching. While there are no systematic comparative studies on whether the bias is more frequent or stronger with respect to some beliefs but not others (e.g., social vs. non-social beliefs), there is related empirical research that does provide some support for these predictions.

Self - related Beliefs

In a number of studies, Swann and colleagues (Swann 1983 ; Swann et al. 1992 ; for an overview, see Swann 2012 ) found that selective information processing characteristic of confirmation bias is “especially pronounced with regards to self-concepts” and so self-related beliefs (Müller-Pinzler et al. 2019 : 9). Footnote 5 Interestingly, and counterintuitively, the data show that “just as people with positive self-views preferentially seek positive evaluations, those with negative self-views preferentially seek negative evaluations” (Talaifar and Swann 2017 : 3). For instance, those “who see themselves as likable seek out and embrace others who evaluate them positively, whereas those who see themselves as dislikeable seek out and embrace others who evaluate them negatively” (ibid). Much in line with the RM account, Swann ( 2012 ) notes that this confirmatory tendency “would have been advantageous” in “hunter-gatherer groups”: once “people used input from the social environment to form self-views, self-verification strivings would have stabilized their identities and behavior, which in turn would make each individual more predictable to other group members” (26).

Similarly, in a study in which subjects received feedback about aspects of their self that can be relatively easily changed (e.g., their ability to estimate the weights of animals), Müller-Pinzler et al. ( 2019 ) found that “prior beliefs about the self modulate self-related belief-formation” in that subjects updated their performance estimates “in line with a confirmation bias”: individuals with prior negative self-related beliefs (e.g., low self-esteem) showed increased biases towards factoring in negative (vs. positive) feedback, and, interestingly, this tendency was “modulated by the social context and only present when participants were exposed to a potentially judging audience” (ibid: 9–10). This coheres with the view that confirmation bias might serve the ‘collective’ to bring subjects into accordance with its social conceptions (positive or negative).

Other - Related Beliefs

If confirmation bias was selected for sustaining social beliefs for the sake of reality matching then the bias should also be particularly pronounced when it comes to beliefs about other people especially in situations conducive to reality matching. For instance, powerful individuals have been found to be more likely to prompt subordinates to behaviorally confirm their social conceptions than relatively powerless subjects (Copeland 1994 ; Leyens et al. 1999 ). That is, interactions between powerful and powerless individuals are conducive to reality matching of the powerful individuals’ social beliefs. According to the RM account, powerful individuals should display a stronger confirmation bias with respect to the relevant social beliefs. Goodwin et al. ( 2000 ) found just that: powerful people, in particular, tend to fail to take into account data that may contradict their social beliefs (capturing, e.g., stereotypes) about subordinates and attend more closely to information that supports their expectations. Relative to the powerless, powerful people displayed a stronger confirmation bias in their thinking about subordinates (ibid: 239f).

Similarly, if confirmation bias serves to facilitate social interaction by contributing to a match between beliefs and social reality then the bias should be increased with respect to trait attributions to other people in subjects who care about social interactions compared to other subjects. Dardenne and Leyens ( 1995 ) reasoned that when testing a hypothesis about the personality of another individual (e.g., their being introverted or extroverted), a preference for questions that match the hypothesis (e.g., that the subject is introverted) indicates social skill, conveying a feeling of being understood to the individual and contributing to a smooth conversation. Socially skilled people (‘high self-monitors’) should thus request ‘matching questions’, say, in an interview setting, for instance, when testing the introvert hypothesis, an interviewer could ask questions that are answered ‘yes’ by a typical introvert (e.g., ‘Do you like to stay alone?’), confirming the presence of the hypothesized trait (ibid). Dardenne and Leyens did find that matching questions pertaining to an introvert or an extrovert hypothesis were selected most by high self-monitors: socially skilled subjects displayed a stronger confirmatory tendency than less socially skilled subjects (ibid).

Finally, there is also evidence that confirmation bias is more pronounced with respect to social beliefs compared to non-social beliefs. For instance, Marsh and Hanlon ( 2007 ) gave one group of behavioral ecologists a specific set of expectations with respect to sex differences in salamander behavior, while a second group was given the opposite set of expectations. In one experiment, subjects collected data on variable sets of live salamanders, while in the other experiment, observers collected data from identical videotaped trials. Across experiments and observed behaviors, the expectations of the observers biased their observations “only to a small or moderate degree”, Marsh and Hanlon note, concluding that these “results are largely optimistic with respect to confirmation bias in behavioral ecology” ( 2007 : 1089). This insignificant confirmation bias with respect to beliefs about non-social matters contrasts with findings of a significant confirmation bias with respect to beliefs about people (Talaifar and Swann 2017 ; Goodwin et al. 2000 ; Marks and Fraley 2006 ; Darley and Gross 1983 ), and, as I shall argue now, social affairs whose reality the subject can determine.

Non - personal, Social Beliefs

One important kind of social beliefs are political beliefs, which concern social states of affairs pertaining to politics. Political beliefs are especially interesting in the context of the RM account because they are very closely related to reality matching. This is not only because subjects can often directly influence political affairs via voting, running as a candidate, campaigning, etc. It is also because subjects who are highly confident about their political beliefs are more likely to be able to convince other people of them too (Kappes et al. 2020 ). And the more widespread a political conviction in a population, the higher the probability that the population will adopt political structures that shape reality in line with it (Jost et al. 2003 ; Ordabayeva and Fernandes 2018 ).

If, as the RM account proposes, confirmation bias was selected for sustaining social beliefs for the sake of reality matching then the bias should be particularly strong when it comes to beliefs about political states of affairs. And indeed Taber and Lodge ( 2006 ) did find that “motivated [confirmation] biases come to the fore in the processing of political arguments”, in particular, and, crucially, subjects “with weak […] [political] attitudes show less [confirmation] bias in processing political arguments” (767). In fact, in psychology, attitude strength, especially, in politically relevant domains of thinking has long been and still is widely accepted to increase the kind of selective exposure constitutive of confirmation bias (Knobloch-Westerwick et al. 2015 : 173). For instance, Brannon et al. ( 2007 ) found that stronger, more extreme political attitudes are correlated with higher ratings of interest in attitude-consistent versus attitude-discrepant political articles. Similarly, Knobloch-Westerwick et al. ( 2015 ) found that people online who attach high importance to particular political topics spent more time on attitude-consistent messages than users who attached low importance to the topics, and “[a]ttitude-consistent messages […] were preferred”, reinforcing the attitudes further (171). While this can contribute to political group polarization, such a polarization also boosts the group-wide reality-matching endeavour and can so be adaptive itself (Johnson and Fowler 2011 : 317).

In short, then, while there are currently no systematic comparative studies on whether confirmation bias is more frequent or stronger with respect to social beliefs, related empirical studies do suggest that when it comes to (positive or negative) social beliefs about oneself, other people, and social states of affairs that the subject can determine (e.g., political beliefs), confirmation bias is both particularly common and pronounced. Empirical data thus corroborate some of the predictions of the RM account.

5.3 Explanatory Benefits

The theoretical and empirical considerations from the preceding sections offer support for the RM account. Before concluding, it is worth mentioning three further reasons for taking the account seriously. First, it has greater explanatory power than the three alternative views outlined above. Second, it is consistent with, and provides new contributions to, different areas of evolutionary theorizing on human cognition. And it casts new light on the epistemic character of confirmation bias. I’ll now support these three points.

For instance, the argumentative-function account holds that confirmation bias is adaptive in making us better arguers. This was problematic because the bias hinders us in anticipating people’s objections, which weakens our argumentative skill and increases the risk of us appearing incompetent in argumentative exchanges. The RM account avoids these problems: if confirmation bias was selected for reinforcing our preconceptions about people to promote SFPs then, since in one’s own reasoning one only needs to justify one’s beliefs to oneself, the first point one finds acceptable will suffice. To convince others , one would perhaps need to anticipate objections. But if the bias functions to boost primarily only one’s own conviction about particular beliefs so as to facilitate SFPs then ‘laziness’ in critical thinking about one’s own positions (Trouche et al. 2016 ) shouldn’t be surprising.

Turning to the group-cognition account, the proposal was that confirmation bias is adaptive in and was selected for making group-level inquires more thorough, reliable, and efficient. In response, I noted that the concept of ‘group selection’ is problematic when it comes to traits threatening an individual’s fitness (West et al. 2007 ; Pinker 2012 ), and that confirmation bias would arguably only lead to the group-level benefits at issue in groups with viewpoint diversity. Yet, it is doubtful that ancestral groups met this condition. The RM account is preferable to the group-cognition view because it doesn’t rely on a notion of group selection but concerns primarily individual-level benefits, and it doesn’t tie the adaptive effects of the bias to conditions of viewpoint diversity. It proposes instead that the adaptive SFP-related effects of the bias increase individuals’ fitness (e.g., by facilitating their navigation of the social world, aligning them/others with their group's conceptions etc.) and can emerge whenever people hold beliefs about each other, interact, and fulfill social expectations. This condition is satisfied even in groups with viewpoint homogeneity.

The RM account also differs from the intention–alignment view, which holds that confirmation bias evolved for allowing us to synchronize intentions with others. One problem with this view was that the bias seems to hinder an intention alignment of individuals by weakening their perspective-taking capacity, and inclining them to ignore or distort people’s objections. The RM account avoids this problem because it suggests that by disregarding objections or counterevidence to one’s beliefs, one can remain convinced about them, which helps align social reality (not only, e.g., people’s intentions) with them, producing the adaptive outcomes outlined above. The account can also explain why confirmation bias is particularly strong in groups in which shared ideologies are at stake (Taber and Lodge 2006 ; Gerken 2019 ). For subjects have a keen interest in reality corresponding to their ideological conceptions. Since the latter are shaping social reality via their impact on behavior and are more effective in doing so the more convinced people are about them (Kappes et al. 2020 ), it is to be expected that when it comes to ideological propositions in like-minded groups, confirmation bias is more pronounced. And, as noted, the resulting group polarization itself can then be adaptive in strengthening the reality-matching process.

Moving beyond extant work on the evolution of confirmation bias, the RM account also contributes to and raises new questions for other areas of research in different disciplines. It, for instance, yields predictions that psychologists can experimentally explore in comparative studies such as the prediction that confirmation bias is more common and stronger when targeting social versus non-social beliefs, or when conditions are conducive to reality matching as opposed to when they are not. The account also adds a new perspective to research on SFPs and on how social conceptions interact with their targets (Hacking 1995 ; Snyder and Klein 2007 ; Jussim 2017 ). Relatedly, the RM account also contributes to recent philosophical work on, folk-psychology , i.e., our ability to ascribe mental states to agents to make sense of their behavior. In that work, some philosophers argue that folk-psychology serves “mindshaping”, that is, the moulding of people’s behavior and minds so that they fit our conceptions, making people more predictable and cooperation with them easier (Mameli 2001 ; Zawidzki 2013 ; Peters 2019b ). There are clear connections between the mindshaping view of folk psychology and the RM account, but also important differences. For instance, the RM account pertains to the function of confirmation bias, not folk psychology. Moreover, advocates of the mindshaping view have so far left the conditions for effective mindshaping via folk-psychological ascriptions and the possible role of confirmation bias in it unexplored. The RM account begins to fill this gap in the research and in doing so adds to work on the question of how epistemic (or ‘mindreading’) and non-epistemic (or ‘mindshaping’, e.g., motivational) processes are related in folk-psychology (Peters 2019b : 545f; Westra 2020 ; Fernández-Castro and Martínez-Manrique 2020 ).

In addition to offering contributions to a range of different areas of research, the RM account also casts new light on the epistemic character of confirmation bias. Capturing the currently common view on the matter, Mercier ( 2016 ) writes that “piling up reasons that support our preconceived views is not the best way to correct them. […] [It] stop[s] people from fixing mistaken beliefs” (110). The RM account offers a different perspective, suggesting that when it is directed at beliefs about social affairs, confirmation bias does often help subjects correct their mistaken conceptions to the extent that it contributes to SFPs of them. Similarly, Dutilh Novaes ( 2018 ) holds that the bias involves or contributes to a failure of perspective taking, and so, “given the importance of being able to appreciate one’s interlocutor’s perspective for social interaction”, is “best not seen as an adaptation” (520). The RM account, on the other hand, proposes that the bias often facilitates social understanding: in making us less sensitive to our interlocutor’s opposing perspective, it helps us remain confident about our social beliefs, which increases the probability of SFPs that in turn make people more predictable and mindreadable.

6 Conclusion

After outlining limitations of three recent proposals on the evolution of confirmation bias, I developed and supported a novel alternative, the reality-matching (RM) account, which holds that one of the adaptive features for which the bias evolved is that it helps us bring social reality into alignment with our beliefs. When the bias targets positive social beliefs, this serves both the subject and the group, assisting them in their development (to become, e.g., more communicative or knowledgeable) while also making their social cognition more effective and tractable. When it targets negative social beliefs, in promoting reality matching, the bias might contribute to ethically problematic outcomes, but it can then still support social conformity and predictability, which were perhaps especially in ancestral tribal groups adaptive. While the socially constructive aspect of confirmation bias highlighted here may not be the main or only feature of the bias that led to its evolution, it is one that has so far been overlooked in the evolutionary theorizing on confirmation bias. If we attend to it, an account of the function of confirmation bias becomes available that coheres with data from across the psychological sciences, manages to avoid many of the shortcomings of competitor views, and has explanatory benefits that help advance the research on the function, nature, and epistemic character of the bias.

Mercier and Sperber ( 2017 ) and others prefer the term ‘myside bias’ to ‘confirmation bias’ because people don’t have a general tendency to confirm any hypothesis that comes to their mind but only ones that are on ‘their side’ of a debate. I shall here use the term ‘confirmation bias’ because it is more common and in any case typically understood in the way just mentioned.

Researchers working on folk psychology might be reminded of the ‘mindshaping’ view of folk psychology (Mameli 2001 ; Zawidzki 2013 ). I will come back to this view and demarcate it from my account of confirmation bias here in Sect.  5 .

It might be proposed that when participants in the experiment seek reasons for their judgments, perhaps they take themselves already to have formed the judgements for good reasons and then wonder what these reasons might have been. Why would they seek reasons against a view that they have formed (by their own lights) for good reasons? However, we might equally well ask why they would take themselves to have formed a judgment for good reasons in the first place even though they don’t know any of them? If it is a general default tendency to assume that any view that one holds rests on good reasons, then that would again suggest the presence of a confirmation bias. For a general tendency to think that one’s views rest on good reasons even when one doesn’t know them is a tendency to favor and confirm these views while resisting balanced scrutiny of their basis.

SFPs can also accumulate when they occur across different interactions, and in contemporary societies, overall accumulative SFP effects of negative social beliefs capturing, e.g., stereotypes might be stronger than those of positive social beliefs in individual dyadic interactions (Madon et al. 2018 ). However, in ancestral, ‘tribal’ groups of highly interdependent subjects, even accumulative SFPs of, e.g., stereotypes would perhaps still have contributed to conformity and social stability. I shall return to the possible SFP-related benefits of nowadays highly negative social conceptions, i.e., stereotypes, ethnocentrism etc. below.

Relatedly, neuroscientific data show that a positive view of one’s own traits tends to correlate with a reduced activation of the right inferior prefrontal gyrus, which is the area of the brain processing self-related content, when the subject receives negative self-related information (Sharot et al. 2011 ). That is, optimists about themselves display a diminished sensitivity for negative information that is in tension with self-related trait optimism (ibid).

Alfano, M. (2013). Character as moral fiction . Cambridge: CUP.

Book   Google Scholar  

Bacharach, M., Guerra, G., & Zizzo, D. J. (2007). The self-fulfilling property of trust: An experimental study. Theory and Decision, 63, 349–388.

Article   Google Scholar  

Ball, P. (2017). The trouble with scientists. How one psychologist is tackling human biases in science. Nautilus . Retrieved May 2, 2019 from http://nautil.us/issue/54/the-unspoken/the-trouble-with-scientists-rp .

Biggs, M. (2009). Self-fulfilling prophecies. In P. Bearman & P. Hedstrom (Eds.), The Oxford handbook of analytical sociology (pp. 294–314). Oxford: OUP.

Google Scholar  

Brannon, L. A., Tagler, M. J., & Eagly, A. H. (2007). The moderating role of attitude strength in selective exposure to information. Journal of Experimental Social Psychology, 43, 611–617.

Copeland, J. (1994). Prophecies of power: Motivational implications of social power for behavioral confirmation. Journal of Personality and Social Psychology, 67, 264–277.

Cornelissen, G., Dewitte, S., & Warlop, L. (2007). Whatever people say I am that’s what I am: Social labeling as a social marketing tool. International Journal of Research in Marketing, 24 (4), 278–288.

Dardenne, B., & Leyens, J. (1995). Confirmation bias as a social skill. Personality and Social Psychology Bulletin, 21 (11), 1229–1239.

Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects. Journal of Personality and Social Psychology, 44, 20–33.

Davidson, O. B., & Eden, D. (2000). Remedial self-fulfilling prophecy: Two field experiments to prevent Golem effects among disadvantaged women. Journal of Applied Psychology, 85 (3), 386–398.

De Bruine, L. M. (2009). Beyond ‘just-so stories’: How evolutionary theories led to predictions that non-evolution-minded researchers would never dream of. Psychologist, 22 (11), 930–933.

De Cruz, H., & De Smedt, J. (2016). How do philosophers evaluate natural theological arguments? An experimental philosophical investigation. In H. De Cruz & R. Nichols (Eds.), Advances in religion, cognitive science, and experimental philosophy (pp. 119–142). New York: Bloomsbury.

Downey, G., Freitas, A. L., Michaelis, B., & Khouri, H. (1998). The self-fulfilling prophecy in close relationships: Rejection sensitivity and rejection by romantic partners. Journal of Personality and Social Psychology, 75, 545–560.

Draper, P., & Nichols, R. (2013). Diagnosing bias in philosophy of religion. The Monist, 96, 420–446.

Dutilh Novaes, C. (2018). The enduring enigma of reason. Mind and Language, 33, 513–524.

Evans, J. (1996). Deciding before you think: Relevance and reasoning in the selection task. British Journal of Psychology, 87, 223–240.

Fernández-Castro, V., & Martínez-Manrique, F. (2020). Shaping your own mind: The self-mindshaping view on metacognition. Phenomenology and the Cognitive Sciences . https://doi.org/10.1007/s11097-020-09658-2 .

Gerken, M. (2019). Public scientific testimony in the scientific image. Studies in History and Philosophy of Science Part A . https://doi.org/10.1016/j.shpsa.2019.05.006 .

Golec de Zavala, A. (2011). Collective narcissism and intergroup hostility: The dark side of ‘in-group love’. Social and Personality Psychology Compass, 5, 309–320.

Goodwin, S., Gubin, A., Fiske, S., & Yzerbyt, V. (2000). Power can bias impression formation: Stereotyping subordinates by default and by design. Group Processes and Intergroup Relations, 3, 227–256.

Gould, S. J., & Lewontin, R. C. (1979). The spandrels of San Marco and the Panglossian paradigm: A critique of the adaptationist programme. Proceedings of the Royal Society of London. Series B, 205 (1161), 581–598.

Grusec, J., Kuczynski, L., Rushton, J., & Simutis, Z. (1978). Modeling, direct instruction, and attributions: Effects on altruism. Developmental Psychology, 14, 51–57.

Hacking, I. (1995). The looping effects of human kinds. In D. Sperber, et al. (Eds.), Causal cognition (pp. 351–383). New York: Clarendon Press.

Hahn, U., & Harris, A. J. L. (2014). What does it mean to be biased: Motivated reasoning and rationality. In H. R. Brian (Ed.), Psychology of learning and motivation (pp. 41–102). New York: Academic Press.

Haidt, J. (2012). The righteous mind . New York: Pantheon.

Hall, L., Johansson, P., & Strandberg, T. (2012). Lifting the veil of morality: Choice blindness and attitude reversals on a self-transforming survey. PLoS ONE, 7 (9), e45457.

Haselton, M. G., & Nettle, D. (2006). The paranoid optimist: An integrative evolutionary model of cognitive biases. Personality and Social Psychology Review, 10, 47–66.

Henrich, J. (2016). The secret of our success . Princeton, NJ: Princeton University Press.

Hernandez, I., & Preston, J. L. (2013). Disfluency disrupts the confirmation bias. Journal of Experimental Social Psychology, 49 (1), 178–182.

Jensen, R. E., & Moore, S. G. (1977). The effect of attribute statements on cooperativeness and competitiveness in school-age boys. Child Development, 48 (1), 305–307.

Johnson, D. D. P., Blumstein, D. T., Fowler, J. H., & Haselton, M. G. (2013). The evolution of error: Error management, cognitive constraints, and adaptive decision-making biases. Trends in Ecology & Evolution, 28, 474–481.

Johnson, D. D. P., & Fowler, J. H. (2011). The evolution of overconfidence. Nature, 477, 317–320.

Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as motivated social cognition. Psychological Bulletin, 129 (3), 339–375.

Jussim, L. (2012). Social perception and social reality . Oxford: OUP.

Jussim, L. (2017). Précis of social perception and social reality: Why accuracy dominates bias and self-fulfilling prophecy. Behavioral and Brain Sciences, 40, 1–20.

Kappes, A., Harvey, A. H., Lohrenz, T., et al. (2020). Confirmation bias in the utilization of others’ opinion strength. Nature Neuroscience, 23, 130–137.

Kelly, D. (2013). Moral disgust and the tribal instincts hypothesis. In K. Sterelny, R. Joyce, B. Calcott, & B. Fraser (Eds.), Cooperation and its evolution (pp. 503–524). Cambridge, MA: The MIT Press.

Kelly, D., & Hoburg, P. (2017). A tale of two processes: On Joseph Henrich’s the secret of our success: How culture is driving human evolution, domesticating our species, and making us smarter. Philosophical Psychology, 30 (6), 832–848.

Ketelaar, T., & Ellis, B. J. (2000). Are evolutionary explanations unfalsifiable? Evolutionary psychology and the Lakatosian philosophy of science. Psychological Inquiry, 11 (1), 1–21.

Klayman, J. (1995). Varieties of confirmation bias. Psychology of Learning and Motivation, 32, 385–418.

Knobloch-Westerwick, S., Johnson, B. K., & Westerwick, A. (2015). Confirmation bias in online searches: Impacts of selective exposure before an election on political attitude strength and shifts. Journal of Computer-Mediated Communication, 20, 171–187.

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108 (3), 480–498.

Leidner, B., Castano, E., Zaiser, E., & Giner-Sorolla, R. (2010). Ingroup glorification, moral disengagement, and justice in the context of collective violence. Personality and Social Psychology Bulletin, 36 (8), 1115–1129.

Levy, N. (2019). Due deference to denialism: Explaining ordinary people’s rejection of established scientific findings. Synthese, 196 (1), 313–327.

Leyens, J., Dardenne, B., Yzerbyt, V., Scaillet, N., & Snyder, M. (1999). Confirmation and disconfirmation: Their social advantages. European Review of Social Psychology, 10 (1), 199–230.

Lilienfeld, S. O. (2017). Psychology’s replication crisis and the grant culture: Righting the ship. Perspectives on Psychological Science, 12 (4), 660–664.

Lord, C., Lepper, M., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality and Social Psychology, 47, 1231–1243.

Madon, S., Jussim, L., & Eccles, J. (1997). In search of the powerful self-fulfilling prophecy. Journal of Personality and Social Psychology, 72, 791–809.

Madon, S., Jussim, L., Guyll, M., Nofziger, H., Salib, E. R., Willard, J., et al. (2018). The accumulation of stereotype-based self-fulfilling prophecies. Journal of Personality and Social Psychology, 115 (5), 825–844.

Mameli, M. (2001). Mindreading, mindshaping, and evolution. Biology and Philosophy, 16, 597–628.

Marks, M. J., & Fraley, R. C. (2006). Confirmation bias and the sexual double standard. Sex Roles: A Journal of Research, 54 (1–2), 19–26.

Marsh, D. M., & Hanlon, T. J. (2007). Seeing what we want to see: Confirmation biasin animal behavior research. Ethology, 113, 1089–1098.

Matheson, J., & Vitz, R. (Eds.). (2014). The ethics of belief: Individual and social . Oxford: OUP.

Mayo, R., Alfasi, D., & Schwarz, N. (2014). Distrust and the positive test heuristic: Dispositional and situated social distrust improves performance on the Wason Rule Discovery Task. Journal of Experimental Psychology: General, 143 (3), 985–990.

McDonald, M. M., Navarrete, C. D., & van Vugt, M. (2012). Evolution and the psychology of intergroup conflict: The male warrior hypothesis. Philosophical Transactions of the Royal Society, B, 367, 670–679.

Mercier, H. (2016). Confirmation (or myside) bias. In R. Pohl (Ed.), Cognitive illusions (pp. 99–114). London: Psychology Press.

Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34 (2), 57–111.

Mercier, H., & Sperber, D. (2017). The enigma of reason . Cambridge, MA: Harvard University Press.

Merton, R. (1948). The self-fulfilling prophecy. The Antioch Review, 8 (2), 193–210.

Miller, R., Brickman, P., & Bolen, D. (1975). Attribution versus persuasion as a means for modifying behavior. Journal of Personality and Social Psychology, 31 (3), 430–441.

Millikan, R. G. (1984). Language thought and other biological categories . Cambridge, MA: MIT Press.

Müller-Pinzler, L., Czekalla, N., Mayer, A. V., et al. (2019). Negativity-bias in forming beliefs about own abilities. Scientific Reports, 9, 14416. https://doi.org/10.1038/s41598-019-50821-w .

Murray, S. L., Holmes, J. G., & Griffin, D. W. (1996). The self-fulfilling nature of positive illusions in romantic relationships: Love is not blind, but prescient. Journal of Personality and Social Psychology , 71 , 1155–1180.

Myers, D., & DeWall, N. (2015). Psychology . New York: Worth Publishers.

Myers, D. G., & Lamm, H. (1976). The group polarization phenomenon. Psychological Bulletin, 83, 602–627.

Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175–220.

Norman, A. (2016). Why we reason: Intention–alignment and the genesis of human rationality. Biology and Philosophy, 31, 685–704.

Ordabayeva, N., & Fernandes, D. (2018). Better or different? How political ideology shapes preferences for differentiation in the social hierarchy. Journal of Consumer Research, 45 (2), 227–250.

Palminteri, S., Lefebvre, G., Kilford, E. J., & Blakemore, S. J. (2017). Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing. PLoS Computational Biology, 13 (8), e1005684.

Pelham, B. W., & Swann, W. B. (1994). The juncture of intrapersonal and interpersonal knowledge: Self-certainty and interpersonal congruence. Personality and Social Psychology Bulletin, 20 (4), 349–357.

Peters, U. (2018). Illegitimate values, confirmation bias, and mandevillian cognition in science. British Journal for Philosophy of Science . https://doi.org/10.1093/bjps/axy079 .

Peters, U. (2019a). Implicit bias, ideological bias, and epistemic risks in philosophy. Mind & Language , 34 , 393–419. https://doi.org/10.1111/mila.12194 .

Peters, U. (2019b). The complementarity of mindshaping and mindreading. Phenomenology and the Cognitive Sciences , 18 (3), 533–549.

Peters, U., Honeycutt, N., De Block, A., & Jussim, L. (forthcoming). Ideological diversity, hostility, and discrimination in philosophy. Philosophical Psychology . Available online: https://philpapers.org/archive/PETIDH-2.pdf .

Pinker, S. (2012). The false allure of group selection. Retrieved July 20, 2012 from http://edge.org/conversation/the-false-allure-of-group-selection .

Rabin, M., & Schrag, J. L. (1999). First impressions matter: A model of confirmatory bias. Quarterly Journal of Economics, 114 (1), 37–82.

Richerson, P., & Boyd, R. (2001). The evolution of subjective commitment to groups: A tribal instincts hypothesis. In R. M. Nesse (Ed.), Evolution and the capacity for commitment (pp. 186–202). New York: Russell Sage Found.

Richerson, P., & Boyd, R. (2005). Not by genes alone: How culture transformed human evolution . Chicago: University of Chicago Press.

Roberts, S. C., van Vugt, M., & Dunbar, R. I. M. (2012). Evolutionary psychology in the modern world: Applications, perspectives, and strategies. Evolutionary Psychology, 10, 762–769.

Schuck, P. H. (2001). The perceived values of diversity, then and now. Cardozo Law Review, 22, 1915–1960.

Sharot, T., Korn, C. W., & Dolan, R. J. (2011). How unrealistic optimism is maintained in the face of reality. Nature Neuroscience, 14, 1475–1479.

Simpson, J. A., & Beckes, L. (2010). Evolutionary perspectives on prosocial behavior. In M. Mikulincer & P. Shaver (Eds.), Prosocial motives, emotions, and behavior: The better angels of our nature (pp. 35–53). Washington, DC: American Psychological Association.

Chapter   Google Scholar  

Smart, P. (2018). Mandevillian intellingence. Synthese, 195, 4169–4200.

Smith, J. J., & Wald, B. (2019). Collectivized intellectualism. Res Philosophica, 96 (2), 199–227.

Sniezek, J. A., & Van Swol, L. M. (2001). Trust, confidence, and expertise in a judge–advisor system. Organizational Behavior and Human Decision Processes, 84, 288–307.

Snyder, M., & Klein, O. (2007). Construing and constructing others: On the reality and the generality of the behavioral confirmation scenario. In P. Hauf & F. Forsterling (Eds.), Making minds (pp. 47–60). John Benjamins: Amsterdam/Philadelphia.

Solomon, G. B. (2016). Improving performance by means of action–cognition coupling in athletes and coaches. In M. Raab, B. Lobinger, S. Hoffman, A. Pizzera, & S. Laborde (Eds.), Performance psychology: Perception, action, cognition, and emotion (pp. 88–101). London, England: Elsevier Academic Press.

Stangor, C. (2011). Principles of social psychology . Victoria, BC: BCcampus.

Stanovich, K., West, R., & Toplak, M. (2013). Myside bias, rational thinking, and intelligence. Current Directions in Psychological Science, 22, 259–264.

Steel, D. (2018). Wishful thinking and values in science: Bias and beliefs about injustice. Philosophy of Science . https://doi.org/10.1086/699714 .

Sterelny, K. (2006). Memes revisited. British Journal for the Philosophy of Science, 57, 145–165.

Sterelny, K. (2007). Social intelligence, human intelligence and niche construction. Philosophical Transactions of the Royal Society B, 362, 719–730.

Sterelny, K. (2018). Why reason? Hugo Mercier’s and Dan Sperber’s the enigma of reason: A new theory of human understanding. Mind and Language, 33 (5), 502–512.

Stibel, J. (2018). Fake news: How our brains lead us into echo chambers that promote racism and sexism. USA Today . Retrieved October 8, 2018 from https://eu.usatoday.com/story/money/columnist/2018/05/15/fake-news-social-media-confirmation-bias-echo-chambers/533857002/ .

Swann, W. B. (1983). Self-verification: Bringing social reality into harmony with the self. In J. Suls & A. G. Greenwald (Eds.), Social psychological perspectives on the self (Vol. 2, pp. 33–66). London: Erlbaum.

Swann, W. B., Jr. (2012). Self-verification theory. In P. A. M. Van Lange, A. W. Kruglanski, & E. T. Higgins (Eds.), Handbook of theories of social psychology (pp. 23–42). Beverley Hills, CA: Sage Publications Ltd.

Swann, W., & Ely, R. (1984). A battle of wills: Self-verification versus behavioral confirmation. Journal of Personality and Social Psychology, 46, 1287–1302.

Swann, W. B., Jr., Stein-Seroussi, A., & Giesler, B. (1992). Why people self-verify. Journal of Personality and Social Psychology, 62, 392–406.

Taber, C., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755–769.

Talaifar, S., & Swann, W. B. (2017). Self-verification theory. In L. Goossens, M. Maes, S. Danneel, J. Vanhalst, & S. Nelemans (Eds.), Encyclopedia of personality and individual differences (pp. 1–9). Berlin: Springer.

Tomasello, M. (2014). The ultra-social animal. European Journal of Social Psychology, 44, 187–194.

Tooby, J., & Cosmides, L. (2015). The theoretical foundations of evolutionary psychology. In D. M. Buss (Ed.), The handbook of evolutionary psychology (pp. 3–87). Hoboken, NJ: Wiley.

Tormala, Z. L. (2016). The role of certainty (and uncertainty) in attitudes and persuasion. Current Opinion in Psychology, 10, 6–11.

Trouche, E., et al. (2016). The selective laziness of reasoning. Cognitive Science, 40, 2122–2136.

Turnwald, B., et al. (2018). Learning one’s genetic risk changes physiology independent of actual genetic risk. Nature Human Behaviour . https://doi.org/10.1038/s41562-018-0483-4 .

von Hippel, W., & Trivers, R. (2011). The evolution and psychology of self-deception. Behavioral and Brain Sciences, 34 (1), 1–16.

Wenger, A., & Fowers, B. J. (2008). Positive illusions in parenting: Every child is above average. Journal of Applied Social Psychology, 38 (3), 611–634.

West, S. A., Griffin, A. S., & Gardiner, A. (2007). Social semantics: How useful has group selection been? Journal of Evolutionary Biology , 21 , 374–385.

Westra, E. (2020). Folk personality psychology: Mindreading and mindshaping in trait attribution. Synthese . https://doi.org/10.1007/s11229-020-02566-7 .

Whitaker, R. M., Colombo, G. B., & Rand, D. G. (2018). Indirect reciprocity and the evolution of prejudicial groups. Scientific Reports , 8 (1), 13247. https://doi.org/10.1038/s41598-018-31363-z .

Whittlestone, J. (2017). The importance of making assumptions: Why confirmation is not necessarily a bias . Ph.D. Thesis. Coventry: University of Warwick.

Willard, J., & Madon, S. (2016). Understanding the connections between self-fulfilling prophecies and social problems. In S. Trusz & P. Przemysław Bąbel (Eds.), Interpersonal and intrapersonal expectancies (pp. 117–125). London: Routledge.

Willard, J., Madon, S., Guyll, M., Spoth, R., & Jussim, L. (2008). Self-efficacy as a moderator of negative and positive self-fulfilling prophecy effects: Mothers’ beliefs and children’s alcohol use. European Journal of Social Psychology, 38, 499–520.

Word, C. O., Zanna, M. P., & Cooper, J. (1974). The nonverbal mediation of self-fulfilling prophecies in interracial interaction. Journal of Experimental Social Psychology, 10, 109–120.

Zawidzki, T. (2008). The function of folk psychology: Mind reading or mind shaping? Philosophical Explorations, 11 (3), 193–210.

Zawidzki, T. (2013). Mindshaping: A new framework for understanding human social cognition. Cambridge: MIT Press.

Zebrowitz, L. A., & Montepare, J. M. (2008). Social psychological face perception: Why appearance matters. Social and Personality Psychology Compass, 2, 1497–1517.

Download references

Acknowledgements

Many thanks to Andreas De Block, Mikkel Gerken, and Alex Krauss for comments on earlier drafts. The research for this paper was partly funded by the Danmarks Frie Forskningsfond Grant no: 8018-00053B allocated to Mikkel Gerken.

Author information

Authors and affiliations.

Department of Philosophy, University of Southern Denmark, Odense, Denmark

Department of Psychology, King’s College London, De Crespigny Park, Camberwell, London, SE5 8AB, UK

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Uwe Peters .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Peters, U. What Is the Function of Confirmation Bias?. Erkenn 87 , 1351–1376 (2022). https://doi.org/10.1007/s10670-020-00252-1

Download citation

Received : 07 May 2019

Accepted : 27 March 2020

Published : 20 April 2020

Issue Date : June 2022

DOI : https://doi.org/10.1007/s10670-020-00252-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base
  • Research bias
  • What Is Confirmation Bias? | Definition & Examples

What Is Confirmation Bias? | Definition & Examples

Published on September 19, 2022 by Kassiani Nikolopoulou . Revised on March 10, 2023.

Confirmation bias is the tendency to seek out and prefer information that supports our preexisting beliefs. As a result, we tend to ignore any information that contradicts those beliefs.

Confirmation bias is often unintentional but can still lead to poor decision-making in (psychology) research and in legal or real-life contexts.

Table of contents

What is confirmation bias, types of confirmation bias, confirmation bias examples, how to avoid confirmation bias, other types of research bias, frequently asked questions about confirmation bias.

Confirmation bias is a type of cognitive bias , or an error in thinking. Processing all the facts available to us costs us time and energy, so our brains tend to pick the information that agrees most with our preexisting opinions and knowledge. This leads to faster decision-making. Mental “shortcuts” like this are called heuristics.

Confirmation bias

When confronted with new information that confirms what we already believe, we are more likely to:

  • Accept it as true and accurate
  • Overlook any flaws or inconsistencies
  • Incorporate it into our belief system
  • Recall it later, using it to support our belief during a discussion

On the other hand, if the new information contradicts what we already believe, we respond differently. We are more likely to:

  • Become defensive about it
  • Focus on criticizing any flaw, while that same flaw would be ignored if the information confirmed our beliefs
  • Forget this information quickly, not recalling reading or hearing about it later on

There are three main ways that people display confirmation bias:

  • Selective search
  • Selective interpretation
  • Selective recall

Biased search for information

This type of bias occurs when only positive evidence is sought, or evidence that supports your expectations or hypotheses. Evidence that could prove them wrong is systematically disregarded.

If you reverse the question and type “are cats better than dogs?”, you will get results in support of cats.

This will happen with any two variables : the search engine “assumes” that you think variable A is better than variable B, and shows you the results that agree with your opinion first.

Biased interpretation of information

Confirmation bias is not limited to the type of information we search for. Even if two people are presented with the same information, it is possible that they will interpret it differently.

The reader who doubts climate change may interpret the article as evidence that climate change is natural and has happened at other points in history. Any arguments raised in the article about the negative impact of fossil fuels will be dismissed.

On the other hand, the reader who is concerned about climate change will view the information as evidence that climate change is a threat and that something must be done about it. Appeals to cut down fossil fuel emissions will be viewed favorably.

Biased recall of information

Confirmation bias also affects what type of information we are able to recall.

A week after encountering the story, the reader who is concerned about climate change is more likely to recall these arguments in a discussion with friends. On the contrary, a climate change doubter likely won’t be able to recall the points made in the article.

Confirmation bias has serious implications for our ability to seek objective facts. It can lead individuals to “cherry-pick” bits of information that reinforce any prejudices or stereotypes.

An overworked physician, believing this is just drug-seeking behavior, examines him hastily in the hall. The physician confirms that all of the man’s vital signs are fine: consistent with what was expected.

The man is discharged. Because the physician was only looking for what was already expected, she missed the signs that the man was actually having a problem with his kidneys.

Confirmation bias can lead to poor decision-making in various contexts, including interpersonal relationships, medical diagnoses, or applications of the law.

Due to this, you unconsciously seek information to support your hypothesis during the data collection phase, rather than remaining open to results that could disprove it. At the end of your research, you conclude that memory games do indeed delay memory loss.

Although confirmation bias cannot be entirely eliminated, there are steps you can take to avoid it:

  • First and foremost, accept that you have biases that impact your decision-making. Even though we like to think that we are objective, it is our nature to use mental shortcuts. This allows us to make judgments quickly and efficiently, but it also makes us disregard information that contradicts our views.
  • Do your research thoroughly when searching for information. Actively consider all the evidence available, rather than just the evidence confirming your opinion or belief. Only use credible sources that can pass the CRAAP test .
  • Make sure you read entire articles, not just the headline, prior to drawing any conclusions. Analyze the article to see if there is reliable evidence to support the argument being made. When in doubt, do further research to check if the information presented is trustworthy.

Cognitive bias

  • Confirmation bias
  • Baader–Meinhof phenomenon

Selection bias

  • Sampling bias
  • Ascertainment bias
  • Attrition bias
  • Self-selection bias
  • Survivorship bias
  • Nonresponse bias
  • Undercoverage bias
  • Hawthorne effect
  • Observer bias
  • Omitted variable bias
  • Publication bias
  • Pygmalion effect
  • Recall bias
  • Social desirability bias
  • Placebo effect

Reliability and validity are both about how well a method measures something:

  • Reliability refers to the  consistency of a measure (whether the results can be reproduced under the same conditions).
  • Validity   refers to the  accuracy of a measure (whether the results really do represent what they are supposed to measure).

If you are doing experimental research, you also have to consider the internal and external validity of your experiment.

Research bias affects the validity and reliability of your research findings , leading to false conclusions and a misinterpretation of the truth. This can have serious implications in areas like medical research where, for example, a new form of treatment may be evaluated.

It can sometimes be hard to distinguish accurate from inaccurate sources , especially online. Published articles are not always credible and can reflect a biased viewpoint without providing evidence to support their conclusions.

Information literacy is important because it helps you to be aware of such unreliable content and to evaluate sources effectively, both in an academic context and more generally.

Confirmation bias is the tendency to search, interpret, and recall information in a way that aligns with our pre-existing values, opinions, or beliefs. It refers to the ability to recollect information best when it amplifies what we already believe. Relatedly, we tend to forget information that contradicts our opinions.

Although selective recall is a component of confirmation bias, it should not be confused with recall bias.

On the other hand, recall bias refers to the differences in the ability between study participants to recall past events when self-reporting is used. This difference in accuracy or completeness of recollection is not related to beliefs or opinions. Rather, recall bias relates to other factors, such as the length of the recall period, age, and the characteristics of the disease under investigation.

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

Nikolopoulou, K. (2023, March 10). What Is Confirmation Bias? | Definition & Examples. Scribbr. Retrieved April 15, 2024, from https://www.scribbr.com/research-bias/confirmation-bias/

Is this article helpful?

Kassiani Nikolopoulou

Kassiani Nikolopoulou

Other students also liked, random vs. systematic error | definition & examples, evaluating sources | methods & examples, applying the craap test & evaluating sources.

Effectiviology

The Confirmation Bias: Why People See What They Want to See

The Confirmation Bias

The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs. For example, if someone is presented with a lot of information on a certain topic, the confirmation bias can cause them to only remember the bits of information that confirm what they already thought.

The confirmation bias influences people’s judgment and decision-making in many areas of life, so it’s important to understand it. As such, in the following article you will first learn more about the confirmation bias, and then see how you can reduce its influence, both in other people’s thought process as well as in your own.

How the confirmation bias affects people

The confirmation bias promotes various problematic patterns of thinking , such as people’s tendency to ignore information that contradicts their beliefs . It does so through several types of biased cognitive processes:

  • Biased search for information. This means that the confirmation bias causes people to search for information that confirms their preexisting beliefs, and to avoid information that contradicts them.
  • Biased favoring of information. This means that the confirmation bias causes people to give more weight to information that supports their beliefs, and less weight to information that contradicts them.
  • Biased interpretation of information. This means that the confirmation bias causes people to interpret information in a way that confirms their beliefs, even if the information could be interpreted in a way that contradicts them.
  • Biased recall of information. This means that the confirmation bias causes people to remember information that supports their beliefs and to forget information that contradicts them, or to remember supporting information as having been more supporting than it really was, or to incorrectly remember contradictory information as having supported their beliefs.

Note : one closely related phenomenon is cherry picking . It involves focusing only on evidence that supports one’s stance, while ignoring evidence that contradicts it. People often engage in cherry picking due to the confirmation bias, though it’s possible to engage in cherry picking even if a person is fully aware of what they’re doing, and is unaffected by the bias.

Examples of the confirmation bias

One example of the confirmation bias is someone who searches online to supposedly check whether a belief that they have is correct, but ignores or dismisses all the sources that state that it’s wrong. Similarly, another example of the confirmation bias is someone who forms an initial impression of a person, and then interprets everything that this person does in a way that confirms this initial impression.

Furthermore, other examples of the confirmation appear in various domains. For instance, the confirmation bias can affect:

  • How people view political information. For example, people generally prefer to spend more time looking at information that supports their political stance and less time looking at information that contradicts it.
  • How people assess pseudoscientific beliefs. For example, people who believe in pseudoscientific theories tend to ignore information that  disproves those theories .
  • How people invest money. For example, investors give more weight to information that confirms their preexisting beliefs regarding the value of certain stocks.
  • How scientists conduct research. For example, scientists often display the confirmation bias when they selectively analyze and interpret data in a way that confirms  their preferred hypothesis.
  • How medical professionals diagnose patients. For example, doctors often search for new information in a selective manner that will allow them to confirm their initial diagnosis of a patient, while ignoring signs that this diagnosis could be wrong.

In addition, an example of how the confirmation bias can influence people appears in the following quote, which references the prevalent misinterpretation of evidence during witch trials in the 17th century:

“When men wish to construct or support a theory, how they torture facts into their service!” ⁠— From “ Extraordinary Popular Delusions and the Madness of Crowds “

Similarly, another example of how people display the confirmation bias is the following:

“… If the new information is consonant with our beliefs, we think it is well founded and useful: ‘Just what I always said!’ But if the new information is dissonant, then we consider it biased or foolish: ‘What a dumb argument!’ So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief.” ⁠— From “ Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts “

Overall, examples of the confirmation bias appear in various domains. These examples illustrate the various different ways in which it can affect people, and show that this bias is highly prevalent, including among trained professionals who are often assumed to assess information in a purely rational manner.

Psychology and causes of the confirmation bias

The confirmation bias can be attributed to two main cognitive mechanisms:

  • Challenge avoidance , which is the desire to avoid finding out that you’re wrong.
  • Reinforcement seeking , which is the desire to find out that you’re right.

These forms of motivated reasoning can be attributed to people’s underlying desire to minimize their  cognitive dissonance , which is psychological distress that occurs when people hold two or more contradictory beliefs simultaneously. Challenge avoidance can reduce dissonance by reducing engagement with information that contradicts preexisting beliefs. Conversely, reinforcement seeking can reduce dissonance by increasing engagement with information that affirms people’s sense of correctness , including if they encounter contradictory information later.

Furthermore, the confirmation bias also occurs due to flaws in the way we test hypotheses.  For example, when people try to find an explanation for a certain phenomenon, they tend to focus on only one hypothesis at a time, and disregard alternative hypotheses, even in cases where they’re not emotionally incentivized to confirm their initial hypothesis. This can cause people to simply try and prove that their initial hypothesis is true, instead of trying to actually check whether it’s true or not, which causes them to ignore the possibility that the information that they encounter could disprove this initial hypothesis, or support alternative hypotheses.

An example of this is a doctor who forms an initial diagnosis of a patient, and who then focuses solely on trying to prove that this diagnosis is right, instead of trying to actively determine whether alternative diagnoses could make more sense.

This explains why people can experience unmotivated confirmation bias in situations where they have no emotional reason to favor a specific hypothesis over others. This is contrasted with a motivated confirmation bias, which occurs when the person displaying the bias is motivated by some emotional consideration.

Finally, the confirmation bias can also be attributed to a number of additional causes. For example, in the case of the motivated confirmation bias, an additional reason why people experience the bias is that the brain sometimes suppresses neural activity in areas associated with emotional regulation and emotionally neutral reasoning. This causes people to process information based on how their emotions guide them to, rather than based on how their logic would guide them.

Overall, people experience the confirmation bias primarily because they want to minimize psychological distress, and specifically due to challenge avoidance , which is the desire to avoid finding out that they’re wrong, and  reinforcement seeking , which is the desire to find out that they’re right. Furthermore, people can also experience the confirmation due to other causes, such as the flawed way they test hypotheses, as in the case where people fixate on confirming a single hypothesis while ignoring alternatives.

Note : Some of the behaviors that people engage in due to the confirmation bias can be viewed as a form of selective exposure . This involves people choosing to engage only with information that supports their preexisting beliefs and decisions, while ignoring information that contradicts them.

How to reduce the confirmation bias

Reducing other people’s confirmation bias.

There are various things that you can do to reduce the influence that the confirmation bias has on people. These methods generally revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place .

As such, these methods generally involve trying to get people to overcome their tendency to focus on and prefer confirmatory information, or their tendency to avoid and reject challenging information, while also encouraging them to conduct a valid reasoning process.

Specifically, the following are some of the most notable techniques that you can use to reduce the confirmation bias in people:

  • Explain what the confirmation bias is, why we experience it, how it affects us, and why it can be a problem, potentially using relevant examples. Understanding this phenomenon better can motivate people to avoid it, and can help them deal with it more effectively, by helping them recognize when and how it affects them. Note that in some cases, it may be beneficial to point out the exact way in which a person is displaying the confirmation bias.
  • Make it so that the goal is to find the right answer, rather than defend an existing belief. For example, consider a situation where you’re discussing a controversial topic with someone, and you know for certain that they’re wrong. If you argue hard against them, that might cause them to get defensive and feel that they must stick by their initial stance regardless of whatever evidence you show them. Conversely, if you state that you’re just trying to figure out what the right answer is, and discuss the topic with them in a friendly manner, that can make them more open to considering the challenging evidence that you present. In this case, your goal is to frame your debate as a journey that you go on together in search of the truth, rather than a battle where you fight each other to prove the other wrong. The key here is that, when it comes to a joint journey, both of you can be “winners”, while in the case of a battle, only one of you can, and the other person will often experience the confirmation bias to avoid feeling that they were the “loser”.
  • Minimize the unpleasantness and issues associated with finding out that they’re wrong. In general, the more unpleasant and problematic being wrong is, the more a person will use the confirmation bias to stick by their initial stance. There are various ways in which you can make the experience of being wrong less unpleasant or problematic, such as by emphasizing the value of learning new things, and by avoiding mocking people for having held incorrect beliefs.
  • Encourage people to avoid letting their emotional response dictate their actions. Specifically, explain that while it’s natural to want to avoid challenges and seek reinforcement, letting these feelings dictate how you process information and make decisions is problematic. This means, for example, that if you feel that you want to avoid a certain piece of information, because it might show that you’re wrong, then you should realize this, but choose to see that information anyway.
  • Encourage people to give information sufficient consideration. When it comes to avoiding the confirmation bias, it often helps to engage with information in a deep and meaningful way, since shallow engagement can lead people to rely on biased intuitions, rather than on proper analytical reasoning. There are various things that people can do to ensure that they give information sufficient consideration , such as spending a substantial amount of time considering it, or interacting with it in an environment that has no distractions.
  • Encourage people to avoid forming a hypothesis too early. Once people have a specific hypothesis in mind, they often try and confirm it , instead of trying to formulate and test other possible hypotheses. As such, it can often help to encourage people to process as much information as possible before forming their initial hypothesis.
  • Ask people to explain their reasoning. For example, you can ask them to clearly state what their stance is, and what evidence has caused them to support that stance. This can help people identify potential issues in their reasoning, such as that their stance is unsupported.
  • Ask people to think about various reasons why their preferred hypothesis might be wrong. This can help them test their preferred hypothesis in ways that they might not otherwise, and can make them more likely to accept and internalize challenging information .
  • Ask people to think about alternative hypotheses, and why those hypotheses might be right. Similarly to asking people to think about reasons why their preferred hypothesis might be wrong, this can encourage people to engage in a proper reasoning process, which they might not do otherwise. Note that, when doing this, it is generally better to focus on a small number of alternative hypotheses , rather than a large number of them.

Different techniques will be more effective for reducing the confirmation bias in different situations, and it is generally most effective to use a combination of techniques, while taking into account relevant situational and personal factors.

Furthermore, in addition to the above techniques, which are aimed at reducing the confirmation bias in particular, there are additional debiasing techniques that you can use to help people overcome their confirmation bias. This includes, for example, getting people to slow down their reasoning process, creating favorable conditions for optimal decision making, and standardizing the decision-making process.

Overall, to reduce the confirmation bias in others, you can use various techniques that revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place. This includes, for example, making people aware of this bias, making discussions be about finding the right answer instead of defending an existing belief, minimizing the unpleasantness associated with being wrong, encouraging people to give information sufficient consideration, and asking people to think about why their preferred hypothesis might be wrong or why competing hypotheses could be right.

Reducing your own confirmation bias

To mitigate the confirmation bias in yourself, you can use similar techniques to those that you would use to mitigate it in others. Specifically, you can do the following:

  • Identify when and how you’re likely to experience the bias.
  • Maintain awareness of the bias in relevant situations, and even actively ask yourself whether you’re experiencing it.
  • Figure out what kind of negative outcomes the bias can cause for you.
  • Focus on trying to find the right answer, rather than on proving that your initial belief was right.
  • Avoid feeling bad if you find out that you’re wrong; for example, try to focus on having learned something new that you can use in the future.
  • Don’t let your emotions dictate how you process information, particularly when it comes to seeking confirmation or avoiding challenges to your beliefs.
  • Dedicate sufficient time and mental effort when processing relevant information.
  • Avoid forming a hypothesis too early, before you’d had a chance to analyze sufficient information.
  • Clearly outline your reasoning, for example by identifying your stance and the evidence that you’re basing it on.
  • Think of reasons why your preferred hypothesis might be wrong.
  • Come up with alternative hypotheses, as well as reasons why those hypotheses might be right.

An added benefit of many of these techniques is that they can help you understand opposing views better, which is important when it comes to explaining your own stance and communicating with others on the topic.

In addition, you can also use general debiasing techniques , such as standardizing your decision-making process and creating favorable conditions for assessing information.

Furthermore, keep in mind that, as is the case with reducing the confirmation bias in others, different techniques will be more effective than others, both in general and in particular circumstances. You should take this into account, and try to find the approach that works best for you in any given situation.

Finally, note that in some ways, debiasing yourself can be easier than debiasing others, since other people are often not as open to your debiasing attempts as you yourself are. At the same time, however, debiasing yourself is also more difficult in some ways, since we often struggle to notice our own blind spots, and to identify areas where we are affected by cognitive biases in general, and the confirmation bias in particular.

Overall, to reduce the confirmation bias in yourself, you can use similar techniques to those that you would use to reduce it in others. This includes, for example, maintaining awareness of this bias, focusing on trying to find the right answer rather than proving that you were right, dedicating sufficient time and effort to analyzing information, clearly outlining your reasoning, thinking of reasons why your preferred hypothesis might be wrong, and coming up with alternative hypotheses.

Additional information

Related cognitive biases.

There are many cognitive biases that are closely associated with the confirmation bias, either because they involved a similar pattern or reasoning, or because they occur, at least partly, due to underlying confirmation bias.

For example, there is the backfire effect , which is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance. This bias can, for instance, cause people to increase their support for a political candidate after they encounter negative information about that candidate, or to strengthen their belief in a scientific misconception after they encounter evidence that highlights the issues with that misconception. The backfire effect is closely associated with the confirmation bias, since it involves the rejection of challenging evidence, with the goal of confirming one’s original beliefs.

Another example of a cognitive bias that is closely related to the confirmation bias is the halo effect , which is a cognitive bias that causes people’s impression of someone or something in one domain to influence their impression of them in other domains. This bias can, for instance, cause people to assume that if someone is physically attractive, then they must also have an interesting personality , or it can cause people to give higher ratings to an essay if they believe that it was written by an attractive author . The halo effect is closely associated with the confirmation bias, since it can be attributed in some cases to people’s tendency to confirm their initial impression of someone, by forming later impressions of them in a biased manner.

The origin and history of the confirmation bias

The term ‘confirmation bias’ was first used in a 1977 paper titled “ Confirmation bias in a simulated research environment: An experimental study of scientific inference “, published by Clifford R. Mynatt, Michael E. Doherty, and Ryan D. Tweney in the Quarterly Journal of Experimental Psychology (Volume 29, Issue 1, pp. 85-95). However, as the authors themselves note, evidence of the confirmation bias can be found earlier in the psychological literature.

Specifically, the following passage is the abstract of the paper that coined the term. It outlines the work presented in the paper, and also notes the existence of prior work on the topic:

“Numerous authors (e.g., Popper, 1959 ) argue that scientists should try to falsify  rather than  confirm theories. However, recent empirical work (Wason and Johnson-Laird, 1972 ) suggests the existence of a confirmation bias, at least on abstract problems. Using a more realistic, computer controlled environment modeled after a real research setting, subjects in this study first formulated hypotheses about the laws governing events occurring in the environment. They then chose between pairs of environments in which they could: (1) make observations which would probably confirm these hypotheses, or (2) test alternative hypotheses. Strong evidence for a confirmation bias involving failure to choose environments allowing tests of alternative hypotheses was found. However, when subjects did obtain explicit falsifying information, they used this information to reject incorrect hypotheses.”

In addition, a number of other past studies are discussed in the paper :

“Examples abound of scientists clinging to pet theories and refusing to seek alternatives in the face of large amounts of contradictory data (see Kuhn, 1970 ). Objective evidence, however, is scant. Wason ( 1968a ) has conducted several experiments on inferential reasoning in which subjects were given conditional rules of the form ‘If P then Q’, where P was a statement about one side of a stimulus card and Q a statement about the other side. Four stimulus cards, corresponding to P, not-P, Q, and not-Q were provided. The subjects’ task was to indicate those cards—and only those cards—which had to be turned over in order to determine if the rule was true or false. Most subjects chose only P, or P and Q. The only cards which can falsify the rule, however, are P and not-Q. Since the not-Q card is almost never selected, the results indicate a strong tendency to seek confirmatory rather than disconfirmatory evidence. This bias for selecting confirmatory evidence has proved remarkably difficult to eradicate (see Wason and Johnson-Laird, 1972 , pp. 171-201). In another set of experiments, Wason ( 1960 , 1968b , 1971 ) also found evidence of failure to consider alternative hypotheses. Subjects were given the task of recovering an experimenter defined rule for generating numerical sequences. The correct rule was a very general one and, consequently, many incorrect specific rules could generate sequences which were compatible with the correct rule. Most subjects produced a few sequences based upon a single, specific rule, received positive feedback, and announced mistakenly that they had discovered the correct rule. With some notable exceptions, what subjects did not do was to generate and eliminate alternative rules in a systematic fashion. Somewhat similar results have been reported by Miller ( 1967 ). Finally, Mitroff ( 1974 ), in a large-scale non-experimental study of NASA scientists, reports that a strong confirmation bias existed among many members of this group. He cites numerous examples of these scientists’ verbalizations of their own and other scientists’ obduracy in the face of data as evidence for this conclusion.”

Summary and conclusions

  • The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs.
  • The confirmation bias affects people in every area of life; for example, it can cause people to disregard negative information about a political candidate that they support, or to only pay attention to news articles that support what they already think.
  • People experience the confirmation bias due to various reasons, including challenge avoidance (the desire to avoid finding out that they’re wrong), reinforcement seeking (the desire to find out that they’re right), and flawed testing of hypotheses (e.g., fixating on a single explanation from the start).
  • To reduce the confirmation bias in yourself and in others, you can use various techniques that revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place.
  • Relevant debiasing techniques you can use include maintaining awareness of this bias, focusing on trying to find the right answer rather than being proven right, dedicating sufficient time and effort to analyzing relevant information, clearly outlining the reasoning process, thinking of reasons why a preferred hypothesis might be wrong, and coming up with alternative hypotheses and reasons why those hypotheses might be right.

Other articles you may find interesting:

  • The Backfire Effect: Why Facts Don't Always Change Minds
  • Cherry Picking: When People Ignore Evidence that They Dislike
  • Belief Bias: When People Rely on Beliefs Rather Than Logic

Understanding confirmation bias in research

Last updated

30 August 2023

Reviewed by

One of the biggest challenges of conducting a meaningful study is removing bias. Some forms of bias are easier than others to identify and remove.

One of the forms that's hardest for us to recognize in ourselves is confirmation bias.

In this article, you'll learn what confirmation bias is, the forms it takes, and how to begin removing it from your research. 

  • History of confirmation bias

Awareness of bias goes back as far as Aristotle and Plato. Aristotle noticed people are more likely to believe arguments that support their bias. Plato noticed the challenge of overcoming bias when seeking the truth. While neither called this “confirmation bias,” they were certainly aware of its effects.

The first psychological evidence of confirmation bias came from an experiment conducted by psychologist Peter Wason. Subjects were asked to guess a rule regarding a sequence of numbers. Participants could test any numbers they wanted before guessing what the rule was. However, most only tested the numbers that confirmed their initial guess.

  • Types of confirmation bias

Confirmation bias comes in many forms. Although the result is a failure to get the complete picture of a given research area, understanding the ways this bias presents itself can help you avoid it in your methodologies.

The biases that may impact research can be grounded in beliefs found outside the lab, so you'll need to evaluate how all your preconceived notions may play a role in skewing your research. 

Information selection bias

Information selection bias occurs when you seek out information that supports your existing beliefs. This is often done subconsciously. Information that allows someone to feel correct is more enjoyable to consume than information that challenges strongly held beliefs. This can also cause you to ignore or dismiss viewpoints that don't align with the way you think. 

For the purposes of this type of confirmation bias, information doesn't just mean news sources or scientific studies. The people you spend time with are a major source of information about the world. Selecting friend groups that don't challenge your beliefs can be a significant source of confirmation bias.

Example: Social media echo chambers

Many people carefully cultivate their social media feeds. Social media can be a challenging environment, with dissenting opinions treated as unfathomable evil, rather than mere disagreement. This can create particularly strong echo chambers that enforce an equally strong resistance to understanding the perspective of those who disagree with you.

Social scientists need to be aware of how these biases may impact their conclusions.

Interpretation bias

Data can often be interpreted in more ways than one. With motivated reasoning, even clear data can be distorted to better align with your views. When data is misrepresented to fit a particular line of reasoning, it's known as interpretation bias.

A common form of interpretation bias is when the researcher places emphasis on data that supports a preconceived notion and downplays data that doesn't.

Example: Biased interpretations of scientific studies

Whether it's a study you've conducted or one that's guiding your research, it's easy to focus on the parts that reinforce what you already believe and ignore the parts that don't.

However, doing so can prevent you from finding evidence that would disprove your theory and make it difficult to solve the problem at hand. 

Memory bias

The propensity to downplay disconfirming data can hurt research in the moment, but it can also have knock-on effects later. Data that confirms your biases will stick in your mind, while data that doesn't can fade away.

When confirmation bias appears in this form, it's called memory bias. This type of bias can be harder to recognize on a particular project because you can't be aware of something you don't remember.

Example: Cherry-picking scientific studies

A big part of conducting research is relying on work that others have done before you. A review of the literature can guide your research and help you to form conclusions.

Confirmation-seeking bias

Wason's experiment, described earlier, is an example of confirmation-seeking bias. The subjects only tested the rule they believed to be the case and didn't properly explore the options. As a result, they came to the wrong conclusion.

This can come in the form of poorly designed experiments or searching only for data and research that confirms your views. In its most extreme form, balanced or disconfirming sources are purposefully ignored or dismissed to confirm a bias instead of answering a research question.

Example: Looking for news sources that align with political views

Here's another example from outside the lab and is one to which political scientists may be particularly susceptible. Increasingly, news sources serve a particular ideological bent. Many people only rely on sources that paint a one-sided picture of the socio-political landscape.

While we're good at recognizing this behavior in others, we aren't so good at recognizing it in ourselves.

  • Impact of confirmation bias

The impacts of confirmation bias over which you have the most control are those that affect you directly. These will weaken the results of your research if you aren't careful to recognize and avoid your biases.

Some common impacts of confirmation bias are:

Biased hypotheses: Confirmation bias can lead you to form a hypothesis based more on existing beliefs than meaningful data, biasing the project from the start.

Data collection and interpretation: During the data collection phase, you may unconsciously focus on data that supports your hypotheses, leading to a distorted representation of the findings.

Selective reporting: In more extreme cases of confirmation bias, you may choose to only report on the findings that confirm your beliefs.

Misinterpretation of results: You may incorrectly interpret ambiguous or inconclusive findings that you would have otherwise been more conscious of.

Poor study design: You may unintentionally design experiments in ways where results are more likely to confirm a hypothesis instead of looking for a more balanced design.

Some impacts of confirmation bias affect the scientific community more broadly. When a given field is dominated by a particular ideology or belief system, several negative consequences can arise from the resulting confirmation bias.

Publication bias: Studies that align more closely with prevailing points of view or wisdom may be more likely to get published than those that push against them, regardless of the strength of the research.

Peer review and feedback: Both sides of peer review can suffer from confirmation bias. Reviewers may be more dismissive of studies they disagree with, or too lenient on those they don't. Authors may be less likely to accept valid criticism that challenges their beliefs.

Replication issues: The best way to prove the validity of a given piece of research is for someone else to replicate it. If confirmation bias played a role in the results, those without the bias might have difficulty replicating it, resulting in the type of replication crisis we've seen some fields experience.

  • Signs of confirmation bias

Understanding the signs of confirmation bias can help people recognize it in themselves and try to work past it. Confirmation bias can be a complex phenomenon, as evidenced by the numerous forms it can take.

Ignoring contradictory evidence

Unfortunately, it isn't uncommon for people to ignore evidence that contradicts their preconceived notions. Because everyone is guilty of this to some extent, it's important to know which signs to look out for, so you can catch yourself when it happens to you.

Some common signs of confirmation bias include:

Selectively focusing on data that supports your position while neglecting conflicting data

Deliberately avoiding situations that might expose you to opposing viewpoints

Suppressing or dismissing evidence that causes discomfort due to conflicting beliefs

Selective exposure to information

It's easiest to ignore disconfirming evidence if you never see it in the first place. Selective exposure to information is a major problem for those who want to get both sides of the picture and ensure their conclusions are based on fact and not bias.

Here are some signs you're guilty of selective exposure to information:

Actively seeking out sources that confirm your existing beliefs

Unconsciously avoiding information that challenges your worldview

Preferring news outlets and websites that align with your personal opinions

Over-relying on anecdotal evidence

There's a joke that the plural of anecdote isn't “anecdata.” Yet, many people treat anecdotal evidence as more concrete than hard data when the anecdotes fit their preferred narrative. Some ways you may catch yourself falling into this trap are:

Giving more weight to personal stories or experiences than concrete data

Being swayed by emotionally charged stories that resonate with your current beliefs

Drawing conclusions from individual experiences to make broader claims

Misinterpreting ambiguous information

The human brain has a habit of filling in gaps. When presented with ambiguous information, there are plenty of gaps to fill. Almost always, the mind will fill these gaps with information that supports an existing belief system.

The signs you're guilty of this include:

Assigning meaning to ambiguous information that confirms your preexisting beliefs

Interpreting ambiguous external stimuli in a way that aligns with your existing notions

Incorrectly attributing motives or intentions to ambiguous actions to fit your assumptions

Group polarization and echo chambers

When you spend most of your time around people who agree with you, you limit the number of alternative perspectives you are exposed to. When everyone you spend time with agrees with you, it creates a potentially false perception that a larger subset of the broader population is of the same opinion.

The following signs may indicate a lack of diversity in your relationships:

In a group setting, the people you spend time with reinforce each other's beliefs more often than not

You spend time in online and offline communities that all share the same views on a subject

The people you spend time with tend to vilify those with different opinions

  • How to avoid confirmation bias in research

The purpose of research should be to find the truth or to solve a problem. Neither can be accomplished if you're merely reinforcing your own, possibly false, beliefs.

We’ve already looked at some ways to identify and potentially avoid confirmation bias. Here are some more proactive measures you can take to be more sure your results are sound:

Acknowledging personal biases: The first step is to understand which way you may want the research to go. Then you'll be better equipped to design experiments that test your idea rather than simply confirm it.

Actively seeking diverse perspectives: Intellectual diversity is a powerful way to fight confirmation bias. Although the bias itself may lead you to push away those with differing beliefs, taking them into account is the best way to shape your own.

Engaging with contradictory information: Similarly, you must seek out information that disconfirms your hypothesis. What arguments and data are against it? By taking those into account in your research, you can better test which theories are true.

Using critical thinking and skepticism: A great way to combat confirmation bias is to treat findings that confirm your suspicions with the same scrutiny you would those that disconfirm them.

Employing rigorous research methods : Putting strict protocols in place and using a robust statistical analysis of the data, if applicable, can help counteract the bias you bring to the research.

Peer review: Just as you sought diverse perspectives when designing and conducting the research, have a trusted neutral party review your work for any signs of bias.

Continuous learning and self-improvement: As the Virginia Tech researchers showed, confirmation bias is part of how our brain works. Working to remove it takes continuous effort to better identify and mitigate it.

Get started today

Go from raw data to valuable insights with a flexible research platform

Editor’s picks

Last updated: 21 December 2023

Last updated: 16 December 2023

Last updated: 6 October 2023

Last updated: 5 March 2024

Last updated: 25 November 2023

Last updated: 15 February 2024

Last updated: 11 March 2024

Last updated: 12 December 2023

Last updated: 6 March 2024

Last updated: 10 April 2023

Last updated: 20 December 2023

Latest articles

Related topics, log in or sign up.

Get started for free

  • Media Center

Why do we favor our existing beliefs?

Confirmation bias, what is confirmation bias.

The confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.

Confirmation bias illustration

Where this bias occurs

Debias your organization.

Most of us work & live in environments that aren’t optimized for solid decision-making. We work with organizations of all kinds to identify sources of cognitive bias & develop tailored solutions.

Consider the following hypothetical situation: Jane is the manager of a local coffee shop. She is a firm believer in the motto, “hard work equals success.” The coffee shop, however, has seen a slump in sales over the past few months. Since Jane strongly believes that “hard work” is a means to success, she concludes that the dip in the coffee shop’s sales is because her staff is not working hard enough. To account for this, Jane puts several measures in place to ensure that her staff is working consistently. Consequently, she ends up spending more money by having a greater number of employees staffed on a shift, exceeding the shop’s budget and thus contributing to overall losses.

Consulting with other business owners in her area, Jane is able to identify her store’s new, less visible location as the primary cause of her sales slump. Her belief in hard work as the most important metric of success led her to mistakenly identify employees’ lack of effort as the reason for the store’s falling revenue while ignoring evidence that pointed to the true cause: the shop’s poor location. Jane has fallen victim to confirmation bias, which caused her to notice and give greater credence to evidence that fits with her pre-existing beliefs.

As this example illustrates, our personal beliefs can weigh us down when conflicting information is present. Not only does it stop us from finding a solution, but we also may not even be able to identify the problem to begin with.  

Individual effects

Confirmation bias can lead to poor decision-making as it distorts the reality from which we draw evidence. When observed under experimental conditions, assigned decision-makers have a tendency to actively seek and assign greater value to information that confirms their existing beliefs rather than evidence that entertains new ideas. 

Confirmation bias can have implications for our interpersonal relationships. Specifically, how first impressions cause us to selectively attend to our peers’ subsequent behavior. Once we have an expectation about a person, we will try to reinforce this belief through our later interactions with them. In doing so we can appear “closed-minded” or, conversely, participate in relationships that do not serve us. 

Systemic effects

Considering the bigger picture,  confirmation bias can have troubling implications. Major social divides and stalled policy-making may begin with our tendency to favor information that confirms our existing beliefs and ignores evidence that does not. The more we become entrenched in our preconceptions, the greater influence confirmation bias has on our behavior and, consequently, the people we choose to surround ourselves with. We can trap ourselves in a sort-of echo-chamber, and without being challenged, the biased thoughts prevail. This can be especially concerning in terms of socio-political cooperation and unity amongst the population.

Confirmation bias can exacerbate social exclusion and tensions. In-group bias is the tendency to favor those with whom you identify, in doing so, assigning them positive characteristics. That same inclination is not present for the out-group, which consists of individuals who you feel you share less in common with. Combined with confirmation bias, there is a lot of opportunity for prejudgment and stereotyping. Confirmation bias may lead us to look for favorable traits in our in-group and avoid any of our shortcomings. It may also cause us to be wary of the out-group and interpret their behavior through the lens of what we already assume.

Confirmation bias is particularly present in the consumption of news and media. The ever-evolving ease of access has allowed the population to personally curate what they consume. While it is evident that people cling to sources that support their political orientation, confirmation bias can also influence how news is reported. Journalists and media outlets are not immune to bias, they too are selective with their sources, what they choose to present, and how that information is conveyed. 2 Zooming out, these outlets and their leanings can have a strong influence on consumers’ knowledge, beliefs, and even voting patterns.  

How it affects product

Marketing and reviews are where we can see the largest influence of confirmation bias as it pertains to products. Most consumers rely on product reviews and advertisements to advise them on the benefits of various items. For example, influencers and celebrities are a great way to promote products. This can expose new people to the brand and broaden the customer demographic. However, it is important to be careful about who you allow to be part of your promotional campaign. By using controversial figures to recommend your product, you may be damaging the brand’s reputation. If any of your clients think poorly of the individual you endorsed, their first impression of your company will be a negative one. This is confirmation bias at work: if we dislike a celebrity who endorses a product, we are more likely to attend to information that suggests that we will also dislike the product.

Consumers will often consult reviews before buying a product – this gives them a good idea of whether or not that item will be useful and valuable. Upon researching, if they are primed with an abundance of positive reviews, they may be likely to seek to confirm information when using it themselves. 

Confirmation Bias and AI

When using artificial intelligence, we are in control of how we prompt the system. While these tools are meant to produce unbiased and objective information, the individual using them may steer the response in a direction that coincides with their preexisting beliefs. For example, if you are using AI software to research different political candidates, the manner in which you ask the question matters. Depending on the tool you use, “Why should I vote for X instead of Y” and “What are the strengths of X candidate and Y candidate” will turn up very different results. Depending on what we “want to hear,” we may unconsciously prompt the system to reinforce our initial thought pattern.

As mentioned, though we like to think of AI as unbiased, the reality may be a little murkier. Artificial intelligence uses large data sets to inform itself on various topics. Due to the size and comprehensiveness of these data sets, they may reflect the biases that are present in the world around us. While it may be harmless in certain situations, it can also perpetuate negative stereotypes, or push a certain narrative as a result of the data used to program it. 

Why it happens

Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for shortcuts to make the process more efficient.

Confirmation bias is aided by several processes that all act on different stages to protect the individual from cognitive dissonance or the discomfort associated with the violation of one’s beliefs. These processes include:

  • Selective exposure, which refers to the filtering of information. Meaning that the individual avoids all challenging or contradictory information.    
  • Selective perception occurs when the individual observes or is exposed to information that conflicts with their standing beliefs, yet somehow tries to manipulate the information to affirm their existing views.
  • Selective retention is a major principle in marketing and attests that individuals are more likely to remember information that has been presented to them if it is consistent with what they already know to be true. 3

Our brains use shortcuts

Heuristics are the mental shortcuts that we use for efficient, though sometimes inaccurate, decision-making. Though it is debated whether or not confirmation bias can be categorized as a heuristic, it is certainly a cognitive strategy. Specifically, it helps us to avoid cognitive dissonance by searching and attending to information that we already believe.  

It makes sense that we do this. Oftentimes, humans need to make sense of information quickly however, forming new explanations or beliefs takes time and effort. We have adapted to take the path of least resistance, sometimes out of necessity.

Imagine our ancestors hunting. An intimidating animal is charging toward them, and they only have a few seconds to decide whether to hold their ground or run. There is no time to consider all the different variables involved in a fully informed decision. Past experience and instinct might cause them to look at the size of the animal and run. However, the presence of other hunters now tilts the chances of successful conflict in their favor. Evolutionary psychologists believe that the modern use of mental shortcuts for in-the-moment decision-making is based on past survival instincts. 1  

It makes us feel good about ourselves

 No one likes to be proven wrong, and when information is presented that violates our beliefs, it is only natural to push back. Deeply held views often form our identities, so disproving them can be uncomfortable. We might even believe that being wrong suggests that we lack intelligence. As a result, we often look for information that supports rather than refutes our existing beliefs.

We can also see the effects of confirmation bias in group settings. Clinical psychologist Jennifer Lerner in collaboration with political psychologist Phillip Tetlock proposed that through our interactions with others, we update our beliefs to conform to the group norm.  The psychologists distinguished between confirmatory thought, which seeks to rationalize a certain belief, and exploratory thought, which takes into consideration many viewpoints before deciding where you stand. 

Confirmatory thought in interpersonal settings can produce groupthink , in which the desire for conformity results in dysfunctional decision-making. So, while confirmation bias is often an individual phenomenon, it can also take place in groups of people.

Why it is important

As mentioned above, confirmation bias can be expressed individually or in a group context. Both can be problematic and deserve careful attention.

At the individual level, confirmation bias affects our decision-making. Our choices cannot be fully informed if we are only focusing on evidence that confirms our assumptions. Confirmation bias causes us to overlook pivotal information both in our careers and in everyday life. A poorly informed decision is likely to produce suboptimal results because not all of the potential alternatives have been explored.  

A voter might stand by a candidate while dismissing emerging facts about the candidate’s poor behavior. A business executive might fail to investigate a new opportunity because of a negative experience with similar ideas in the past. An individual who sustains this sort of thinking may be labeled “close-minded.”  Confirmation bias can cause us to miss out on opportunities and make less informed choices, it is important to approach situations and the decisions they call for with an open mind. 

At a group level, it can produce and sustain the groupthink phenomenon. In a culture of groupthink, decision-making can be hindered by the assumption that harmony and group coherence are the values most crucial to success. This reduces the likelihood of disagreement within the group.

Imagine if an employee at a technology firm  did not disclose a revolutionary discovery she made for fear of reorienting the firm’s direction. Likewise, this bias can prevent people from becoming informed on differing views, and by extension, engaging in the constructive discussion that many democracies are built on.

How to avoid it

Confirmation bias is likely to occur when we are gathering information for decision-making. It occurs subconsciously, meaning that we are unaware of its influence on our decision-making.

As such, the first step to avoiding confirmation bias is being aware that it is a problem. By understanding its effect and how it works, we are more likely to identify it in our decision-making. Psychology professor and author Robert Cialdini suggests two approaches to recognizing when these biases are influencing our decision-making:

First, listen to your gut feeling. We often have a physical reaction to uncomfortable stimuli , like when a salesperson is pushing us too far. Even if we have complied with similar requests in the past, we should not use that precedent as a reference point. Recall past actions and ask yourself: “Knowing what I know now, if I could go back in time, would I make the same commitment?”

Second, because the bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by diversifying where we get our information from, and having multiple sources. Though it is difficult to find objective reporting, reaching for reputable, neutral outlets can allow us to have more agency in our beliefs. 

Third, when hypotheses are being drawn from assembled data, decision-makers should also consider having interpersonal discussions that explicitly aim at identifying individual cognitive bias in the hypothesis selection and evaluation. Engaging in debate is a productive way to challenge our views and expose ourselves to information we may have otherwise avoided. 

While it is likely impossible to eliminate confirmation bias completely, these measures may help manage cognitive bias and make better decisions in light of it.

How it all started

Confirmation bias was known to the ancient Greeks. It was described by the classical historian Thucydides, in his text The History of the Peloponnesian. He wrote: “It is a habit of mankind to entrust to careless hope what they long for and to use sovereign reason to thrust aside what they do not want.’’ 4

In the 1960s, Peter Wason first described this phenomenon as confirmation bias. In what’s known as Wason’s SelectionTest, he conducted an experiment in which participants were presented with four cards. The cards were either red or brown and featured a number on the opposite side, two even, and two odd cards. For example, two cards would read the numbers 3 and 8, while the other two would be face down, showing the color, one red and one brown. Participants were told if the number on the card was even, the opposite side would be red. They were then tasked with trying to figure out whether this rule was true by flipping over two cards of their choosing. 

Many of the participants chose to turn over the card with the number 8 as well as the red card, as this was consistent with the rule they were given. In reality, this does little to actually test the rule. Indeed, turning over the “8” card will confirm what the experimenter said, but one also needs to turn over the brown card to verify that it is an odd number.

This experiment demonstrates confirmation bias in action, we seek to confirm what we know to be true, while disregarding information that could potentially violate that. 5  

Example 1 – Blindness to our own faults

A major study carried out by researchers at Stanford University in 1979 explored the psychological dynamics of confirmation bias. The study was composed of undergraduate students who held opposing viewpoints on the topic of capital punishment. Unbeknownst to them, the participants were asked to evaluate two fictitious studies on the topic.

One of the false studies provided data in support of the argument that capital punishment deters crime, while the alternative, opposing view (that capital punishment had no appreciable effect on overall criminality in the population).

While both studies were entirely fabricated by the Stanford researchers, they were designed to present “equally compelling” objective statistics. The researchers discovered that responses to the studies were heavily influenced  by participants’ pre-existing opinions:

  • The participants who initially supported the deterrence argument in favor of capital punishment considered the anti-deterrence data unconvincing and thought the data in support of their position was credible;
  • Participants who held the opposing view at the beginning of the study reported the same but in support of their stance against capital punishment.

So, after being confronted both with evidence that supported capital punishment and evidence that refuted it, both groups reported feeling more committed to their original stance. The net effect of having their position challenged was a re-entrenchment of their existing beliefs. 6

Example 2 – Effects of the internet

The “filter bubble effect” is an example of technology amplifying and facilitating our cognitive tendency toward confirmation bias. The term was coined by internet activist Eli Pariser to describe the intellectual isolation that can occur when websites use algorithms to predict and present information a user would want to see. 7

This means that as we use particular websites and content networks, the more likely we are to encounter content that we prefer. At the same time, algorithms will exclude content that runs contrary to our preferences. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might favor information that confirms your existing options and exclude disconfirming evidence from your online experience.

In his seminal book, “The Filter Bubble: What the Internet Is Hiding from You", Pariser uses the example of internet searches for an oil spill to show the filter bubble effect:

"In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term ‘BP’. They’re pretty similar — educated, white, left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw the news. For one, the first page results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP." 7

If this were the only source of information that these women were exposed to, surely they would have formed very different conceptions of the BP oil spill. The internet search engine showed information tailored to the beliefs their past searches showed and picked results predicted to fit with their reaction to the oil spill. Unbeknownst to them, it facilitated confirmation bias.

While the implications of this particular filter bubble may have been harmless, filter bubbles on social media platforms have been shown to influence elections by tailoring the content of campaign messages and political news to different subsets of voters. This could have a fragmenting effect that inhibits constructive democratic discussion, as different voter demographics become increasingly entrenched in their political views as a result of a curated stream of evidence that supports them.

Confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.

Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for shortcuts to make the process more efficient. We look for evidence that best supports what we know to be true because the most readily available hypotheses are the ones we already have. Another reason why we sometimes show confirmation bias is that it protects our self-esteem. No one likes feeling bad about themselves-- and realizing that a belief they valued is false can have this effect. As a result, we often look for information that supports rather than disproves our existing beliefs.

Example #1 - Blindness to our own faults

A 1979 study by researchers at Stanford found that after being confronted with equally compelling evidence in support of capital punishment and evidence that refuted it, subjects reported feeling more committed to their original stance on the issue. The net effect of having their position challenged was a re-entrenchment of their existing beliefs.

Example #2 - Establishing personalized networks online

Modern preference algorithms have a “filter bubble effect,” which is an example of technology amplifying and facilitating our tendency toward confirmation bias. Websites use algorithms to predict the information and content that a user wants to see. We normally prefer media that confirms our beliefs because it requires less critical reflection. So, filter bubbles might exclude information that clashes with your existing opinions as informed by your online activity. 

Confirmation bias is likely to occur when we are gathering the information needed to make decisions. It is also subconscious; we are unaware of its influence on our decision-making. As such, the first step to avoiding confirmation bias is making ourselves aware of it. Because confirmation bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by having multiple objective sources of information.

Related TDL articles

Can overcoming implicit gender bias boost a company’s bottom line .

This article argues that gender diversity in a firm is associated with higher firm performance. By addressing and drawing on confirmation bias (among other relevant psychological principles), firms may be able to increase diversity and thereby increase performance.

Learning Within Limits: How Curated Content Affects Education 

This article argues that the use of ‘trigger warnings’, modern preferences algorithms, and other such cues create a highly curated stream of information that facilitates cognitive biases such as confirmation bias. The author notes that this can prevent us from empathizing with others and consolidating our opinions in light of differing ones.

  • Healy, P. (2016, August 18). Confirmation bias: How it affects your organization and how to overcome it. Business Insights Blog. https://online.hbs.edu/blog/post/confirmation-bias-how-it-affects-your-organization-and-how-to-overcome-it 
  • Ling, R. (2020). Confirmation bias in the era of mobile news consumption: The social and psychological dimensions. Digital Journalism, 8(5), 596–604. https://doi.org/10.1080/21670811.2020.1766987
  • Hastall, M. R. (2020). Selective exposure, perception, and retention. The SAGE International Encyclopedia of Mass Media and Society, 1–5, 1537–1539. https://doi.org/10.4135/9781483375519  
  • Schlosser, J. A. (2013). “Hope, danger’s comforter”: Thucydides, hope, politics. The Journal of Politics, 75(1), 169–182. https://doi.org/10.1017/s0022381612000941 
  • Badcock, C. (2012, May 5). Making sense of wason. Psychology Today. https://www.psychologytoday.com/ca/blog/the-imprinted-brain/201205/making-sense-wason 
  • Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–2109. https://doi.org/10.1037/0022-3514.37.11.2098 
  • Pariser, E. (2012). The filter bubble: What the internet is hiding from you. Penguin Books. 

Affect Heuristic

Why do we rely on our current emotions when making quick decisions, the sunk cost fallacy, why are we likely to continue with an investment even if it would be rational to give it up, salience bias, why do we focus on items or information that are more prominent and ignore those that are not.

Notes illustration

Eager to learn about how behavioral science can help your organization?

Get new behavioral science insights in your inbox every month..

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

11: Confirmation Bias and Filter Bubbles

  • Last updated
  • Save as PDF
  • Page ID 100045

What is Being Filtered Out of Your Search?

The idea of filter bubbles was introduced nearly ten years ago by Eli Pariser. The idea is that information providers are tracking your online activity in order to target what they have determined are your information needs.

Even almost ten years later, many people still haven't heard of filter bubbles. If you haven't, you might find the information a bit unnerving. This is Pariser's TED Talk from 2011.

Eli Pariser filter bubbles ted talk - image links to video

Where Are We Now?

A few years ago, Eli Pariser was interviewed by Wired magazine to discuss how Pariser's warning of filter bubbles had evolved over time.

Reading one: From Wired website: Eli Pariser Predicted the Future. Now He Can’t Escape It by Jesse Hempel

How We Confirm Our Own Beliefs 

Filter bubbles are outside forces that affect the information we take in. But, there's also a lot of stuff going on in our own brains that influences the way we take in and interpret information. This is called confirmation bias.

The next reading from Scientific American explores how people can be exposed to scientific evidence, but still have doubts. It's a good introduction to confirmation bias in this context.

Wikipedia also has an extensive entry on confirmation bias that is well researched and has a lot of suggested readings if you want to explore this concept further. I included a link to it at the bottom of the page in further reading.

[NOTE TO USERS OF THIS TEXTBOOK: The following reading is not freely available online. The link goes to the Los Rios Libraries MASTERfile database. You will need to see if your databases include access to this article and if not, find an alternative.]

Reading two: Scientific American: The Science of Antiscience Thinking: Convincing people who doubt the validity of climate change and evolution to change their  beliefs requires overcoming a set of ingrained cognitive biases By: Kenrick, Douglas T., Cohen, Adam B., Neuberg, Steven L., & Cialdini Robert B.

Bias in News

I am a former journalist. My bachelor's degree is in journalism and I worked as a television news producer for nearly ten years before switching careers. I have been stunned to see how much the journalistic landscape has shifted in the last twenty years. Journalists used to be highly respected and objectivity was paramount.

Now, many news outlets openly discuss and tout their political leanings. It has created an environment that makes understanding our confirmation biases even more difficult.

There is a lot of information out there about media bias. One website I've been particularly impressed with is called AllSides.com . I often encourage students to seek out the same story from several different news outlets to see how it has been covered. AllSides.com does that for you, showing the same story and its coverage from left, center, and right leaning news sources.

I encourage you to check out the website and click around. Read a story that interests you to see how it has been covered in the three areas. We'll be exploring this more next week.

Further Reading

Wikipedia. (2019, March 22). Confirmation bias . https://en.wikipedia.org/wiki/Confirmation_bias

CC BY-NC logo

Pariser, E. (2011). Beware online “filter bubbles” [Video file]. TED https://www.ted.com/talks/eli_pariser_beware_online_filter_bubbles

Shahram Heshmat Ph.D.

What Is Confirmation Bias?

People are prone to believe what they want to believe..

Posted April 23, 2015 | Reviewed by Lybi Ma

  • When people would like a certain concept to be true, they believe it to be true. This is confirmation bias.
  • Confirmation bias can be found in anxious individuals, who view the world as dangerous.
  • Wishful thinking, or false optimism, can lead to confirmation bias.
  • Overcoming confirmation bias begins with setting one's hypothesis while looking for how to prove it is wrong.

Imagine that you have tried to reach a friend with whom you have an ambivalent relationship by phone or email, leaving messages, yet receiving no call in return. In a situation like this, it is easy to jump to conclusions in an intuitive manner that your friend wants to avoid you. The danger, of course, is that you leave this belief unchecked and start to act as though it were true.

Confirmation bias occurs from the direct influence of desire on beliefs. When people would like a certain idea or concept to be true, they end up believing it to be true. They are motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views or prejudices one would like to be true.

Once we have formed a view, we embrace information that confirms that view while ignoring, or rejecting, information that casts doubt on it. Confirmation bias suggests that we don’t perceive circumstances objectively. We pick out those bits of data that make us feel good because they confirm our prejudices. Thus, we may become prisoners of our assumptions. For example, some people will have a very strong inclination to dismiss any claims that marijuana may cause harm as nothing more than old-fashioned reefer madness. Some social conservatives will downplay any evidence that marijuana does not cause harm.

Confirmation bias, anxiety, and self-deception

Confirmation bias can also be found in anxious individuals, who view the world as dangerous. For example, a person with low self-esteem is highly sensitive to being ignored by other people, and they constantly monitor for signs that people might not like them. Thus, if you are worried that someone is annoyed with you, you are biased toward all the negative information about how that person acts toward you. You interpret neutral behavior as indicative of something negative.

Wishful thinking is a form of self-deception , such as false optimism . For example, we often deceive ourselves, such as stating: just this one; it’s not that fattening; I’ll stop smoking tomorrow. Or when someone is “under the influence” he feels confident that he can drive safely even after three or more drinks.

Self-deception can be like a drug, numbing you from harsh reality or turning a blind eye to the tough matter of gathering evidence and thinking. As Voltaire commented long ago, “Illusion is the first of all pleasure.” In some cases, self-deception is good for us. For example, when dealing with certain illnesses, positive thinking may actually be beneficial for diseases such as cancer, but not diabetes or ulcers. There is limited evidence that believing that you will recover helps reduce your level of stress hormones , giving the immune system and modern medicine a better chance to do their work.

In sum, people are prone to believe what they want to believe. Seeking to confirm our beliefs comes naturally, while it feels strong and counterintuitive to look for evidence that contradicts our beliefs. This explains why opinions survive and spread. Disconfirming instances are far more powerful in establishing the truth. Disconfirmation would require looking for evidence to disprove it.

How to minimize confirmation bias

The take-home lesson here is to set your hypothesis and look for instances to prove that you are wrong. This is perhaps a true definition of self-confidence : the ability to look at the world without the need to look for instances that please your ego.

For group decision-making , it is crucial to obtain information from each member in a way that they are independent. For example, as part of a police procedure to derive the most reliable information from multiple witnesses to a crime , witnesses are not allowed to discuss it prior to giving their testimony. The goal is to prevent unbiased witnesses from influencing each other. It is known that Abraham Lincoln intentionally filled his cabinet with rival politicians who had extremely different ideologies. When making decisions, Lincoln always encouraged vigorous debate and discussion.

Shahram Heshmat Ph.D.

Shahram Heshmat, Ph.D., is an associate professor emeritus of health economics of addiction at the University of Illinois at Springfield.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Teletherapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Coronavirus Disease 2019
  • Affective Forecasting
  • Neuroscience
  • Cornette Library

Critical Thinking

  • Confirmation Bias
  • Deepfake Videos
  • Evaluation Tools & Search Strategies
  • Logic Puzzles
  • Lesson Ideas & Tools
  • Open Educational Resources
  • Quiz and Certificate

Confirmation Bias: What is it?

Encyclopaedia Britannica defines confirmation bias as, " the tendency to process information by looking for, or interpreting, information that is consistent with one’s existing beliefs".

Casad, B.J. (2019). Confirmation bias. Encyclopaedia Britannica. Retrieved July 23, 2020,  https://www.britannica.com/science/confirmation-bias  

Confirmation Bias: How to Avoid It, Part 1

One way to avoid confirmation bias is to disprove what you believe instead of only looking for things that prove your belief.  Learn more about that by watching the short video below.

Confirmation Bias: How to Avoid It, Part 2

  • Avoiding Confirmation Bias in Searches Read through this brief eBook chapter to learn more about avoiding confirmation bias in searches.
  • << Previous: Propaganda
  • Next: Clickbait >>
  • Last Updated: Jul 31, 2023 4:04 PM
  • URL: https://infoguides.wtamu.edu/criticalthinking

Kendall College of Art & Design

Critical Thinking & Evaluating Information

  • Critical Thinking Skills
  • Critical Thinking Questions
  • Fake News & Misinformation
  • Checkers & Cheat Sheets
  • Evaluate Using T.R.A.A.P.
  • Alternate Videos
  • Sources & Links

What is Bias?

                Sources of bias image bubble

Biases also play a role in how you approach all information. The short video below provides definitions of 12 types of cognitive biases.

There are two forms of bias of particular importance given today's information laden landscape, implicit bias and confirmation bias .

Implicit Bias & Confirmation Bias

Implicit / Unconscious Bias 

"Original definition (neutral) - Any personal preference, attitude, or expectation that unconsciously affects a person's outlook or behaviour.

Current definition (negative) - Unconscious favouritism towards or prejudice against people of a particular race, gender, or group that influences one's actions or perceptions; an instance of this."

"unconscious bias, n." OED Online, Oxford University Press, December 2020, www.oed.com/view/Entry/88686003 .

"Thoughts and feelings are “implicit” if we are unaware of them or mistaken about their nature. We have a bias when, rather than being neutral, we have a preference for (or aversion to) a person or group of people. Thus, we use the term “implicit bias” to describe when we have attitudes towards people or associate stereotypes with them without our conscious knowledge." 

https://perception.org/research/implicit-bias/

Confirmation Bias – "Originating in the field of psychology; the tendency to seek or favour new information which supports one’s existing theories or beliefs, while avoiding or rejecting that which disrupts them." 

Addition of definition to the Oxford Dictionary in 2019 

"confirmation, n." OED Online, Oxford University Press, December 2020, www.oed.com/view/Entry/38852. 

Simply put, confirmation bias is the tendency to seek out and/ or interpret new information as confirmation of one's existing beliefs or theories and to exclude contradictory or opposing information or points of view.

Put Bias in Check!

                Who, what, when, where, why, how blocks image

Now that you are aware of bias, your personal biases and bias that can be found in sources of information, you can put it in check . You should approach information objectively, neutrally and critically evaluate it. Numerous tools included in this course can help you do this, like the critical thinking cheat sheet in the previous module.

  • << Previous: Critical Thinking Questions
  • Next: Evaluating News & Media >>
  • Last Updated: Sep 9, 2021 12:09 PM
  • URL: https://ferris.libguides.com/criticalthinking

Ferris State University Imagine More

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2023 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

confirmation bias critical thinking

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

confirmation bias critical thinking

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

More From Forbes

Overcoming confirmation bias.

  • Share to Facebook
  • Share to Twitter
  • Share to Linkedin

Of the many cognitive biases that cloud our decision making, one that business leaders need to be particularly vigilant against is confirmation bias .

Confirmation bias refers to our tendency to interpret information in a way that confirms what we already believe to be true. This bias is widespread and has significant implications for business leaders who need to make sound and objective decisions based on available data.

“Confirmation bias occurs from the direct influence of desire on beliefs,” says managerial economics expert Dr. Shahram Heshmat. “When people would like a certain idea or concept to be true, they end up believing it to be true. They are motivated by wishful thinking. This error leads the individual to stop gathering information when the evidence gathered so far confirms the views or prejudices one would like to be true.”

Confirmation bias is something all business leaders need to guard against.

When leaders only seek information that confirms their existing beliefs, they may ignore or dismiss information that contradicts those beliefs. This can lead to decisions that are based on incomplete or biased information, which can result in poor outcomes for the organization. For example, a business leader who believes that their product is superior to their competitors may only seek feedback that confirms this belief, ignoring negative feedback that could help them improve their product.

Another danger of confirmation bias in business leadership is that it can create an echo chamber.

When leaders only seek information that confirms their existing beliefs, they are unlikely to consider alternative viewpoints. This can lead to a lack of diversity in the organization's decision-making process, which can limit innovation and creativity. Additionally, an echo chamber can create a culture where dissenting views are discouraged or dismissed, which can lead to a lack of accountability and transparency.

Best Travel Insurance Companies

Best covid-19 travel insurance plans.

Confirmation bias can also lead to overconfidence and arrogance in business leaders.

When leaders believe that their existing beliefs are always correct, they may become complacent and fail to consider alternative viewpoints or feedback. This can create a false sense of security that can lead to poor decision-making and missed opportunities. For example, a business leader who is overconfident in their company's market position may fail to consider new competitors or emerging technologies that could disrupt their business.

To avoid the dangers of confirmation bias, business leaders must actively seek out diverse viewpoints and opinions. This can be achieved through a variety of methods, such as seeking feedback from customers, engaging with stakeholders, and encouraging dissenting views within the organization. Additionally, leaders should seek out information that contradicts their existing beliefs to ensure that they are making decisions based on all available data.

It is also essential for business leaders to cultivate a culture of transparency and accountability. By encouraging open and honest communication within the organization, leaders can create an environment where alternative viewpoints are valued, and dissenting views are encouraged. This can help to avoid the creation of an echo chamber and promote a diverse and inclusive decision-making process.

Finally, business leaders must be willing to admit when they are wrong and make course corrections when necessary. By acknowledging mistakes and learning from them, leaders can demonstrate a willingness to consider alternative viewpoints and adapt to changing circumstances. This can help to avoid overconfidence and arrogance and promote a culture of humility and continuous improvement.

“Confirmation bias is twisting the facts to fit your beliefs. Critical thinking is bending your beliefs to fit the facts,” says organizational psychologist and author Adam Grant. “Seeking the truth is not about validating the story in your head. It's about rigorously vetting and accepting the story that matches the reality in the world.”

Bryce Hoffman

  • Editorial Standards
  • Reprints & Permissions

These 2 internal biases cause us to fall for misinformation - here's why

people walking around - confirmation bias

Everyone is subject to internal biases. But we are not powerless to stop them. Image:  Photo by Timon Studler on Unsplash

.chakra .wef-1c7l3mo{-webkit-transition:all 0.15s ease-out;transition:all 0.15s ease-out;cursor:pointer;-webkit-text-decoration:none;text-decoration:none;outline:none;color:inherit;}.chakra .wef-1c7l3mo:hover,.chakra .wef-1c7l3mo[data-hover]{-webkit-text-decoration:underline;text-decoration:underline;}.chakra .wef-1c7l3mo:focus,.chakra .wef-1c7l3mo[data-focus]{box-shadow:0 0 0 3px rgba(168,203,251,0.5);} Alex Edmans

A hand holding a looking glass by a lake

.chakra .wef-1nk5u5d{margin-top:16px;margin-bottom:16px;line-height:1.388;color:#2846F8;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-1nk5u5d{font-size:1.125rem;}} Get involved .chakra .wef-9dduvl{margin-top:16px;margin-bottom:16px;line-height:1.388;font-size:1.25rem;}@media screen and (min-width:56.5rem){.chakra .wef-9dduvl{font-size:1.125rem;}} with our crowdsourced digital platform to deliver impact at scale

Stay up to date:, fairer economies.

  • Confirmation bias is the temptation to accept evidence uncritically if it confirms what one would like to be true.
  • Black-and-white thinking is another form of bias that entails viewing the world in binary terms.
  • We can overcome these biases by asking simple questions and thinking critically.

“Check the facts.” “Examine the evidence.” “Correlation is not causation.”

We’ve heard these phrases enough times that they should be in our DNA. If true, misinformation would never get out of the starting block. But countless examples abound of misinformation spreading like wildfire.

This is because our internal, often subconscious, biases cause us to accept incorrect statements at face value. Nobel Laureate Daniel Kahneman refers to our rational, slow thought process — which has mastered the above three phrases — as System 2, and our impulsive, fast thought process — distorted by our biases — as System 1. In the cold light of day, we know that we shouldn’t take claims at face value, but when our System 1 is in overdrive, the red mist of anger clouds our vision.

Have you read?

Curbing misinformation in india: how does a fact-checking whatsapp helpline work, disinformation is a threat to our trust ecosystem. experts explain how to curb it, confirmation bias.

One culprit is confirmation bias – the temptation to accept evidence uncritically if it confirms what we’d like to be true, and to reject a claim out of hand if it clashes with our worldview. Importantly, these biases can be subtle; they’re not limited to topics such as immigration or gun control where emotions run high. It’s widely claimed that breastfeeding increases child IQ, even though correlation is not causation because parental factors drive both. But, because many of us would trust natural breastmilk over the artificial formula of a giant corporation, we lap this claim up.

Confirmation bias is hard to shake. In a study , three neuroscientists took students with liberal political views and hooked them up to a functional magnetic resonance imaging scanner. The researchers read out statements the participants previously said they agreed with, then gave contradictory evidence and measured the students’ brain activity. There was no effect when non-political claims were challenged, but countering political positions triggered their amygdala. That’s the same part of the brain that’s activated when a tiger attacks you, inducing a ‘fight-or-flight’ response. The amygdala drives our System 1, and drowns out the prefrontal cortex which operates our System 2.

Confirmation bias looms large for issues where we have a pre-existing opinion. But for many topics, we have no prior view. If there’s nothing to confirm, there’s no confirmation bias, so we’d hope we can approach these issues with a clear head.

Black-and-white-thinking

Unfortunately, another bias can kick in: black-and-white thinking. This bias means that we view the world in binary terms. Something is either always good or always bad, with no shades of grey.

To pen a bestseller, Atkins didn’t need to be right. He just needed to be extreme.

The bestselling weight-loss book in history, Dr Atkins’ New Diet Revolution , benefited from this bias. Before Atkins, people may not have had strong views on whether carbs were good or bad. But as long as they think it has to be one or the other, with no middle ground, they’ll latch onto a one-way recommendation. That’s what the Atkins diet did. It had one rule: Avoid all carbs. Not just refined sugar, not just simple carbs, but all carbs. You can decide whether to eat something by looking at the “Carbohydrate” line on the nutrition label, without worrying whether the carbs are complex or simple, natural or processed. This simple rule played into black-and-white thinking and made it easy to follow.

Overcoming our biases

So, what do we do about it? The first step is to recognize our own biases. If a statement sparks our emotions and we’re raring to share or trash it, or if it’s extreme and gives a one-size-fit-all prescription, we need to proceed with caution.

The second step is to ask questions, particularly if it’s a claim we’re eager to accept. One is to “consider the opposite”. If a study had reached the opposite conclusion, what holes would you poke in it? Then, ask yourself whether these concerns still apply even though it gives you the results you want.

Take the plethora of studies claiming that sustainability improves company performance. What if a paper had found that sustainability worsens performance? Sustainability supporters would throw up a host of objections. First, how did the researchers actually measure sustainability? Was it a company’s sustainability claims rather than its actual delivery? Second, how large a sample did they analyze? If it was a handful of firms over just one year, the underperformance could be due to randomness; there’s not enough data to draw strong conclusions. Third, is it causation or just correlation? Perhaps high sustainability doesn’t cause low performance, but something else, such as heavy regulation, drives both. Now that you’ve opened your eyes to potential problems, ask yourselves if they plague the study you’re eager to trumpet.

A second question is to “consider the authors”. Think about who wrote the study and what their incentives are to make the claim that they did. Many reports are produced by organizations whose goal is advocacy rather than scientific inquiry. Ask “would the authors have published the paper if it had found the opposite result?” — if not, they may have cherry-picked their data or methodology.

In addition to bias, another key attribute is the authors’ expertise in conducting scientific research. Leading CEOs and investors have substantial experience, and there’s nobody more qualified to write an account of the companies they’ve run or the investments they’ve made. However, some move beyond telling war stories to proclaiming a universal set of rules for success – but without scientific research we don’t know whether these principles work in general. A simple question is “If the same study was written by the same authors, with the same credentials, but found the opposite results, would you still believe it?”

Today, anyone can make a claim, start a conspiracy theory or post a statistic. If people want it to be true it will go viral. But we have the tools to combat it. We know how to show discernment, ask questions and conduct due diligence if we don’t like a finding. The trick is to tame our biases and exercise the same scrutiny when we see something we’re raring to accept.

This article is adapted from May Contain Lies: How Stories, Statistics, and Studies Exploit Our Biases – and What We Can Do About It (Penguin Random House, 2024).

Don't miss any update on this topic

Create a free account and access your personalized content collection with our latest publications and analyses.

License and Republishing

World Economic Forum articles may be republished in accordance with the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International Public License, and in accordance with our Terms of Use.

The views expressed in this article are those of the author alone and not the World Economic Forum.

The Agenda .chakra .wef-n7bacu{margin-top:16px;margin-bottom:16px;line-height:1.388;font-weight:400;} Weekly

A weekly update of the most important issues driving the global agenda

.chakra .wef-1dtnjt5{display:-webkit-box;display:-webkit-flex;display:-ms-flexbox;display:flex;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;-webkit-flex-wrap:wrap;-ms-flex-wrap:wrap;flex-wrap:wrap;} More on Fairer Economies .chakra .wef-17xejub{-webkit-flex:1;-ms-flex:1;flex:1;justify-self:stretch;-webkit-align-self:stretch;-ms-flex-item-align:stretch;align-self:stretch;} .chakra .wef-nr1rr4{display:-webkit-inline-box;display:-webkit-inline-flex;display:-ms-inline-flexbox;display:inline-flex;white-space:normal;vertical-align:middle;text-transform:uppercase;font-size:0.75rem;border-radius:0.25rem;font-weight:700;-webkit-align-items:center;-webkit-box-align:center;-ms-flex-align:center;align-items:center;line-height:1.2;-webkit-letter-spacing:1.25px;-moz-letter-spacing:1.25px;-ms-letter-spacing:1.25px;letter-spacing:1.25px;background:none;padding:0px;color:#B3B3B3;-webkit-box-decoration-break:clone;box-decoration-break:clone;-webkit-box-decoration-break:clone;}@media screen and (min-width:37.5rem){.chakra .wef-nr1rr4{font-size:0.875rem;}}@media screen and (min-width:56.5rem){.chakra .wef-nr1rr4{font-size:1rem;}} See all

confirmation bias critical thinking

Inclusion in US financial services is at an all-time high – and tech can take us further

John Hope Bryant

April 11, 2024

confirmation bias critical thinking

Explainer: What is a living wage and how is it different from the minimum wage?

Victoria Masterson

April 9, 2024

confirmation bias critical thinking

Building equitable futures through the Ibero-American Social and Solidarity Economy Network

Juan Manuel Martinez Louvier

March 5, 2024

confirmation bias critical thinking

How countries can save millions by prioritising young people's sexual and reproductive health

Tomoko Fukuda and Andreas Daugaard Jørgensen

March 4, 2024

confirmation bias critical thinking

5 ways to drive investment in the most challenging fragile places

Andrew Herscowitz

February 27, 2024

confirmation bias critical thinking

Are global value chains leaving Indonesian SMEs behind?

Arip Tirta and Prasanti W. Sarli

February 20, 2024

Joam Gonzalez Rodriguez’s Blog

Lehigh Valley Social Impact Fellowship

Blog Post #9

  • Our team has several personal goals that we are passionate about. First and foremost, we all share a deep passion for promoting and supporting creative industries within our community. Second, we are motivated by a desire to contribute to the growth and revitalization of the Lehigh Valley’s local economy. We are interested in fostering entrepreneurial ecosystems, particularly within the creative sector, as it aligns with our personal goals and values. Lastly, we are all committed to the process of research and innovation. We value the process of research and discovery and believe it is essential in proposing effective solutions. To leverage these personal goals for collaboration, we can encourage open discussions to align individual motivations with collective project objectives. We’ll emphasize the importance of each member’s unique perspective and expertise in achieving shared goals, fostering a sense of purpose and ownership within the team.

2. What are the common project goals within the members of your team, and how can you leverage those goals to make progress?

  • Identifying Barriers and Opportunities: Understanding the challenges and gaps in current support systems for CIs in the Lehigh Valley.
  • Developing Innovative Solutions: Creating actionable strategies to enhance support structures and ecosystems for creative entrepreneurs.
  • Implementing Effective AI Integration: Exploring how AI can benefit and impact entrepreneurs in creative industries.
  • Advancing Local Economic Growth: Contributing to the transformation of the Lehigh Valley into a thriving hub for creative entrepreneurs.

To leverage these shared project goals, we’ll establish clear milestones and objectives that align with each member’s strengths. We’ll also foster a collaborative environment where team members can openly discuss ideas, contribute insights, and collectively work towards achieving project milestones.

3. What are some biases that might become a barrier to your project goals?

Addressing biases is a crucial aspect of ensuring effective project outcomes. Status quo bias, which can manifest as resistance to changing existing structures that impede innovation, must be actively countered by promoting a culture of openness to new ideas and approaches. Confirmation bias, on the other hand, is where individuals prefer information that aligns with their preconceived notions. This can be mitigated by encouraging critical thinking and welcoming diverse perspectives within the team. Cultural bias is another type of bias that should be addressed by fostering an inclusive environment that values and respects diverse viewpoints and experiences. This bias may overlook the unique needs of different creative sectors in the community. Lastly, technology bias refers to unchecked assumptions about AI solutions and can be managed by promoting thorough reflection and evaluation of proposed technologies. By cultivating a culture of critical thinking and open-mindedness, we can effectively challenge biases, leading to more informed decision-making and innovative problem-solving approaches.

4. What type of decision-making system will you use and why?

To ensure inclusivity and diverse perspectives in shaping project strategies, it is important to utilize a participative decision-making system that values input from all team members. This can be achieved through techniques such as brainstorming sessions, consensus-building discussions, and regular feedback loops that enhance collaboration and empower team members to contribute meaningfully to project decisions. In addition, we will emphasize the use of data-driven insights and evidence-based approaches to validate proposed solutions, which will help mitigate the impact of biases on project outcomes.

Leave a Reply Cancel reply

You must be logged in to post a comment.

  • Study Guides
  • Homework Questions

Critical thinking self analysis

IMAGES

  1. Principles of Critical Thinking: Confirmation Bias • Open Minds Foundation

    confirmation bias critical thinking

  2. Principles of Critical Thinking: Confirmation Bias

    confirmation bias critical thinking

  3. 17 Confirmation Bias Examples (2023)

    confirmation bias critical thinking

  4. Confirmation bias

    confirmation bias critical thinking

  5. Confirmation Bias: Definition, Signs, Overcoming

    confirmation bias critical thinking

  6. Guide to the Most Common Cognitive Biases and Heuristics

    confirmation bias critical thinking

VIDEO

  1. Confirming you were right! #bias #confirmationbias #shorts

  2. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  3. Reality Check: How Confirmation Bias Shapes Our Worldview #factsunveiled

  4. Debunking Psychic Tricks: The Truth Behind Astrologers and Psychics

  5. Steering Clear of the Trap (Understanding Hindsight Bias)

  6. Confirmation Bias and Politics

COMMENTS

  1. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  2. Confirmation Bias In Psychology: Definition & Examples

    Confirmation Bias is the tendency to look for information that supports, rather than rejects, one's preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association). One of the early demonstrations of confirmation bias appeared in an experiment ...

  3. Confirmation Bias: Seeing What We Want to Believe

    Confirmation bias is one of several cognitive biases'(Lidén, 2023). They are important because researchers have recognized that "vulnerability to clinical anxiety and depression depends in part on various cognitive biases" and that mental health treatments such as CBT should support the goals of reducing them (Eysenck & Keane, 2015, p. 668).

  4. Cognitive Bias Is the Loose Screw in Critical Thinking

    Cognitive biases include confirmation bias, anchoring bias, bandwagon effect, and negativity bias. ... Since then, I recalled a huge impediment to critical thinking: cognitive bias. We are all ...

  5. Confirmation Bias: Definition, Signs, Overcoming

    A confirmation bias is cognitive bias that favors information that confirms your previously existing beliefs or biases. For example, imagine that a person believes left-handed people are more creative than right-handed people. Whenever this person encounters a person that is both left-handed and creative, they place greater importance on this ...

  6. Confirmation bias

    Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills. Biased search for information, biased interpretation of this information, and biased memory recall, have been invoked to explain four specific effects:

  7. What Is the Function of Confirmation Bias?

    Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss ...

  8. Confirmation bias

    confirmation bias, people's tendency to process information by looking for, or interpreting, information that is consistent with their existing beliefs.This biased approach to decision making is largely unintentional, and it results in a person ignoring information that is inconsistent with their beliefs. These beliefs can include a person's expectations in a given situation and their ...

  9. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. ... One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that ...

  10. Believing in Overcoming Cognitive Biases

    Examples of Cognitive Biases. Confirmation bias is the selective gathering and interpretation of evidence consistent with current ... an instrument used to detect the impact of such biases on analytical thinking. 22 It is important to note that the validity of the ICBM ... Cosby K, Wears R. Critical Thinking and Reasoning in Emergency ...

  11. What Is Confirmation Bias?

    Revised on March 10, 2023. Confirmation bias is the tendency to seek out and prefer information that supports our preexisting beliefs. As a result, we tend to ignore any information that contradicts those beliefs. Confirmation bias is often unintentional but can still lead to poor decision-making in (psychology) research and in legal or real ...

  12. The Confirmation Bias: Why People See What They Want to See

    How the confirmation bias affects people. The confirmation bias promotes various problematic patterns of thinking, such as people's tendency to ignore information that contradicts their beliefs.It does so through several types of biased cognitive processes: Biased search for information.This means that the confirmation bias causes people to search for information that confirms their ...

  13. Understanding Confirmation Bias: Causes, Effects, and Solutions

    Using critical thinking and skepticism: A great way to combat confirmation bias is to treat findings that confirm your suspicions with the same scrutiny you would those that disconfirm them. Employing rigorous research methods : Putting strict protocols in place and using a robust statistical analysis of the data, if applicable, can help ...

  14. What Is Confirmation Bias?

    Confirmation bias is a killer of critical thinking. And the opposite approach is exhausting. To constantly consider a broad set of evidence and data (historical patterns, existing trends, widespread indicators, alternative explanations, etc.) and then narrow it down to identify higher-quality data in order to form a 'fluid' conclusion that ...

  15. Confirmation Bias

    Confirmation bias describes our underlying tendency to notice, focus on, and provide greater credence to evidence that fit our existing beliefs. ... An individual who sustains this sort of thinking may be labeled "close-minded." ... We normally prefer media that confirms our beliefs because it requires less critical reflection. So, filter ...

  16. 11: Confirmation Bias and Filter Bubbles

    Filter bubbles are outside forces that affect the information we take in. But, there's also a lot of stuff going on in our own brains that influences the way we take in and interpret information. This is called confirmation bias. The next reading from Scientific American explores how people can be exposed to scientific evidence, but still have ...

  17. What Is Confirmation Bias?

    Confirmation bias can be found in anxious individuals, who view the world as dangerous. Wishful thinking, or false optimism, can lead to confirmation bias.

  18. Confirmation Bias

    Encyclopaedia Britannica defines confirmation bias as, "the tendency to process information by looking for, or interpreting, information that is consistent with one's existing beliefs". Casad, B.J. (2019). Confirmation bias.

  19. LibGuides: Critical Thinking & Evaluating Information: Bias

    Confirmation Bias - "Originating in the field of psychology; the tendency to seek or favour new information which supports one's existing theories or beliefs, while avoiding or rejecting that which disrupts them." Addition of definition to the Oxford Dictionary in 2019. "confirmation, n." OED Online, Oxford University Press, December 2020 ...

  20. Cognitive Bias List: 13 Common Types of Bias

    The Misinformation Effect. The Actor-Observer Bias. The False Consensus Effect. The Halo Effect. The Self-Serving Bias. The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases.

  21. Confirmation Bias Definition and Examples

    Confirmation bias not only affects how we gather information, but it also impacts how we pay attention, interpret data, and remember information. ... Keeping these tips in mind and staying vigilant against confirmation bias can improve your critical thinking skills and help you make better decisions. Reference. Hart W, Albarracín D, Eagly AH, ...

  22. Confirmation Bias: What It Is and Why It's Important

    Confirmation bias is a psychological term for the human tendency to only seek out information that supports one position or idea. This causes you to have a bias towards your original position ...

  23. Overcoming Confirmation Bias

    "Confirmation bias is twisting the facts to fit your beliefs. Critical thinking is bending your beliefs to fit the facts," says organizational psychologist and author Adam Grant.

  24. These 2 internal biases cause us to fall for misinformation

    Confirmation bias is the temptation to accept evidence uncritically if it confirms what one would like to be true. Black-and-white thinking is another form of bias that entails viewing the world in binary terms. We can overcome these biases by asking simple questions and thinking critically.

  25. Blog Post #9

    Confirmation bias, on the other hand, is where individuals prefer information that aligns with their preconceived notions. This can be mitigated by encouraging critical thinking and welcoming diverse perspectives within the team. Cultural bias is another type of bias that should be addressed by fostering an inclusive environment that values and ...

  26. Critical thinking self analysis (docx)

    Erika Vaughan September 26, 2023 Critical Thinking Self Analysis 1) Of all 5, the two that I tend to struggle with the hardest are confirmation bias and also loss aversion. The article describes loss aversion as a biasness of people who tend to strongly prefer losses over acquiring equivalent gains, in layman's terms, people will feel the pain of losing something more intense than the pleasure ...