Anchoring Bias occurs when a person's expectation about one thing is affected by something mostly or entirely irrelevant they saw, heard, or thought before, such as an irrelevant number. In other words, it occurs when a person's beliefs or behaviors are influenced by a specific piece of information far more than they should be given how much evidence that information actually provides.
Attention Bias occurs when some information or evidence holds a disproportionate amount of a person's attention because of that person's environment or history, or because of people's natural instincts.
|
The Availability Bias occurs when someone's prediction about an event's frequency or probability is unduly influenced by how easily they can recall examples of that event. We have a whole mini-course about combating availability bias .
A Bias Blind Spot is a tendency to see oneself as being less biased or less susceptible to biases (such as those listed in this article) than others in the population.
: "In fact, viewing yourself as rational can backfire. The more objective you think you are, the more you trust your own intuitions and opinions as accurate representations of reality, and the less inclined you are to question them. 'I'm an objective person, so my views on gun control must be correct, unlike the views of all those irrational people who disagree with me,' we think." - |
Choice-Supportive Bias is a cognitive bias whereby someone who has chosen between different options later remembers the option that they chose as having more positive attributes than it did at the time (while they remember options they did not choose as having more negative attributes than they'd had at the time).
Confirmation Bias refers to a tendency for people to seek out, favor, or give more weight to information that confirms their preconceptions or hypotheses (even if the information isn't true) than information that contradicts their prior beliefs.
The Denomination Effect is a cognitive bias whereby people tend to be more likely to spend a given amount of money if it is composed of smaller individual sums than if it is composed of larger individual sums.
Hindsight Bias refers to a tendency to perceive past events as being more predictable than they were before they took place.
Optimism Bias is the tendency to be unduly optimistic about the probability of future good and bad events, overestimating the probability of positive ones while underestimating the probability of negative ones.
Motivated reasoning occurs when you are disposed to interpret new evidence in ways that support your existing beliefs, or that lead to the outcome you wish was true, even when that evidence doesn't truly support your beliefs.
What are the types of bias.
There are three main types of bias.
1. Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X.
2. Implicit biases are unconscious beliefs that lead people to form opinions or judgments, often without being fully aware they hold the unconscious beliefs. If you subtely distrust people of group X without even realizing you're doing it, then you have an implicit bias against people of group X.
3. Cognitive biases differ from explicit and implicit biases: they are a group of systematic patterns in how our beliefs, judgments, and actions differ from what they would if we were completely rational. If most people systemtaically misjudge certain types of information in such a way that you come to false conclusions, then people have a cognitive bias related to that type of information.
There is no consensus among academics regarding how many cognitive biases exist. Some have found ~40 , others find >100 , and Wikipedia lists over 180 .
As we’ve seen above, cognitive biases often appear when one is faced with a decision and has limited resources (such as time, understanding, and cognitive capacity).
For instance, when buying a banana, you can't consider every single possible other use of that money to determine whether a banana is truly the single best use. You are limited in both how much time you have to think and how much total cognitive capacity you have.
Using fast heuristics or relying on our intuition is often an effective way of coming to conclusions in these situations because such approaches require fewer resources than careful thinking. While our intuition is often reliable, there are certain cases where our intuitions systematically produce inaccurate beliefs and unhelpful behaviors - these are what we refer to as "cognitive biases".
Even when we have plenty of time to think and aren't hitting a limit on our cognitive resources, people can still be prone to cognitive biases. For instance, there are certain automatic rules of thumb that our minds evolved to use since they worked quite well for the survival of our ancestors. Unfortunately, these rules of thumb can sometimes lead us to false conclusions and unhelpful behaviors in the modern world.
Cognitive biases are not good or bad in themselves. They are an unavoidable effect of not having infinite intelligence and infinite time to think, and hence the need to rely on heuristics and intuition. We call a tendency a cognitive bias when it leads to systemic inaccuracies in our beliefs or unhelpful behaviors. In that sense, by definition, cognitive biases cause systematic problems.
However, cognitive biases do not always lead to negative outcomes in every instance. For instance, overconfidence may cause a person to try something very difficult, that they ultimately succeed at. On the other hand, for every one person who succeeds due to overconfidence, there may be multiple other people that try something that's unrealistic due to overconfidence and end up failing.
Just knowing about specific cognitive biases is a great first step to identifying them in yourself, but knowledge of the biases is often not sufficient to cause you to identify them. Once you’ve done that, it can be helpful to get to know the most common cognitive biases (such as the ones presented above) so that you can look out for them in your own thinking.
Yes and no. It is possible to reduce the influence of cognitive biases on your thinking (and this can be very beneficial!). So you may be able to avoid a cognitive bias in many particular instances. But it's not possible to completely remove all of your cognitive biases.
Unfortunately, it’s impossible to overcome all of your cognitive biases completely. However, that doesn’t mean you can’t do anything. A good first step on the path to getting your cognitive biases under control is familiarizing yourself with them
Here are a few of our interactive tools that might help:
The Planning Fallacy
The Sunk Cost Fallacy
Improve Your Frequency Predictions
Political Bias Test
Rhetorical Fallacies
Are Your Overconfident?
Calibrate Your Judgement
How Rational Are You, Really?
Metal Traps ,
However, just knowing about your cognitive biases isn’t enough . You need to take action! Here are some practical steps we recommend:
Biases such as overconfidence, confirmation bias, and the illusion of control can be reduced or avoided by having multiple points of view. Surrounding yourself and listening to people with diverse experiences, systems of beliefs, and expertise reduces the chances of falling into one of the said biases. This is also true for the source of information: it is less likely that you fall into a cognitive bias if you look for other data sources and conflict.
Actively seeking evidence against your current point of view (on important decisions) can be a helpful way to combat biases like overconfidence, confirmation bias, and motivated reasoning.
Another strategy recommended by researchers who studied cognitive biases in physicians, is to consciously consider the options you dismissed at first, so you can reach a more considered answer.
Emotional biases can be considered a subcategory of cognitive biases. What separates them from other cognitive biases is that they are based on e motions such as anger, disgust, fear, happiness, sadness, and surprise . When we're experiencing emotions, we may act in a biased way that is concordant with that emotion. For instance, anxiety may cause us to overestimate the chance of something being dangerous.
Emotional biases are linked to emotional dispositions (commonly known as ‘temperament’). Different emotional dispositions may even lead to different emotional reactions to the same occurrence of events.
Emotional biases may help us explain optimism and pessimism biases .
Cognitive biases interfere with impartiality, and they can negatively impact critical thinking in a myriad of different ways. Here are several:
Motivated reasoning leads us to underestimate the arguments for conclusions we don’t believe in and overestimate the arguments for conclusions we want to believe;
Availability bias messes with our critical thinking because it leads us to asses risk by how readily examples come to mind, rather than considering all of the relevant examples;
We are also prone to blind spot bias, meaning that we are less likely to identify biases in our own judgment than in other people's.
Cognitive biases affect decision-making in at least two ways: they help decision-making by speeding it up and cutting necessary corners when we have limited time or cognitive power, but they also hinder decision-making by causing us to come to false conclusions or take unhelpful actions in certain cases.
Research has shown some correlation between gender or sex and specific biases. For instance, researchers found that male investors tend to show greater overconfidence and optimism biases, while female investors tend to exhibit more anchoring and hindsight biases. The research makes no claims about what causes such gendered differences - e.g., socialization or biology or a mix of both.
Gender stereotypes are explicit biases, which means they are not cognitive biases. However, there are many cognitive biases that involve gender stereotypes. For example, masculine bias is the tendency to assume a person is a male based on stereotypes after hearing gender-neutral information about them, and the tendency to use gender as a description only when describing women.
Gender stereotypes are also a sign of binary thinking .
Research has shown some cognitive biases are correlated with depression . This has been found to be the case for negative interpretation bias (the tendency to interpret ambiguous scenarios as negative) and pessimistic biases, which lead people to predict future situations as unrealistically negative.
Cognitive behavioral therapy is based on the assumption that individuals with depression have distorted negative beliefs about themselves or the world (known in CBT as "cognitive distortions").
Yes. They have been studied since the early 1970s by cognitive psychologists, sociologists, and behavioral economists.
Just like every other human being, scientists can exhibit cognitive biases.They may exhibit overconfidence bias or fall prety to selection biases, for example. This has been researched as it relates to the replication crisis social psychology faces today .
There is even research on the presence of cognitive biases in scientific contexts and occuring within academic publications. Nobody, not even scientists, are immune to cognitive biases!
Both. We are born with a tendency for some cognitive biases, but we can also learn specific aspects of these biases. Our brains have evolved to be prone to all sorts of cognitive biases because those biases have been helpful in the survival of our ancestors in the environment (and under the constraints) in which they lived.
But the details of some specific cognitive biases are learned as we move through the world. For example, humans have evolved a tendency to engage in motivated reasoning, but which conclusions motivate your reasoning is something you aren’t born with and are impacted by your experiences and learning.
Want to understand cognitive biases on a deeper level? Learn about a few of the mind's mistakes with our interactive introduction to cognitive biases!
Who is right about your money: traditional economists or self-help authors?
Does astrology work? We put 152 astrologers to the test
What causes anxiety: life challenges or your personality?
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Diane f. halpern.
1 Department of Psychology, Claremont McKenna College, Emerita, Altadena, CA 91001, USA
2 Department of Psychology, Moravian College, Bethlehem, PA 18018, USA; ude.naivarom@nnud
Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. A high IQ is correlated with many important outcomes (e.g., academic prominence, reduced crime), but it does not protect against cognitive biases, partisan thinking, reactance, or confirmation bias, among others. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests. Similarly, some scholars argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. Yes, intelligence (i.e., critical thinking) can be enhanced and used for solving a real-world problem such as COVID-19, which we use as an example of contemporary problems that need a new approach.
The editors of this Special Issue asked authors to respond to a deceptively simple statement: “How Intelligence Can Be a Solution to Consequential World Problems.” This statement holds many complexities, including how intelligence is defined and which theories are designed to address real-world problems.
For the most part, we identify high intelligence as having a high score on a standardized test of intelligence. Like any test score, IQ can only reflect what is on the given test. Most contemporary standardized measures of intelligence include vocabulary, working memory, spatial skills, analogies, processing speed, and puzzle-like elements (e.g., Wechsler Adult Intelligence Scale Fourth Edition; see ( Drozdick et al. 2012 )). Measures of IQ correlate with many important outcomes, including academic performance ( Kretzschmar et al. 2016 ), job-related skills ( Hunter and Schmidt 1996 ), reduced likelihood of criminal behavior ( Burhan et al. 2014 ), and for those with exceptionally high IQs, obtaining a doctorate and publishing scholarly articles ( McCabe et al. 2020 ). Gottfredson ( 1997, p. 81 ) summarized these effects when she said the “predictive validity of g is ubiquitous.” More recent research using longitudinal data, found that general mental abilities and specific abilities are good predictors of several work variables including job prestige, and income ( Lang and Kell 2020 ). Although assessments of IQ are useful in many contexts, having a high IQ does not protect against falling for common cognitive fallacies (e.g., blind spot bias, reactance, anecdotal reasoning), relying on biased and blatantly one-sided information sources, failing to consider information that does not conform to one’s preferred view of reality (confirmation bias), resisting pressure to think and act in a certain way, among others. This point was clearly articulated by Stanovich ( 2009, p. 3 ) when he stated that,” IQ tests measure only a small set of the thinking abilities that people need.”
Most theories of intelligence do not directly address the question of whether people with high intelligence can successfully solve real world problems. For example, Grossmann et al. ( 2013 ) cite many studies in which IQ scores have not predicted well-being, including life satisfaction and longevity. Using a stratified random sample of Americans, these investigators found that wise reasoning is associated with life satisfaction, and that “there was no association between intelligence and well-being” (p. 944). (critical thinking [CT] is often referred to as “wise reasoning” or “rational thinking,”). Similar results were reported by Wirthwein and Rost ( 2011 ) who compared life satisfaction in several domains for gifted adults and adults of average intelligence. There were no differences in any of the measures of subjective well-being, except for leisure, which was significantly lower for the gifted adults. Additional research in a series of experiments by Stanovich and West ( 2008 ) found that participants with high cognitive ability were as likely as others to endorse positions that are consistent with their biases, and they were equally likely to prefer one-sided arguments over those that provided a balanced argument. There are several newer theories that directly address the question about solving real-world problems. Prominent among them is Sternberg’s adaptive intelligence with “adaptation to the environment” as the central premise, a construct that does not exist on standardized IQ tests (e.g., Sternberg 2019 ). Similarly, Stanovich and West ( 2014 ) argue that standardized tests of intelligence are not measures of rational thought—the sort of skill/ability that would be needed to address complex real-world problems. Halpern and Butler ( 2020 ) advocate for CT as a useful model of intelligence for addressing real-world problems because it was designed for this purpose. Although there is much overlap among these more recent theories, often using different terms for similar concepts, we use Halpern and Butler’s conceptualization to make our point: Yes, intelligence (i.e., CT) can be enhanced and used for solving a real-world problem like COVID-19.
One definition of intelligence that directly addresses the question about intelligence and real-world problem solving comes from Nickerson ( 2020, p. 205 ): “the ability to learn, to reason well, to solve novel problems, and to deal effectively with novel problems—often unpredictable—that confront one in daily life.” Using this definition, the question of whether intelligent thinking can solve a world problem like the novel coronavirus is a resounding “yes” because solutions to real-world novel problems are part of his definition. This is a popular idea in the general public. For example, over 1000 business managers and hiring executives said that they want employees who can think critically based on the belief that CT skills will help them solve work-related problems ( Hart Research Associates 2018 ).
We define CT as the use of those cognitive skills or strategies that increase the probability of a desirable outcome. It is used to describe thinking that is purposeful, reasoned, and goal directed--the kind of thinking involved in solving problems, formulating inferences, calculating likelihoods, and making decisions, when the thinker is using skills that are thoughtful and effective for the particular context and type of thinking task. International surveys conducted by the OECD ( 2019, p. 16 ) established “key information-processing competencies” that are “highly transferable, in that they are relevant to many social contexts and work situations; and ‘learnable’ and therefore subject to the influence of policy.” One of these skills is problem solving, which is one subset of CT skills.
The CT model of intelligence is comprised of two components: (1) understanding information at a deep, meaningful level and (2) appropriate use of CT skills. The underlying idea is that CT skills can be identified, taught, and learned, and when they are recognized and applied in novel settings, the individual is demonstrating intelligent thought. CT skills include judging the credibility of an information source, making cost–benefit calculations, recognizing regression to the mean, understanding the limits of extrapolation, muting reactance responses, using analogical reasoning, rating the strength of reasons that support and fail to support a conclusion, and recognizing hindsight bias or confirmation bias, among others. Critical thinkers use these skills appropriately, without prompting, and usually with conscious intent in a variety of settings.
One of the key concepts in this model is that CT skills transfer in appropriate situations. Thus, assessments using situational judgments are needed to assess whether particular skills have transferred to a novel situation where it is appropriate. In an assessment created by the first author ( Halpern 2018 ), short paragraphs provide information about 20 different everyday scenarios (e.g., A speaker at the meeting of your local school board reported that when drug use rises, grades decline; so schools need to enforce a “war on drugs” to improve student grades); participants provide two response formats for every scenario: (a) constructed responses where they respond with short written responses, followed by (b) forced choice responses (e.g., multiple choice, rating or ranking of alternatives) for the same situations.
There is a large and growing empirical literature to support the assertion that CT skills can be learned and will transfer (when taught for transfer). See for example, Holmes et al. ( 2015 ), who wrote in the prestigious Proceedings of the National Academy of Sciences , that there was “significant and sustained improvement in students’ critical thinking behavior” (p. 11,199) for students who received CT instruction. Abrami et al. ( 2015, para. 1 ) concluded from a meta-analysis that “there are effective strategies for teaching CT skills, both generic and content specific, and CT dispositions, at all educational levels and across all disciplinary areas.” Abrami et al. ( 2008, para. 1 ), included 341 effect sizes in a meta-analysis. They wrote: “findings make it clear that improvement in students’ CT skills and dispositions cannot be a matter of implicit expectation.” A strong test of whether CT skills can be used for real-word problems comes from research by Butler et al. ( 2017 ). Community adults and college students (N = 244) completed several scales including an assessment of CT, an intelligence test, and an inventory of real-life events. Both CT scores and intelligence scores predicted individual outcomes on the inventory of real-life events, but CT was a stronger predictor.
Heijltjes et al. ( 2015, p. 487 ) randomly assigned participants to either a CT instruction group or one of six other control conditions. They found that “only participants assigned to CT instruction improved their reasoning skills.” Similarly, when Halpern et al. ( 2012 ) used random assignment of participants to either a learning group where they were taught scientific reasoning skills using a game format or a control condition (which also used computerized learning and was similar in length), participants in the scientific skills learning group showed higher proportional learning gains than students who did not play the game. As the body of additional supportive research is too large to report here, interested readers can find additional lists of CT skills and support for the assertion that these skills can be learned and will transfer in Halpern and Dunn ( Forthcoming ). There is a clear need for more high-quality research on the application and transfer of CT and its relationship to IQ.
A pandemic occurs when a disease runs rampant over an entire country or even the world. Pandemics have occurred throughout history: At the time of writing this article, COVID-19 is a world-wide pandemic whose actual death rate is unknown but estimated with projections of several million over the course of 2021 and beyond ( Mega 2020 ). Although vaccines are available, it will take some time to inoculate most or much of the world’s population. Since March 2020, national and international health agencies have created a list of actions that can slow and hopefully stop the spread of COVID (e.g., wearing face masks, practicing social distancing, avoiding group gatherings), yet many people in the United States and other countries have resisted their advice.
Could instruction in CT encourage more people to accept and comply with simple life-saving measures? There are many possible reasons to believe that by increasing citizens’ CT abilities, this problematic trend can be reversed for, at least, some unknown percentage of the population. We recognize the long history of social and cognitive research showing that changing attitudes and behaviors is difficult, and it would be unrealistic to expect that individuals with extreme beliefs supported by their social group and consistent with their political ideologies are likely to change. For example, an Iranian cleric and an orthodox rabbi both claimed (separately) that the COVID-19 vaccine can make people gay ( Marr 2021 ). These unfounded opinions are based on deeply held prejudicial beliefs that we expect to be resistant to CT. We are targeting those individuals who beliefs are less extreme and may be based on reasonable reservations, such as concern about the hasty development of the vaccine and the lack of long-term data on its effects. There should be some unknown proportion of individuals who can change their COVID-19-related beliefs and actions with appropriate instruction in CT. CT can be a (partial) antidote for the chaos of the modern world with armies of bots creating content on social media, political and other forces deliberately attempting to confuse issues, and almost all media labeled “fake news” by social influencers (i.e., people with followers that sometimes run to millions on various social media). Here, are some CT skills that could be helpful in getting more people to think more critically about pandemic-related issues.
Early communications about the ability of masks to prevent the spread of COVID from national health agencies were not consistent. In many regions of the world, the benefits of wearing masks incited prolonged and acrimonious debates ( Tang 2020 ). However, after the initial confusion, virtually all of the global and national health organizations (e.g., WHO, National Health Service in the U. K., U. S. Centers for Disease Control and Prevention) endorse masks as a way to slow the spread of COVID ( Cheng et al. 2020 ; Chu et al. 2020 ). However, as we know, some people do not trust governmental agencies and often cite the conflicting information that was originally given as a reason for not wearing a mask. There are varied reasons for refusing to wear a mask, but the one most often cited is that it is against civil liberties ( Smith 2020 ). Reasoning by analogy is an appropriate CT skill for evaluating this belief (and a key skill in legal thinking). It might be useful to cite some of the many laws that already regulate our behavior such as, requiring health inspections for restaurants, setting speed limits, mandating seat belts when riding in a car, and establishing the age at which someone can consume alcohol. Individuals would be asked to consider how the mandate to wear a mask compares to these and other regulatory laws.
Another reason why some people resist the measures suggested by virtually every health agency concerns questions about whom to believe. Could training in CT change the beliefs and actions of even a small percentage of those opposed to wearing masks? Such training would include considering the following questions with practice across a wide domain of knowledge: (a) Does the source have sufficient expertise? (b) Is the expertise recent and relevant? (c) Is there a potential for gain by the information source, such as financial gain? (d) What would the ideal information source be and how close is the current source to the ideal? (e) Does the information source offer evidence that what they are recommending is likely to be correct? (f) Have you traced URLs to determine if the information in front of you really came from the alleged source?, etc. Of course, not everyone will respond in the same way to each question, so there is little likelihood that we would all think alike, but these questions provide a framework for evaluating credibility. Donovan et al. ( 2015 ) were successful using a similar approach to improve dynamic decision-making by asking participants to reflect on questions that relate to the decision. Imagine the effect of rigorous large-scale education in CT from elementary through secondary schools, as well as at the university-level. As stated above, empirical evidence has shown that people can become better thinkers with appropriate instruction in CT. With training, could we encourage some portion of the population to become more astute at judging the credibility of a source of information? It is an experiment worth trying.
Historical records show that refusal to wear a mask during a pandemic is not a new reaction. The epidemic of 1918 also included mandates to wear masks, which drew public backlash. Then, as now, many people refused, even when they were told that it was a symbol of “wartime patriotism” because the 1918 pandemic occurred during World War I ( Lovelace 2020 ). CT instruction would include instruction in why and how to compute cost–benefit analyses. Estimates of “lives saved” by wearing a mask can be made meaningful with graphical displays that allow more people to understand large numbers. Gigerenzer ( 2020 ) found that people can understand risk ratios in medicine when the numbers are presented as frequencies instead of probabilities. If this information were used when presenting the likelihood of illness and death from COVID-19, could we increase the numbers of people who understand the severity of this disease? Small scale studies by Gigerenzer have shown that it is possible.
The process of analyzing arguments requires that individuals rate the strength of support for and against a conclusion. By engaging in this practice, they must consider evidence and reasoning that may run counter to a preferred outcome. Kozyreva et al. ( 2020 ) call the deliberate failure to consider both supporting and conflicting data “deliberate ignorance”—avoiding or failing to consider information that could be useful in decision-making because it may collide with an existing belief. When applied to COVID-19, people would have to decide if the evidence for and against wearing a face mask is a reasonable way to stop the spread of this disease, and if they conclude that it is not, what are the costs and benefits of not wearing masks at a time when governmental health organizations are making them mandatory in public spaces? Again, we wonder if rigorous and systematic instruction in argument analysis would result in more positive attitudes and behaviors that relate to wearing a mask or other real-world problems. We believe that it is an experiment worth doing.
We believe that teaching CT is a worthwhile approach for educating the general public in order to improve reasoning and motivate actions to address, avert, or ameliorate real-world problems like the COVID-19 pandemic. Evidence suggests that CT can guide intelligent responses to societal and global problems. We are NOT claiming that CT skills will be a universal solution for the many real-world problems that we confront in contemporary society, or that everyone will substitute CT for other decision-making practices, but we do believe that systematic education in CT can help many people become better thinkers, and we believe that this is an important step toward creating a society that values and practices routine CT. The challenges are great, but the tools to tackle them are available, if we are willing to use them.
Conceptualization, D.F.H. and D.S.D.; resources, D.F.H.; data curation, writing—original draft preparation, D.F.H.; writing—review and editing, D.F.H. and D.S.D. All authors have read and agreed to the published version of the manuscript.
This research received no external funding.
No IRB Review.
No Informed Consent.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Make sure that the decisions that matter are not made based on bias..
Posted September 7, 2018 | Reviewed by Matt Huston
Though the concept of illusory superiority arguably dates back to Confucius and Socrates, it may come as a shock that its discussion in the form of the Dunning-Kruger Effect is almost 20 years old; and though it may simply be a result of an echo chamber created through my own social media , it seems to be popping up quite frequently in the news and posts that I’ve been reading lately—even through memes . For those of you unfamiliar with the phenomenon, the Dunning-Kruger Effect refers to a cognitive bias in which individuals with a low level of knowledge in a particular subject mistakenly assess their knowledge or ability as greater than it is. Similarly, it also refers to experts underestimating their own level of knowledge or ability.
But, then again, maybe it’s not my echo chamber—maybe it is part and parcel of our new knowledge economy (Dwyer, 2017; Dwyer, Hogan & Stewart, 2014) and the manner in which we quickly and effortlessly process information (right or wrong) with the help of the internet. In any case, given the frequency with which I seem to have encountered mention of this cognitive bias lately, coupled with the interest in my previous blog post " 18 Common Logical Fallacies and Persuasion Techniques ," I decided it might be interesting to compile a similar list—this time, one of cognitive biases .
A cognitive bias refers to a "systematic error" in the thinking process. Such biases are often connected to a heuristic, which is essentially a mental shortcut—heuristics allow one to make an inference without extensive deliberation and/or reflective judgment, given that they are essentially schemas for such solutions (West, Toplak, & Stanovich, 2008). Though there are many interesting heuristics out there, the following list deals exclusively with cognitive biases. Furthermore, these are not the only cognitive biases out there (e.g. there’s also the halo effect and the just world phenomenon ); rather, they are 12 common biases that affect how we make everyday decisions, from my experience.
In addition to the explanation of this effect above, experts are often aware of what they don’t know and (hopefully) engage their intellectual honesty and humility in this fashion. In this sense, the more you know, the less confident you're likely to be—not out of lacking knowledge, but due to caution. On the other hand, if you know only a little about something, you see it simplistically—biasing you to believe that the concept is easier to comprehend than it may actually be.
Just because I put the Dunning-Kruger Effect in the number one spot does not mean I consider it the most commonly engaged bias—it is an interesting effect, sure; but in my critical thinking classes, the confirmation bias is the one I constantly warn students about. We all favour ideas that confirm our existing beliefs and what we think we know. Likewise, when we conduct research, we all suffer from trying to find sources that justify what we believe about the subject. This bias brings to light the importance of, as I discussed in my previous post on " 5 Tips for Critical Thinking ," playing devil’s advocate . That is, we must overcome confirmation bias and consider both sides (or, if there are more than two, all sides) of the story. Remember, we are cognitively lazy—we don’t like changing our knowledge (schema) structures and how we think about things.
Ever fail an exam because your teacher hates you? Ever go in the following week and ace the next one because you studied extra hard despite that teacher? Congratulations, you’ve engaged the self-serving bias. We attribute successes and positive outcomes to our doing, basking in our own glory when things go right; but, when we face failure and negative outcomes, we tend to attribute these events to other people or contextual factors outside ourselves.
Similar in ways to the availability heuristic (Tversky & Kahneman, 1974) and to some extent, the false consensus effect , once you (truly) understand a new piece of information, that piece of information is now available to you and often becomes seemingly obvious. It might be easy to forget that there was ever a time you didn’t know this information and so, you assume that others, like yourself, also know this information: the curse of knowledge . However, it is often an unfair assumption that others share the same knowledge. The hindsight bias is similar to the curse of knowledge in that once we have information about an event, it then seems obvious that it was going to happen all along. I should have seen it coming!
As you probably guessed from the name, we have a tendency to overestimate the likelihood of positive outcomes, particularly if we are in good humour, and to overestimate the likelihood of negative outcomes if we are feeling down or have a pessimistic attitude. In either the case of optimism or pessimism , be aware that emotions can make thinking irrational. Remember one of my " 5 Tips for Critical Thinking ": Leave emotion at the door.
Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty thinking, given the manner in which we think in terms of winning, losing, and breaking even. For example, we generally believe that when we put something in, we should get something out—whether it’s effort, time, or money. With that, sometimes we lose… and that’s it—we get nothing in return. A sunk cost refers to something lost that cannot be recovered. Our aversion to losing (Kahneman, 2011) makes us irrationally cling to the idea of regaining even though it has already been lost (known in gambling as chasing the pot —when we make a bet and chase after it, perhaps making another bet to recoup the original [and hopefully more] even though, rationally, we should consider the initial bet as out-and-out lost). The appropriate advice of cutting your losses is applicable here.
Negativity bias is not totally separate from pessimism bias , but it is subtly and importantly distinct. In fact, it works according to similar mechanics as the sunk cost fallacy in that it reflects our profound aversion to losing. We like to win, but we hate to lose even more. So, when we make a decision, we generally think in terms of outcomes—either positive or negative. The bias comes into play when we irrationally weigh the potential for a negative outcome as more important than that of a positive outcome.
You may have heard the complaint that the internet will be the downfall of information dissemination; but, Socrates reportedly said the same thing about the written word. Declinism refers to a bias in favour of the past over and above "how things are going." Similarly, you might know a member of an older generation who prefaces grievances with, "Well, back in my day" before following up with how things are supposedly getting worse. The decline bias may result from something I’ve mentioned repeatedly in my posts—we don’t like change. People like their worlds to make sense, they like things wrapped up in nice, neat little packages. Our world is easier to engage in when things make sense to us. When things change, so must the way in which we think about them; and because we are cognitively lazy (Kahenman, 2011; Simon, 1957), we try our best to avoid changing our thought processes.
The backfire effect refers to the strengthening of a belief even after it has been challenged. Cook and Lewandowsky (2011) explain it very well in the context of changing people’s minds in their Debunking Handbook . The backfire effect may work based on the same foundation as Declinism , in that we do not like change. It is also similar to negativity bias , in that we wish to avoid losing and other negative outcomes—in this case, one’s idea is being challenged or rejected (i.e. perceived as being made out to be "wrong") and thus, they may hold on tighter to the idea than they had before. However, there are caveats to the backfire effect—for example, we also tend to abandon a belief if there's enough evidence against it with regard to specific facts .
The fundamental attribution error is similar to the self-serving bias , in that we look for contextual excuses for our failures, but generally blame other people or their characteristics for their failures. It also may stem from the availability heuristic in that we make judgments based only on the information we have available at hand.
One of the best textbook examples of this integrates stereotyping: Imagine you are driving behind another car. The other driver is swerving a bit and unpredictably starts speeding up and slowing down. You decide to overtake them (so as to no longer be stuck behind such a dangerous driver) and as you look over, you see a female behind the wheel. The fundamental attribution error kicks in when you make the judgment that their driving is poor because they’re a woman (also tying on to an unfounded stereotype). But what you probably don’t know is that the other driver has three children yelling and goofing around in the backseat, while she’s trying to get one to soccer, one to dance, and the other to a piano lesson. She’s had a particularly tough day and now she’s running late with all of the kids because she couldn’t leave work at the normal time. If we were that driver, we’d judge ourselves as driving poorly because of these reasons, not because of who we are. Tangentially, my wife is a much better driver than I am.
As we have seen through consideration of the self-serving bias and the fundamental attribution error , we have a tendency to be relatively kind when making judgments about ourselves. Simply, in-group bias refers to the unfair favouring of someone from one’s own group. You might think that you’re unbiased, impartial, and fair, but we all succumb to this bias, having evolved to be this way. That is, from an evolutionary perspective, this bias can be considered an advantage—favouring and protecting those similar to you, particularly with respect to kinship and the promotion of one’s own line.
As in the case of Declinism , to better understand the Forer effect (commonly known as the Barnum Effect ), it’s helpful to acknowledge that people like their world to make sense. If it didn’t, we would have no pre-existing routine to fall back on and we’d have to think harder to contextualise new information. With that, if there are gaps in our thinking of how we understand things, we will try to fill those gaps in with what we intuitively think makes sense, subsequently reinforcing our existing schema(s). As our minds make such connections to consolidate our own personal understanding of the world, it is easy to see how people can tend to process vague information and interpret it in a manner that makes it seem personal and specific to them. Given our egocentric nature (along with our desire for nice, neat little packages and patterns), when we process vague information, we hold on to what we deem meaningful to us and discard what is not. Simply, we better process information we think is specifically tailored to us, regardless of ambiguity. Specifically, the Forer effect refers to the tendency for people to accept vague and general personality descriptions as uniquely applicable to themselves without realizing that the same description could be applied to just about everyone else (Forer, 1949). For example, when people read their horoscope, even vague, general information can seem like it’s advising something relevant and specific to them.
While heuristics are generally useful for making inferences by providing us with cognitive shortcuts that help us stave off decision fatigue, some forms of heuristics can make our judgments irrational. Though various cognitive biases were covered in this post, these are by no means the only biases out there—just the most commonly engaged, in my experience, with respect to everyday decision-making . If you’re interested in learning more about these and other cognitive biases, I recommend checking out yourbias.is . Remember, we make thousands of decisions every day, some more important than others. Make sure that the ones that do matter are not made based on bias, but rather on reflective judgment and critical thinking.
Cook, J. & Lewandowsky, S. (2011). The debunking handbook. St. Lucia, Australia: University of Queensland. Retrieved from http://www.skepticalscience.com/docs/Debunking_Handbook.pdf
Dwyer, C.P. (2017). Critical thinking: Conceptual perspectives and practical guidelines. Cambridge, UK: Cambridge University Press; with foreword by former APA President, Dr. Diane F. Halpern.
Dwyer, C. P., Hogan, M. J., & Stewart, I. (2014). An integrated critical thinking framework for the 21st century. Thinking Skills & Creativity, 12, 43–52.
Forer, B. R. (1949) "The Fallacy of Personal Validation: A classroom Demonstration of Gullibility," Journal of Abnormal Psychology, 44, 118-121.
Kahneman, D. (2011). Thinking fast and slow. Penguin: Great Britain.
Kruger, J. &Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-Assessments. Journal of Personality and Social Psychology, 77, 6, 1121–1134.
Simon, H. A. (1957). Models of man. New York: Wiley.
Tversky, A. & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185, 4157, 1124–1131.
West, R. F., Toplak, M. E., & Stanovich, K. E. (2008). Heuristics and biases as measures of critical thinking: Associations with cognitive ability and thinking dispositions. Journal of Educational Psychology, 100, 4, 930–941.
Christopher Dwyer, Ph.D., is a lecturer at the Technological University of the Shannon in Athlone, Ireland.
Sticking up for yourself is no easy task. But there are concrete skills you can use to hone your assertiveness and advocate for yourself.
Which of these sway your thinking the most?
The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.
Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.
Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.
Attention is a limited resource. This means we can't possibly evaluate every possible detail and event when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.
The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.
Tara Moore / Getty Images
The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.
Examples include:
There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.
People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.
The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.
Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.
The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.
Some examples of the hindsight bias include:
In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.
Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.
The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.
The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.
The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:
While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.
Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.
The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.
For example:
In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”
When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.
There are a few factors that may play a role in this phenomenon. New information may get blended with older memories. In other cases, new information may be used to fill in "gaps" in memory.
The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.
The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.
When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:
When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:
While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.
The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.
The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:
Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.
Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.
This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.
The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:
One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.
This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.
The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.
Some examples of this:
The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.
This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.
The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:
It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.
Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.
The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.
For example, we may assume that negative events won't affect us such as:
The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.
This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.
There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.
Many other cognitive biases can distort how we perceive the world. Just a partial list:
The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.
Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.
Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377
Lee KK. An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations . PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609
Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review . BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1
Furnham A., Boo HC. A literature review of anchoring bias . The Journal of Socio-Economics. 2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008
Loftus EF. Leading questions and the eyewitness report . Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7
Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect . J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343
Miyamoto R, Kikuchi Y. Gender differences of brain activity in the conflicts based on implicit self-esteem . PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901
Weinstein ND, Klein WM. Resistance of personal risk perceptions to debiasing interventions . Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132
Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016
By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."
Cognitive biases are tendencies to selectively search for or interpret data in a way that confirms one’s existing beliefs.
by Terry Heick
Cognitive biases are a kind of ongoing cognitive ‘condition’–tendencies to selectively search for and interpret data in a way that confirms one’s existing beliefs.
A cognitive bias is an inherent thinking ‘blind spot’ that reduces thinking accuracy and results inaccurate–and often irrational–conclusions.
Much like logical fallacies, cognitive biases can be viewed as either as causes or effects but can generally be reduced to broken thinking. Not all ‘broken thinking,’ blind spots, and failures of thought are labeled, of course. But some are so common that they are given names–and once named, they’re easier to identify, emphasize, analyze, and ultimately avoid.
See also The Difference Between Logical Fallacies And Cognitive Biases
And that’s where this list comes in.
Cognitive Bias –> Confirmation Bias
For example, consider confirmation bias.
In What Is Confirmation Bias? we looked at this very common thinking mistake: the tendency to overvalue data and observation that fits with our existing beliefs.
The pattern is to form a theory (often based on emotion) supported with insufficient data, and then to restrict critical thinking and ongoing analysis, which is, of course, irrational. Instead, you look for data that fits your theory.
While it seems obvious enough to avoid, confirmation bias is particularly sinister cognitive bias, affecting not just intellectual debates, but relationships, personal finances, and even your physical and mental health. Racism and sexism, for example, can both be deepened by confirmation bias. If you have an opinion on gender roles, it can be tempting to look for ‘data’ from your daily life that reinforce your opinion on those roles.
This is, of course, all much more complex than the above thumbnail. The larger point, however, is that a failure of rational and critical thinking is not just ‘wrong’ but erosive and even toxic not just in academia, but every level of society.
See also Complete List Of Logical Fallacies With Examples
The Cognitive Bias Codex: A Visual Of 180+ Cognitive Biases
And that’s why a graphic like this is so extraordinary. In a single image, we have delineated dozens and dozens of these ‘bad cognitive patterns’ that, as a visual, underscores how commonly our thinking fails us–and a result, where we might begin to improve. Why and how to accomplish this is in a modern circumstance is at the core of TeachThought’s mission.
The graphic is structure as a circle with four quadrants categorizing the cognitive biases into four categories:
1. Too Much Information
2. Not Enough Meaning
3. Need To Act Fast
4. What Should We Remember?
We’ve listed each fallacy below moving clockwise from ‘Too Much Information’ to ‘What Should We Remember?’ Obviously, this list isn’t exhaustive–and there are even subjectivities and cultural biases embedded within (down to some of the biases themselves–the ‘IKEA effect,’ for example). The premise, though, remains intact: What are our most common failures of rational and critical thinking, and how can we avoid them in pursuit of academic and sociocultural progress?
So take a look and let me know what you think. There’s even an updated version of this graphic with all of the definitions for each of the biases–which I personally love, but is difficult to read.
Image description: Wikipedia’s complete (as of 2021) list of cognitive biases, arranged and designed by John Manoogian III. Categories and descriptions originally by Buster Benson.
Too Much Information
We notice things already primed in memory or repeated often
Availability heuristic
Attentional bias
Illusory truth effect
Mere exposure effect
Context effect
Cue-dependent forgetting
Mood-congruent memory bias
Frequency illusion
Baader-Meinhof Phenomenon
Empathy gap
Omission bias
Base rate fallacy
Bizarre, funny, visually-striking, or anthropomorphic things stick out more than non-bizarre/unfunny things
Bizarreness effect
Humor effect Von Restorff effect
Picture superiority effect
Self-relevance effect
Negativity bias
We notice when something has changed
Conservation
Contrast effect
Distinction effect
Focusing effect
Framing effect
Money illusion
Weber-Fechner law
We are drawn to details that confirm our own existing beliefs
Confirmation bias
Congruence bias
Post-purchase rationalization
Choice-support bias
Selective perception
Observer-expectancy effect
Experimenter’s bias
Observer effect
Exception bias
Ostrich effect
Subjective validation
Continued influence effect
Semmelweis reflex
We notice flaws in others more easily than we notice flaws in ourselves
Bias blind spot
Naive cynicism
Naive realism
Not Enough Meaning
We tend to find stories and data when looking at sparse data
Confabulation
Clustering illusion
Insensitivity to sample size
Neglect of Probability
Anecdotal fallacy
Illusion of validity
Masked man fallacy
Recency illusion
Gambler’s fallacy
Illusory correlation
Anthropomorphism
We fill in characteristics from stereotypes, generalities, and prior histories
Group attribution error
Ultimate attribution error
Stereotyping
Essentialism
Functional fixedness
Moral credential effect
Just-world hypothesis
argument from fallacy
Authority bias
Automation bias
Bandwagon effect
Placebo effect
We imagine things and people we’re familiar with or fond of as better
Out-group homogeneity bias
Cross-race effect
In-group bias
Halo effect
Cheerleader effect
Positivity effect
Not invented here
Reactive devaluation
Well-traveled road effect
We simplify probabilities and numbers to make them easier to think about
Mental accounting
Appeal to probability fallacy
Normalcy bias
Murphy’s Law
Zero-sum bias
Survivorship bias
Subadditivity effect
Denomination effect
Magic number 7+-2
We think we know what other people are thinking
Illusion of transparency
Curse of knowledge
Spotlight effect
Extrinsic incentive error
Illusion of external agency
Illusion of asymmetric insight
We project our current mindset and assumptions onto the past and future
Self-consistency bias
Resistant bias
Projection bias
Pro-innovation bias
Time-saving bias
Planning fallacy
Pessimism bias
Impact bias
Outcome bias
Hindsight bias
Rosy retrospection
Telescoping effect
Need To Act Fast
We favor simple-looking options and complete information over complex, ambiguous options
Less-is-better effect
Occam’s razor
Conjunction fallacy
Delmore effect
Law of Triviality
Bike-shedding effect
Rhyme as reason effect
Belief bias
Information bias
Ambiguity bias
To avoid mistakes, we aim to preserve autonomy and group status and avoid irreversible decisions
Status quo bias
Social comparison bias
Decoy effect
Reverse psychology
System justification
To get things done, we tend to complete things we’ve time & energy in
Backfire effect
Endowment effect
Processing difficulty effect
Pseudocertainty effect
Disposition effect
Zero-risk bias
IKEA effect
Loss aversion
Generation effect
Escalation of commitment
Irrational escalation
Sunk cost fallacy
To stay focused, we favor the immediate, relatable thing in front of us
Identifiable victim effect
Appeal to novelty
Hyperbolic discounting
To act, we must be confident we can make an impact and feel what we do is important
Peltzman effect
Risk compensation
Effort Justification
Trait ascription bias
Defensive attribution hypothesis
Fundamental attribution error
Illusory superiority
Illusion of control
Actor-observer bias
Self-serving bias
Barnum effect
Forer effect
Optimism effect
Egocentric effect
Dunning-Kruger effect
Lake Wobegone effect
Hard-easy effect
False consensus effect
Third-person effect
Social desirability bias
Overconfidence effect
What Should We Remember?
We store memories differently based on how they are experienced
Tip of the tongue phenomenon
Google effect
Next-in-line effect
Testing effect
Absent-mindedness
Levels of processing effect
We reduce events and lists to their key elements
Suffix effect
Serial position effect
Part-list cueing effect
Recency effect
Primary effect
Memory inhibition
Modality effect
Duration neglect
List-length effect
Serial recall effect
Misinformation effect
Leveling and sharpening
Peak-end rule
We discard specifics to form generalities
Fading affect bias
Stereotypical bias
Implicit stereotypes
Implicit association
We edit and reinforce some memories after the fact
Spacing effect
Suggestibility
False memory
Cryptomnesia
Source confusion
Misattribution of memory
Creative Commons Attribution: Share-Alike
Founder & Director of TeachThought
Learning objectives.
By the end of this section, you will be able to:
To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.
To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.
This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.
To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.
Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:
In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.
See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.
Watch the video to orient yourself before reading the text that follows.
Confirmation bias.
One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.
Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.
Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.
In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.
Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.
In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.
The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.
Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.
Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.
Sunk cost fallacy.
Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.
A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.
There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.
In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.
Table 2.1 summarizes these common cognitive biases.
Bias | Description | Example |
---|---|---|
Confirmation bias | The tendency to search for, interpret, favor, and recall information that confirms or supports prior beliefs | As part of their morning routine, a person scans news headlines on the internet and chooses to read only those stories that confirm views they already hold. |
Anchoring bias | The tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something | When supplied with a random number and then asked to provide a number estimate in response to a question, people supply a number close to the random number they were initially given. |
Availability heuristic | The tendency to evaluate new information based on the most recent or most easily recalled examples | People in the United States overestimate the probability of dying in a criminal attack, since these types of stories are easy to vividly recall. |
Tribalism | The tendency for human beings to align themselves with groups with whom they share values and practices | People with a strong commitment to one political party often struggle to objectively evaluate the political positions of those who are members of the opposing party. |
Bandwagon fallacy | The tendency to do something or believe something because many other people do or believe the same thing | Advertisers often rely on the bandwagon fallacy, attempting to create the impression that “everyone” is buying a new product, in order to inspire others to buy it. |
Sunk cost fallacy | The tendency to attach a value to things in which resources have been invested that is greater than the value those things actually have | A business person continues to invest money in a failing venture, “throwing good money after bad.” |
Gambler’s fallacy | The tendency to reason that future chance events will be more likely if they have not happened recently | Someone who regularly buys lottery tickets reasons that they are “due to win,” since they haven’t won once in twenty years. |
As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?
Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?
This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.
Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.
Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
© Mar 1, 2024 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.
How it works
Transform your enterprise with the scalable mindsets, skills, & behavior change that drive performance.
Explore how BetterUp connects to your core business systems.
We pair AI with the latest in human-centered coaching to drive powerful, lasting learning and behavior change.
Build leaders that accelerate team performance and engagement.
Unlock performance potential at scale with AI-powered curated growth journeys.
Build resilience, well-being and agility to drive performance across your entire enterprise.
Transform your business, starting with your sales leaders.
Unlock business impact from the top with executive coaching.
Foster a culture of inclusion and belonging.
Accelerate the performance and potential of your agencies and employees.
See how innovative organizations use BetterUp to build a thriving workforce.
Discover how BetterUp measurably impacts key business outcomes for organizations like yours.
Daring Leadership Institute: a groundbreaking partnership that amplifies Brené Brown's empirically based, courage-building curriculum with BetterUp’s human transformation platform.
Learn how 1:1 coaching works, who its for, and if it's right for you.
Accelerate your personal and professional growth with the expert guidance of a BetterUp Coach.
Types of Coaching
Navigate career transitions, accelerate your professional growth, and achieve your career goals with expert coaching.
Enhance your communication skills for better personal and professional relationships, with tailored coaching that focuses on your needs.
Find balance, resilience, and well-being in all areas of your life with holistic coaching designed to empower you.
Discover your perfect match : Take our 5-minute assessment and let us pair you with one of our top Coaches tailored just for you.
Find your coach
Research, expert insights, and resources to develop courageous leaders within your organization.
Best practices, research, and tools to fuel individual and business growth.
View on-demand BetterUp events and learn about upcoming live discussions.
The latest insights and ideas for building a high-performing workplace.
The online magazine that helps you understand tomorrow's workforce trends, today.
Innovative research featured in peer-reviewed journals, press, and more.
Founded in 2022 to deepen the understanding of the intersection of well-being, purpose, and performance
We're on a mission to help everyone live with clarity, purpose, and passion.
Join us and create impactful change.
Read the buzz about BetterUp.
Meet the leadership that's passionate about empowering your workforce.
For Business
For Individuals
As we go through life, we tend to think that we make decisions or judgements based on objective information and events that have happened.
We also tend to imagine that our brains work like tiny but powerful supercomputers. We imagine they take in facts and make rational judgments.
But the truth is a little more complicated than that.
While we think we're objectively taking in information, the truth is otherwise. There are several cognitive biases that can alter how we make decisions . These unconscious biases can lead us to make inaccurate judgments. And also make us behave in irrational ways.
Let’s take a look at some common forms of cognitive bias and how we can overcome them.
At their core, cognitive biases are our brain’s attempt to be efficient and make decisions quickly. They serve as mental shortcuts so that our brains can speed up information processing. They help us more quickly make sense of what we’re seeing and move on to make a decision. In this sense, they're considered an “adaptive tool.”
These mental shortcuts exist to make our brains more efficient. But instead, they can create systematic errors in our way of thinking. This is because they rely on our perceptions, observations, and experiences and not on actual facts.
In reality, we see the world through our own set of filters and make decisions based on them. It’s important to acknowledge that these filters aren't “factual.” Instead, they reflect our own particular perceptions and experiences.
These biases can lead us to avoid information that we don’t like or don’t want to see. Additionally, they can cause us to see patterns that don’t exist.
While many of the cognitive biases we experience are unconscious , we can take steps to avoid them. Before we can delve into avoiding cognitive biases, we need to understand these biases and where they're most likely to show up.
There are some sure signs we can tune into that help us determine if a cognitive bias is interfering with our decisions.
Although these signs might be easier to spot in others, we can work to recognize them in ourselves.
Signs that you could be experiencing cognitive bias include:
Amos Tversky and Daniel Kahneman first introduced the idea of cognitive bias in 1972 . They demonstrated that people often made judgments and decisions that weren't rational.
There is now research across the fields of social psychology and behavioral economics confirming dozens of cognitive biases . Usually, these cognitive bias examples are due to ignoring relevant information. Or they’re due to giving weight to an unimportant but salient feature of the situation.
Let’s look at some common biases that receive the most research attention and have the greatest impact on how we navigate the world:
You've probably met someone who thought they could do anything. That's an overconfidence bias. This is when you hold a false idea about your level of talent, intellect, or skills. It can be quite a dangerous bias that can have disastrous consequences in some scenarios.
It could be overestimating your driving abilities on the highway. Or it could be knowledge of the stock market while investing. This bias is closely related to an optimism bias, which is thinking that you’re less likely to experience a negative event. An overconfidence bias is pretty much the exact opposite of imposter syndrome .
There are a number of reasons we should strive to eliminate cognitive biases and biased thinking.
At its core, biased thinking makes it difficult for us to exchange accurate information. It can lead us to avoid information that we don’t like and fail to recognize information that could lead to a more accurate outcome.
Biases distort our critical thinking and can cause us to make irrational decisions. And finally, they can harm our relationships. Biases can cause us to make inaccurate judgments about others and then treat them accordingly.
While cognitive biases can be unconscious , there are a number of things we can do to reduce their likelihood.
The first tip to overcome these biases is to acknowledge that they exist. When we know there are factors that can alter the way we see things, we're more likely to be careful as we form judgments or make decisions.
Is there anything in the current situation that could lead you to feel overconfident in your convictions? Or cause you to ignore certain information? Make sure not to fall victim to the bandwagon effect, or adopt attitudes simply because others are.
Look for patterns in how you've perceived prior situations and where you might have made mistakes. If, for example, you see that you tend to ignore facts or overemphasize intuition. Then lean into opportunities to further explore data presented to you.
Being curious can help us avoid cognitive biases. Curiosity can help us pause long enough to ask questions. It stops us from assuming we're right.
People with growth mindsets believe that cognitive ability can be developed and tend to learn from criticism. Rather than covering up mistakes, they see them as an opportunity to learn. They don’t believe that factors are “fixed” or unchangeable. Cognitive bias modification is possible with some work and effort. A growth mindset is one of many heuristics that can help move you in the right direction.
Are there people or situations that rub you the wrong way? Ask yourself what makes you respond this way and whether you could have a bias that's impacting your perspective.
Trying to understand an issue from both sides can make you a stronger critical thinker and help you see the world with more empathy . Push yourself to believe the opposite of your initial reaction and pay attention to what happens.
Solicit feedback and perspectives from others. Asking others for their input can help us find potential blind spots and stop us from being overconfident.
Go out of your way to seek out information that runs counter to your existing belief.
Intellectual humility is about remaining open to the idea that you might be wrong. Rather than blindly standing by our convictions, it’s about asking, “what am I missing here?”
We all have cognitive biases. But there are proactive steps we can take to reduce their negative impact on our judgment. Doing so will help us improve our relationships and make better decisions.
BetterUp can help your team create a more inclusive culture with less cognitive bias. See how BetterUp works by requesting a customized demo .
Understand Yourself Better:
Big 5 Personality Test
Bethany Klynn, PhD is a BetterUp Fellow Coach and PhD in Industrial/Organizational Psychology. She has more than 20 years of experience in leadership development, coaching, team development, and shaping organizational cultures. Bethany has a passion for helping leaders become successful in finding those "aha moments." She loves helping individuals grow and discover how to best contribute, learn new skills, and achieve even better results. Bethany brings together the science of what happens at work and her 20 years of in-the-trenches executive experience of leading her own teams, coaching and developing leaders, and shaping organizational cultures. She's a voracious reader and loves swapping book recommendations — share some suggestions and she'll happily share some too!
What is cognitive flexibility, and why does it matter, what is cognitive dissonance and how do you reduce it, 17 memorization techniques to sharpen your memory & recall, 9 cognitive skill examples and how to improve them, developing cognitive empathy to become a better coworker, cbt-i: what is it and how does it work, feeling foggy let us help you strengthen your mental clarity, learn how to stop intrusive thoughts with these 10 techniques, when to trust your gut (and when not to), how sunk cost fallacy limits potential & how to overcome it, implicit bias: how unconscious attitudes affect everything, what is anchoring bias examples and tips for better decisions, top-down processing & how to avoid self-limiting behavior, the cognitive biases caused by the availability heuristic, proximity bias: definition, stay connected with betterup, get our newsletter, event invites, plus product insights and research..
3100 E 5th Street, Suite 350 Austin, TX 78702
Research output : Chapter in Book/Report/Conference proceeding › Conference contribution
Original language | English |
---|---|
Title of host publication | Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting, HFES 2012 |
Pages | 272-276 |
Number of pages | 5 |
DOIs | |
State | Published - 2012 |
Externally published | Yes |
Event | - Boston, MA, United States Duration: Oct 22 2012 → Oct 26 2012 |
Name | Proceedings of the Human Factors and Ergonomics Society |
---|---|
ISSN (Print) | 1071-1813 |
Conference | Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting, HFES 2012 |
---|---|
Country/Territory | United States |
City | Boston, MA |
Period | 10/22/12 → 10/26/12 |
T1 - Approaches to cognitive bias in serious games for critical thinking
AU - Flach, John M.
AU - Hale, Christopher R.
AU - Catrambone, Richard
AU - Whitaker, Elizabeth T.
AU - Hoffman, Robert R.
AU - Klein, Gary
AU - Veinott, Beth
UR - http://www.scopus.com/inward/record.url?scp=84873477981&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84873477981&partnerID=8YFLogxK
U2 - 10.1177/1071181312561064
DO - 10.1177/1071181312561064
M3 - Conference contribution
AN - SCOPUS:84873477981
SN - 9780945289418
T3 - Proceedings of the Human Factors and Ergonomics Society
BT - Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting, HFES 2012
T2 - Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting, HFES 2012
Y2 - 22 October 2012 through 26 October 2012
IMAGES
COMMENTS
12.2 Bias in Critical Thinking Theory and Pedagogy. Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are ...
Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking. Wikipedia lists 197 ...
Thinking about these things and challenging your biases can make you a more critical thinker. Challenging others' biases: Respectfully pointing out biases in others can also help others think more critically, especially if they're unconsciously acting or speaking on their biases. "Even if we can't change others' biases, remember they may be at ...
Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.
Critical Thinking. Critical Thinking is the process of using and assessing reasons to evaluate statements, assumptions, and arguments in ordinary situations. ... the representativeness heuristic, confirmation bias, attentional bias, and the anchoring effect. The field of behavioral economics, made popular by Dan Ariely (2008; 2010; 2012) and ...
biases-will-be-a-central-feature-of-critical-thinking-education-in-the-21st-century.-What They Are and Why They're Important 3 1. Cognitive Biases: What They Are and Why They're ... critical-thinking-in-different-ways.-One-of-the-most-important-types-of-background-knowledge-is-knowledge)of)how)our)minds)actually)work-—-
I recently wrote a research paper on cognitive barriers to critical thinking (CT), discussing flaws in thinking associated with intuitive judgment, emotion, bias, and epistemological ...
A weak sense critical thinker is skilled at using critical thinking tools to serve 'egocentric' and 'sociocentric' biases . Weak-sense critical thinkers can make strong and logical arguments, but they are not fair-minded as they lack the ability to take on the perspective of others ( Paul, 1992 ).
Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well. Collecting, analyzing and evaluating information is an important skill in life, and a highly ...
Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...
Secured with SSL. Cognitive biases are inherent in the way we think, and many of them are unconscious. Identifying the biases you experience and purport in your everyday interactions is the first step to understanding how our mental processes work, which can help us make better, more informed decisions.
Sources of Bias. Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.
This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while ...
Explicit biases are prejudiced beliefs regarding a group of people or ways of living. Racism, sexism, religious intolerance, and LGBTQ-phobias are examples of explicit biases. If you think that all people of group X are inferior, then you have an explicit bias against people of group X. 2. Implicit biases are unconscious beliefs that lead ...
Other investigators advocate for critical thinking as a model of intelligence specifically designed for addressing real-world problems. ... Stanovich Keith E., West Richard F. On the failure of cognitive ability to predict my-side bias and one-sided thinking biases. Thinking & Reasoning. 2008; 14:129-67. doi: 10.1080/13546780701679764 ...
Remember one of my "5 Tips for Critical Thinking": Leave emotion at the door. 6. The Sunk Cost Fallacy. Though labeled a fallacy, I see "sunk cost" as just as much in tune with bias as faulty ...
The Misinformation Effect. The False Consensus Effect. The Halo Effect. The Availability Heuristic. The Optimism Bias. Other Kinds. Trending Videos. Close this video player. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases.
The List Of Cognitive Biases: A Graphic Of 180+ Heuristics. Cognitive biases are a kind of ongoing cognitive 'condition'-tendencies to selectively search for and interpret data in a way that confirms one's existing beliefs. A cognitive bias is an inherent thinking 'blind spot' that reduces thinking accuracy and results inaccurate ...
Classify and describe cognitive biases. Apply critical reflection strategies to resist cognitive biases. To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical ...
In this article, the authors argue that there are a range of effects usually studied within cognitive psychology that are legitimately thought of as aspects of critical thinking: the cognitive biases studied in the heuristics and biases literature. In a study of 793 student participants, the authors found that the ability to avoid these biases was moderately correlated with a more traditional ...
Biases distort our critical thinking and can cause us to make irrational decisions. And finally, they can harm our relationships. Biases can cause us to make inaccurate judgments about others and then treat them accordingly. 10 tips to overcome cognitive biases.
Critical thinking is the ability to think clearly and rationally about what to do or what to believe. It includes the ability to engage in reflective and independent thinking. Someone with critical thinking skills is able to do the following: Understand the logical connections between ideas. Identify, construct, and evaluate arguments.
Cognitive biases can affect your decision-making skills, limit your problem-solving abilities, hamper your career success, damage the reliability of your memories, challenge your ability to ...
Flach JM, Hale CR, Catrambone R, Whitaker ET, Hoffman RR, Klein G et al. Approaches to cognitive bias in serious games for critical thinking. In Proceedings of the Human Factors and Ergonomics Society 56th Annual Meeting, HFES 2012. 2012. p. 272-276.