Confirmation Bias In Psychology: Definition & Examples

Julia Simkus

Editor at Simply Psychology

BA (Hons) Psychology, Princeton University

Julia Simkus is a graduate of Princeton University with a Bachelor of Arts in Psychology. She is currently studying for a Master's Degree in Counseling for Mental Health and Wellness in September 2023. Julia's research has been published in peer reviewed journals.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

On This Page:

Confirmation Bias is the tendency to look for information that supports, rather than rejects, one’s preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association).

One of the early demonstrations of confirmation bias appeared in an experiment by Peter Watson (1960) in which the subjects were to find the experimenter’s rule for sequencing numbers.

Its results showed that the subjects chose responses that supported their hypotheses while rejecting contradictory evidence, and even though their hypotheses were incorrect, they became confident in them quickly (Gray, 2010, p. 356).

Though such evidence of confirmation bias has appeared in psychological literature throughout history, the term ‘confirmation bias’ was first used in a 1977 paper detailing an experimental study on the topic (Mynatt, Doherty, & Tweney, 1977).

Confirmation bias as psychological objective attitude issue outline diagram. Incorrect information checking or aware of self interpretation vector illustration. Tendency to approve existing opinion.

Biased Search for Information

This type of confirmation bias explains people’s search for evidence in a one-sided way to support their hypotheses or theories.

Experiments have shown that people provide tests/questions designed to yield “yes” if their favored hypothesis is true and ignore alternative hypotheses that are likely to give the same result.

This is also known as the congruence heuristic (Baron, 2000, p.162-64). Though the preference for affirmative questions itself may not be biased, there are experiments that have shown that congruence bias does exist.

For Example:

If you were to search “Are cats better than dogs?” in Google, all you would get are sites listing the reasons why cats are better.

However, if you were to search “Are dogs better than cats?” google will only provide you with sites that believe dogs are better than cats.

This shows that phrasing questions in a one-sided way (i.e., affirmative manner) will assist you in obtaining evidence consistent with your hypothesis.

Biased Interpretation

This type of bias explains that people interpret evidence concerning their existing beliefs by evaluating confirming evidence differently than evidence that challenges their preconceptions.

Various experiments have shown that people tend not to change their beliefs on complex issues even after being provided with research because of the way they interpret the evidence.

Additionally, people accept “confirming” evidence more easily and critically evaluate the “disconfirming” evidence (this is known as disconfirmation bias) (Taber & Lodge, 2006).

When provided with the same evidence, people’s interpretations could still be biased.

For example:

Biased interpretation is shown in an experiment conducted by Stanford University on the topic of capital punishment. It included participants who were in support of and others who were against capital punishment.

All subjects were provided with the same two studies.

After reading the detailed descriptions of the studies, participants still held their initial beliefs and supported their reasoning by providing “confirming” evidence from the studies and rejecting any contradictory evidence or considering it inferior to the “confirming” evidence (Lord, Ross, & Lepper, 1979).

Biased Memory

To confirm their current beliefs, people may remember/recall information selectively. Psychological theories vary in defining memory bias.

Some theories state that information confirming prior beliefs is stored in the memory while contradictory evidence is not (i.e., Schema theory). Some others claim that striking information is remembered best (i.e., humor effect).

Memory confirmation bias also serves a role in stereotype maintenance. Experiments have shown that the mental association between expectancy-confirming information and the group label strongly affects recall and recognition memory.

Though a certain stereotype about a social group might not be true for an individual, people tend to remember the stereotype-consistent information better than any disconfirming evidence (Fyock & Stangor, 1994).

In one experimental study, participants were asked to read a woman’s profile (detailing her extroverted and introverted skills) and assess her for either a job of a librarian or real-estate salesperson.

Those assessing her as a salesperson better recalled extroverted traits, while the other group recalled more examples of introversion (Snyder & Cantor, 1979).

These experiments, along with others, have offered an insight into selective memory and provided evidence for biased memory, proving that one searches for and better remembers confirming evidence.

social media bias

Social Media

Information we are presented on social media is not only reflective of what the users want to see but also of the designers’ beliefs and values. Today, people are exposed to an overwhelming number of news sources, each varying in their credibility.

To form conclusions, people tend to read the news that aligns with their perspectives. For instance, new channels provide information (even the same news) differently from each other on complex issues (i.e., racism, political parties, etc.), with some using sensational headlines/pictures and one-sided information.

Due to the biased coverage of topics, people only utilize certain channels/sites to obtain their information to make biased conclusions.

Religious Faith

People also tend to search for and interpret evidence with respect to their religious beliefs (if any).

For instance, on the topics of abortion and transgender rights, people whose religions are against such things will interpret this information differently than others and will look for evidence to validate what they believe.

Similarly, those who religiously reject the theory of evolution will either gather information disproving evolution or hold no official stance on the topic.

Also, irreligious people might perceive events that are considered “miracles” and “test of faiths” by religious people to be a reinforcement of their lack of faith in a religion.

when Does The Confirmation Bias Occur?

There are several explanations why humans possess confirmation bias, including this tendency being an efficient way to process information, protect self-esteem, and minimize cognitive dissonance.

Information Processing

Confirmation bias serves as an efficient way to process information because of the limitless information humans are exposed to.

To form an unbiased decision, one would have to critically evaluate every piece of information present, which is unfeasible. Therefore, people only tend to look for information desired to form their conclusions (Casad, 2019).

Protect Self-esteem

People are susceptible to confirmation bias to protect their self-esteem (to know that their beliefs are accurate).

To make themselves feel confident, they tend to look for information that supports their existing beliefs (Casad, 2019).

Minimize Cognitive Dissonance

Cognitive dissonance also explains why confirmation bias is adaptive.

Cognitive dissonance is a mental conflict that occurs when a person holds two contradictory beliefs and causes psychological stress/unease in a person.

To minimize this dissonance, people adapt to confirmation bias by avoiding information that is contradictory to their views and seeking evidence confirming their beliefs.

Challenge avoidance and reinforcement seeking to affect people’s thoughts/reactions differently since exposure to disconfirming information results in negative emotions, something that is nonexistent when seeking reinforcing evidence (“The Confirmation Bias: Why People See What They Want to See”).

Implications

Confirmation bias consistently shapes the way we look for and interpret information that influences our decisions in this society, ranging from homes to global platforms. This bias prevents people from gathering information objectively.

During the election campaign, people tend to look for information confirming their perspectives on different candidates while ignoring any information contradictory to their views.

This subjective manner of obtaining information can lead to overconfidence in a candidate, and misinterpretation/overlooking of important information, thus influencing their voting decision and, eventually country’s leadership (Cherry, 2020).

Recruitment and Selection

Confirmation bias also affects employment diversity because preconceived ideas about different social groups can introduce discrimination (though it might be unconscious) and impact the recruitment process (Agarwal, 2018).

Existing beliefs of a certain group being more competent than the other is the reason why particular races and gender are represented the most in companies today. This bias can hamper the company’s attempt at diversifying its employees.

Mitigating Confirmation Bias

Change in intrapersonal thought:.

To avoid being susceptible to confirmation bias, start questioning your research methods, and sources used to obtain their information.

Expanding the types of sources used in searching for information could provide different aspects of a particular topic and offer levels of credibility.

  • Read entire articles rather than forming conclusions based on the headlines and pictures. – Search for credible evidence presented in the article.
  • Analyze if the statements being asserted are backed up by trustworthy evidence (tracking the source of evidence could prove its credibility). – Encourage yourself and others to gather information in a conscious manner.

Alternative hypothesis:

Confirmation bias occurs when people tend to look for information that confirms their beliefs/hypotheses, but this bias can be reduced by taking into alternative hypotheses and their consequences.

Considering the possibility of beliefs/hypotheses other than one’s own could help you gather information in a more dynamic manner (rather than a one-sided way).

Related Cognitive Biases

There are many cognitive biases that characterize as subtypes of confirmation bias. Following are two of the subtypes:

Backfire Effect

The backfire effect occurs when people’s preexisting beliefs strengthen when challenged by contradictory evidence (Silverman, 2011).

  • Therefore, disproving a misconception can actually strengthen a person’s belief in that misconception.

One piece of disconfirming evidence does not change people’s views, but a constant flow of credible refutations could correct misinformation/misconceptions.

This effect is considered a subtype of confirmation bias because it explains people’s reactions to new information based on their preexisting hypotheses.

A study by Brendan Nyhan and Jason Reifler (two researchers on political misinformation) explored the effects of different types of statements on people’s beliefs.

While examining two statements, “I am not a Muslim, Obama says.” and “I am a Christian, Obama says,” they concluded that the latter statement is more persuasive and resulted in people’s change of beliefs, thus affirming statements are more effective at correcting incorrect views (Silverman, 2011).

Halo Effect

The halo effect occurs when people use impressions from a single trait to form conclusions about other unrelated attributes. It is heavily influenced by the first impression.

Research on this effect was pioneered by American psychologist Edward Thorndike who, in 1920, described ways officers rated their soldiers on different traits based on first impressions (Neugaard, 2019).

Experiments have shown that when positive attributes are presented first, a person is judged more favorably than when negative traits are shown first. This is a subtype of confirmation bias because it allows us to structure our thinking about other information using only initial evidence.

Learning Check

When does the confirmation bias occur.

  • When an individual only researches information that is consistent with personal beliefs.
  • When an individual only makes a decision after all perspectives have been evaluated.
  • When an individual becomes more confident in one’s judgments after researching alternative perspectives.
  • When an individual believes that the odds of an event occurring increase if the event hasn’t occurred recently.

The correct answer is A. Confirmation bias occurs when an individual only researches information consistent with personal beliefs. This bias leads people to favor information that confirms their preconceptions or hypotheses, regardless of whether the information is true.

Take-home Messages

  • Confirmation bias is the tendency of people to favor information that confirms their existing beliefs or hypotheses.
  • Confirmation bias happens when a person gives more weight to evidence that confirms their beliefs and undervalues evidence that could disprove it.
  • People display this bias when they gather or recall information selectively or when they interpret it in a biased way.
  • The effect is stronger for emotionally charged issues and for deeply entrenched beliefs.

Agarwal, P., Dr. (2018, October 19). Here Is How Bias Can Affect Recruitment In Your Organisation. https://www.forbes.com/sites/pragyaagarwaleurope/2018/10/19/how-can-bias-during-interviewsaffect-recruitment-in-your-organisation

American Psychological Association. (n.d.). APA Dictionary of Psychology. https://dictionary.apa.org/confirmation-bias

Baron, J. (2000). Thinking and Deciding (Third ed.). Cambridge University Press.

Casad, B. (2019, October 09). Confirmation bias . https://www.britannica.com/science/confirmation-bias

Cherry, K. (2020, February 19). Why Do We Favor Information That Confirms Our Existing Beliefs? https://www.verywellmind.com/what-is-a-confirmation-bias-2795024

Fyock, J., & Stangor, C. (1994). The role of memory biases in stereotype maintenance. The British journal of social psychology, 33 (3), 331–343.

Gray, P. O. (2010). Psychology . New York: Worth Publishers.

Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37 (11), 2098–2109.

Mynatt, C. R., Doherty, M. E., & Tweney, R. D. (1977). Confirmation bias in a simulated research environment: An experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29 (1), 85-95.

Neugaard, B. (2019, October 09). Halo effect. https://www.britannica.com/science/halo-effect

Silverman, C. (2011, June 17). The Backfire Effect . https://archives.cjr.org/behind_the_news/the_backfire_effect.php

Snyder, M., & Cantor, N. (1979). Testing hypotheses about other people: The use of historical knowledge. Journal of Experimental Social Psychology, 15 (4), 330–342.

Further Information

  • What Is Confirmation Bias and When Do People Actually Have It?
  • Confirmation Bias: A Ubiquitous Phenomenon in Many Guises
  • The importance of making assumptions: why confirmation is not necessarily a bias
  • Decision Making Is Caused By Information Processing And Emotion: A Synthesis Of Two Approaches To Explain The Phenomenon Of Confirmation Bias

Confirmation bias occurs when individuals selectively collect, interpret, or remember information that confirms their existing beliefs or ideas, while ignoring or discounting evidence that contradicts these beliefs.

This bias can happen unconsciously and can influence decision-making and reasoning in various contexts, such as research, politics, or everyday decision-making.

What is confirmation bias in psychology?

Confirmation bias in psychology is the tendency to favor information that confirms existing beliefs or values. People exhibiting this bias are likely to seek out, interpret, remember, and give more weight to evidence that supports their views, while ignoring, dismissing, or undervaluing the relevance of evidence that contradicts them.

This can lead to faulty decision-making because one-sided information doesn’t provide a full picture.

Print Friendly, PDF & Email

Related Articles

Automatic Processing in Psychology: Definition & Examples

Cognitive Psychology

Automatic Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

Controlled Processing in Psychology: Definition & Examples

How Ego Depletion Can Drain Your Willpower

How Ego Depletion Can Drain Your Willpower

What is the Default Mode Network?

What is the Default Mode Network?

Theories of Selective Attention in Psychology

Availability Heuristic and Decision Making

Availability Heuristic and Decision Making

Confirmation Bias: Seeing What We Want to Believe

Confirmation Bias

Confirmation bias is a widely recognized phenomenon and refers to our tendency to seek out evidence in line with our current beliefs and stick to ideas even when the data contradicts them (Lidén, 2023).

Evolutionary and cognitive psychologists agree that we naturally tend to be selective and look for information we already know (Buss, 2016).

This article explores this tendency, how it happens, why it matters, and what we can do to get better at recognizing it and reducing its impact.

Before you continue, we thought you might like to download our three Positive CBT Exercises for free . These science-based exercises will provide you with detailed insight into positive Cognitive-Behavioral Therapy (CBT) and give you the tools to apply it in your therapy or coaching.

This Article Contains

Understanding confirmation bias, fascinating confirmation bias examples, 10 reasons we fall for it, 10 steps to recognizing and reducing confirmation bias, how confirmation bias impacts research, can confirmation bias be good, resources from positivepsychology.com, a take-home message.

We can understand the confirmation bias definition as the human tendency “to seek out, to interpret, to favor, and to selectively recall information that confirms beliefs they already hold, while avoiding or ignoring information that disconfirms these beliefs” (Gabriel & O’Connor, 2024, p. 1).

While it has been known and accepted since at least the 17th century that humans are inclined to form and hold on to ideas and beliefs — often tenaciously — even when faced with contradictory evidence, the term “confirmation bias” only became popular in the 1960s with the work of cognitive psychologist Peter Cathcart Wason (Lidén, 2023).

Wason’s (1960) famous 2–4–6 experiment was devised to investigate the nature of hypothesis testing.

Participants were given the numbers 2, 4, and 6 and told the numbers adhered to a rule.

They were then asked to arrive at a hypothesis explaining the sequence and try a new three-number series to test their rule (Wason, 1960; Lidén, 2023).

For example, if a participant thought the second number was twice that of the first and the third number was three times greater, they might suggest the numbers 10, 20, and 30.

However, if another participant thought it was a simple series increasing by two each time, they might suggest 13, 15, and 17 (Wason, 1960; Lidén, 2023).

The actual rule is more straightforward; the numbers are in ascending order. That’s all.

As we typically offer tests that confirm our initial beliefs, both example hypotheses appear to work, even if they are not the answer (Wason, 1960; Lidén, 2023).

The experiment demonstrates our confirmation bias; we seek information confirming our existing beliefs or hypotheses rather than challenging or disproving them (Lidén, 2023).

In the decades since, and with developments in cognitive science, we have come to understand that people don’t typically have everything they need, “and even if they did, they would not be able to use all the information due to constraints in the environment, attention, or memory” (Lidén, 2023, p. 8).

Instead, we rely on heuristics. Such “rules of thumb” are easy to apply and fairly accurate, yet they can potentially result in systematic and serious biases and errors in judgment (Lidén, 2023; Eysenck & Keane, 2015).

Confirmation bias in context

Confirmation bias is one of several cognitive biases ’(Lidén, 2023).

They are important because researchers have recognized that “vulnerability to clinical anxiety and depression depends in part on various cognitive biases” and that mental health treatments such as CBT  should support the goals of reducing them (Eysenck & Keane, 2015, p. 668).

Cognitive biases include (Eysenck & Keane, 2015):

  • Attentional bias Attending to threat-related stimuli more than neutral stimuli
  • Interpretive bias Interpreting ambiguous stimuli, situations, and events as threatening
  • Explicit memory bias The likelihood of retrieving mostly unpleasant thoughts rather than positive ones
  • Implicit memory bias The tendency to perform better for negative or threatening information on memory tests

Individuals possessing all four biases focus too much on environmental threats, interpret most incidents as concerning, and identify themselves as having experienced mostly unpleasant past events (Eysenck & Keane, 2015).

Similarly, confirmation bias means that individuals give too much weight to evidence that confirms their preconceptions or hypotheses, even incorrect and unhelpful ones. It can lead to poor decision-making because it limits their ability to consider alternative viewpoints or evidence that contradicts their beliefs (Lidén, 2023).

Unsurprisingly, such a negative outlook or bias will lead to unhealthy outcomes, including anxiety and depression (Eysenck & Keane, 2015).

Check out Tali Sharot’s video for a deeper dive.

Confirmation bias is commonplace and typically has a low impact, yet there are times when it is significant and newsworthy (Eysenck & Keane, 2015; Lidén, 2023).

Limits of information

In 2005, terrorists detonated four bombs in London (three on the London Underground and one on a bus), killing 52 and injuring 700 civilians. In the chaotic weeks that followed, a further attempt failed to detonate a suicide bomb, and the individual got away (Lidén, 2023).

Unsurprisingly, a mass hunt was launched to capture the escaped bomber, and many suspects came under surveillance. Yet, the security services made several significant mistakes.

On July 22, 2005, a man living in the same house as two suspects and bearing a resemblance to one of them was shot dead on an Underground train by officers.

“The context with the previous bombings, the available intelligence, and the pre-operation briefings, created expectations that the surveillance team would spot a suicide bomber leaving the doorway” (Lidén, 2023, p. 37).

The wrong man died because the officers involved failed to see the limits of the information available to them at the time.

Witness identification

In 1976, factory worker John Demjanjuk from Cleveland, Ohio, was identified as a Nazi war criminal known as Ivan the Terrible, perpetrator of many killings within prison camps in the Second World War (Lidén, 2023).

Due to the individual’s denial and limited evidence, the case rested on proof of identity via a photo line-up. However, it became known that “Ivan the Terrible” had a round face and was bald.

As the defendant was the only individual who matched the description, he was chosen by all the witnesses (Lidén, 2023).

Whether or not the witnesses were genuinely able to identify the factory worker as the criminal became irrelevant. The case centered around the unfairness of the line-up and the confirmation bias that resulted from the information they had been given (Lidén, 2023).

Years later, in 2012, following continuing challenges to his identity, John Demjanjuk died pending an appeal for his conviction in a German court. His identity remained unclear as the confirmation bias remained (“Ivan the Terrible,” 2024).

example of confirmation bias critical thinking

Download 3 Free Positive CBT Exercises (PDF)

These detailed, science-based exercises will equip you or your clients with tools to find new pathways to reduce suffering and more effectively cope with life stressors.

Download 3 Free Positive CBT Tools Pack (PDF)

By filling out your name and email address below.

Confirmation bias can significantly impact our own and others’ lives (Lidén, 2023; Kappes et al., 2020).

For that reason, it is helpful to understand why it happens and the psychological factors involved. Research confirms that people (Lidén, 2023; Kappes et al., 2020; Eysenck & Keane, 2015):

  • Don’t like to let go of their initial hypothesis
  • Prefer to use as much information as is initially available, often resulting in a too specific hypothesis
  • Show confirmation bias more on their hypothesis than others
  • Are more likely to adopt a confirmation bias when under high cognitive load
  • With a lower degree of intelligence are more likely to engage in confirmation bias (most likely due to being less able to manage higher cognitive loads and see the overall picture)
  • With cognitive impairments are more impacted by confirmation bias
  • Are often unable to actively consider and understand all relevant information to challenge the existing hypothesis or make a new one
  • Are influenced by their emotions and motivations and potentially “blinded” to the facts
  • Are biased by existing thoughts and beliefs (sometimes cultural), even if incorrect
  • Are influenced by the beliefs and arguments of those around them

Recognize confirmation bias

  • Recognize that confirmation bias exists and understand its impact on decision-making and how you interpret information. ​
  • Actively seek out and consider different viewpoints, opinions, and sources of information that challenge your existing beliefs and hypotheses. ​
  • Develop critical thinking skills that evaluate evidence and arguments objectively without favoring preconceived notions or desired outcomes.
  • Be aware of your biases and open to questioning your beliefs and assumptions.
  • Explore alternative explanations or hypotheses that may contradict your initial beliefs or interpretations.
  • Welcome feedback and criticism from others, even if they challenge your ideas; recognize it as an opportunity to learn and grow.
  • Apply systematic and rigorous methods to gather and analyze data, ensuring your conclusions are evidence-based rather than a result of personal biases.
  • Engage in collaborative discussions and debates with individuals with different perspectives to help see other viewpoints and challenge your biases.
  • Continuously seek new information and update your knowledge base to avoid becoming entrenched and support more-informed decision-making.
  • Practice analytical thinking, questioning assumptions, evaluating evidence objectively, and considering alternate explanations.

As far back as 1968, Karl Popper recognized that falsifiability (being able to prove that something can be incorrect or false) is crucial to all scientific inquiry, impacting researchers’ behavior and experimental outcomes.

As scientists, Popper argued, we should focus on looking for examples of why a theory does not work instead of seeking confirmation of its correctness. More recently, researchers have also considered that when findings suggest a theory is false, it may be due to issues with the experimental design or data accuracy (Eysenck & Keane, 2015).

Yet, confirmation bias has been an issue for a long time in scientific discovery and remains a challenge.

When researchers looked back at the work of Alexander Graham Bell in developing the telephone, they found that, due to confirmation bias, he ignored promising new approaches in favor of his tried-and-tested ones. It ultimately led to Thomas Edison being the first to develop the forerunner of today’s telephone (Eysenck & Keane, 2015).

More recently, a study showed that 88% of professional scientists working on issues in molecular biology responded to unexpected and inconsistent findings by blaming their experimental methods; they ignored the suggestion that they may need to modify, or even replace, their theories (Eysenck & Keane, 2015).

However, when those same scientists changed their approach yet obtained similarly inconsistent results, 61% revisited their theoretical assumptions (Eysenck & Keane, 2015).

Failure to report null research findings is also a problem. It is known as the “file drawer problem” because data remains unseen in the bottom drawer as the researcher does not attempt to get findings published or because journals show no interest in them (Lidén, 2023).

Positive confirmation bias

Researchers have recognized several potential benefits that arise from our natural inclination to seek out confirmation that we are right, including (Peters, 2022; Gabriel & O’Connor, 2024; Bergerot et al., 2023):

  • Assisting in the personal development of individuals by reinforcing their positive self-conceptions and traits
  • Helping individuals shape social structures by persuading others to adopt their viewpoints
  • Supporting increased confidence by reinforcing individuals’ beliefs and ignoring contradictory evidence
  • Contributing to social conformity and stability by reinforcing shared beliefs and values within a group, potentially boosting cooperation and coordination
  • Encouraging decision-making by removing uncertainty and doubt
  • Increasing the knowledge-producing capacity of a group by supporting a deeper exploration of individual members’ perspectives

It’s vital to note that the possible benefits also have their limitations. They potentially favor the individual at the cost of others’ needs while potentially distorting and hindering the formation of well-founded beliefs (Peters, 2022).

example of confirmation bias critical thinking

17 Science-Based Ways To Apply Positive CBT

These 17 Positive CBT & Cognitive Therapy Exercises [PDF] include our top-rated, ready-made templates for helping others develop more helpful thoughts and behaviors in response to challenges, while broadening the scope of traditional CBT.

Created by Experts. 100% Science-based.

We have many resources for coaches and therapists to help individuals and groups understand and manage their biases.

Why not download our free 3 Positive CBT Exercises Pack and try out the powerful tools contained within? Some examples include the following:

  • Re-Framing Critical Self-Talk  Self-criticism typically involves judgment and self-blame regarding our shortcomings (real or imagined), such as our inability to accomplish personal goals and meet others’ expectations. In this exercise, we use self-talk to help us reduce self-criticism and cultivate a kinder, compassionate relationship with ourselves.
  • Solution-Focused Guided Imagery Solution-focused therapy assumes we have the resources required to resolve our issues. Here, we learn how to connect with our strengths and overcome the challenges we face.

Other free resources include:

  • The What-If Bias We often get caught up in our negative biases, thinking about potentially dire outcomes rather than adopting rational beliefs. This exercise helps us regain a more realistic and balanced perspective.
  • Becoming Aware of Assumptions We all bring biases into our daily lives, particularly conversations. In this helpful exercise , we picture how things might be in five years to put them into context.

More extensive versions of the following tools are available with a subscription to the Positive Psychology Toolkit© , but they are described briefly below.

  • Increasing Awareness of Cognitive Distortions

Cognitive distortions refer to our biased thinking about ourselves and our environment. This tool helps reduce the effect of the distortions by dismantling them.

  • Step one – Begin by exploring cognitive distortions, such as all-or-nothing thinking, jumping to conclusions, and catastrophizing .
  • Step two – Next, identify the cognitive distortions relevant to your situation.
  • Step three – Reflect on your thinking patterns, how they could harm you, and how you interact with others.
  • Finding Silver Linings

We tend to dwell on the things that go wrong in our lives. We may even begin to think our days are filled with mishaps and disappointments.

Rather than solely focusing on things that have gone wrong, it can help to look on the bright side. Try the following:

  • Step one – Create a list of things that make you feel life is worthwhile, enjoyable, and meaningful.
  • Step two – Think of a time when things didn’t go how you wanted them to.
  • Step three – Reflect on what this difficulty cost you.
  • Step four – Finally, consider what you may have gained from the experience. Write down three positives.

If you’re looking for more science-based ways to help others through CBT, check out this collection of 17 validated positive CBT tools for practitioners. Use them to help others overcome unhelpful thoughts and feelings and develop more positive behaviors.

We can’t always trust what we hear or see because our beliefs and expectations influence so much of how we interact with the world.

Confirmation bias refers to our natural inclination to seek out and focus on what confirms our beliefs, often ignoring anything that contradicts them.

While we have known of its effect for over 200 years, it still receives considerable research focus because of its impact on us individually and as a society, often causing us to make poor decisions and leading to damaging outcomes.

Confirmation bias has several sources and triggers, including our unwillingness to relinquish our initial beliefs (even when incorrect), preference for personal hypotheses, cognitive load, and cognitive impairments.

However, most of us can reduce confirmation bias with practice and training. We can become more aware of such inclinations and seek out challenges or alternate explanations for our beliefs.

It matters because confirmation bias can influence how we work, the research we base decisions on, and how our clients manage their relationships with others and their environments.

We hope you enjoyed reading this article. For more information, don’t forget to download our three Positive CBT Exercises for free .

  • Bergerot, C., Barfuss, W., & Romanczuk, P. (2023). Moderate confirmation bias enhances collective decision-making . biorXiv. https://www.biorxiv.org/content/10.1101/2023.11.21.568073v1.full
  • Buss, D. M. (2016). Evolutionary psychology: The new science of the mind . Routledge.
  • Eysenck, M. W., & Keane, M. T. (2015). Cognitive psychology: A student’s handbook . Psychology Press.
  • Gabriel, N., & O’Connor, C. (2024). Can confirmation bias improve group learning? PhilSci Archive. https://philsci-archive.pitt.edu/20528/
  • Ivan the Terrible (Treblinka guard). (2024). In Wikipedia . https://en.wikipedia.org/wiki/Ivan_the_Terrible_(Treblinka_guard)
  • Kappes, A., Harvey, A. H., Lohrenz, T., Montague, P. R., & Sharot, T. (2020). Confirmation bias in the utilization of others’ opinion strength. Nature Neuroscience , 23 (1), 130–137.
  • Lidén, M. (2023). Confirmation bias in criminal cases . Oxford University Press.
  • Peters, U. (2022). What is the function of confirmation bias? Erkenntnis , 87 , 1351–1376.
  • Popper, K. R. (1968). The logic of scientific discovery . Hutchinson.
  • Rist, T. (2023). Confirmation bias studies: Towards a scientific theory in the humanities. SN Social Sciences , 3 (8).
  • Wason, P. C. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology , 12 (3), 129–140.

' src=

Share this article:

Article feedback

Let us know your thoughts cancel reply.

Your email address will not be published.

Save my name, email, and website in this browser for the next time I comment.

Related articles

FEA

Fundamental Attribution Error: Shifting the Blame Game

We all try to make sense of the behaviors we observe in ourselves and others. However, sometimes this process can be marred by cognitive biases [...]

Halo effect

Halo Effect: Why We Judge a Book by Its Cover

Even though we may consider ourselves logical and rational, it appears we are easily biased by a single incident or individual characteristic (Nicolau, Mellinas, & [...]

Sunk cost fallacy

Sunk Cost Fallacy: Why We Can’t Let Go

If you’ve continued with a decision or an investment of time, money, or resources long after you should have stopped, you’ve succumbed to the ‘sunk [...]

Read other articles by their category

  • Body & Brain (49)
  • Coaching & Application (58)
  • Compassion (25)
  • Counseling (51)
  • Emotional Intelligence (23)
  • Gratitude (18)
  • Grief & Bereavement (21)
  • Happiness & SWB (40)
  • Meaning & Values (26)
  • Meditation (20)
  • Mindfulness (44)
  • Motivation & Goals (45)
  • Optimism & Mindset (34)
  • Positive CBT (30)
  • Positive Communication (21)
  • Positive Education (47)
  • Positive Emotions (32)
  • Positive Leadership (19)
  • Positive Parenting (15)
  • Positive Psychology (34)
  • Positive Workplace (37)
  • Productivity (17)
  • Relationships (44)
  • Resilience & Coping (38)
  • Self Awareness (21)
  • Self Esteem (38)
  • Strengths & Virtues (32)
  • Stress & Burnout Prevention (34)
  • Theory & Books (46)
  • Therapy Exercises (37)
  • Types of Therapy (64)

3 Positive CBT Exercises (PDF)

Confirmation Bias (Examples + Definition)

practical psychology logo

I want to tell you a story about a person named John. John is passionate about gun rights. He believes everyone should have the freedom to own as many guns as they wish and that they should be allowed in all public places.

Sure, his views may seem extreme to some. But this story isn't solely about gun rights. It's about understanding the underlying psychology behind how individuals, like John or even you, might support or reinforce their beliefs. In this context, let's explore John's beliefs about guns.

Imagine John scrolling through Facebook. He spots a post from his aunt, sharing a story about a “good guy with a gun” preventing a mass shooting somewhere in the country. Without delving deeper into the video, John instantly shares the story. However, right below his aunt’s post, another headline catches his eye— it's about Australia’s success with stringent gun control measures.

What do you think is John's next move? Will he share that story too, contemplating its implications for America's safety, or scroll past, maybe even outright dismissing it?

This tendency to favor information that aligns with our pre-existing beliefs, as in John's case, is a cognitive phenomenon known as confirmation bias. It's pervasive, evident daily on our social media feeds and even in the news. Perhaps you've experienced it during heated debates with colleagues, where, despite presenting concrete facts, you felt they were utterly unreceptive.

This video delves into the intricate world of confirmation bias, shedding light on how it shapes our perceptions, governs our interactions, and influences our decisions. It's an unbiased force, affecting gun enthusiasts and critics alike, supporters of Black Lives Matter and Blue Lives Matter. Essentially, it's a universal trait, and yes, it affects you too.

What is the Confirmation Bias?

Confirmation bias leads us to search for evidence that supports our current beliefs and oppose information that goes against our current beliefs. Even when facts are presented, our brain will likely dismiss the ones that challenge what we already “know” about the world.

Confirmation bias was “discovered” in 1960 by a psychologist named Peter Wason. He confirmed his theory with a simple experiment . He gave participants three numbers and asked them to figure out the “rule” for the three numbers. The example he gave was “2-4-6.”

The rule behind his set of three numbers is that they had to be chosen in ascending order. 3-6-9, 45-89-100, and 1-2,9 would have all been acceptable answers. But more than half of the students couldn’t figure out the rule.

Wason looked at the triples that the participants chose to test their theory. Some believed that the set of three numbers all had to be even, like 6-8-10 or 80-90-100. They tested sets of triplets with only even numbers to confirm their belief.

What they didn’t test were triplets which would go against their beliefs. They only sought out information that would confirm what they already believed to be true.

This shows why our fictional “John” from the beginning of the video might likely scroll past the news article that went against his view on guns.

venn diagram describing how confirmation bias is the intersection of what we believe and objective facts

Confirmation Bias and Fake News

Don’t believe me? Here’s another study that shows how confirmation bias can shield our ability to believe information that goes against our beliefs.

In 1979, researchers at Stanford gathered a group of participants with specific opinions on capital punishment. One group believed that capital punishment was a good thing. The other group opposed capital punishment.

The researchers made up two studies that provided statistics on capital punishment. When I say “made up,” I mean made up. The statistics were completely fake. They were designed to be equally compelling. One study supported that capital punishment benefited society and lowered crime rates. The other study supported the opposite claim.

Both studies were presented to the participants. Can you guess what happened next?

The students favoring capital punishment were likelier to say that the study confirming their beliefs was credible. The students against capital punishment were more likely to say that the same study was not to be believed and possibly “fake.” (Or dare I say, “fake news?”)

What surprised researchers even more is that participants were likelier to feel more adamant about their views after reading both studies. The participants who were initially in favor of capital punishment were more likely to feel stronger about their opinions. The participants initially against capital punishment were more likely to feel stronger about its drawbacks.

May I remind you that the studies presented to both sides were completely made up?

How it works

Confirmation bias can have harmful effects, but it also has some benefits. Let me explain.

Psychologists have been able to explain where we get confirmation bias. Our brains have a lot of information to process, especially nowadays. We don’t have time to view everything we have seen as if we’re seeing it and judging it for the first time. So we use “shortcuts” called heuristics to help us make decisions quickly.

Understanding Heuristics: The Brain's Mental Shortcuts

While confirmation bias is a cognitive phenomenon many are familiar with, it's essential to understand another critical concept that governs our decision-making process: heuristics. Often referred to as 'mental shortcuts' or 'rules of thumb,' heuristics are cognitive strategies our brains use to simplify complex problems and speed up the decision-making process. These shortcuts are not always accurate, but they're efficient.

Imagine you're trying to decide which of two restaurants to dine at. Instead of painstakingly analyzing every review for both establishments, you might rely on the most recent feedback or the overall star rating—a heuristic approach. This method may not guarantee the best meal of your life, but it significantly reduces the time and cognitive effort required to decide.

There are various heuristics that psychologists have identified, including:

  • Availability Heuristic : This involves relying on immediate examples that come to one's mind when evaluating a specific topic or decision. For instance, if a friend recently had a car accident, you might overestimate the danger of driving and opt for another mode of transportation, even if, statistically, driving is reasonably safe.
  • Representativeness Heuristic : Here, people judge probabilities based on resemblance. For example, if someone is shy and introverted, you might believe they're more likely to be a librarian than a salesperson, even if there are more salespeople.
  • Anchoring Heuristic : This is where individuals rely heavily on the first piece of information they encounter (the "anchor") when making decisions. If you see a shirt originally priced at $100 but now on sale for $60, you might consider it a bargain, even if it is not worth $60.

Heuristics, like confirmation bias, have evolved because they're beneficial in many situations, offering a quick way to navigate our complex world. They allow us to act fast, especially when a rapid response is required. However, these mental shortcuts can also lead to systematic errors or biases in our thinking. Awareness of these can help us make more informed and rational decisions, even when our brain is looking for the quickest route.

Quick decisions have many benefits. If we see a big, furry animal in the distance, our minds will likely say, “BEAR!” and tell our feet to move in the other direction. Brains that took in every detail of every hair on the bear’s back and asked ourselves whether or not a bear was dangerous might take up a lot of time - and in that time, we might get eaten by a bear.

Other Biases Related to Confirmation Bias

So our minds help move things along with biases and shortcuts. Confirmation bias is far from the only bias that helps us make quick decisions. The Self-Serving bias tells us to give ourselves credit for good things that happen to us. The Anchoring bias gives us an “anchor” to compare information to. The Optimism Bias tells us that good things are likely to happen to us, even more so than they are likely to happen to others.

Once we have formed a belief, it’s also hard to switch gears. Holding two or more opposing beliefs is very uncomfortable for the mind. It puts us in a state of cognitive dissonance. Our brains don’t like discomfort. Brains actively try to avoid any discomfort or pain. So when they are faced with two opposing ideas, they are more likely to look to and confirm the one that they already believe in. They may reject the opposing ideas by claiming that it’s fake news or not as strong of an argument.

Cognitive biases aren’t inherently bad. They are a natural part of how the brain works and makes decisions. But, as you might have seen in the examples I showed earlier, confirmation bias can have a serious effect on the way we come to a decision about issues and limit our ability to solve problems.

Confirmation bias and outrage

Confirmation bias can put serious limits on our ability to think. But in an age where we have to process and make decisions on information faster than ever, it’s extremely important to be aware of confirmation bias and how we form opinions on political and social issues. Spreading “fake news” isn’t just a problem that affects one side of the political spectrum or the other.

Take the story about the “gender-neutral Santa” that came out a few months ago. Headlines filled up social media platforms claiming that people demanded  a gender-neutral Santa. A majority or a third of people wanted Santa to be gender-neutral or even a woman.

People commented on the piece, reinforcing the idea that people were demanding too much political correctness and that things were getting out of control. This study reached the desks of the BBC and CBS.

Well, the facts behind the story don’t exactly match up with the outrage and assumptions that people gave it. No one asked or demanded anything. A graphics company behind the “study” surveyed people in an online poll asking for ways to modernize Santa. Among the ways to modernize Santa, more people suggested that Santa could use Amazon Prime, wear sneakers, or use an iPhone. The survey was not a scientific one. It fed users answers (it suggested that Santa be gender-neutral, rather than allowing participants to make that suggestion themselves) and was generally not a survey met to reflect the entire population of the US and the UK.

Less than a third suggested that Santa should be gender-neutral or a woman. No one demanded anything. There was no evidence to suggest that most people even wanted to change Santa. But people who already felt outraged by “PC culture” still took the biased study and ran with it.

This lapse of judgment and reading about the study's validity is one way we allow our confirmation bias to confirm things we already know to be true. People who think that liberals are demanding extreme measures when it comes to gender rights and breaking with tradition were likely to let the headline confirm their concerns.

Example 1: Holding Onto Stereotypes About Others

There are many hurtful stereotypes in the world about people from different countries or of different religions or races. The confirmation bias keeps those stereotypes alive.

Let’s say you hold onto the belief that law enforcement officers are evil or racist. You are more likely to seek out, pay attention to, or share stories of law enforcement officers who murdered minorities without justification. Or, maybe you hold onto a different opinion - you believe that not all cops are bad or that the people who the police have mistreated deserve what they got. You might brush off the stories that people share about cops killing unarmed minorities or believe that they are isolated incidents. You might read more into stories of people who were pulled over and interacted pleasantly with the police. Or, you might turn up the volume on stories of police doing good for the community and cutting down on crime.

This probably sounds like someone you know or on your Facebook friends list. Anytime someone clings to a stereotype, they let the confirmation bias lead them astray.

Example 2: Speculation in Courtrooms

Confirmation bias can have serious effects outside of the digital world. Have you ever wondered why selecting a jury is such a long process? The judges, prosecutors, and defense attorneys are all trying to eliminate cases of confirmation bias. They want a jury that can make a fair and impartial decision about whether or not the person committed a crime.

Unfortunately, that doesn’t always happen. Look at the OJ Simpson trial. I could talk all day about the different biases that led to the “not guilty” verdict. But one is the confirmation bias. Some jurors walked into the courtroom with the belief that OJ Simpson did or did not commit the crime. They had a lot of information to take in during the trial - it lasted for months. However, the confirmation bias made the information that confirmed the jurors’ beliefs stick out most in their minds.

O.J. Simpson trial

Example 3: Facebook Algorithms As An Echo Chamber

Have you ever heard of an echo chamber before? It’s a term used to describe spaces where people only encounter views and beliefs that mirror their own. You only hear “echoes” of the same ideas repeatedly.

Echo chambers exist online and offline. Maybe you only see news on your social media feed that confirms your beliefs. Maybe you only go to events at college with people who you know have the same political or spiritual beliefs. You might block people who get into arguments with you on Facebook. The more you isolate yourself from people who think differently than you, the farther you go into the echo chamber.

Knowledge of confirmation biases or echo chambers is not new - and it has been recognized and manipulated for years. Facebook’s algorithm encourages it. After all, it wants us to stay on the social media platform and share content. If we see more content from websites that we trust or people on our friends list who share our same beliefs, we are more likely to stay on the site and engage with the content.

Example 4: Fueling Ideas About Conspiracy Theories

Conspiracy theories like the Flat Earth Theory show how easy it is to ignore facts and hold onto your beliefs. People who believe these conspiracy theories look for any small piece of evidence that confirms what they believe. There could be dozens of explanations for the evidence they use, but conspiracy theorists don’t see it or won’t have it. They are so entrenched in their beliefs that basic science can no longer convince them.

Facebook groups and confirmation bias go hand in hand. Members of Flat Earther or other conspiracy theorists' Facebook groups share articles, photos, and videos that continue to confirm their beliefs. The more time people spend in these groups, the deeper their beliefs become. They pay attention only to the “facts” in these groups rather than seeing the larger picture.

Example 5: Confirmation Bias in The Workplace

Be on the lookout for confirmation bias at work. It can save you time and money to look at the facts before you, even if they disprove what you currently think about a project.

Let’s say you work in advertising. You believe that the target audience for your group project is moms who stay at home and don’t have a college degree. When your team brings you research on your competitors and the current market, however, the numbers show that your target audience isn’t making the purchasing decisions for their household.

Confirmation bias may stray your thinking here. Rather than going back to the drawing board and creating a message that tailors to your new target audience, you stick with your original beliefs about who is buying your product. You look for other research that supports what you believe about the market.

It’s easy to see how confirmation bias can skew your direction on a project and lead you astray.

Example 6: Warren Buffett and Confirmation Bias

If you’re one of the richest men in the world, it can be easy to fall into the confirmation bias. After all, your fortune can confirm that you’ve made great decisions in business and finance. But that’s not a trap that Warren Buffett wants to fall into. Investors can easily fall into the confirmation bias - they put all of their money into one company and ignore any signs that it might perform poorly. When an investor is led by confirmation bias, they can stick with their investments for way too long.

So Warren Buffett took specific steps to fight the confirmation bias. He invites vocal critics of his investment strategy to meetings. He sits down and talks with people who are his direct competitors. This way, he can challenge his current beliefs about his strategy and possibly look for a new, better way to invest.

Example 7: Relationship Woes

We all have that one friend who has a habit of dating the wrong people. Often, this is caused by low self-esteem. They believe that they are not good enough to date someone who is kind or wants to treat them like royalty. When someone does treat them well, they get suspicious and run away. It seems like that friend tends to seek out the worst possible partners they can find, only to confirm their beliefs that they’re not worthy of a healthy, loving relationship.

This can be frustrating for the friend’s loved ones. Maybe it’s time to talk to that friend about the confirmation bias and how it may affect who they choose to date or spend their time with.

Example 8: Failing to Listen

One of the most powerful slogans of the #MeToo movement is “Believe Women.” This slogan is necessary due to the confirmation bias. When women tell their stories of sexual harassment or misconduct, they often feel that the men in their lives are not listening to them. They are brushed off, or the men who hear the stories downplay the story's validity. This is confirmation bias at work. It’s not easy for some men to hear that their peers, an idol, or someone who looks just like them is capable of doing harmful or evil things. It’s not easy for some men to hear that they are not listening or that they are not supporting women. So they don’t listen to the evidence that supports these contradicting beliefs.

Confirmation Bias and Trustworthiness

Confirmation bias is the basis of many studies, especially as we are bombarded with more information and “fake news.” In one study, researchers examined how trustworthy people thought a news source was. It’s no surprise that people who lean to the left are more likely to consider news sources like Vox or the New York Times a trustworthy source and rate Fox News as a less trustworthy source. People who lean to the right are likelier to feel the opposite way.

In one study , researchers gave participants content from these websites. Some people saw the news source, and others were only given the text from the website to read. People who leaned to the left were more likely to rank the content from Fox News as more trustworthy when they didn’t know it was coming from Fox News. People who leaned to the right were more likely to rank the content from The New York Times as trustworthy when they didn’t know it was coming from The New York Times. Simply seeing the name of a website can easily skew our interpretation of information, no matter who you are.

The research also revealed some input about extreme opinions and bias. People who leaned far to the right or the left were likelier to have biases against news sites. This includes people who consider themselves “very liberal” or “very conservative.”

The Importance of Being Aware of Confirmation Bias

Heuristics and biases can help us make decisions quickly. We don’t have to spend an egregious amount of time processing the news and information we see on social media. And while these shortcuts can be harmless, they can seriously skew our opinions.

Be aware of how you view information and headlines as they come in. Do you read the full article? Do you read it with an open mind? How likely are you to dismiss information because it goes against what you know? How likely are you to fact-check or seek opposing information that could expand your perspective?

Keep these questions in mind and continue to challenge your biases.

Did you learn something new about confirmation bias? Prove it! I have a quick quiz to test what you’ve learned in this video.

Question 1:

True or False: The confirmation bias leads people to reject, refute, or ignore information confirming their beliefs.

False! It’s likely to reject, refute, or completely ignore information that opposes current beliefs.

Question 2:

_______ is a discomfort we feel when faced with opposing thoughts or beliefs.

Cognitive dissonance! It’s one idea that supports confirmation bias.

Question 3:

What are the “shortcuts” our brain uses to make quick decisions?

Heuristics!

Related posts:

  • Experimenter Bias (Definition + Examples)
  • Negative Punishment (Definition + Examples)
  • Functional Fixedness (Definition + Examples)
  • Positive Punishment (Definition + Examples)
  • Hindsight Bias (Definition + Examples)

Reference this article:

About The Author

Photo of author

PracticalPie.com is a participant in the Amazon Associates Program. As an Amazon Associate we earn from qualifying purchases.

Follow Us On:

Youtube Facebook Instagram X/Twitter

Psychology Resources

Developmental

Personality

Relationships

Psychologists

Serial Killers

Psychology Tests

Personality Quiz

Memory Test

Depression test

Type A/B Personality Test

© PracticalPsychology. All rights reserved

Privacy Policy | Terms of Use

Listen-Hard

Illustrating Confirmation Bias in Psychology: Real-Life Examples and Effects

example of confirmation bias critical thinking

Confirmation bias is a common psychological phenomenon that affects the way we think and make decisions. This article will explore what confirmation bias is, how it impacts our thinking, and provide real-life examples of its effects.

From political beliefs to superstitions, we will examine how confirmation bias can influence our perceptions. We will discuss how confirmation bias can impact decision-making processes, as well as strategies to overcome this cognitive bias.

Join us as we delve into the fascinating world of confirmation bias and its implications in our daily lives.

  • Confirmation bias is the tendency to seek out information that confirms our existing beliefs, and ignore evidence that contradicts them.
  • This bias can lead to limited information gathering, biased interpretation of information, and reinforcing existing beliefs.
  • To overcome confirmation bias, actively seek out different perspectives, challenge our own beliefs, and consider all available evidence.
  • 1 What Is Confirmation Bias?
  • 2 How Does Confirmation Bias Affect Our Thinking?
  • 3.1 Political Beliefs
  • 3.2 Conspiracy Theories
  • 3.3 Stereotypes
  • 3.4 Superstitions
  • 4.1 Limited Information Gathering
  • 4.2 Biased Interpretation of Information
  • 4.3 Reinforcing Existing Beliefs
  • 5.1 Actively Seek Out Different Perspectives
  • 5.2 Challenge Our Own Beliefs
  • 5.3 Consider All Available Evidence
  • 6 Conclusion
  • 7.1 What is confirmation bias in psychology?
  • 7.2 Can you provide a real-life example of confirmation bias?
  • 7.3 How does confirmation bias affect decision-making?
  • 7.4 Can confirmation bias be harmful?
  • 7.5 Is confirmation bias a conscious or unconscious process?
  • 7.6 How can one overcome confirmation bias?

What Is Confirmation Bias?

Confirmation bias refers to the tendency of individuals to seek out, interpret, and recall information in a way that confirms or supports their preexisting beliefs and hypotheses.

This cognitive bias plays a significant role in how people process information, filtering incoming data through the lens of their existing opinions. It can lead to a distorted perception of reality, where individuals tend to ignore evidence that contradicts their views while actively seeking out and giving more weight to information that aligns with what they already believe.

In various studies on decision-making, confirmation bias consistently emerges as a key factor influencing the choices people make. It can lead to flawed reasoning and hinder the ability to critically evaluate alternative perspectives, ultimately impacting the quality of decisions in both personal and professional contexts.

How Does Confirmation Bias Affect Our Thinking?

Confirmation bias significantly influences our thinking by shaping how we process information, reinforcing our existing beliefs, and affecting our decision-making processes.

When confirmation bias takes hold, individuals tend to selectively seek out information that aligns with their preconceived notions, dismissing contradictory evidence as anomalies or errors. This tendency to cherry-pick data can lead to overlooking important facts and drawing inaccurate conclusions. Confirmation bias can also impact how we interpret the world, distorting our perceptions and creating a warped reality shaped by our biases. By reinforcing our beliefs, this cognitive bias can create an echo chamber effect, where we surround ourselves with like-minded individuals and sources, further entrenching our existing views.

Real-Life Examples of Confirmation Bias

Real-life examples vividly demonstrate how confirmation bias operates in various contexts, affecting people’s perceptions, judgments, and behaviors.

For example, in the realm of politics, individuals tend to seek out news sources and social media platforms that align with their existing beliefs, reinforcing their preconceived notions and overlooking contradictory information. This selective exposure perpetuates confirmation bias, strengthening ideological divides and hindering open-minded discourse.

In finance, investors may overlook warning signs or expert advice that contradicts their bullish outlook on a particular stock, leading to potentially risky decisions driven by the desire to confirm their positive expectations. Such biased financial decisions can result in significant losses and missed opportunities.

Political Beliefs

Political beliefs are often subject to confirmation bias, where individuals selectively perceive and accept information that aligns with their political views, leading to polarization and resistance to contradictory evidence.

For example, individuals with a liberal ideology may be more inclined to trust and share news articles from left-leaning sources, while dismissing or ignoring those from conservative outlets. This bias not only affects how people interpret information but also influences their policy preferences. Someone biased towards a particular political party may be more likely to support policies associated with that party, even if evidence suggests they may not be the most effective solutions.

Conspiracy Theories

Confirmation bias plays a significant role in the propagation and persistence of conspiracy theories, as individuals tend to selectively recall and emphasize information that supports their conspiratorial beliefs while avoiding contradictory evidence.

This psychological phenomenon influences how people interpret and process information, leading them to seek out data that confirms their preexisting notions, even if such data lacks credibility or is distorted.

For instance, in the case of the moon landing conspiracy theory, proponents often focus on anomalies or inconsistencies in the footage, while disregarding overwhelming scientific evidence that supports the authenticity of the moon landing.

To counteract confirmation bias and combat the proliferation of conspiracy theories, fostering critical thinking skills and encouraging open-mindedness are essential. Illustrating Confirmation Bias in Psychology: Real-Life Examples and Effects .

Stereotypes

Stereotypes are often perpetuated by confirmation bias, where individuals rely on selective information that confirms existing stereotypes while dismissing or discounting evidence that challenges their validity.

This cognitive bias poses a significant challenge in dismantling stereotypes, as individuals tend to seek out information that aligns with their preconceived notions, reinforcing biased beliefs. This not only affects social perceptions of different groups but also influences interpersonal interactions, leading to misunderstandings and prejudices. In group dynamics, confirmation bias can lead to polarization and conflict, creating barriers to collaboration and empathy.

Superstitions

Superstitions often thrive on confirmation bias, with individuals attributing positive outcomes to superstitious beliefs due to selective recall of confirming instances, leading to self-fulfilling prophecies and reinforced irrational beliefs.

When individuals engage in confirmation bias , they tend to seek out information that aligns with their preconceived beliefs or expectations, while disregarding contradictory evidence. In the context of superstitions , this means that people are more likely to remember instances where their beliefs ‘came true,’ reinforcing the idea that the superstition was responsible for the outcome.

This phenomenon creates a cycle where the individual’s belief in the superstition grows stronger with each confirming instance, further solidifying the irrational connection between the superstition and positive outcomes.

How Does Confirmation Bias Impact Decision Making?

Confirmation bias exerts a profound impact on decision-making processes by limiting information gathering, favoring data that confirms existing beliefs, and distorting the evaluation of alternative options.

When individuals fall prey to confirmation bias, they tend to seek out information that aligns with their preconceived notions, reinforcing their initial standpoint. This cognitive bias can blur the line between objective analysis and subjective interpretation, leading to skewed perceptions of risks and rewards.

Moreover, confirmation bias can inadvertently diminish the diversity of viewpoints considered during decision-making, narrowing the scope of available solutions and potentially overlooking innovative or unconventional approaches that could yield better outcomes.

Limited Information Gathering

Confirmation bias leads to limited information gathering in decision-making, as individuals tend to selectively seek out data that aligns with their preconceptions, overlooking contradictory evidence that could enhance the decision-making process.

This tendency can significantly impact the quality of research findings and the robustness of decisions made based on those findings. When individuals focus solely on confirming their existing beliefs, they may miss crucial insights and alternative perspectives that could lead to more informed choices.

Confirmation bias can distort risk assessments, leading to potentially flawed conclusions and outcomes. For counteract this bias, it is essential to actively seek out diverse sources of information, challenge one’s own assumptions, and encourage a culture of open-mindedness and critical thinking within decision-making processes.

Biased Interpretation of Information

Confirmation bias distorts the interpretation of information by favoring explanations that reinforce existing beliefs, leading to skewed perceptions, faulty reasoning, and suboptimal decision outcomes.

When individuals face data or opinions that align with what they already believe, they tend to gravitate towards those, often ignoring conflicting evidence or alternative viewpoints. This biased filtering affects not only our comprehension but also our ability to engage in rational debates and problem-solving exercises. Subconsciously cherry-picking information to confirm our preconceptions forms a feedback loop that strengthens these beliefs, hindering the capacity for growth and adaptability.

Ultimately, such cognitive tendencies could hinder effective communication, dampen critical thinking skills, and impede objective decision-making processes, impacting personal and professional spheres alike.

Reinforcing Existing Beliefs

Confirmation bias reinforces existing beliefs by selectively processing information that supports those beliefs while discounting or dismissing contradictory evidence, creating a cycle of reinforcement that can be challenging to overcome.

Confirmation bias can lead individuals to seek out information that aligns with what they already believe, unknowingly ignoring anything that contradicts their preconceptions. This tendency to cherry-pick data not only strengthens one’s current beliefs but also hinders the ability to consider alternative viewpoints. Overcoming confirmation bias requires a conscious effort to actively engage with diverse perspectives and information sources.

How Can We Overcome Confirmation Bias?

Overcoming confirmation bias requires conscious effort and an awareness of its influence on our thinking, decision-making, and beliefs.

By actively seeking out contradictory evidence to our preconceived notions, we can challenge and reevaluate our beliefs objectively. Engaging in discussions with individuals who hold differing viewpoints can also help broaden our perspective and reveal blind spots in our thinking. It’s crucial to cultivate a habit of fact-checking sources and verifying information before drawing conclusions. Developing critical thinking skills enables us to spot biases in our own reasoning and assess the validity of arguments more effectively. Ultimately, breaking free from confirmation bias allows for more knowledge-based decision making, fosters personal growth, and enhances our ability to process information accurately.

Actively Seek Out Different Perspectives

Actively seeking out different perspectives is a key strategy to counter confirmation bias, as it exposes individuals to diverse viewpoints, challenges preexisting beliefs, and broadens their understanding of complex issues.

By actively engaging with a variety of perspectives, individuals can enhance their decision-making skills by considering a wider range of possibilities and potential outcomes. This cognitive flexibility enables them to adapt more effectively to changing circumstances and make more informed choices. Embracing diverse viewpoints fosters interpersonal relationships by promoting understanding, empathy, and respect for others’ opinions.

From a personal growth standpoint, welcoming different perspectives contributes significantly to intellectual development, encouraging critical thinking, creativity, and problem-solving abilities. It allows individuals to question their own assumptions, see beyond their inherent biases, and approach challenges with a more open mind. This journey of exploration not only enriches one’s knowledge and worldview but also cultivates a broader sense of empathy and cultural competence.

Challenge Our Own Beliefs

Challenging our own beliefs is a proactive strategy to combat confirmation bias, encouraging individuals to question their assumptions, critically evaluate evidence, and consider alternative viewpoints with an open mind.

By engaging in self-reflection and actively seeking out information that may challenge preconceived notions, individuals can cultivate intellectual humility and a willingness to revise their beliefs in light of new evidence. This process of introspection and belief examination fosters intellectual growth and promotes a deeper understanding of complex issues.

Adopting a mindset of openness to revision can lead to more nuanced perspectives and improved decision-making. It allows individuals to break free from the constraints of confirmation bias and approach information with a critical eye, rather than seeking out only that which reinforces existing beliefs.

Consider All Available Evidence

Considering all available evidence is essential in combating confirmation bias, as it promotes thorough evaluation, fact-based decision-making, and a more nuanced understanding of complex issues.

When individuals gather empirical data from various credible sources, they equip themselves with the tools necessary to challenge preconceived notions and biases that may cloud judgment. Through critical analysis and synthesis of information, one can sift through the noise to uncover the most valid and reliable evidence. This evidence-based approach not only enhances problem-solving skills but also upholds research integrity by grounding conclusions in sound data. Such rigorous practices foster a culture of intellectual honesty and drive towards knowledge-based decision making across various domains.

Confirmation bias is a pervasive cognitive phenomenon that profoundly influences how we perceive, process, and interpret information, underscoring the importance of critical thinking, evidence evaluation, and intellectual humility in navigating the complexities of belief formation and decision-making.

Confirmation bias can lead individuals to seek out information that confirms their existing beliefs while disregarding contradictory evidence, creating an echo chamber of biased opinions.

To combat this, individuals need to cultivate self-awareness and actively challenge their preconceptions by engaging with divergent viewpoints, embracing skepticism, and practicing reflective thinking.

Fostering a culture of evidence-based reasoning in society and institutions can help mitigate the detrimental effects of confirmation bias on public discourse and policy-making.

Frequently Asked Questions

What is confirmation bias in psychology.

Confirmation bias is a psychological phenomenon where individuals tend to seek out and interpret information in a way that supports their preexisting beliefs or hypotheses, while ignoring or dismissing contradictory evidence.

Can you provide a real-life example of confirmation bias?

Yes, a common example of confirmation bias in everyday life can be seen in political discussions, where individuals often only pay attention to news sources that align with their political beliefs and reject or discredit information from opposing viewpoints.

How does confirmation bias affect decision-making?

Confirmation bias can greatly impact decision-making by leading individuals to make choices that are not based on objective evidence, but rather on their own biased perceptions and beliefs. This can result in poor decision-making and missed opportunities.

Can confirmation bias be harmful?

Yes, confirmation bias can be harmful in many ways, such as reinforcing stereotypes and prejudices, hindering personal growth and development, and causing individuals to make decisions that are not in their best interest.

Is confirmation bias a conscious or unconscious process?

Confirmation bias can be both a conscious and unconscious process. While some individuals may deliberately seek out information that supports their beliefs, others may do so without even realizing it due to their ingrained biases.

How can one overcome confirmation bias?

Overcoming confirmation bias requires self-awareness and a willingness to challenge one’s own beliefs. It also involves seeking out diverse perspectives and considering all available evidence before making a decision. Seeking feedback from others can also help to reduce the effects of confirmation bias.

' src=

Dr. Sofia Alvarez is a clinical psychologist with over a decade of experience in counseling and psychotherapy. Specializing in anxiety disorders and mindfulness practices, she has contributed to numerous mental health initiatives and workshops. Dr. Alvarez combines her clinical expertise with a passion for writing to demystify psychology and promote mental wellness. She believes in the power of therapeutic storytelling and advocates for mental health awareness in various online platforms and community forums.

Similar Posts

Exploring Automatic Thoughts: Insights into Cognitive Processes in Psychology

Exploring Automatic Thoughts: Insights into Cognitive Processes in Psychology

The article was last updated by Dr. Emily Tan on February 6, 2024. Automatic thoughts play a crucial role in shaping our emotions, behaviors, and…

Understanding the All-or-none Principle: Psychological Implications and Applications

Understanding the All-or-none Principle: Psychological Implications and Applications

The article was last updated by Julian Torres on February 6, 2024. Have you ever heard of the all-or-none principle in psychology? This principle states…

Unpacking the Phenomenon of the Google Effect in Psychology

Unpacking the Phenomenon of the Google Effect in Psychology

The article was last updated by Marcus Wong on February 9, 2024. Have you ever experienced the “Google Effect” where you rely on the internet…

Understanding the Psychological Implications of Light and Shadow

Understanding the Psychological Implications of Light and Shadow

The article was last updated by Marcus Wong on February 8, 2024. Have you ever stopped to consider the profound impact that light and shadow…

Unveiling the Concept of Mental Models in Psychology

Unveiling the Concept of Mental Models in Psychology

The article was last updated by Samantha Choi on February 5, 2024. Have you ever wondered how your mind constructs its own version of reality?…

Deconstructing Shallow Processing in Psychology: Definition and Effects

Deconstructing Shallow Processing in Psychology: Definition and Effects

The article was last updated by Gabriel Silva on February 4, 2024. Have you ever wondered why some information sticks in your memory while other…

  • Media Center

Why do we favor our existing beliefs?

Confirmation bias, what is confirmation bias.

The confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.

Confirmation bias illustration

Where this bias occurs

Debias your organization.

Most of us work & live in environments that aren’t optimized for solid decision-making. We work with organizations of all kinds to identify sources of cognitive bias & develop tailored solutions.

Consider the following hypothetical situation: Jane is the manager of a local coffee shop. She is a firm believer in the motto, “hard work equals success.” The coffee shop, however, has seen a slump in sales over the past few months. Since Jane strongly believes that “hard work” is a means to success, she concludes that the dip in the coffee shop’s sales is because her staff is not working hard enough. To account for this, Jane puts several measures in place to ensure that her staff is working consistently. Consequently, she ends up spending more money by having a greater number of employees staffed on a shift, exceeding the shop’s budget and thus contributing to overall losses.

Consulting with other business owners in her area, Jane is able to identify her store’s new, less visible location as the primary cause of her sales slump. Her belief in hard work as the most important metric of success led her to mistakenly identify employees’ lack of effort as the reason for the store’s falling revenue while ignoring evidence that pointed to the true cause: the shop’s poor location. Jane has fallen victim to confirmation bias, which caused her to notice and give greater credence to evidence that fits with her pre-existing beliefs.

As this example illustrates, our personal beliefs can weigh us down when conflicting information is present. Not only does it stop us from finding a solution, but we also may not even be able to identify the problem to begin with.  

Individual effects

Confirmation bias can lead to poor decision-making as it distorts the reality from which we draw evidence. When observed under experimental conditions, assigned decision-makers have a tendency to actively seek and assign greater value to information that confirms their existing beliefs rather than evidence that entertains new ideas. 

example of confirmation bias critical thinking

Confirmation bias can have implications for our interpersonal relationships. Specifically, how first impressions cause us to selectively attend to our peers’ subsequent behavior. Once we have an expectation about a person, we will try to reinforce this belief through our later interactions with them. In doing so we can appear “closed-minded” or, conversely, participate in relationships that do not serve us. 

Systemic effects

Considering the bigger picture,  confirmation bias can have troubling implications. Major social divides and stalled policy-making may begin with our tendency to favor information that confirms our existing beliefs and ignores evidence that does not. The more we become entrenched in our preconceptions, the greater influence confirmation bias has on our behavior and, consequently, the people we choose to surround ourselves with. We can trap ourselves in a sort-of echo-chamber, and without being challenged, the biased thoughts prevail. This can be especially concerning in terms of socio-political cooperation and unity amongst the population.

Confirmation bias can exacerbate social exclusion and tensions. In-group bias is the tendency to favor those with whom you identify, in doing so, assigning them positive characteristics. That same inclination is not present for the out-group, which consists of individuals who you feel you share less in common with. Combined with confirmation bias, there is a lot of opportunity for prejudgment and stereotyping. Confirmation bias may lead us to look for favorable traits in our in-group and avoid any of our shortcomings. It may also cause us to be wary of the out-group and interpret their behavior through the lens of what we already assume.

Confirmation bias is particularly present in the consumption of news and media. The ever-evolving ease of access has allowed the population to personally curate what they consume. While it is evident that people cling to sources that support their political orientation, confirmation bias can also influence how news is reported. Journalists and media outlets are not immune to bias, they too are selective with their sources, what they choose to present, and how that information is conveyed. 2 Zooming out, these outlets and their leanings can have a strong influence on consumers’ knowledge, beliefs, and even voting patterns.  

How it affects product

Marketing and reviews are where we can see the largest influence of confirmation bias as it pertains to products. Most consumers rely on product reviews and advertisements to advise them on the benefits of various items. For example, influencers and celebrities are a great way to promote products. This can expose new people to the brand and broaden the customer demographic. However, it is important to be careful about who you allow to be part of your promotional campaign. By using controversial figures to recommend your product, you may be damaging the brand’s reputation. If any of your clients think poorly of the individual you endorsed, their first impression of your company will be a negative one. This is confirmation bias at work: if we dislike a celebrity who endorses a product, we are more likely to attend to information that suggests that we will also dislike the product.

Consumers will often consult reviews before buying a product – this gives them a good idea of whether or not that item will be useful and valuable. Upon researching, if they are primed with an abundance of positive reviews, they may be likely to seek to confirm information when using it themselves. 

Confirmation Bias and AI

example of confirmation bias critical thinking

When using artificial intelligence, we are in control of how we prompt the system. While these tools are meant to produce unbiased and objective information, the individual using them may steer the response in a direction that coincides with their preexisting beliefs. For example, if you are using AI software to research different political candidates, the manner in which you ask the question matters. Depending on the tool you use, “Why should I vote for X instead of Y” and “What are the strengths of X candidate and Y candidate” will turn up very different results. Depending on what we “want to hear,” we may unconsciously prompt the system to reinforce our initial thought pattern.

As mentioned, though we like to think of AI as unbiased, the reality may be a little murkier. Artificial intelligence uses large data sets to inform itself on various topics. Due to the size and comprehensiveness of these data sets, they may reflect the biases that are present in the world around us. While it may be harmless in certain situations, it can also perpetuate negative stereotypes, or push a certain narrative as a result of the data used to program it. 

Why it happens

Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for shortcuts to make the process more efficient.

Confirmation bias is aided by several processes that all act on different stages to protect the individual from cognitive dissonance or the discomfort associated with the violation of one’s beliefs. These processes include:

  • Selective exposure, which refers to the filtering of information. Meaning that the individual avoids all challenging or contradictory information.    
  • Selective perception occurs when the individual observes or is exposed to information that conflicts with their standing beliefs, yet somehow tries to manipulate the information to affirm their existing views.
  • Selective retention is a major principle in marketing and attests that individuals are more likely to remember information that has been presented to them if it is consistent with what they already know to be true. 3

Our brains use shortcuts

Heuristics are the mental shortcuts that we use for efficient, though sometimes inaccurate, decision-making. Though it is debated whether or not confirmation bias can be categorized as a heuristic, it is certainly a cognitive strategy. Specifically, it helps us to avoid cognitive dissonance by searching and attending to information that we already believe.  

It makes sense that we do this. Oftentimes, humans need to make sense of information quickly however, forming new explanations or beliefs takes time and effort. We have adapted to take the path of least resistance, sometimes out of necessity.

Imagine our ancestors hunting. An intimidating animal is charging toward them, and they only have a few seconds to decide whether to hold their ground or run. There is no time to consider all the different variables involved in a fully informed decision. Past experience and instinct might cause them to look at the size of the animal and run. However, the presence of other hunters now tilts the chances of successful conflict in their favor. Evolutionary psychologists believe that the modern use of mental shortcuts for in-the-moment decision-making is based on past survival instincts. 1  

It makes us feel good about ourselves

 No one likes to be proven wrong, and when information is presented that violates our beliefs, it is only natural to push back. Deeply held views often form our identities, so disproving them can be uncomfortable. We might even believe that being wrong suggests that we lack intelligence. As a result, we often look for information that supports rather than refutes our existing beliefs.

We can also see the effects of confirmation bias in group settings. Clinical psychologist Jennifer Lerner in collaboration with political psychologist Phillip Tetlock proposed that through our interactions with others, we update our beliefs to conform to the group norm.  The psychologists distinguished between confirmatory thought, which seeks to rationalize a certain belief, and exploratory thought, which takes into consideration many viewpoints before deciding where you stand. 

Confirmatory thought in interpersonal settings can produce groupthink , in which the desire for conformity results in dysfunctional decision-making. So, while confirmation bias is often an individual phenomenon, it can also take place in groups of people.

Why it is important

As mentioned above, confirmation bias can be expressed individually or in a group context. Both can be problematic and deserve careful attention.

At the individual level, confirmation bias affects our decision-making. Our choices cannot be fully informed if we are only focusing on evidence that confirms our assumptions. Confirmation bias causes us to overlook pivotal information both in our careers and in everyday life. A poorly informed decision is likely to produce suboptimal results because not all of the potential alternatives have been explored.  

A voter might stand by a candidate while dismissing emerging facts about the candidate’s poor behavior. A business executive might fail to investigate a new opportunity because of a negative experience with similar ideas in the past. An individual who sustains this sort of thinking may be labeled “close-minded.”  Confirmation bias can cause us to miss out on opportunities and make less informed choices, it is important to approach situations and the decisions they call for with an open mind. 

At a group level, it can produce and sustain the groupthink phenomenon. In a culture of groupthink, decision-making can be hindered by the assumption that harmony and group coherence are the values most crucial to success. This reduces the likelihood of disagreement within the group.

Imagine if an employee at a technology firm  did not disclose a revolutionary discovery she made for fear of reorienting the firm’s direction. Likewise, this bias can prevent people from becoming informed on differing views, and by extension, engaging in the constructive discussion that many democracies are built on.

How to avoid it

Confirmation bias is likely to occur when we are gathering information for decision-making. It occurs subconsciously, meaning that we are unaware of its influence on our decision-making.

As such, the first step to avoiding confirmation bias is being aware that it is a problem. By understanding its effect and how it works, we are more likely to identify it in our decision-making. Psychology professor and author Robert Cialdini suggests two approaches to recognizing when these biases are influencing our decision-making:

First, listen to your gut feeling. We often have a physical reaction to uncomfortable stimuli , like when a salesperson is pushing us too far. Even if we have complied with similar requests in the past, we should not use that precedent as a reference point. Recall past actions and ask yourself: “Knowing what I know now, if I could go back in time, would I make the same commitment?”

Second, because the bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by diversifying where we get our information from, and having multiple sources. Though it is difficult to find objective reporting, reaching for reputable, neutral outlets can allow us to have more agency in our beliefs. 

Third, when hypotheses are being drawn from assembled data, decision-makers should also consider having interpersonal discussions that explicitly aim at identifying individual cognitive bias in the hypothesis selection and evaluation. Engaging in debate is a productive way to challenge our views and expose ourselves to information we may have otherwise avoided. 

While it is likely impossible to eliminate confirmation bias completely, these measures may help manage cognitive bias and make better decisions in light of it.

How it all started

example of confirmation bias critical thinking

Confirmation bias was known to the ancient Greeks. It was described by the classical historian Thucydides, in his text The History of the Peloponnesian. He wrote: “It is a habit of mankind to entrust to careless hope what they long for and to use sovereign reason to thrust aside what they do not want.’’ 4

In the 1960s, Peter Wason first described this phenomenon as confirmation bias. In what’s known as Wason’s SelectionTest, he conducted an experiment in which participants were presented with four cards. The cards were either red or brown and featured a number on the opposite side, two even, and two odd cards. For example, two cards would read the numbers 3 and 8, while the other two would be face down, showing the color, one red and one brown. Participants were told if the number on the card was even, the opposite side would be red. They were then tasked with trying to figure out whether this rule was true by flipping over two cards of their choosing. 

Many of the participants chose to turn over the card with the number 8 as well as the red card, as this was consistent with the rule they were given. In reality, this does little to actually test the rule. Indeed, turning over the “8” card will confirm what the experimenter said, but one also needs to turn over the brown card to verify that it is an odd number.

This experiment demonstrates confirmation bias in action, we seek to confirm what we know to be true, while disregarding information that could potentially violate that. 5  

Example 1 – Blindness to our own faults

A major study carried out by researchers at Stanford University in 1979 explored the psychological dynamics of confirmation bias. The study was composed of undergraduate students who held opposing viewpoints on the topic of capital punishment. Unbeknownst to them, the participants were asked to evaluate two fictitious studies on the topic.

One of the false studies provided data in support of the argument that capital punishment deters crime, while the alternative, opposing view (that capital punishment had no appreciable effect on overall criminality in the population).

While both studies were entirely fabricated by the Stanford researchers, they were designed to present “equally compelling” objective statistics. The researchers discovered that responses to the studies were heavily influenced  by participants’ pre-existing opinions:

  • The participants who initially supported the deterrence argument in favor of capital punishment considered the anti-deterrence data unconvincing and thought the data in support of their position was credible;
  • Participants who held the opposing view at the beginning of the study reported the same but in support of their stance against capital punishment.

So, after being confronted both with evidence that supported capital punishment and evidence that refuted it, both groups reported feeling more committed to their original stance. The net effect of having their position challenged was a re-entrenchment of their existing beliefs. 6

Example 2 – Effects of the internet

The “filter bubble effect” is an example of technology amplifying and facilitating our cognitive tendency toward confirmation bias. The term was coined by internet activist Eli Pariser to describe the intellectual isolation that can occur when websites use algorithms to predict and present information a user would want to see. 7

This means that as we use particular websites and content networks, the more likely we are to encounter content that we prefer. At the same time, algorithms will exclude content that runs contrary to our preferences. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might favor information that confirms your existing options and exclude disconfirming evidence from your online experience.

In his seminal book, “The Filter Bubble: What the Internet Is Hiding from You", Pariser uses the example of internet searches for an oil spill to show the filter bubble effect:

"In the spring of 2010, while the remains of the Deepwater Horizon oil rig were spewing crude oil into the Gulf of Mexico, I asked two friends to search for the term ‘BP’. They’re pretty similar — educated, white, left-leaning women who live in the Northeast. But the results they saw were quite different. One of my friends saw investment information about BP. The other saw the news. For one, the first page results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP." 7

If this were the only source of information that these women were exposed to, surely they would have formed very different conceptions of the BP oil spill. The internet search engine showed information tailored to the beliefs their past searches showed and picked results predicted to fit with their reaction to the oil spill. Unbeknownst to them, it facilitated confirmation bias.

While the implications of this particular filter bubble may have been harmless, filter bubbles on social media platforms have been shown to influence elections by tailoring the content of campaign messages and political news to different subsets of voters. This could have a fragmenting effect that inhibits constructive democratic discussion, as different voter demographics become increasingly entrenched in their political views as a result of a curated stream of evidence that supports them.

Confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.

Confirmation bias is a cognitive shortcut we use when gathering and interpreting information. Evaluating evidence takes time and energy, and so our brain looks for shortcuts to make the process more efficient. We look for evidence that best supports what we know to be true because the most readily available hypotheses are the ones we already have. Another reason why we sometimes show confirmation bias is that it protects our self-esteem. No one likes feeling bad about themselves-- and realizing that a belief they valued is false can have this effect. As a result, we often look for information that supports rather than disproves our existing beliefs.

Example #1 - Blindness to our own faults

A 1979 study by researchers at Stanford found that after being confronted with equally compelling evidence in support of capital punishment and evidence that refuted it, subjects reported feeling more committed to their original stance on the issue. The net effect of having their position challenged was a re-entrenchment of their existing beliefs.

Example #2 - Establishing personalized networks online

Modern preference algorithms have a “filter bubble effect,” which is an example of technology amplifying and facilitating our tendency toward confirmation bias. Websites use algorithms to predict the information and content that a user wants to see. We normally prefer media that confirms our beliefs because it requires less critical reflection. So, filter bubbles might exclude information that clashes with your existing opinions as informed by your online activity. 

Confirmation bias is likely to occur when we are gathering the information needed to make decisions. It is also subconscious; we are unaware of its influence on our decision-making. As such, the first step to avoiding confirmation bias is making ourselves aware of it. Because confirmation bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base. This can be achieved by having multiple objective sources of information.

Related TDL articles

Can overcoming implicit gender bias boost a company’s bottom line .

This article argues that gender diversity in a firm is associated with higher firm performance. By addressing and drawing on confirmation bias (among other relevant psychological principles), firms may be able to increase diversity and thereby increase performance.

Learning Within Limits: How Curated Content Affects Education 

This article argues that the use of ‘trigger warnings’, modern preferences algorithms, and other such cues create a highly curated stream of information that facilitates cognitive biases such as confirmation bias. The author notes that this can prevent us from empathizing with others and consolidating our opinions in light of differing ones.

  • Healy, P. (2016, August 18). Confirmation bias: How it affects your organization and how to overcome it. Business Insights Blog. https://online.hbs.edu/blog/post/confirmation-bias-how-it-affects-your-organization-and-how-to-overcome-it 
  • Ling, R. (2020). Confirmation bias in the era of mobile news consumption: The social and psychological dimensions. Digital Journalism, 8(5), 596–604. https://doi.org/10.1080/21670811.2020.1766987
  • Hastall, M. R. (2020). Selective exposure, perception, and retention. The SAGE International Encyclopedia of Mass Media and Society, 1–5, 1537–1539. https://doi.org/10.4135/9781483375519  
  • Schlosser, J. A. (2013). “Hope, danger’s comforter”: Thucydides, hope, politics. The Journal of Politics, 75(1), 169–182. https://doi.org/10.1017/s0022381612000941 
  • Badcock, C. (2012, May 5). Making sense of wason. Psychology Today. https://www.psychologytoday.com/ca/blog/the-imprinted-brain/201205/making-sense-wason 
  • Lord, C. G., Ross, L., & Lepper, M. R. (1979). Biased assimilation and attitude polarization: The effects of prior theories on subsequently considered evidence. Journal of Personality and Social Psychology, 37(11), 2098–2109. https://doi.org/10.1037/0022-3514.37.11.2098 
  • Pariser, E. (2012). The filter bubble: What the internet is hiding from you. Penguin Books. 

About the Authors

Dan Pilat's portrait

Dan is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. Dan has a background in organizational decision making, with a BComm in Decision & Information Systems from McGill University. He has worked on enterprise-level behavioral architecture at TD Securities and BMO Capital Markets, where he advised management on the implementation of systems processing billions of dollars per week. Driven by an appetite for the latest in technology, Dan created a course on business intelligence and lectured at McGill University, and has applied behavioral science to topics such as augmented and virtual reality.

Sekoul Krastev's portrait

Dr. Sekoul Krastev

Sekoul is a Co-Founder and Managing Director at The Decision Lab. He is a bestselling author of Intention - a book he wrote with Wiley on the mindful application of behavioral science in organizations. A decision scientist with a PhD in Decision Neuroscience from McGill University, Sekoul's work has been featured in peer-reviewed journals and has been presented at conferences around the world. Sekoul previously advised management on innovation and engagement strategy at The Boston Consulting Group as well as on online media strategy at Google. He has a deep interest in the applications of behavioral science to new technology and has published on these topics in places such as the Huffington Post and Strategy & Business.

Hyperbolic Discounting

Why do we value immediate rewards more than long-term rewards, dunning–kruger effect, why can we not perceive our own abilities, halo effect, why do positive impressions produced in one area positively influence our opinions in another area.

Notes illustration

Eager to learn about how behavioral science can help your organization?

Get new behavioral science insights in your inbox every month..

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons

Margin Size

  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

  • Last updated
  • Save as PDF
  • Page ID 162135

  • Nathan Smith et al.

\( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

\( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

\( \newcommand{\Span}{\mathrm{span}}\)

\( \newcommand{\id}{\mathrm{id}}\)

\( \newcommand{\kernel}{\mathrm{null}\,}\)

\( \newcommand{\range}{\mathrm{range}\,}\)

\( \newcommand{\RealPart}{\mathrm{Re}}\)

\( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

\( \newcommand{\Argument}{\mathrm{Arg}}\)

\( \newcommand{\norm}[1]{\| #1 \|}\)

\( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

\( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

\( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

\( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

\( \newcommand{\vectorC}[1]{\textbf{#1}} \)

\( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

\( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

\( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

CONNECTIONS

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Click to view content

Confirmation Bias

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Table 2.1 Common Cognitive Biases

Think Like A Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

helpful professor logo

17 Confirmation Bias Examples

confirmation bias examples and definition, explained below

A confirmation bias is when we look for information that supports our preexisting opinion. We start with a view of a particular issue and then search for information that upholds that view.

Although it is a bias, it is not usually intentional, meaning it’s a type of implicit bias . It is a natural tendency of the way people think to rely on shortcuts in our mental processing. It is just easier to do than subjecting our views to contradictory information, which is discomforting.

We yield to the confirmation bias when reading about politics, encountering people of particular demographic profiles, or selecting articles to read on the internet.

Definition of Confirmation Bias

Of all the heuristics and biases that psychologists have identified, none play a greater role in science than the confirmation bias. Science is supposed to be objective and without bias.

In the words of Wason (1960), one of the earliest to investigate the confirmation bias:

“…scientific inferences are based on the principle of eliminating hypotheses, while provisionally accepting only those which remain. Methodologically, such eliminative induction implies adequate controls so that both positive and negative experimental results give information about the possible determinants of a phenomenon” (p. 129).

Searching for information that disconfirms our theory is at the heart and soul of scientific research; the exact opposite of the confirmation bias.

Examples of Confirmation Bias

1. optimistic people.

Being optimistic is good for a person’s mental health, to some extent. Seeing the positive side of everything can keep us in a good mood. But optimists also seem to have a talent for ignoring negative or unpleasant information.

Being pessimistic is just the opposite. Always seeing the negative side of a situation can make us feel depressed and lose a sense of hope.

Both sides of this coin are good examples of confirmation bias. The optimist only looks for positive information and the pessimist only looks for negative information. The glass will always be half full or half empty, depending on your personal outlook.

Of course, if you are a scientist then you must be neutral and objective. Therefore, the glass is neither half full or half empty: it’s at 50% capacity.

2. Refs Making Bad Calls

We tend to think a referee made a good call when it is beneficial for our team, but if it goes against our team, there’s a good chance we will think the referee made a bad call.

Watching your favorite team lose a game because of bad officiating can be very frustrating. For some, it might seem like the end of the world. Fans on the other side of the field however, will have a completely opposite opinion.

Discussions after the game can offer plenty of examples of the confirmation bias. Fans on the losing team can rattle off a number of calls that went against their team. They can cite details of the play that support their view, and maybe even reference a few pages from the official rulebook. Fans on the opposing side can do the exact same thing in support of their team’s victory.

In most cases, as in this one, the confirmation bias applies to both sides of the same coin.  

3. News Reporting  

Today, many news reporters are expected to curate media that supports the political perspective of the news organization’s owners.

Reporters are supposed to be neutral and objective. At least, in theory that is how it is supposed to be. In modern times however, reality is a bit different.

In some Western countries, it is easy to see a clear and strong political bias in various news agencies. In some cases, nearly every news story will have a political tilt that favors a particular political ideology.

This is evident in the selection of stories covered, the angle presented, the facts cited, and even the types of guests interviewed.

Although the confirmation bias is usually considered unintentional, that is not always the case, especially when it comes to politics.

4. Believing a Horoscope  

Horoscopes tend to be highly interpretive, allowing people to believe it no matter what: you simply find the interpretation of the horoscope that supports your own perspective.

Reading one’s horoscope can be entertaining. Who doesn’t want to know what will happen next week or discover their destiny? Maybe true love is just around the corner.

Horoscopes are deliberately written in a way that is slightly vague. If you analyze the statements carefully, you will discover that each one is open to a lot of interpretation. Events that happen afterwards could be construed in a way in which it seems very consistent with what the horoscope predicted.

True believers will always find a way to fit what happened to them with what their horoscope predicted. This is the magic of horoscopes and the power of the confirmation bias.

5. Criminal Investigations

Often, detectives will believe they know there are patterns in all their investigations. When they come up with a theory, they’ll go out there to try to confirm that theory by looking for supporting evidence.

Detectives are people too. When working a crime scene, they must look at all the evidence objectively and seek additional data to solve the crime. Maybe they have seen similar situations so many times before in their career that they develop a “working theory” about what happened.

That is where the trouble starts. Since they have a theory already, they may begin to search for evidence that is consistent with that theory. They may interview witnesses that fit a certain stereotype and ask specific questions that are also a little biased.

During the process of the investigation, the detectives accumulate more and more evidence that fits their expectations. Eventually, an arrest will occur. Hopefully, the legal system will work and the right suspect will be tried and convicted.  

6. Conducting a Research Literature Review

When conducting a literature review of the available research students should read all of the studies objectively. However, they may begin their project having a preliminary theory of the phenomenon being studied which clouds their biases.

This is where the confirmation bias begins. Since they already favor a particular theory, they may input search terms that are consistent with that theory. That means the results will display studies that match the theoretical view they favor.

Instead of intentionally seeking out information that is inconsistent with their preconceived views, like a good scientist is supposed to do, they only read studies that confirm their favored theory.

During the process of writing the literature review , they describe a lot of research that supports that theory. Eventually the paper will be turned in to their research advisor. Hopefully the professor’s review will be thorough and the paper will be given back to the student to do over.

7. Stereotype Reinforcement    

Stereotypes often take on a life of their own. Once formed they seem to be very hard to break. This inability to break a stereotype is referred to as belief perseverance .

When we observe situations that involve people from specific backgrounds or demographics, everything we see will be filtered through the lens of our stereotypes.

Unfortunately, even information that is blatantly contradictory to those stereotypes can go completely unnoticed. It’s as if those actions were invisible. When this happens, it makes it impossible for the stereotype to be broken.

When recounting the situation to others, a person may only include descriptions of another person’s behavior that fit the stereotype they had for them. Thus, the stereotype is perpetuated by way of confirmation bias.

8. Forming Friendships with People that Agree with You  

“Birds of a feather flock together.” This is an old saying, and it is very true. People just automatically gravitate toward others that are similar to themselves because you find them agreeable. It is human nature.

We tend to form friendships with people that are similar to us in terms of demographics such as age, race, ethnicity, and SES status. We should include socio-political attitudes in that list as well.

This creates a confirmation bias by means of self-selection. By selecting similar others, we are also by default selecting to be exposed to information and views that are consistent with our own.

Although it is comforting to have our views confirmed by others, it is also a bit unhealthy and keeps us from growing as human beings.

9. Phrasing Questions in a Survey    

Survey questions can influence people into giving certain answers. If this is the case, then the study has become invalid due to confirmation bias.

Conducting a survey is a great way to gather the opinions of a large number of people in a relatively short period of time. In most situations, the goal is to obtain an objective insight into an issue or collect the opinions of others.

Phrasing the questions can be a bit tricky, however. The researchers may have already formed an opinion regarding the issue, which can lead to an unconscious bias in how they word the questions.

For example: “Why do you think the government should do more to help struggling families?” Or: “In what ways should employers give their staff more influence in marketing decisions?” Of course, these two questions are obviously skewed in a certain direction. Each one suggests an opinion and then asks the respondent to support it. That’s not exactly the neutral phrasing that researchers are supposed to take when collecting data.

10. Placebo Effects

A placebo is an intervention – often a medication – that doesn’t actually have any benefit. For example, it might just be a sugar pill. Nevertheless, people often perceive placebos to be helpful because they’re looking for signs that confirm the placebo has worked.

Essential oils are one example. They are concentrated liquids made from various plants. The oils can be heated and evaporated using a special device or applied to the skin. Many essential oils should not be ingested orally and can be very dangerous if done so. Common essential oils include lavender, peppermint, and tea tree.

The expectation that something will work can cloud our judgement. For example, if we apply a particular oil to our skin that is supposed to make it firmer, then when we look in the mirror, we may interpret what we see as firmer skin.

The confirmation bias is at work again. In this example, it literally affects what we see.

11. Internet Algorithms

Internet algorithms help confirm our own biases because they learn about our preferences and present information to us that we’re most likely to enjoy and click on.

The use of algorithms impacts everything we do on the internet. Algorithms track the search terms we input, the ads we click on, and which news stories we select to read.

Over time, an algorithm can build a surprisingly detailed profile of our personality characteristics and socio-political views.

That profile then influences the type of information we are exposed to because the algorithms will feed us a particular type of news story or advertisement. Our profile data might be given to numerous corporations that then send us tailored content as well.

Although we may not realize it, the algorithms are creating a kind of “bubble of self-confirmation bias.” By being sent content that is already consistent with our profile, our preferences and views are being reinforced.

This is an example of a passive form of confirmation bias of which we are completely unaware.

12. Political TV Channels  

In a country with a free press, it means that TV channels can be categorized based on their political leanings. For example, in the US there are 2 major networks that are obviously biased in polar opposites.

One channel, called MSNBC, is overwhelmingly liberal. The newscasters, guests, and stories have a clearly discernable agenda. The other major channel, FOX News, is skewed in the conservative direction . The newscasters, guests, and stories also have a clear political orientation.

In terms of the confirmation bias, the viewers that have a preference for one channel or the other are proactively selecting a side. By doing so, they are consciously making a choice to be exposed to information that confirms their already existing, well-ingrained perspective.

Although the confirmation bias is usually considered to be an unconscious mental shortcut, in the case of selecting a particular news channel, it seems to take on a very conscious form.

13. Biased Consumer Research

Conscious researcher bias occurs when a researcher is conducting research purely to confirm their own perspective.

Many large companies who have a vested interest in research that supports their product will pay or donate money to researchers in order to have them create research that finds the product is good.

For example, you may have heard of oil companies that supported research that denies climate change. In these situations, it is a clear example of unethical behaviors in the academic and scientific fields.

14. Judge and Jury Bias

Judges can also be highly biased in their findings. For example, in the United States, the supreme court is stacked with ‘liberal’ versus ‘conservative’ judges.

This sort of bias is rife in the legal system. Depending on the judge you get, you may get a strict or lenient perspective.

It becomes confirmation bias when a judge gives you a harsh sentence due to their stereotypical perspectives of the defendant. For example, a judge might see a single mother and think “single mothers are terribly irresponsible.” This becomes the anchor for their thought process (as shown in the anchoring bias heuristic ).

Suddenly, this judge is going to want to confirm their bias against the defendant and see all evidence through that biased lens, and you may get a tougher sentence as a result.

15. Hindsight Bias

Hindsight bias is a sub-type of confirmation bias. It refers to situations when people reflect on a situation and say “well it’s obvious it would have turned out that way!” Here, we’re using the benefit of hindsight to show off our brilliance: “see, it confirmed my perspective!”.

For example, people might have seen a car crash and said “Well, it was obvious that person was going to get into a crash. They’re a terrible driver.”

However, this is hindsight bias because the situation was probably far more complex in the moment. Nevertheless, with the benefit of hindsight, we use past events to paint a black-and-white picture of something that was “obvious” and confirms how your beliefs are always correct. By doing this, we’re creating a simplified vision of the world in a way that helps to support your own biases.

16. The Halo Effect

The halo effect is a type of bias where you see someone is excellent at one thing so assume they’re excellent at everything.

For example, a company might hire a new employee who did a really great job at one project in the first week of the job. From there on, this employee becomes the ‘golden boy’ and the boss thinks he can’t do anything wrong.

The other employees may look at them and think “wow, he did one thing right, but he hasn’t outperformed since then. Why does the boss think he’s so great?”

What’s happened here is the boss has passed an initial judgement of the new employee and is now looking for examples to confirm their first impression. They’re biased toward confirming their initial belief that this is an excellent employee.

17. The Horns Effect

The horns effect is the opposite of the halo effect. It occurs when you believe someone is bad, so you see everything they do negatively.

This may happen, for example, when you’re against the politician on the other political team. Suddenly, everything he does is bad. Not because you’re objective but because you’re seeking for confirmation bias. You suddenly think small mishaps mean the person isn’t worthy of standing for office and you disagree with anything they do, no matter what.

At the same time, it’s likely that you’ll apply the halo effect to the politician you support and want to defend their every move to confirm your original perspective: that they’re the best politician!

Confirmation vs Belief Bias

Belief bias and confirmation bias are similar concepts in psychology, however they differ in a subtle way.

Here’s the difference.

  • Confirmation bias refers to the seek out information that supports our desired conclusions.
  • Belief bias refers to the tendency to make judgments about the validity of an argument based on the believability of its conclusions. It often leads us to believe an argument that is illogical simply because it got us to the conclusion we desire.

The key difference is that belief bias focuses on  making judgments about validity of logic  based on our expected conclusions; whereas confirmation bias is about  making judgments about the conclusion  to suit or desires. Often, when engaging in confirmation bias, we may employ belief bias as a way to support our desired conclusion.

See Also: A Full List of The Different Types of Bias

Confirmation bias is a very self-serving type of bias . It can take many shapes and effect our thinking processes across a wide range of contexts. Sometimes it clouds our perceptions of not-so-significant events such as the outcome of a game or the effectiveness of essential oils.

On the other hand, it can also have a tremendous impact on quite significant events. For example, how detectives approach solving a homicide or the political leanings of where we get our news. Even the algorithms that are such an integral part of the internet are feeding us information that confirms our already existing opinions and preferences.

The confirmation bias generates a vicious cycle that perpetuates our views on the world around us and leads to the formation of a thick-layered bubble from which we live.

How to burst that bubble is a question we all must consider if we hope to grow as individuals.

Moskowitz, G., & Carter, D. (2018). Confirmation bias and the stereotype of the black athlete . Psychology of Sport and Exercise, 36 , 139-146. https://doi.org/10.1016/j.psychsport.2018.02.010

Mynatt, C., Doherty, M., & Tweney, R. (1977). Confirmation bias in a simulated research environment: An experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 29 , 85-95. https://doi.org/10.1080/00335557743000053

Rajsic, J., Wilson, D. E., & Pratt, J. (2015). Confirmation bias in visual search. Journal of Experimental Psychology: Human Perception and Performance, 41 (5), 1353–1364. https://doi.org/10.1037/xhp0000090

Rassin, E., Eerland, A., & Kuijpers, I. (2010). Let’s find the evidence: An analogue study of confirmation bias in criminal investigations. Journal of Investigative Psychology and Offender Profiling, 7 , 231-246. https://doi.org/10.1002/jip.126

Wason, P. (1960). On the failure to eliminate hypotheses in a conceptual task. Quarterly Journal of Experimental Psychology,12 , 129-140. https://doi.org/10.1080/17470216008416717

Westerwick, A., Johnson, B., & Knobloch-Westerwick, S. (2017). Confirmation biases in selective exposure to political online information: Source bias vs. content bias. Communication Monographs , 84 (3), 343-364. https://doi.org/10.1080/03637751.2016.1272761

Dave

Dave Cornell (PhD)

Dr. Cornell has worked in education for more than 20 years. His work has involved designing teacher certification for Trinity College in London and in-service training for state governments in the United States. He has trained kindergarten teachers in 8 countries and helped businessmen and women open baby centers and kindergartens in 3 countries.

  • Dave Cornell (PhD) https://helpfulprofessor.com/author/dave-cornell-phd/ Vicarious Punishment: Definition, Examples, Pros and Cons
  • Dave Cornell (PhD) https://helpfulprofessor.com/author/dave-cornell-phd/ 10 Sublimation Examples (in Psychology)
  • Dave Cornell (PhD) https://helpfulprofessor.com/author/dave-cornell-phd/ 10 Fixed Ratio Schedule Examples
  • Dave Cornell (PhD) https://helpfulprofessor.com/author/dave-cornell-phd/ 10 Sensorimotor Stage Examples

Chris

Chris Drew (PhD)

This article was peer-reviewed and edited by Chris Drew (PhD). The review process on Helpful Professor involves having a PhD level expert fact check, edit, and contribute to articles. Reviewers ensure all content reflects expert academic consensus and is backed up with reference to academic studies. Dr. Drew has published over 20 academic articles in scholarly journals. He is the former editor of the Journal of Learning Development in Higher Education and holds a PhD in Education from ACU.

  • Chris Drew (PhD) #molongui-disabled-link Cognitive Dissonance Theory: Examples and Definition
  • Chris Drew (PhD) #molongui-disabled-link Vicarious Punishment: Definition, Examples, Pros and Cons
  • Chris Drew (PhD) #molongui-disabled-link 10 Sublimation Examples (in Psychology)
  • Chris Drew (PhD) #molongui-disabled-link Social Penetration Theory: Examples, Phases, Criticism

Leave a Comment Cancel Reply

Your email address will not be published. Required fields are marked *

Effectiviology

The Confirmation Bias: Why People See What They Want to See

The Confirmation Bias

The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs. For example, if someone is presented with a lot of information on a certain topic, the confirmation bias can cause them to only remember the bits of information that confirm what they already thought.

The confirmation bias influences people’s judgment and decision-making in many areas of life, so it’s important to understand it. As such, in the following article you will first learn more about the confirmation bias, and then see how you can reduce its influence, both in other people’s thought process as well as in your own.

How the confirmation bias affects people

The confirmation bias promotes various problematic patterns of thinking , such as people’s tendency to ignore information that contradicts their beliefs . It does so through several types of biased cognitive processes:

  • Biased search for information. This means that the confirmation bias causes people to search for information that confirms their preexisting beliefs, and to avoid information that contradicts them.
  • Biased favoring of information. This means that the confirmation bias causes people to give more weight to information that supports their beliefs, and less weight to information that contradicts them.
  • Biased interpretation of information. This means that the confirmation bias causes people to interpret information in a way that confirms their beliefs, even if the information could be interpreted in a way that contradicts them.
  • Biased recall of information. This means that the confirmation bias causes people to remember information that supports their beliefs and to forget information that contradicts them, or to remember supporting information as having been more supporting than it really was, or to incorrectly remember contradictory information as having supported their beliefs.

Note : one closely related phenomenon is cherry picking . It involves focusing only on evidence that supports one’s stance, while ignoring evidence that contradicts it. People often engage in cherry picking due to the confirmation bias, though it’s possible to engage in cherry picking even if a person is fully aware of what they’re doing, and is unaffected by the bias.

Examples of the confirmation bias

One example of the confirmation bias is someone who searches online to supposedly check whether a belief that they have is correct, but ignores or dismisses all the sources that state that it’s wrong. Similarly, another example of the confirmation bias is someone who forms an initial impression of a person, and then interprets everything that this person does in a way that confirms this initial impression.

Furthermore, other examples of the confirmation appear in various domains. For instance, the confirmation bias can affect:

  • How people view political information. For example, people generally prefer to spend more time looking at information that supports their political stance and less time looking at information that contradicts it.
  • How people assess pseudoscientific beliefs. For example, people who believe in pseudoscientific theories tend to ignore information that  disproves those theories .
  • How people invest money. For example, investors give more weight to information that confirms their preexisting beliefs regarding the value of certain stocks.
  • How scientists conduct research. For example, scientists often display the confirmation bias when they selectively analyze and interpret data in a way that confirms  their preferred hypothesis.
  • How medical professionals diagnose patients. For example, doctors often search for new information in a selective manner that will allow them to confirm their initial diagnosis of a patient, while ignoring signs that this diagnosis could be wrong.

In addition, an example of how the confirmation bias can influence people appears in the following quote, which references the prevalent misinterpretation of evidence during witch trials in the 17th century:

“When men wish to construct or support a theory, how they torture facts into their service!” ⁠— From “ Extraordinary Popular Delusions and the Madness of Crowds “

Similarly, another example of how people display the confirmation bias is the following:

“… If the new information is consonant with our beliefs, we think it is well founded and useful: ‘Just what I always said!’ But if the new information is dissonant, then we consider it biased or foolish: ‘What a dumb argument!’ So powerful is the need for consonance that when people are forced to look at disconfirming evidence, they will find a way to criticize, distort, or dismiss it so that they can maintain or even strengthen their existing belief.” ⁠— From “ Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts “

Overall, examples of the confirmation bias appear in various domains. These examples illustrate the various different ways in which it can affect people, and show that this bias is highly prevalent, including among trained professionals who are often assumed to assess information in a purely rational manner.

Psychology and causes of the confirmation bias

The confirmation bias can be attributed to two main cognitive mechanisms:

  • Challenge avoidance , which is the desire to avoid finding out that you’re wrong.
  • Reinforcement seeking , which is the desire to find out that you’re right.

These forms of motivated reasoning can be attributed to people’s underlying desire to minimize their  cognitive dissonance , which is psychological distress that occurs when people hold two or more contradictory beliefs simultaneously. Challenge avoidance can reduce dissonance by reducing engagement with information that contradicts preexisting beliefs. Conversely, reinforcement seeking can reduce dissonance by increasing engagement with information that affirms people’s sense of correctness , including if they encounter contradictory information later.

Furthermore, the confirmation bias also occurs due to flaws in the way we test hypotheses.  For example, when people try to find an explanation for a certain phenomenon, they tend to focus on only one hypothesis at a time, and disregard alternative hypotheses, even in cases where they’re not emotionally incentivized to confirm their initial hypothesis. This can cause people to simply try and prove that their initial hypothesis is true, instead of trying to actually check whether it’s true or not, which causes them to ignore the possibility that the information that they encounter could disprove this initial hypothesis, or support alternative hypotheses.

An example of this is a doctor who forms an initial diagnosis of a patient, and who then focuses solely on trying to prove that this diagnosis is right, instead of trying to actively determine whether alternative diagnoses could make more sense.

This explains why people can experience unmotivated confirmation bias in situations where they have no emotional reason to favor a specific hypothesis over others. This is contrasted with a motivated confirmation bias, which occurs when the person displaying the bias is motivated by some emotional consideration.

Finally, the confirmation bias can also be attributed to a number of additional causes. For example, in the case of the motivated confirmation bias, an additional reason why people experience the bias is that the brain sometimes suppresses neural activity in areas associated with emotional regulation and emotionally neutral reasoning. This causes people to process information based on how their emotions guide them to, rather than based on how their logic would guide them.

Overall, people experience the confirmation bias primarily because they want to minimize psychological distress, and specifically due to challenge avoidance , which is the desire to avoid finding out that they’re wrong, and  reinforcement seeking , which is the desire to find out that they’re right. Furthermore, people can also experience the confirmation due to other causes, such as the flawed way they test hypotheses, as in the case where people fixate on confirming a single hypothesis while ignoring alternatives.

Note : Some of the behaviors that people engage in due to the confirmation bias can be viewed as a form of selective exposure . This involves people choosing to engage only with information that supports their preexisting beliefs and decisions, while ignoring information that contradicts them.

How to reduce the confirmation bias

Reducing other people’s confirmation bias.

There are various things that you can do to reduce the influence that the confirmation bias has on people. These methods generally revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place .

As such, these methods generally involve trying to get people to overcome their tendency to focus on and prefer confirmatory information, or their tendency to avoid and reject challenging information, while also encouraging them to conduct a valid reasoning process.

Specifically, the following are some of the most notable techniques that you can use to reduce the confirmation bias in people:

  • Explain what the confirmation bias is, why we experience it, how it affects us, and why it can be a problem, potentially using relevant examples. Understanding this phenomenon better can motivate people to avoid it, and can help them deal with it more effectively, by helping them recognize when and how it affects them. Note that in some cases, it may be beneficial to point out the exact way in which a person is displaying the confirmation bias.
  • Make it so that the goal is to find the right answer, rather than defend an existing belief. For example, consider a situation where you’re discussing a controversial topic with someone, and you know for certain that they’re wrong. If you argue hard against them, that might cause them to get defensive and feel that they must stick by their initial stance regardless of whatever evidence you show them. Conversely, if you state that you’re just trying to figure out what the right answer is, and discuss the topic with them in a friendly manner, that can make them more open to considering the challenging evidence that you present. In this case, your goal is to frame your debate as a journey that you go on together in search of the truth, rather than a battle where you fight each other to prove the other wrong. The key here is that, when it comes to a joint journey, both of you can be “winners”, while in the case of a battle, only one of you can, and the other person will often experience the confirmation bias to avoid feeling that they were the “loser”.
  • Minimize the unpleasantness and issues associated with finding out that they’re wrong. In general, the more unpleasant and problematic being wrong is, the more a person will use the confirmation bias to stick by their initial stance. There are various ways in which you can make the experience of being wrong less unpleasant or problematic, such as by emphasizing the value of learning new things, and by avoiding mocking people for having held incorrect beliefs.
  • Encourage people to avoid letting their emotional response dictate their actions. Specifically, explain that while it’s natural to want to avoid challenges and seek reinforcement, letting these feelings dictate how you process information and make decisions is problematic. This means, for example, that if you feel that you want to avoid a certain piece of information, because it might show that you’re wrong, then you should realize this, but choose to see that information anyway.
  • Encourage people to give information sufficient consideration. When it comes to avoiding the confirmation bias, it often helps to engage with information in a deep and meaningful way, since shallow engagement can lead people to rely on biased intuitions, rather than on proper analytical reasoning. There are various things that people can do to ensure that they give information sufficient consideration , such as spending a substantial amount of time considering it, or interacting with it in an environment that has no distractions.
  • Encourage people to avoid forming a hypothesis too early. Once people have a specific hypothesis in mind, they often try and confirm it , instead of trying to formulate and test other possible hypotheses. As such, it can often help to encourage people to process as much information as possible before forming their initial hypothesis.
  • Ask people to explain their reasoning. For example, you can ask them to clearly state what their stance is, and what evidence has caused them to support that stance. This can help people identify potential issues in their reasoning, such as that their stance is unsupported.
  • Ask people to think about various reasons why their preferred hypothesis might be wrong. This can help them test their preferred hypothesis in ways that they might not otherwise, and can make them more likely to accept and internalize challenging information .
  • Ask people to think about alternative hypotheses, and why those hypotheses might be right. Similarly to asking people to think about reasons why their preferred hypothesis might be wrong, this can encourage people to engage in a proper reasoning process, which they might not do otherwise. Note that, when doing this, it is generally better to focus on a small number of alternative hypotheses , rather than a large number of them.

Different techniques will be more effective for reducing the confirmation bias in different situations, and it is generally most effective to use a combination of techniques, while taking into account relevant situational and personal factors.

Furthermore, in addition to the above techniques, which are aimed at reducing the confirmation bias in particular, there are additional debiasing techniques that you can use to help people overcome their confirmation bias. This includes, for example, getting people to slow down their reasoning process, creating favorable conditions for optimal decision making, and standardizing the decision-making process.

Overall, to reduce the confirmation bias in others, you can use various techniques that revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place. This includes, for example, making people aware of this bias, making discussions be about finding the right answer instead of defending an existing belief, minimizing the unpleasantness associated with being wrong, encouraging people to give information sufficient consideration, and asking people to think about why their preferred hypothesis might be wrong or why competing hypotheses could be right.

Reducing your own confirmation bias

To mitigate the confirmation bias in yourself, you can use similar techniques to those that you would use to mitigate it in others. Specifically, you can do the following:

  • Identify when and how you’re likely to experience the bias.
  • Maintain awareness of the bias in relevant situations, and even actively ask yourself whether you’re experiencing it.
  • Figure out what kind of negative outcomes the bias can cause for you.
  • Focus on trying to find the right answer, rather than on proving that your initial belief was right.
  • Avoid feeling bad if you find out that you’re wrong; for example, try to focus on having learned something new that you can use in the future.
  • Don’t let your emotions dictate how you process information, particularly when it comes to seeking confirmation or avoiding challenges to your beliefs.
  • Dedicate sufficient time and mental effort when processing relevant information.
  • Avoid forming a hypothesis too early, before you’d had a chance to analyze sufficient information.
  • Clearly outline your reasoning, for example by identifying your stance and the evidence that you’re basing it on.
  • Think of reasons why your preferred hypothesis might be wrong.
  • Come up with alternative hypotheses, as well as reasons why those hypotheses might be right.

An added benefit of many of these techniques is that they can help you understand opposing views better, which is important when it comes to explaining your own stance and communicating with others on the topic.

In addition, you can also use general debiasing techniques , such as standardizing your decision-making process and creating favorable conditions for assessing information.

Furthermore, keep in mind that, as is the case with reducing the confirmation bias in others, different techniques will be more effective than others, both in general and in particular circumstances. You should take this into account, and try to find the approach that works best for you in any given situation.

Finally, note that in some ways, debiasing yourself can be easier than debiasing others, since other people are often not as open to your debiasing attempts as you yourself are. At the same time, however, debiasing yourself is also more difficult in some ways, since we often struggle to notice our own blind spots, and to identify areas where we are affected by cognitive biases in general, and the confirmation bias in particular.

Overall, to reduce the confirmation bias in yourself, you can use similar techniques to those that you would use to reduce it in others. This includes, for example, maintaining awareness of this bias, focusing on trying to find the right answer rather than proving that you were right, dedicating sufficient time and effort to analyzing information, clearly outlining your reasoning, thinking of reasons why your preferred hypothesis might be wrong, and coming up with alternative hypotheses.

Additional information

Related cognitive biases.

There are many cognitive biases that are closely associated with the confirmation bias, either because they involved a similar pattern or reasoning, or because they occur, at least partly, due to underlying confirmation bias.

For example, there is the backfire effect , which is a cognitive bias that causes people who encounter evidence that challenges their beliefs to reject that evidence, and to strengthen their support of their original stance. This bias can, for instance, cause people to increase their support for a political candidate after they encounter negative information about that candidate, or to strengthen their belief in a scientific misconception after they encounter evidence that highlights the issues with that misconception. The backfire effect is closely associated with the confirmation bias, since it involves the rejection of challenging evidence, with the goal of confirming one’s original beliefs.

Another example of a cognitive bias that is closely related to the confirmation bias is the halo effect , which is a cognitive bias that causes people’s impression of someone or something in one domain to influence their impression of them in other domains. This bias can, for instance, cause people to assume that if someone is physically attractive, then they must also have an interesting personality , or it can cause people to give higher ratings to an essay if they believe that it was written by an attractive author . The halo effect is closely associated with the confirmation bias, since it can be attributed in some cases to people’s tendency to confirm their initial impression of someone, by forming later impressions of them in a biased manner.

The origin and history of the confirmation bias

The term ‘confirmation bias’ was first used in a 1977 paper titled “ Confirmation bias in a simulated research environment: An experimental study of scientific inference “, published by Clifford R. Mynatt, Michael E. Doherty, and Ryan D. Tweney in the Quarterly Journal of Experimental Psychology (Volume 29, Issue 1, pp. 85-95). However, as the authors themselves note, evidence of the confirmation bias can be found earlier in the psychological literature.

Specifically, the following passage is the abstract of the paper that coined the term. It outlines the work presented in the paper, and also notes the existence of prior work on the topic:

“Numerous authors (e.g., Popper, 1959 ) argue that scientists should try to falsify  rather than  confirm theories. However, recent empirical work (Wason and Johnson-Laird, 1972 ) suggests the existence of a confirmation bias, at least on abstract problems. Using a more realistic, computer controlled environment modeled after a real research setting, subjects in this study first formulated hypotheses about the laws governing events occurring in the environment. They then chose between pairs of environments in which they could: (1) make observations which would probably confirm these hypotheses, or (2) test alternative hypotheses. Strong evidence for a confirmation bias involving failure to choose environments allowing tests of alternative hypotheses was found. However, when subjects did obtain explicit falsifying information, they used this information to reject incorrect hypotheses.”

In addition, a number of other past studies are discussed in the paper :

“Examples abound of scientists clinging to pet theories and refusing to seek alternatives in the face of large amounts of contradictory data (see Kuhn, 1970 ). Objective evidence, however, is scant. Wason ( 1968a ) has conducted several experiments on inferential reasoning in which subjects were given conditional rules of the form ‘If P then Q’, where P was a statement about one side of a stimulus card and Q a statement about the other side. Four stimulus cards, corresponding to P, not-P, Q, and not-Q were provided. The subjects’ task was to indicate those cards—and only those cards—which had to be turned over in order to determine if the rule was true or false. Most subjects chose only P, or P and Q. The only cards which can falsify the rule, however, are P and not-Q. Since the not-Q card is almost never selected, the results indicate a strong tendency to seek confirmatory rather than disconfirmatory evidence. This bias for selecting confirmatory evidence has proved remarkably difficult to eradicate (see Wason and Johnson-Laird, 1972 , pp. 171-201). In another set of experiments, Wason ( 1960 , 1968b , 1971 ) also found evidence of failure to consider alternative hypotheses. Subjects were given the task of recovering an experimenter defined rule for generating numerical sequences. The correct rule was a very general one and, consequently, many incorrect specific rules could generate sequences which were compatible with the correct rule. Most subjects produced a few sequences based upon a single, specific rule, received positive feedback, and announced mistakenly that they had discovered the correct rule. With some notable exceptions, what subjects did not do was to generate and eliminate alternative rules in a systematic fashion. Somewhat similar results have been reported by Miller ( 1967 ). Finally, Mitroff ( 1974 ), in a large-scale non-experimental study of NASA scientists, reports that a strong confirmation bias existed among many members of this group. He cites numerous examples of these scientists’ verbalizations of their own and other scientists’ obduracy in the face of data as evidence for this conclusion.”

Summary and conclusions

  • The confirmation bias is a cognitive bias that causes people to search for, favor, interpret, and recall information in a way that confirms their preexisting beliefs.
  • The confirmation bias affects people in every area of life; for example, it can cause people to disregard negative information about a political candidate that they support, or to only pay attention to news articles that support what they already think.
  • People experience the confirmation bias due to various reasons, including challenge avoidance (the desire to avoid finding out that they’re wrong), reinforcement seeking (the desire to find out that they’re right), and flawed testing of hypotheses (e.g., fixating on a single explanation from the start).
  • To reduce the confirmation bias in yourself and in others, you can use various techniques that revolve around trying to counteract the cognitive mechanisms that promote the confirmation bias in the first place.
  • Relevant debiasing techniques you can use include maintaining awareness of this bias, focusing on trying to find the right answer rather than being proven right, dedicating sufficient time and effort to analyzing relevant information, clearly outlining the reasoning process, thinking of reasons why a preferred hypothesis might be wrong, and coming up with alternative hypotheses and reasons why those hypotheses might be right.

Other articles you may find interesting:

  • The Backfire Effect: Why Facts Don't Always Change Minds
  • Cherry Picking: When People Ignore Evidence that They Dislike
  • Belief Bias: When People Rely on Beliefs Rather Than Logic

What Is the Function of Confirmation Bias?

  • Original Research
  • Open access
  • Published: 20 April 2020
  • Volume 87 , pages 1351–1376, ( 2022 )

Cite this article

You have full access to this open access article

example of confirmation bias critical thinking

  • Uwe Peters 1 , 2  

35 Citations

183 Altmetric

19 Mentions

Explore all metrics

Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss three recent proposals of this kind before developing a novel alternative, what I call the ‘reality-matching account’. According to the account, confirmation bias evolved because it helps us influence people and social structures so that they come to match our beliefs about them. This can result in significant developmental and epistemic benefits for us and other people, ensuring that over time we don’t become epistemically disconnected from social reality but can navigate it more easily. While that might not be the only evolved function of confirmation bias, it is an important one that has so far been neglected in the theorizing on the bias.

Similar content being viewed by others

example of confirmation bias critical thinking

The Relationship Between Social Media Use and Beliefs in Conspiracy Theories and Misinformation

example of confirmation bias critical thinking

When should one be open-minded?

The anchoring bias reflects rational use of cognitive resources.

Avoid common mistakes on your manuscript.

In recent years, confirmation bias (or ‘myside bias’), Footnote 1 that is, people’s tendency to search for information that supports their beliefs and ignore or distort data contradicting them (Nickerson 1998 ; Myers and DeWall 2015 : 357), has frequently been discussed in the media, the sciences, and philosophy. The bias has, for example, been mentioned in debates on the spread of “fake news” (Stibel 2018 ), on the “replication crisis” in the sciences (Ball 2017 ; Lilienfeld 2017 ), the impact of cognitive diversity in philosophy (Peters 2019a ; Peters et al. forthcoming; Draper and Nichols 2013 ; De Cruz and De Smedt 2016 ), the role of values in inquiry (Steel 2018 ; Peters 2018 ), and the evolution of human reasoning (Norman 2016 ; Mercier and Sperber 2017 ; Sterelny 2018 ; Dutilh Novaes 2018 ).

Confirmation bias is typically viewed as an epistemically pernicious tendency. For instance, Mercier and Sperber ( 2017 : 215) maintain that the bias impedes the formation of well-founded beliefs, reduces people’s ability to correct their mistaken views, and makes them, when they reason on their own, “become overconfident” (Mercier 2016 : 110). In the same vein, Steel ( 2018 ) holds that the bias involves an “epistemic distortion [that] consists of unjustifiably favoring supporting evidence for [one’s] belief, which can result in the belief becoming unreasonably confident or extreme” (897). Similarly, Peters ( 2018 ) writes that confirmation bias “leads to partial, and therewith for the individual less reliable, information processing” (15).

The bias is not only taken to be epistemically problematic, but also thought to be a “ubiquitous” (Nickerson 1998 : 208), “built-in feature of the mind” (Haidt 2012 : 105), found in both everyday and abstract reasoning tasks (Evans 1996 ), independently of subjects’ intelligence, cognitive ability, or motivation to avoid it (Stanovich et al. 2013 ; Lord et al. 1984 ). Given its seemingly dysfunctional character, the apparent pervasiveness of confirmation bias raises a puzzle: If the bias is indeed epistemically problematic, why is it still with us today? By definition, dysfunctional traits should be more prone to extinction than functional ones (Nickerson 1998 ). Might confirmation bias be or have been adaptive ?

Some philosophers are optimistic, arguing that the bias has in fact significant advantages for the individual, groups, or both (Mercier and Sperber 2017 ; Norman 2016 ; Smart 2018 ; Peters 2018 ). Others are pessimistic. For instance, Dutilh Novaes ( 2018 ) maintains that confirmation bias makes subjects less able to anticipate other people’s viewpoints, and so, “given the importance of being able to appreciate one’s interlocutor’s perspective for social interaction”, is “best not seen as an adaptation” (520).

In the following, I discuss three recent proposals of the adaptationist kind, mention reservations about them, and develop a novel account of the evolution of confirmation bias that challenges a key assumption underlying current research on the bias, namely that the bias thwarts reliable belief formation and truth tracking. The account holds that while searching for information supporting one’s pre-existing beliefs and ignoring contradictory data is disadvantageous when that what one takes to be reality is and stays different from what one believes it to be, it is beneficial when, as the result of one’s processing information in that way, that reality is changed so that it matches one’s beliefs. I call this process reality matching and contend that it frequently occurs when the beliefs at issue are about people and social structures (i.e., relationships between individuals, groups, and socio-political institutions). In these situations, confirmation bias is highly effective for us to be confident about our beliefs even when there is insufficient evidence or subjective motivation available to us to support them. This helps us influence and ‘mould’ people and social structures so that they fit our beliefs, Footnote 2 which is an adaptive property of confirmation bias. It can result in significant developmental and epistemic benefits for us and other people, ensuring that over time we don’t become epistemically disconnected from social reality but can navigate it more easily.

I shall not argue that the adaptive function of confirmation bias that this reality-matching account highlights is the only evolved function of the bias. Rather, I propose that it is one important function that has so far been neglected in the theorizing on the bias.

In Sects.  1 and 2 , I distinguish confirmation bias from related cognitions before briefly introducing some recent empirical evidence supporting the existence of the bias. In Sect.  3 , I motivate the search for an evolutionary explanation of confirmation bias and critically discuss three recent proposals. In Sects.  4 and 5 , I then develop and support the reality-matching account as an alternative.

1 Confirmation Bias and Friends

The term ‘confirmation bias’ has been used to refer to various distinct ways in which beliefs and expectations can influence the selection, retention, and evaluation of evidence (Klayman 1995 ; Nickerson 1998 ). Hahn and Harris ( 2014 ) offer a list of them including four types of cognitions: (1) hypothesis-determined information seeking and interpretation, (2) failures to pursue a falsificationist strategy in contexts of conditional reasoning, (3) a resistance to change a belief or opinion once formed, and (4) overconfidence or an illusion of validity of one’s own view.

Hahn and Harries note that while all of these cognitions have been labeled ‘confirmation bias’, (1)–(4) are also sometimes viewed as components of ‘motivated reasoning’ (or ‘wishful thinking’) (ibid: 45), i.e., information processing that leads people to arrive at the conclusions they favor (Kunda 1990 ). In fact, as Nickerson ( 1998 : 176) notes, confirmation bias comes in two different flavors: “motivated” and “unmotivated” confirmation bias. And the operation of the former can be understood as motivated reasoning itself, because it too involves partial information processing to buttress a view that one wants to be true (ibid). Unmotivated confirmation bias, however, operates when people process data in one-sided, partial ways that support their predetermined views no matter whether they favor them. So confirmation bias is also importantly different from motivated reasoning, as it can take effect in the absence of a preferred view and might lead one to support even beliefs that one wants to be false (e.g., when one believes the catastrophic effects of climate change are unavoidable; Steel 2018 ).

Despite overlapping with motivated reasoning, confirmation bias can thus plausibly be (and typically is) construed as a distinctive cognition. It is thought to be a subject’s largely automatic and unconscious tendency to (i) seek support for her pre-existing, favored or not favored beliefs and (ii) ignore or distort information compromising them (Klayman 1995 : 406; Nickerson 1998 : 175; Myers and DeWall 2015 : 357; Palminteri et al. 2017 : 14). I here endorse this standard, functional concept of confirmation bias.

2 Is Confirmation Bias Real?

Many psychologists hold that the bias is a “pervasive” (Nickerson 1998 : 175; Palminteri et al. 2017 : 14), “ineradicable” feature of human reasoning (Haidt 2012 : 105). Such strong claims are problematic, however. For there is evidence that, for instance, disrupting the fluency in information processing (Hernandez and Preston 2013 ) or priming subjects for distrust (Mayo et al. 2014 ) reduces the bias. Moreover, some researchers have recently re-examined the relevant studies and found that confirmation bias is in fact less common and the evidence of it less robust than often assumed (Mercier 2016 ; Whittlestone 2017 ). These researchers grant, however, the weaker claim that the bias is real and often, in some domains more than in others, operative in human cognition (Mercier 2016 : 100, 108; Whittlestone 2017 : 199, 207). I shall only rely on this modest view here. To motivate it a bit more, consider the following two studies.

Hall et al. ( 2012 ) gave their participants (N = 160) a questionnaire, asking them about their opinion on moral principles such as ‘Even if an action might harm the innocent, it can still be morally permissible to perform it’. After the subjects had indicated their view using a scale ranging from ‘completely disagree’ to ‘completely agree’, the experimenter performed a sleight of hand, inverting the meaning of some of the statements so that the question then read, for instance, ‘If an action might harm the innocent, then it is not morally permissible to perform it’. The answer scales, however, were not altered. So if a subject had agreed with the first claim, she then agreed with the opposite one. Surprisingly, 69% of the study participants failed to detect at least one of the changes. Moreover, they subsequently tended to justify positions they thought they held despite just having chosen the opposite . Presumably, subjects accepted that they favored a particular position, didn’t know the reasons, and so were now looking for support that would justify their position. They displayed a confirmation bias. Footnote 3

Using a similar experimental set-up, Trouche et al. ( 2016 ) found that subjects also tend to exhibit a selective ‘laziness’ in their critical thinking: they are more likely to avoid raising objections to their own positions than to other people’s. Trouche et al. first asked their test participants to produce arguments in response to a set of simple reasoning problems. Directly afterwards, they had them assess other subjects’ arguments concerning the same problems. About half of the participants didn’t notice that by the experimenter’s intervention, in some trials, they were in fact presented with their own arguments again; the arguments appeared to these participants as if they were someone else’s. Furthermore, more than half of the subjects who believed they were assessing someone else’s arguments now rejected those that were in fact their own, and were more likely to do so for invalid than for valid ones. This suggests that subjects are less critical of their own arguments than of other people’s, indicating that confirmation bias is real and perhaps often operative when we are considering our own claims and arguments.

3 Evolutionary Accounts of the Bias

Confirmation bias is typically taken to be epistemically problematic, as it leads to partial and therewith for the individual less reliable information processing and contributes to failures in, for instance, perspective-taking with clear costs for social and other types of cognition (Mercier and Sperber 2017 : 215; Steel 2018 ; Peters 2018 ; Dutilh Novaes 2018 ). Prima facie , the bias thus seems maladaptive.

But then why does it still exist? Granted, even if the bias isn’t an adaptation, we might still be able to explain why it is with us today. We might, for instance, argue that it is a “spandrel”, a by-product of the evolution of another trait that is an adaptation (Gould and Lewontin 1979 ). Or we may abandon the evolutionary approach to the bias altogether and hold that it emerged by chance.

However, evolutionary explanations of psychological traits are often fruitful. They can create new perspectives on these traits that may allow developing means to reduce the traits’ potential negative effects (Roberts et al. 2012 ; Johnson et al. 2013 ). Evolutionary explanations might also stimulate novel, testable predictions that researchers who aren’t evolutionarily minded would overlook (Ketelaar and Ellis 2000 ; De Bruine 2009 ). Moreover, they typically involve integrating diverse data from different disciplines (e.g., psychology, biology, anthropology etc.), and thereby contribute to the development of a more complete understanding of the traits at play and human cognition, in general (Tooby and Cosmides 2015 ). These points equally apply when it comes to considering the origin of confirmation bias. They provide good reasons for searching for an evolutionary account of the bias.

Different proposals can be discerned in the literature. I will discuss three recent ones, what I shall call (1) the argumentative - function account, (2) the group - cognition account, and the (3) intention – alignment account. I won’t offer conclusive arguments against them here. The aim is just to introduce some reservations about these proposals to motivate the exploration of an alternative.

3.1 The Argumentative-Function Account

Mercier and Sperber ( 2011 , 2017 ) hold that human reasoning didn’t evolve for truth tracking but for making us better at convincing other people and evaluating their arguments so as to be convinced only when their points are compelling. In this context, when persuasion is paramount, the tendency to look for material supporting our preconceptions and to discount contradictory data allows us to accumulate argumentative ammunition, which strengthens our argumentative skill, Mercier and Sperber maintain. They suggest that confirmation bias thus evolved to “serve the goal of convincing others” ( 2011 : 63).

Mercier and Sperber acknowledge that the bias also hinders us in anticipating objections, which should make it more difficult for us to develop strong, objection–resistant arguments ( 2017 : 225f). But they add that it is much less cognitively demanding to react to objections than to anticipate them, because objections might depend on particular features of one’s opponents’ preferences or on information that only they have access to. It is thus more efficient to be ‘lazy’ in anticipating criticism and let the audience make the moves, Mercier and Sperber claim.

There is reason to be sceptical about their proposal, however. For instance, an anticipated objection is likely to be answered more convincingly than an immediate response from one’s audience. After all, “forewarned is forearmed”; it gives a tactical advantage (e.g., more time to develop a reply) (Sterelny 2018 : 4). And even if it is granted that objections depend on private information, they also often derive from obvious interests and public knowledge, making an anticipation of them easy (ibid). Moreover, as Dutilh Novaes ( 2018 : 519) notes, there is a risk of “looking daft” when producing poor arguments, say, due to laziness in scrutinizing one’s thoughts. Since individuals within their social groups depend on their reputation so as to find collaborators, anticipating one’s audience’s responses should be and have been more adaptive than having a confirmation bias (ibid). If human reasoning emerged for argumentative purposes, the existence of the bias remains puzzling.

3.2 The Group-Cognition Account

Even if confirmation bias is maladaptive for individual s, it might still be adaptive for groups . For instance, Smart ( 2018 ) and Peters ( 2018 ) hold that in groups with a sufficient degree of cognitive diversity at the outset of solving a particular problem, each individual’s confirmation bias might help the group as a whole conduct a more in-depth analysis of the problem space than otherwise. When each subject is biased towards a different particular proposal on how to solve the problem, the bias will push them to invest greater effort in defending their favored proposals and might, in the light of counterevidence, motivate them to consider rejecting auxiliary assumptions rather than the proposals themselves. This contributes to a thorough exploration of them that is less likely with less committed thinkers. Additionally, since individuals appear to have a particular strength in detecting flaws in others’ arguments (Trouche et al. 2016 ), open social criticism within the group should ensure that the group’s conclusions remain reliable even if some, or at times most, of its members are led astray by their confirmation bias (Smart 2018 : 4190; Peters 2018 : 20).

Mercier and Sperber ( 2011 : 65) themselves already float the idea of such a social “division of cognitive labor”. They don’t yet take its group-level benefits to explain why confirmation bias evolved, however (Dutilh Novaes 2018 : 518f). Smart ( 2018 ) and Peters ( 2018 ) also don’t introduce their views as accounts of the evolved function of the bias. But Dutilh Novaes ( 2018 : 519) and Levy ( 2019 : 317) gesture toward, and Smith and Wald ( 2019 ) make the case for, an evolutionary proposal along these lines, arguing that the bias was selected for making a group’s inquiry more thorough, effective, and reliable.

While I have sympathies with this proposal, several researchers have noted that the concept of ‘group selection’ is problematic (West et al. 2007 ; Pinker 2012 ). One of the issues is that since individuals reproduce faster than groups, a trait T that is an adaptation that is good for groups but bad for an individual’s fitness won’t spread, because the rate of proliferation of groups is undermined by the evolutionary disadvantage of T within groups (Pinker 2012 ). The point equally applies to the proposal that confirmation bias was selected for its group-level benefits.

Moreover, a group arguably only benefits from each individual’s confirmation bias if there is a diversity of viewpoints in the group and members express their views, as otherwise “group polarization” is likely to arise (Myers and Lamm 1976 ): arguments for shared positions will accumulate without being criticized, making the group’s average opinion more extreme and less reliable, which is maladaptive. Crucially, ancestral ‘hunter-gather’ groups are perhaps unlikely to have displayed a diversity of viewpoints. After all, their members traveled less, interacted less with strangers, and were less economically dependent on other groups (Simpson and Beckes 2010 : 37). This should have homogenized them with respect to race, culture, and background (Schuck 2001 : 1915). Even today groups often display such homogeneity, as calls for diversity in academia, companies etc. indicate. These points provide reasons to doubt that ancestral groups provided the kind of conditions in which confirmation bias could have produced the benefits that the group-cognition account highlights rather than maladaptive effects tied to group polarization.

3.3 The Intention–Alignment Account

Turning to a third and here final extant proposal on the evolution of confirmation bias, Norman ( 2016 ) argues that human reasoning evolved for facilitating an “intention alignment” between individuals: in social interactions, reasons typically ‘overwrite’ nonaligned mental states (e.g., people’s divergent intentions or beliefs) with aligned ones by showing the need for changing them. Norman holds that human reasoning was selected for this purpose because it makes cooperation easier. He adds that, in this context, “confirmation bias would have facilitated intention alignment, for a tribe of hunter-gatherers prone to [the bias] would more easily form and maintain the kind of shared outlook needed for mutualistic collaboration. The mythologies and ideologies taught to the young would accrue confirming evidence and tend to stick, thereby cementing group solidarity” ( 2016 : 700). Norman takes his view to be supported by the “fact that confirmation bias is especially pronounced when a group’s ideological preconceptions are at stake” (ibid).

However, the proposal seems at odds with the finding that the bias inclines subjects to ignore or misconstrue their opponents’ objections. In fueling one-sided information processing to support one’s own view, the bias makes people less able to anticipate and adequately respond to their interlocutor’s point of view (Dutilh Novaes 2018 : 520). Due to that effect, the bias arguably makes an intention alignment with others (especially with one’s opponents) harder, not easier. Moreover, since our ancesteral groups are (as noted above) likely to have been largely viewpoint homogenous, in supporting intention-alignment in these social environments, confirmation bias would have again facilitated group polarization, which is prima facie evolutionarily disadvantageous.

All three proposals of the adaptive role of confirmation bias considered so far thus raise questions. While the points mentioned aren’t meant to be fatal for the proposals and might be answerable within their frameworks, they do provide a motivation to explore an alternative.

4 Towards an Alternative

The key idea that I want to develop is the following. Confirmation bias is typically taken to work against an individual’s truth tracking (Mercier and Sperber 2017 : 215; Peters 2018 : 15), and indeed searching for information supporting one’s beliefs and ignoring contradictory data is epistemically disadvantageous when what one takes to be reality is and stays different from what one believes it to be. However, reality doesn’t always remain unchanged when we form beliefs about it. Consider social beliefs, that is, beliefs about people (oneself, others, and groups) and social structures (i.e., relationships between individuals, groups, and socio-political institutions). I shall contend that a confirmation bias pertaining to social beliefs reinforces our confidence in these beliefs, therewith strengthening our tendency to behave in ways that cause changes in reality so that it corresponds to the beliefs, turning them (when they are initially inaccurate) into self - fulfilling prophecies (SFPs) (Merton 1948 ; Biggs 2009 ). Due to its role in helping us make social reality match our beliefs, confirmation bias is adaptive, or so I will argue. I first introduce examples of SFPs of social beliefs. Then I explore the relevance of these beliefs in our species, before making explicit the adaptive role of confirmation bias in facilitating SFPs.

4.1 Social Beliefs and SFPs

Social beliefs often lead to SFPs with beneficial outcomes. Here are four examples.

S (false) believes he is highly intelligent. His self-view motivates him to engage with intellectuals, read books, attend academic talks, etc. This makes him increasingly more intelligent, gradually confirming his initially inaccurate self-concept (for relevant empirical data, see Swann 2012 ).

Without a communicative intention, a baby boy looking at a kitten produces a certain noise: ‘ma-ma’. His mother is thrilled, believing (falsely) that he is beginning to talk and wants to call her. She responds accordingly, rushing to him, attending to him, and indicating excitement. This leads the boy to associate ‘ma-ma’ with the arrival and attention of his mother. And so he gradually begins using the sounds to call her, confirming her initially false belief about his communicative intention (for relevant empirical data, see Mameli 2001 ).

A father believes his adolescent daughter doesn’t regularly drink alcohol, but she does. He acts in line with his beliefs, and expresses it in communication with other people. His daughter notices and likes his positive view of her, which motivates her to increasingly resist drinks, gradually fulfilling her father’s optimistic belief about her (for relevant empirical data; see Willard et al. 2008 ).

A teacher (falsely) believes that a student’s current academic performance is above average. She thus gives him challenging material, encourages him, and communicates high expectations. This leads the student to increase his efforts, which gradually results in above-average academic performance (for relevant evidence, see Madon et al. 1997 ).

SFPs of initially false positive trait ascriptions emerge in many other situations too. They also occurred, for instance, when adults ascribed to children traits such as being tidy (Miller et al. 1975 ), charitable (Jensen and Moore 1977 ), or cooperative (Grusec et al. 1978 ). Similarly, in adults, attributions of, for example, kindness (Murray et al. 1996 ), eco-friendliness (Cornelissen et al. 2007 ), military competence (Davidson and Eden 2000 ), athletic ability (Solomon 2016 ), and even physiological changes (Turnwald et al. 2018 ) have all had self-fulfilling effects. Moreover, these effects don’t necessarily take much time to unfold but can happen swiftly in a single interaction (e.g., in interview settings; Word et al. 1974 ) right after the ascription (Turnwald et al. 2018 : 49).

SFPs are, however, neither pervasive nor all-powerful (Jussim 2012 ), and there are various conditions for them to occur (Snyder and Klein 2007 ). For instance, they tend to occur only when targets are able to change in accordance with the trait ascriptions, when the latter are believable rather than unrealistic (Alfano 2013 : 91f), and when the ascriber holds more power than the ascribee (Copeland 1994 : 264f). But comprehensive literature reviews confirm that SFPs are “real, reliable, and occasionally quite powerful” (Jussim 2017 : 8; Willard and Madon 2016 ).

4.2 The Distribution of Social Beliefs and Role of Prosociality in Humans

Importantly, SFPs can be pernicious when the beliefs at the center of them capture negative social conceptions, for instance, stereotypes, anxious expectations, fear, or hostility (Darley and Gross 1983 ; Downey et al. 1998 ; Madon et al. 2018 ). In these cases, SFPs would be maladaptive. Given this, what do we know about the distribution of social beliefs, in general, and positive ones, in particular, in ancestral human groups?

Many researchers hold that our evolutionary success as a species relies on our being “ultra-social” and “ultra-cooperative” animals (e.g., Tomasello 2014 : 187; Henrich 2016 ). Human sociality is “spectacularly elaborate, and of profound biological importance” because “our social groups are characterized by extensive cooperation and division of labour” (Sterelny 2007 : 720). Since we live in an almost continuous flow of interactions with conspecifics, “solving problems of coordination with our fellows is [one of] our most pressing ecological tasks” (Zawidzki 2008 : 198). A significant amount of our beliefs are thus likely to be social ones (Tomasello 2014 : 190f).

Moreover, when it comes to oneself, to group or “tribe” members, and to collaborators, these beliefs often capture positive to overly optimistic ascriptions of traits (e.g., communicativeness, skills, etc.; Simpson and Beckes 2010 ). This is well established when it comes to one’s beliefs about oneself (about 70% of the general population has a positive self-conception; Talaifar and Swann 2017 : 4) and one’s family members (Wenger and Fowers 2008 ). The assumption that the point also holds for ‘tribe’ members and collaborators, more generally, receives support from the “tribal-instincts hypothesis” (Richerson and Boyd 2001 ), which holds that humans tend to harbor “ethnocentric attitudes in favor of [their] own tribe along with its members, customs, values and norms”, as this facilitates social predictability and cooperation (Kelly 2013 : 507). For instance, in the past as much as today, humans “talk differently about their in-groups than their out-groups, such that they describe the in-group and its members [but not out-groups] as having broadly positive traits” (Stangor 2011 : 568). In subjects with such ‘tribal instincts’, judgments about out-group members might easily be negative. But within the groups of these subjects, among in-group members, overly optimistic, cooperation-enhancing conceptions of others should be and have been more dominant particularly in “intergroup conflict, [which] is undeniably pervasive across human societies” (McDonald et al. 2012 : 670). Indeed, such conflicts are known to fuel in-group “glorification” (Leidner et al. 2010 ; Golec De Zavala 2011 ).

Given these points, in ‘ultra-cooperative’ social environments in which ‘tribe’ members held predominantly positive social conceptions and expectations about in-group subjects, positive SFPs should have been overall more frequent and stronger than negative ones. Indeed, there is evidence that even today, positive SFPs in individual, dyadic interactions are more likely and pronounced than negative ones. Footnote 4 For instance, focusing on mothers’ beliefs about their sons’ alcohol consumption, Willard et al. ( 2008 ) found that children “were more susceptible to their mothers’ positive than negative self-fulfilling effects” (499): “mothers’ false beliefs buffered their adolescents against increased alcohol use rather than putting them at greater risk” (Willard and Madon 2016 : 133). Similarly, studies found that “teachers’ false beliefs raised students’ achievement more than they lowered it” (Willard and Madon 2016 : 118): teacher overestimates “increase[d] achievement more than teacher underestimates tended to decrease achievement among students” (Madon et al. 1997 : 806). Experiments with stigmatized subjects corroborate these results further (ibid), leading Jussim ( 2017 ) in his literature review to conclude that high teacher expectations help students “more than low expectations harm achievement” (8).

One common explanation of this asymmetry is that SFPs typically depend on whether the targets of the trait ascriptions involved accept the expectations imposed on them via the ascriptions (Snyder and Klein 2007 ). And since subjects tend to strive to think well of themselves (Talaifar and Swann 2017 ), they respond more to positive than negative expectations (Madon et al. 1997 : 792). If we combine these considerations with the assumption that in ancestral groups of heavily interdependent subjects, positive social beliefs about in-group members (in-group favoritism) are likely to have been more prevalent than negative ones, then there is reason to hold that the SFPs of the social conceptions in the groups at issue were more often than not adaptive. With these points in mind, it is time to return to confirmation bias.

4.3 From SFPs to Confirmation Bias

Notice that SFPs depend on trait or mental-state ascriptions that are ‘ahead’ of their own truth: they are formed when an objective assessment of the available evidence doesn’t yet support their truth. Assuming direct doxastic voluntarism is false (Matheson and Vitz 2014 ), how can they nonetheless be formed and confidently maintained?

I suggest that confirmation bias plays an important role: it allows subjects to become and remain convinced about their social beliefs (e.g., trait ascriptions) when the available evidence doesn’t yet support their truth. This makes SFPs of these beliefs more likely than if the ascriber merely verbally attributed the traits without committing to the truth of the ascriptions, or believed in them but readily revised the beliefs. I shall argue that this is in fact adaptive not only when it comes to positive trait ascriptions, but also to negative ones. I will illustrate the point first with respect to positive trait ascriptions.

4.3.1 Motivated Confirmation Bias and Positive Trait Ascriptions

Suppose that you ascribe a positive property T to a subject A , who is your ward, but (unbeknownst to you) the available evidence doesn’t yet fully support that ascription. The more convinced you are about your view of A even in the light of counterevidence, the better you are at conveying your conviction to A because, generally, “people are more influenced [by others] when [these] others express judgments with high confidence than low confidence” (Kappes et al. 2020 : 1; von Hippel and Trivers 2011 ). Additionally, the better you are at conveying to A your conviction that he has T , the more confident he himself will be that he has that trait (assuming he trusts you) (Sniezek and Van Swol 2001 ). Crucially, if A too is confident that he has T , he will be more likely to conform to the corresponding expectations than if he doesn’t believe the ascription, say, because he notices that you only say but don’t believe that he has T . Relatedly, the more convinced you are about your trait ascription to A , the clearer your signaling of the corresponding expectations to A in your behavior (Tormala 2016 ) and the higher the normative impetus on him, as a cooperative subject, to conform so as to avoid disrupting interactions with you.

Returning to confirmation bias, given what we know about the cognitive effect of the bias, the more affected you are by the bias, the stronger your belief in your trait ascriptions to A (Rabin and Schrag 1999 ), and so the lower the likelihood that you will reveal in your behavior a lack of conviction about them that could undermine SFPs. Thus, the more affected you are by the bias, the higher the likelihood of SFPs of the ascriptions because conviction about the ascriptions plays a key facilitative role for SFPs. This is also experimentally supported. For several studies found that SFPs of trait ascriptions occurred only when ascribers were certain of the ascriptions, not when they were less confident (Swann and Ely 1984 ; Pelham and Swann 1994 ; Swann 2012 : 30). If we add to these points that SFPs of trait ascriptions were in developmental and educational contexts in ancestral tribal groups more often beneficial for the targets than not, then there is a basis for holding that confirmation bias might in fact have been selected for sustaining SFPs.

Notice that the argument so far equally applies to motivated reasoning. This is to be expected because, as mentioned above, motivated confirmation bias is an instance of motivated reasoning (Nickerson 1998 ). To pertain specifically to confirmation bias, however, the evolutionary proposal that the bias was selected for facilitating SFPs of social conceptions also has to hold for unmotivated confirmation bias. Is this the case?

4.3.2 Unmotivated Confirmation Bias and Negative Trait Ascriptions

Notice that when we automatically reinforce any of our views no matter whether we favor them, then our preferences won’t be required for and undermine the reinforcement process and the SFPs promoted by it. This means that such a general tendency, i.e., a confirmation bias, can fulfil the function of facilitating SFPs more frequently than motivated cognitions, namely whenever the subject has acquired a social conception (e.g., as the result of upbringing, learning, or testimony). This is adaptive for at least three reasons.

First, suppose that as a parent, caretaker, or teacher you (unknowingly) wishfully believe that A , who is your ward, has a positive trait T . You tell another subject ( B ) that A has T , and, on your testimony, B subsequently believes this too. But suppose that unlike you, B has no preference as to whether A has T. Yet, as it happens, she still has a confirmation bias toward her beliefs. Just like you, B will now process information so that it strengthens her view about A . This increases her conviction in, and so the probability of an SFP of, the trait ascription to A , because now both you and B are more likely to act toward A in ways indicating ascription-related expectations. As a general tendency to support any of one’s beliefs rather than only favored ones, the bias thus enables a social ‘ripple’ effect in the process of making trait ascriptions match reality. Since this process is in ultra-social and ultra-cooperative groups more often than not adaptive (e.g., boosting the development of a positive trait in A ), in facilitating a social extension of it, confirmation bias is adaptive too.

Secondly, in ancestral groups, many of the social conceptions (e.g., beliefs about social roles, gender norms, stereotypes etc.) that subjects unreflectively acquired during their upbringing and socialization will have been geared toward preserving the group's function and status quo  and aligning individuals with them (Sterelny 2006 : 148). Since it can operate independently of a subject’s preferences, a confirmation bias in each member of the group would have helped the group enlist each of its members for re-producing social identities, social structures, traits, and roles in the image of the group’s conceptions even when these individuals disfavored them. In sustaining SFPs of these conceptions, which might have included various stereotypes or ethnocentric, prejudicial attitudes that we today consider offensive negative trait ascriptions (e.g., gender or racist stereotypes) (Whitaker et al. 2018 ), confirmation bias would have been adaptive in the past. For, as Richerson and Boyd ( 2005 : 121f) note too, in ancestral groups, selection pressure favored social conformity, predictability, and stability. That confirmation bias might have evolved for facilitating SFPs that serve the ‘tribal' collective , possibly even against the preference, autonomy, and better judgment of the individual, is in line with recent research suggesting that many uniquely human features of cognition evolved through pressures selecting for the ability to conform to other people and to facilitate social projects (Henrich 2016 ). It is thought that these features may work against common ideals associated with self-reliance or “achieving basic personal autonomy, because the main purpose of [them] is to allow us to fluidly mesh with others, making us effective nodes in larger networks” (Kelly and Hoburg 2017 : 10). I suggest that confirmation bias too was selected for making us effective ‘nodes’ in social networks by inclining us to create social reality that corresponds to these networks’ conceptions even when we dislike them or they are harmful to others (e.g., out-group members).

Thirdly, in helping us make social affairs match our beliefs about them even when we don’t favor them, confirmation bias also provides us with significant epistemic benefits in social cognition. Consider Jack and Jill. Both have just seen an agent A act ambiguously, and both have formed a first impression of A according to which A is acting the way he is because he has trait T . Suppose neither Jack nor Jill has any preference as to whether A has that trait but subsequently process information in the following two different ways. Jack does not have a confirmation bias but impartially assesses the evidence and swiftly revises his beliefs when encountering contradictory data. As it happens, A ’s behavior soon does provide him with just such evidence, leading him to abandon his first impression of A and reopen the search for an explanation of A ’s action. In contrast, Jill does have a confirmation bias with respect to her beliefs and interprets the available evidence so that it supports her beliefs. Jill too sees A act in a way that contradicts her first impression of him. But unlike Jack, she doesn’t abandon her view. Rather, she reinterprets A ’s action so that it bolsters her view. Whose information processing might be more adaptive? For Jack, encountering data challenging his view removes certainty and initiates a new cycle of computations about A , which requires him to postpone a possible collaboration with A . For Jill, however, the new evidence strengthens her view, leading her to keep the issue of explaining A ’s action settled and be ready to collaborate with him. Jack’s approach might still seem better for attaining an accurate view of A and predicting what he’ll do next. But suppose Jill confidently signals to A her view of him in her behavior. Since people have a general inclination to fulfil others’ expectations (especially positive ones) out of an interest in coordinating and getting along with them (Dardenne and Leyens 1995 ; Bacharach et al. 2007 ), when A notices Jill’s conviction that he displays T , he too is likely to conform, which provides Jill with a correct view of what he will do next. Jill’s biased processing is thus more adaptive than Jack’s approach: a confirmation bias provides her with certainty and simpler information processing that simultaneously facilitates accurate predictions (via contributing to SFPs). Generalizing from Jill, in everyday social interactions we all form swift first impressions of others without having any particular preference with respect to these impressions either way. Assuming that confirmation bias operates on them nonetheless, the bias will frequently be adaptive in the ways just mentioned.

4.3.3 Summing Up: The Reality-Matching Account

By helping subjects make social reality match their beliefs about it no matter whether they favor these beliefs or the latter are sufficiently evidentially supported, confirmation bias is adaptive: when the bias targets positive social beliefs and trait ascriptions, it serves both the subject and the group by producing effects that (1) assist them in their development (to become, e.g., more communicative, cooperative, or knowledgeable) and (2) make social cognition more tractable (by increasing social conformity and predictability). To be sure, when it targets negative trait ascriptions (pernicious stereotypes, etc.), the bias can have ethically problematic SFP effects. But, as noted, especially in ancestral ‘tribal’ groups, it would perhaps still have contributed to social conformity, predictability, and sustaining the status quo , which would have been adaptive in these groups (Richerson and Boyd  2005 ) inter alia  by facilitating social cognition. Taken together, these considerations provide a basis for holding that confirmation bias was selected for promoting SFPs. I shall call the proposal introduced in this section, the  reality-matching (RM) account of the function of confirmation bias.

5 Supporting the RM Account

Before offering empirical support for the RM account and highlighting its explanatory benefits, it is useful to disarm an objection: if confirmation bias was selected for its SFP-related effects, then people should not also display the bias with respect to beliefs that can’t produce SFPs (e.g., beliefs about physics, climate change, religion, etc.). But they do (Nickerson 1998 ).

5.1 From Social to Non-social Beliefs

In response to the objection just mentioned, two points should be noted. First, the RM account is compatible with the view that confirmation bias was also selected for adaptive effects related to non -social beliefs. It only claims that facilitating the alignment of social reality with social beliefs (i.e., reality matching) is one of the important adaptive features for which the bias was selected that has so far been neglected.

Second, it doesn’t follow that because confirmation bias also affects beliefs that can’t initiate SFPs that it could not have been selected for affecting beliefs that can and do initiate SFPs. The literature offers many examples of biological features or cognitive traits that were selected for fulfilling a certain function despite rarely doing so or even having maladaptive effects (Millikan 1984 ; Haselton and Nettle 2006 ). Consider the “baby-face overgeneralization” bias (Zebrowitz and Montepare 2008 ). Studies suggest that people have a strong readiness to favorably respond to babies’ distinctive facial features. And this tendency is overgeneralized such that even adults are more readily viewed more favorably, treated as likeable (but also physically weak, and naïve) when they display babyface features. While this overgeneralization tendency often leads to errors, it is thought to have evolved because failures to respond favorably to babies (i.e., false negatives) are evolutionarily more costly than overgeneralizing (i.e., false positives) (ibid).

Might our domain-general tendency to confirm our own beliefs be similarly less evolutionarily costly than not having such a general tendency? It is not implausible to assume so because, as noted, we are ultra-social and ultra-cooperative, and our beliefs about people’s social standing, knowledge, intentions, abilities, etc. are critical for our flourishing (Sterelny 2007 : 720; Tomasello 2014 : 190f; Henrich 2016 ). Importantly, these beliefs, unlike beliefs about the non-social world, are able to and frequently do initiate SFPs contributing to the outlined evolutionary benefits. This matters because if social beliefs are pervasive and SFPs of them significant for our flourishing, then a domain-general tendency to confirm any of our beliefs ensures that we don’t miss opportunities to align social reality with our conceptions and to reap the related developmental and epistemic benefits. Granted, this tendency overgeneralizes, which creates clear costs. But given the special role of social beliefs in our species and our dependence on social learning and social cognition, which are facilitated by SFPs, it is worth taking seriously the possibility that these costs can often outweigh the benefits.

While this thought doesn’t yet show that the RM account is correct, it does help disarm the above objection. For it explains why the fact that confirmation bias also affects beliefs that cannot initiate SFPs doesn’t disprove the view that the bias was selected for reality matching: the special role of social beliefs in our species (compared to others species) lends plausibility to the assumption that the costs of the bias’ overgeneralizing might be lower than the costs of its failing to generalize. I now turn to the positive support for the RM account.

5.2 Empirical Data

If, as the RM account proposes, confirmation bias was selected for facilitating the process of making reality match our beliefs, then the bias should be common and pronounced when (1) it comes to social beliefs, that is, beliefs (a) about oneself, (b) about other people, and (c) about social structures that the subject can determine, and when (2) social conditions are conducive to reality matching. While there are no systematic comparative studies on whether the bias is more frequent or stronger with respect to some beliefs but not others (e.g., social vs. non-social beliefs), there is related empirical research that does provide some support for these predictions.

Self - related Beliefs

In a number of studies, Swann and colleagues (Swann 1983 ; Swann et al. 1992 ; for an overview, see Swann 2012 ) found that selective information processing characteristic of confirmation bias is “especially pronounced with regards to self-concepts” and so self-related beliefs (Müller-Pinzler et al. 2019 : 9). Footnote 5 Interestingly, and counterintuitively, the data show that “just as people with positive self-views preferentially seek positive evaluations, those with negative self-views preferentially seek negative evaluations” (Talaifar and Swann 2017 : 3). For instance, those “who see themselves as likable seek out and embrace others who evaluate them positively, whereas those who see themselves as dislikeable seek out and embrace others who evaluate them negatively” (ibid). Much in line with the RM account, Swann ( 2012 ) notes that this confirmatory tendency “would have been advantageous” in “hunter-gatherer groups”: once “people used input from the social environment to form self-views, self-verification strivings would have stabilized their identities and behavior, which in turn would make each individual more predictable to other group members” (26).

Similarly, in a study in which subjects received feedback about aspects of their self that can be relatively easily changed (e.g., their ability to estimate the weights of animals), Müller-Pinzler et al. ( 2019 ) found that “prior beliefs about the self modulate self-related belief-formation” in that subjects updated their performance estimates “in line with a confirmation bias”: individuals with prior negative self-related beliefs (e.g., low self-esteem) showed increased biases towards factoring in negative (vs. positive) feedback, and, interestingly, this tendency was “modulated by the social context and only present when participants were exposed to a potentially judging audience” (ibid: 9–10). This coheres with the view that confirmation bias might serve the ‘collective’ to bring subjects into accordance with its social conceptions (positive or negative).

Other - Related Beliefs

If confirmation bias was selected for sustaining social beliefs for the sake of reality matching then the bias should also be particularly pronounced when it comes to beliefs about other people especially in situations conducive to reality matching. For instance, powerful individuals have been found to be more likely to prompt subordinates to behaviorally confirm their social conceptions than relatively powerless subjects (Copeland 1994 ; Leyens et al. 1999 ). That is, interactions between powerful and powerless individuals are conducive to reality matching of the powerful individuals’ social beliefs. According to the RM account, powerful individuals should display a stronger confirmation bias with respect to the relevant social beliefs. Goodwin et al. ( 2000 ) found just that: powerful people, in particular, tend to fail to take into account data that may contradict their social beliefs (capturing, e.g., stereotypes) about subordinates and attend more closely to information that supports their expectations. Relative to the powerless, powerful people displayed a stronger confirmation bias in their thinking about subordinates (ibid: 239f).

Similarly, if confirmation bias serves to facilitate social interaction by contributing to a match between beliefs and social reality then the bias should be increased with respect to trait attributions to other people in subjects who care about social interactions compared to other subjects. Dardenne and Leyens ( 1995 ) reasoned that when testing a hypothesis about the personality of another individual (e.g., their being introverted or extroverted), a preference for questions that match the hypothesis (e.g., that the subject is introverted) indicates social skill, conveying a feeling of being understood to the individual and contributing to a smooth conversation. Socially skilled people (‘high self-monitors’) should thus request ‘matching questions’, say, in an interview setting, for instance, when testing the introvert hypothesis, an interviewer could ask questions that are answered ‘yes’ by a typical introvert (e.g., ‘Do you like to stay alone?’), confirming the presence of the hypothesized trait (ibid). Dardenne and Leyens did find that matching questions pertaining to an introvert or an extrovert hypothesis were selected most by high self-monitors: socially skilled subjects displayed a stronger confirmatory tendency than less socially skilled subjects (ibid).

Finally, there is also evidence that confirmation bias is more pronounced with respect to social beliefs compared to non-social beliefs. For instance, Marsh and Hanlon ( 2007 ) gave one group of behavioral ecologists a specific set of expectations with respect to sex differences in salamander behavior, while a second group was given the opposite set of expectations. In one experiment, subjects collected data on variable sets of live salamanders, while in the other experiment, observers collected data from identical videotaped trials. Across experiments and observed behaviors, the expectations of the observers biased their observations “only to a small or moderate degree”, Marsh and Hanlon note, concluding that these “results are largely optimistic with respect to confirmation bias in behavioral ecology” ( 2007 : 1089). This insignificant confirmation bias with respect to beliefs about non-social matters contrasts with findings of a significant confirmation bias with respect to beliefs about people (Talaifar and Swann 2017 ; Goodwin et al. 2000 ; Marks and Fraley 2006 ; Darley and Gross 1983 ), and, as I shall argue now, social affairs whose reality the subject can determine.

Non - personal, Social Beliefs

One important kind of social beliefs are political beliefs, which concern social states of affairs pertaining to politics. Political beliefs are especially interesting in the context of the RM account because they are very closely related to reality matching. This is not only because subjects can often directly influence political affairs via voting, running as a candidate, campaigning, etc. It is also because subjects who are highly confident about their political beliefs are more likely to be able to convince other people of them too (Kappes et al. 2020 ). And the more widespread a political conviction in a population, the higher the probability that the population will adopt political structures that shape reality in line with it (Jost et al. 2003 ; Ordabayeva and Fernandes 2018 ).

If, as the RM account proposes, confirmation bias was selected for sustaining social beliefs for the sake of reality matching then the bias should be particularly strong when it comes to beliefs about political states of affairs. And indeed Taber and Lodge ( 2006 ) did find that “motivated [confirmation] biases come to the fore in the processing of political arguments”, in particular, and, crucially, subjects “with weak […] [political] attitudes show less [confirmation] bias in processing political arguments” (767). In fact, in psychology, attitude strength, especially, in politically relevant domains of thinking has long been and still is widely accepted to increase the kind of selective exposure constitutive of confirmation bias (Knobloch-Westerwick et al. 2015 : 173). For instance, Brannon et al. ( 2007 ) found that stronger, more extreme political attitudes are correlated with higher ratings of interest in attitude-consistent versus attitude-discrepant political articles. Similarly, Knobloch-Westerwick et al. ( 2015 ) found that people online who attach high importance to particular political topics spent more time on attitude-consistent messages than users who attached low importance to the topics, and “[a]ttitude-consistent messages […] were preferred”, reinforcing the attitudes further (171). While this can contribute to political group polarization, such a polarization also boosts the group-wide reality-matching endeavour and can so be adaptive itself (Johnson and Fowler 2011 : 317).

In short, then, while there are currently no systematic comparative studies on whether confirmation bias is more frequent or stronger with respect to social beliefs, related empirical studies do suggest that when it comes to (positive or negative) social beliefs about oneself, other people, and social states of affairs that the subject can determine (e.g., political beliefs), confirmation bias is both particularly common and pronounced. Empirical data thus corroborate some of the predictions of the RM account.

5.3 Explanatory Benefits

The theoretical and empirical considerations from the preceding sections offer support for the RM account. Before concluding, it is worth mentioning three further reasons for taking the account seriously. First, it has greater explanatory power than the three alternative views outlined above. Second, it is consistent with, and provides new contributions to, different areas of evolutionary theorizing on human cognition. And it casts new light on the epistemic character of confirmation bias. I’ll now support these three points.

For instance, the argumentative-function account holds that confirmation bias is adaptive in making us better arguers. This was problematic because the bias hinders us in anticipating people’s objections, which weakens our argumentative skill and increases the risk of us appearing incompetent in argumentative exchanges. The RM account avoids these problems: if confirmation bias was selected for reinforcing our preconceptions about people to promote SFPs then, since in one’s own reasoning one only needs to justify one’s beliefs to oneself, the first point one finds acceptable will suffice. To convince others , one would perhaps need to anticipate objections. But if the bias functions to boost primarily only one’s own conviction about particular beliefs so as to facilitate SFPs then ‘laziness’ in critical thinking about one’s own positions (Trouche et al. 2016 ) shouldn’t be surprising.

Turning to the group-cognition account, the proposal was that confirmation bias is adaptive in and was selected for making group-level inquires more thorough, reliable, and efficient. In response, I noted that the concept of ‘group selection’ is problematic when it comes to traits threatening an individual’s fitness (West et al. 2007 ; Pinker 2012 ), and that confirmation bias would arguably only lead to the group-level benefits at issue in groups with viewpoint diversity. Yet, it is doubtful that ancestral groups met this condition. The RM account is preferable to the group-cognition view because it doesn’t rely on a notion of group selection but concerns primarily individual-level benefits, and it doesn’t tie the adaptive effects of the bias to conditions of viewpoint diversity. It proposes instead that the adaptive SFP-related effects of the bias increase individuals’ fitness (e.g., by facilitating their navigation of the social world, aligning them/others with their group's conceptions etc.) and can emerge whenever people hold beliefs about each other, interact, and fulfill social expectations. This condition is satisfied even in groups with viewpoint homogeneity.

The RM account also differs from the intention–alignment view, which holds that confirmation bias evolved for allowing us to synchronize intentions with others. One problem with this view was that the bias seems to hinder an intention alignment of individuals by weakening their perspective-taking capacity, and inclining them to ignore or distort people’s objections. The RM account avoids this problem because it suggests that by disregarding objections or counterevidence to one’s beliefs, one can remain convinced about them, which helps align social reality (not only, e.g., people’s intentions) with them, producing the adaptive outcomes outlined above. The account can also explain why confirmation bias is particularly strong in groups in which shared ideologies are at stake (Taber and Lodge 2006 ; Gerken 2019 ). For subjects have a keen interest in reality corresponding to their ideological conceptions. Since the latter are shaping social reality via their impact on behavior and are more effective in doing so the more convinced people are about them (Kappes et al. 2020 ), it is to be expected that when it comes to ideological propositions in like-minded groups, confirmation bias is more pronounced. And, as noted, the resulting group polarization itself can then be adaptive in strengthening the reality-matching process.

Moving beyond extant work on the evolution of confirmation bias, the RM account also contributes to and raises new questions for other areas of research in different disciplines. It, for instance, yields predictions that psychologists can experimentally explore in comparative studies such as the prediction that confirmation bias is more common and stronger when targeting social versus non-social beliefs, or when conditions are conducive to reality matching as opposed to when they are not. The account also adds a new perspective to research on SFPs and on how social conceptions interact with their targets (Hacking 1995 ; Snyder and Klein 2007 ; Jussim 2017 ). Relatedly, the RM account also contributes to recent philosophical work on, folk-psychology , i.e., our ability to ascribe mental states to agents to make sense of their behavior. In that work, some philosophers argue that folk-psychology serves “mindshaping”, that is, the moulding of people’s behavior and minds so that they fit our conceptions, making people more predictable and cooperation with them easier (Mameli 2001 ; Zawidzki 2013 ; Peters 2019b ). There are clear connections between the mindshaping view of folk psychology and the RM account, but also important differences. For instance, the RM account pertains to the function of confirmation bias, not folk psychology. Moreover, advocates of the mindshaping view have so far left the conditions for effective mindshaping via folk-psychological ascriptions and the possible role of confirmation bias in it unexplored. The RM account begins to fill this gap in the research and in doing so adds to work on the question of how epistemic (or ‘mindreading’) and non-epistemic (or ‘mindshaping’, e.g., motivational) processes are related in folk-psychology (Peters 2019b : 545f; Westra 2020 ; Fernández-Castro and Martínez-Manrique 2020 ).

In addition to offering contributions to a range of different areas of research, the RM account also casts new light on the epistemic character of confirmation bias. Capturing the currently common view on the matter, Mercier ( 2016 ) writes that “piling up reasons that support our preconceived views is not the best way to correct them. […] [It] stop[s] people from fixing mistaken beliefs” (110). The RM account offers a different perspective, suggesting that when it is directed at beliefs about social affairs, confirmation bias does often help subjects correct their mistaken conceptions to the extent that it contributes to SFPs of them. Similarly, Dutilh Novaes ( 2018 ) holds that the bias involves or contributes to a failure of perspective taking, and so, “given the importance of being able to appreciate one’s interlocutor’s perspective for social interaction”, is “best not seen as an adaptation” (520). The RM account, on the other hand, proposes that the bias often facilitates social understanding: in making us less sensitive to our interlocutor’s opposing perspective, it helps us remain confident about our social beliefs, which increases the probability of SFPs that in turn make people more predictable and mindreadable.

6 Conclusion

After outlining limitations of three recent proposals on the evolution of confirmation bias, I developed and supported a novel alternative, the reality-matching (RM) account, which holds that one of the adaptive features for which the bias evolved is that it helps us bring social reality into alignment with our beliefs. When the bias targets positive social beliefs, this serves both the subject and the group, assisting them in their development (to become, e.g., more communicative or knowledgeable) while also making their social cognition more effective and tractable. When it targets negative social beliefs, in promoting reality matching, the bias might contribute to ethically problematic outcomes, but it can then still support social conformity and predictability, which were perhaps especially in ancestral tribal groups adaptive. While the socially constructive aspect of confirmation bias highlighted here may not be the main or only feature of the bias that led to its evolution, it is one that has so far been overlooked in the evolutionary theorizing on confirmation bias. If we attend to it, an account of the function of confirmation bias becomes available that coheres with data from across the psychological sciences, manages to avoid many of the shortcomings of competitor views, and has explanatory benefits that help advance the research on the function, nature, and epistemic character of the bias.

Mercier and Sperber ( 2017 ) and others prefer the term ‘myside bias’ to ‘confirmation bias’ because people don’t have a general tendency to confirm any hypothesis that comes to their mind but only ones that are on ‘their side’ of a debate. I shall here use the term ‘confirmation bias’ because it is more common and in any case typically understood in the way just mentioned.

Researchers working on folk psychology might be reminded of the ‘mindshaping’ view of folk psychology (Mameli 2001 ; Zawidzki 2013 ). I will come back to this view and demarcate it from my account of confirmation bias here in Sect.  5 .

It might be proposed that when participants in the experiment seek reasons for their judgments, perhaps they take themselves already to have formed the judgements for good reasons and then wonder what these reasons might have been. Why would they seek reasons against a view that they have formed (by their own lights) for good reasons? However, we might equally well ask why they would take themselves to have formed a judgment for good reasons in the first place even though they don’t know any of them? If it is a general default tendency to assume that any view that one holds rests on good reasons, then that would again suggest the presence of a confirmation bias. For a general tendency to think that one’s views rest on good reasons even when one doesn’t know them is a tendency to favor and confirm these views while resisting balanced scrutiny of their basis.

SFPs can also accumulate when they occur across different interactions, and in contemporary societies, overall accumulative SFP effects of negative social beliefs capturing, e.g., stereotypes might be stronger than those of positive social beliefs in individual dyadic interactions (Madon et al. 2018 ). However, in ancestral, ‘tribal’ groups of highly interdependent subjects, even accumulative SFPs of, e.g., stereotypes would perhaps still have contributed to conformity and social stability. I shall return to the possible SFP-related benefits of nowadays highly negative social conceptions, i.e., stereotypes, ethnocentrism etc. below.

Relatedly, neuroscientific data show that a positive view of one’s own traits tends to correlate with a reduced activation of the right inferior prefrontal gyrus, which is the area of the brain processing self-related content, when the subject receives negative self-related information (Sharot et al. 2011 ). That is, optimists about themselves display a diminished sensitivity for negative information that is in tension with self-related trait optimism (ibid).

Alfano, M. (2013). Character as moral fiction . Cambridge: CUP.

Book   Google Scholar  

Bacharach, M., Guerra, G., & Zizzo, D. J. (2007). The self-fulfilling property of trust: An experimental study. Theory and Decision, 63, 349–388.

Article   Google Scholar  

Ball, P. (2017). The trouble with scientists. How one psychologist is tackling human biases in science. Nautilus . Retrieved May 2, 2019 from http://nautil.us/issue/54/the-unspoken/the-trouble-with-scientists-rp .

Biggs, M. (2009). Self-fulfilling prophecies. In P. Bearman & P. Hedstrom (Eds.), The Oxford handbook of analytical sociology (pp. 294–314). Oxford: OUP.

Google Scholar  

Brannon, L. A., Tagler, M. J., & Eagly, A. H. (2007). The moderating role of attitude strength in selective exposure to information. Journal of Experimental Social Psychology, 43, 611–617.

Copeland, J. (1994). Prophecies of power: Motivational implications of social power for behavioral confirmation. Journal of Personality and Social Psychology, 67, 264–277.

Cornelissen, G., Dewitte, S., & Warlop, L. (2007). Whatever people say I am that’s what I am: Social labeling as a social marketing tool. International Journal of Research in Marketing, 24 (4), 278–288.

Dardenne, B., & Leyens, J. (1995). Confirmation bias as a social skill. Personality and Social Psychology Bulletin, 21 (11), 1229–1239.

Darley, J. M., & Gross, P. H. (1983). A hypothesis-confirming bias in labeling effects. Journal of Personality and Social Psychology, 44, 20–33.

Davidson, O. B., & Eden, D. (2000). Remedial self-fulfilling prophecy: Two field experiments to prevent Golem effects among disadvantaged women. Journal of Applied Psychology, 85 (3), 386–398.

De Bruine, L. M. (2009). Beyond ‘just-so stories’: How evolutionary theories led to predictions that non-evolution-minded researchers would never dream of. Psychologist, 22 (11), 930–933.

De Cruz, H., & De Smedt, J. (2016). How do philosophers evaluate natural theological arguments? An experimental philosophical investigation. In H. De Cruz & R. Nichols (Eds.), Advances in religion, cognitive science, and experimental philosophy (pp. 119–142). New York: Bloomsbury.

Downey, G., Freitas, A. L., Michaelis, B., & Khouri, H. (1998). The self-fulfilling prophecy in close relationships: Rejection sensitivity and rejection by romantic partners. Journal of Personality and Social Psychology, 75, 545–560.

Draper, P., & Nichols, R. (2013). Diagnosing bias in philosophy of religion. The Monist, 96, 420–446.

Dutilh Novaes, C. (2018). The enduring enigma of reason. Mind and Language, 33, 513–524.

Evans, J. (1996). Deciding before you think: Relevance and reasoning in the selection task. British Journal of Psychology, 87, 223–240.

Fernández-Castro, V., & Martínez-Manrique, F. (2020). Shaping your own mind: The self-mindshaping view on metacognition. Phenomenology and the Cognitive Sciences . https://doi.org/10.1007/s11097-020-09658-2 .

Gerken, M. (2019). Public scientific testimony in the scientific image. Studies in History and Philosophy of Science Part A . https://doi.org/10.1016/j.shpsa.2019.05.006 .

Golec de Zavala, A. (2011). Collective narcissism and intergroup hostility: The dark side of ‘in-group love’. Social and Personality Psychology Compass, 5, 309–320.

Goodwin, S., Gubin, A., Fiske, S., & Yzerbyt, V. (2000). Power can bias impression formation: Stereotyping subordinates by default and by design. Group Processes and Intergroup Relations, 3, 227–256.

Gould, S. J., & Lewontin, R. C. (1979). The spandrels of San Marco and the Panglossian paradigm: A critique of the adaptationist programme. Proceedings of the Royal Society of London. Series B, 205 (1161), 581–598.

Grusec, J., Kuczynski, L., Rushton, J., & Simutis, Z. (1978). Modeling, direct instruction, and attributions: Effects on altruism. Developmental Psychology, 14, 51–57.

Hacking, I. (1995). The looping effects of human kinds. In D. Sperber, et al. (Eds.), Causal cognition (pp. 351–383). New York: Clarendon Press.

Hahn, U., & Harris, A. J. L. (2014). What does it mean to be biased: Motivated reasoning and rationality. In H. R. Brian (Ed.), Psychology of learning and motivation (pp. 41–102). New York: Academic Press.

Haidt, J. (2012). The righteous mind . New York: Pantheon.

Hall, L., Johansson, P., & Strandberg, T. (2012). Lifting the veil of morality: Choice blindness and attitude reversals on a self-transforming survey. PLoS ONE, 7 (9), e45457.

Haselton, M. G., & Nettle, D. (2006). The paranoid optimist: An integrative evolutionary model of cognitive biases. Personality and Social Psychology Review, 10, 47–66.

Henrich, J. (2016). The secret of our success . Princeton, NJ: Princeton University Press.

Hernandez, I., & Preston, J. L. (2013). Disfluency disrupts the confirmation bias. Journal of Experimental Social Psychology, 49 (1), 178–182.

Jensen, R. E., & Moore, S. G. (1977). The effect of attribute statements on cooperativeness and competitiveness in school-age boys. Child Development, 48 (1), 305–307.

Johnson, D. D. P., Blumstein, D. T., Fowler, J. H., & Haselton, M. G. (2013). The evolution of error: Error management, cognitive constraints, and adaptive decision-making biases. Trends in Ecology & Evolution, 28, 474–481.

Johnson, D. D. P., & Fowler, J. H. (2011). The evolution of overconfidence. Nature, 477, 317–320.

Jost, J. T., Glaser, J., Kruglanski, A. W., & Sulloway, F. J. (2003). Political conservatism as motivated social cognition. Psychological Bulletin, 129 (3), 339–375.

Jussim, L. (2012). Social perception and social reality . Oxford: OUP.

Jussim, L. (2017). Précis of social perception and social reality: Why accuracy dominates bias and self-fulfilling prophecy. Behavioral and Brain Sciences, 40, 1–20.

Kappes, A., Harvey, A. H., Lohrenz, T., et al. (2020). Confirmation bias in the utilization of others’ opinion strength. Nature Neuroscience, 23, 130–137.

Kelly, D. (2013). Moral disgust and the tribal instincts hypothesis. In K. Sterelny, R. Joyce, B. Calcott, & B. Fraser (Eds.), Cooperation and its evolution (pp. 503–524). Cambridge, MA: The MIT Press.

Kelly, D., & Hoburg, P. (2017). A tale of two processes: On Joseph Henrich’s the secret of our success: How culture is driving human evolution, domesticating our species, and making us smarter. Philosophical Psychology, 30 (6), 832–848.

Ketelaar, T., & Ellis, B. J. (2000). Are evolutionary explanations unfalsifiable? Evolutionary psychology and the Lakatosian philosophy of science. Psychological Inquiry, 11 (1), 1–21.

Klayman, J. (1995). Varieties of confirmation bias. Psychology of Learning and Motivation, 32, 385–418.

Knobloch-Westerwick, S., Johnson, B. K., & Westerwick, A. (2015). Confirmation bias in online searches: Impacts of selective exposure before an election on political attitude strength and shifts. Journal of Computer-Mediated Communication, 20, 171–187.

Kunda, Z. (1990). The case for motivated reasoning. Psychological Bulletin, 108 (3), 480–498.

Leidner, B., Castano, E., Zaiser, E., & Giner-Sorolla, R. (2010). Ingroup glorification, moral disengagement, and justice in the context of collective violence. Personality and Social Psychology Bulletin, 36 (8), 1115–1129.

Levy, N. (2019). Due deference to denialism: Explaining ordinary people’s rejection of established scientific findings. Synthese, 196 (1), 313–327.

Leyens, J., Dardenne, B., Yzerbyt, V., Scaillet, N., & Snyder, M. (1999). Confirmation and disconfirmation: Their social advantages. European Review of Social Psychology, 10 (1), 199–230.

Lilienfeld, S. O. (2017). Psychology’s replication crisis and the grant culture: Righting the ship. Perspectives on Psychological Science, 12 (4), 660–664.

Lord, C., Lepper, M., & Preston, E. (1984). Considering the opposite: A corrective strategy for social judgment. Journal of Personality and Social Psychology, 47, 1231–1243.

Madon, S., Jussim, L., & Eccles, J. (1997). In search of the powerful self-fulfilling prophecy. Journal of Personality and Social Psychology, 72, 791–809.

Madon, S., Jussim, L., Guyll, M., Nofziger, H., Salib, E. R., Willard, J., et al. (2018). The accumulation of stereotype-based self-fulfilling prophecies. Journal of Personality and Social Psychology, 115 (5), 825–844.

Mameli, M. (2001). Mindreading, mindshaping, and evolution. Biology and Philosophy, 16, 597–628.

Marks, M. J., & Fraley, R. C. (2006). Confirmation bias and the sexual double standard. Sex Roles: A Journal of Research, 54 (1–2), 19–26.

Marsh, D. M., & Hanlon, T. J. (2007). Seeing what we want to see: Confirmation biasin animal behavior research. Ethology, 113, 1089–1098.

Matheson, J., & Vitz, R. (Eds.). (2014). The ethics of belief: Individual and social . Oxford: OUP.

Mayo, R., Alfasi, D., & Schwarz, N. (2014). Distrust and the positive test heuristic: Dispositional and situated social distrust improves performance on the Wason Rule Discovery Task. Journal of Experimental Psychology: General, 143 (3), 985–990.

McDonald, M. M., Navarrete, C. D., & van Vugt, M. (2012). Evolution and the psychology of intergroup conflict: The male warrior hypothesis. Philosophical Transactions of the Royal Society, B, 367, 670–679.

Mercier, H. (2016). Confirmation (or myside) bias. In R. Pohl (Ed.), Cognitive illusions (pp. 99–114). London: Psychology Press.

Mercier, H., & Sperber, D. (2011). Why do humans reason? Arguments for an argumentative theory. Behavioral and Brain Sciences, 34 (2), 57–111.

Mercier, H., & Sperber, D. (2017). The enigma of reason . Cambridge, MA: Harvard University Press.

Merton, R. (1948). The self-fulfilling prophecy. The Antioch Review, 8 (2), 193–210.

Miller, R., Brickman, P., & Bolen, D. (1975). Attribution versus persuasion as a means for modifying behavior. Journal of Personality and Social Psychology, 31 (3), 430–441.

Millikan, R. G. (1984). Language thought and other biological categories . Cambridge, MA: MIT Press.

Müller-Pinzler, L., Czekalla, N., Mayer, A. V., et al. (2019). Negativity-bias in forming beliefs about own abilities. Scientific Reports, 9, 14416. https://doi.org/10.1038/s41598-019-50821-w .

Murray, S. L., Holmes, J. G., & Griffin, D. W. (1996). The self-fulfilling nature of positive illusions in romantic relationships: Love is not blind, but prescient. Journal of Personality and Social Psychology , 71 , 1155–1180.

Myers, D., & DeWall, N. (2015). Psychology . New York: Worth Publishers.

Myers, D. G., & Lamm, H. (1976). The group polarization phenomenon. Psychological Bulletin, 83, 602–627.

Nickerson, R. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175–220.

Norman, A. (2016). Why we reason: Intention–alignment and the genesis of human rationality. Biology and Philosophy, 31, 685–704.

Ordabayeva, N., & Fernandes, D. (2018). Better or different? How political ideology shapes preferences for differentiation in the social hierarchy. Journal of Consumer Research, 45 (2), 227–250.

Palminteri, S., Lefebvre, G., Kilford, E. J., & Blakemore, S. J. (2017). Confirmation bias in human reinforcement learning: Evidence from counterfactual feedback processing. PLoS Computational Biology, 13 (8), e1005684.

Pelham, B. W., & Swann, W. B. (1994). The juncture of intrapersonal and interpersonal knowledge: Self-certainty and interpersonal congruence. Personality and Social Psychology Bulletin, 20 (4), 349–357.

Peters, U. (2018). Illegitimate values, confirmation bias, and mandevillian cognition in science. British Journal for Philosophy of Science . https://doi.org/10.1093/bjps/axy079 .

Peters, U. (2019a). Implicit bias, ideological bias, and epistemic risks in philosophy. Mind & Language , 34 , 393–419. https://doi.org/10.1111/mila.12194 .

Peters, U. (2019b). The complementarity of mindshaping and mindreading. Phenomenology and the Cognitive Sciences , 18 (3), 533–549.

Peters, U., Honeycutt, N., De Block, A., & Jussim, L. (forthcoming). Ideological diversity, hostility, and discrimination in philosophy. Philosophical Psychology . Available online: https://philpapers.org/archive/PETIDH-2.pdf .

Pinker, S. (2012). The false allure of group selection. Retrieved July 20, 2012 from http://edge.org/conversation/the-false-allure-of-group-selection .

Rabin, M., & Schrag, J. L. (1999). First impressions matter: A model of confirmatory bias. Quarterly Journal of Economics, 114 (1), 37–82.

Richerson, P., & Boyd, R. (2001). The evolution of subjective commitment to groups: A tribal instincts hypothesis. In R. M. Nesse (Ed.), Evolution and the capacity for commitment (pp. 186–202). New York: Russell Sage Found.

Richerson, P., & Boyd, R. (2005). Not by genes alone: How culture transformed human evolution . Chicago: University of Chicago Press.

Roberts, S. C., van Vugt, M., & Dunbar, R. I. M. (2012). Evolutionary psychology in the modern world: Applications, perspectives, and strategies. Evolutionary Psychology, 10, 762–769.

Schuck, P. H. (2001). The perceived values of diversity, then and now. Cardozo Law Review, 22, 1915–1960.

Sharot, T., Korn, C. W., & Dolan, R. J. (2011). How unrealistic optimism is maintained in the face of reality. Nature Neuroscience, 14, 1475–1479.

Simpson, J. A., & Beckes, L. (2010). Evolutionary perspectives on prosocial behavior. In M. Mikulincer & P. Shaver (Eds.), Prosocial motives, emotions, and behavior: The better angels of our nature (pp. 35–53). Washington, DC: American Psychological Association.

Chapter   Google Scholar  

Smart, P. (2018). Mandevillian intellingence. Synthese, 195, 4169–4200.

Smith, J. J., & Wald, B. (2019). Collectivized intellectualism. Res Philosophica, 96 (2), 199–227.

Sniezek, J. A., & Van Swol, L. M. (2001). Trust, confidence, and expertise in a judge–advisor system. Organizational Behavior and Human Decision Processes, 84, 288–307.

Snyder, M., & Klein, O. (2007). Construing and constructing others: On the reality and the generality of the behavioral confirmation scenario. In P. Hauf & F. Forsterling (Eds.), Making minds (pp. 47–60). John Benjamins: Amsterdam/Philadelphia.

Solomon, G. B. (2016). Improving performance by means of action–cognition coupling in athletes and coaches. In M. Raab, B. Lobinger, S. Hoffman, A. Pizzera, & S. Laborde (Eds.), Performance psychology: Perception, action, cognition, and emotion (pp. 88–101). London, England: Elsevier Academic Press.

Stangor, C. (2011). Principles of social psychology . Victoria, BC: BCcampus.

Stanovich, K., West, R., & Toplak, M. (2013). Myside bias, rational thinking, and intelligence. Current Directions in Psychological Science, 22, 259–264.

Steel, D. (2018). Wishful thinking and values in science: Bias and beliefs about injustice. Philosophy of Science . https://doi.org/10.1086/699714 .

Sterelny, K. (2006). Memes revisited. British Journal for the Philosophy of Science, 57, 145–165.

Sterelny, K. (2007). Social intelligence, human intelligence and niche construction. Philosophical Transactions of the Royal Society B, 362, 719–730.

Sterelny, K. (2018). Why reason? Hugo Mercier’s and Dan Sperber’s the enigma of reason: A new theory of human understanding. Mind and Language, 33 (5), 502–512.

Stibel, J. (2018). Fake news: How our brains lead us into echo chambers that promote racism and sexism. USA Today . Retrieved October 8, 2018 from https://eu.usatoday.com/story/money/columnist/2018/05/15/fake-news-social-media-confirmation-bias-echo-chambers/533857002/ .

Swann, W. B. (1983). Self-verification: Bringing social reality into harmony with the self. In J. Suls & A. G. Greenwald (Eds.), Social psychological perspectives on the self (Vol. 2, pp. 33–66). London: Erlbaum.

Swann, W. B., Jr. (2012). Self-verification theory. In P. A. M. Van Lange, A. W. Kruglanski, & E. T. Higgins (Eds.), Handbook of theories of social psychology (pp. 23–42). Beverley Hills, CA: Sage Publications Ltd.

Swann, W., & Ely, R. (1984). A battle of wills: Self-verification versus behavioral confirmation. Journal of Personality and Social Psychology, 46, 1287–1302.

Swann, W. B., Jr., Stein-Seroussi, A., & Giesler, B. (1992). Why people self-verify. Journal of Personality and Social Psychology, 62, 392–406.

Taber, C., & Lodge, M. (2006). Motivated skepticism in the evaluation of political beliefs. American Journal of Political Science, 50, 755–769.

Talaifar, S., & Swann, W. B. (2017). Self-verification theory. In L. Goossens, M. Maes, S. Danneel, J. Vanhalst, & S. Nelemans (Eds.), Encyclopedia of personality and individual differences (pp. 1–9). Berlin: Springer.

Tomasello, M. (2014). The ultra-social animal. European Journal of Social Psychology, 44, 187–194.

Tooby, J., & Cosmides, L. (2015). The theoretical foundations of evolutionary psychology. In D. M. Buss (Ed.), The handbook of evolutionary psychology (pp. 3–87). Hoboken, NJ: Wiley.

Tormala, Z. L. (2016). The role of certainty (and uncertainty) in attitudes and persuasion. Current Opinion in Psychology, 10, 6–11.

Trouche, E., et al. (2016). The selective laziness of reasoning. Cognitive Science, 40, 2122–2136.

Turnwald, B., et al. (2018). Learning one’s genetic risk changes physiology independent of actual genetic risk. Nature Human Behaviour . https://doi.org/10.1038/s41562-018-0483-4 .

von Hippel, W., & Trivers, R. (2011). The evolution and psychology of self-deception. Behavioral and Brain Sciences, 34 (1), 1–16.

Wenger, A., & Fowers, B. J. (2008). Positive illusions in parenting: Every child is above average. Journal of Applied Social Psychology, 38 (3), 611–634.

West, S. A., Griffin, A. S., & Gardiner, A. (2007). Social semantics: How useful has group selection been? Journal of Evolutionary Biology , 21 , 374–385.

Westra, E. (2020). Folk personality psychology: Mindreading and mindshaping in trait attribution. Synthese . https://doi.org/10.1007/s11229-020-02566-7 .

Whitaker, R. M., Colombo, G. B., & Rand, D. G. (2018). Indirect reciprocity and the evolution of prejudicial groups. Scientific Reports , 8 (1), 13247. https://doi.org/10.1038/s41598-018-31363-z .

Whittlestone, J. (2017). The importance of making assumptions: Why confirmation is not necessarily a bias . Ph.D. Thesis. Coventry: University of Warwick.

Willard, J., & Madon, S. (2016). Understanding the connections between self-fulfilling prophecies and social problems. In S. Trusz & P. Przemysław Bąbel (Eds.), Interpersonal and intrapersonal expectancies (pp. 117–125). London: Routledge.

Willard, J., Madon, S., Guyll, M., Spoth, R., & Jussim, L. (2008). Self-efficacy as a moderator of negative and positive self-fulfilling prophecy effects: Mothers’ beliefs and children’s alcohol use. European Journal of Social Psychology, 38, 499–520.

Word, C. O., Zanna, M. P., & Cooper, J. (1974). The nonverbal mediation of self-fulfilling prophecies in interracial interaction. Journal of Experimental Social Psychology, 10, 109–120.

Zawidzki, T. (2008). The function of folk psychology: Mind reading or mind shaping? Philosophical Explorations, 11 (3), 193–210.

Zawidzki, T. (2013). Mindshaping: A new framework for understanding human social cognition. Cambridge: MIT Press.

Zebrowitz, L. A., & Montepare, J. M. (2008). Social psychological face perception: Why appearance matters. Social and Personality Psychology Compass, 2, 1497–1517.

Download references

Acknowledgements

Many thanks to Andreas De Block, Mikkel Gerken, and Alex Krauss for comments on earlier drafts. The research for this paper was partly funded by the Danmarks Frie Forskningsfond Grant no: 8018-00053B allocated to Mikkel Gerken.

Author information

Authors and affiliations.

Department of Philosophy, University of Southern Denmark, Odense, Denmark

Department of Psychology, King’s College London, De Crespigny Park, Camberwell, London, SE5 8AB, UK

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Uwe Peters .

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Peters, U. What Is the Function of Confirmation Bias?. Erkenn 87 , 1351–1376 (2022). https://doi.org/10.1007/s10670-020-00252-1

Download citation

Received : 07 May 2019

Accepted : 27 March 2020

Published : 20 April 2020

Issue Date : June 2022

DOI : https://doi.org/10.1007/s10670-020-00252-1

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Find a journal
  • Publish with us
  • Track your research

example of confirmation bias critical thinking

Confirmation Bias – Meaning, Definition And Examples

Lipika was the only person with a degree from an Ivy League school at her organization. Not only her manager…

Confirmation Bias – Meaning, Definition And Examples

Lipika was the only person with a degree from an Ivy League school at her organization. Not only her manager but also her coworkers were in awe of her and had high expectations from her. Within a month of joining, her manager asks her to take over an important project. Lipika is unable to meet her deadlines and it surprises everybody at the office.

Do you think it was Lipika’s inability that prevented her from meeting the project’s requirements? Or was it her manager’s high expectations that led her to bite more than she could chew? 

Not only Lipika’s manager but the entire organization fell victim to the confirmation bias. Their first impression of Lipika was primarily based on her academic background, which heightened their expectations.

Confirmation bias needs to be addressed and challenged as it has the potential to affect our thoughts and decisions. Let’s see what it means and effective ways of tackling it.

Meaning Of Confirmation Bias

How does confirmation bias affect us, examples of confirmation bias, why is it difficult to challenge confirmation bias, how to overcome confirmation bias, confirmation biases befall organizations.

Why is it that humans are constantly reminded of thinking critically and reasoning logically? We are often asked to separate opinions from facts while judging different situations. It’s because we have a tendency to process information in an illogical and biased manner. Our minds take shortcuts, distorting thought processes, therefore affecting decision-making and information processing.

Mental shortcuts are a result of a constant supply of information. The human mind is often impatient when it needs to carefully process every piece of information. It leads to various types of cognitive biases that we navigate in our everyday lives—consciously or unconsciously. One such bias is the confirmation bias, commonly studied in cognitive psychology. It’s the tendency to process information that supports one’s existing beliefs, rejecting or overlooking any relevant information.

Confirmation bias psychology suggests that we don’t perceive information objectively. We tend to pick out bits of information that confirm our existing beliefs or prejudices. This type of bias is motivated by wishful thinking—when we form beliefs that confirm our views. It prevents us from gathering more information because the evidence gathered so far confirms our preexisting beliefs. 

Confirmation bias not only impacts how we gather information but also influences the ways we interpret and recall information. Sometimes, we remember details in a way that reinforces our attitudes. Confirmation bias psychology manifests itself in three ways:

How We Seek Information

Imagine that you’re a parent, who has recently watched a movie on the rising number of child abduction cases in India. Your research confirms that India is unsafe for children and you should take additional measures to protect your child. Confirmation bias affects the way we seek information i.e., the way we collect and analyze data.    

How We Interpret Information

Sometimes, we see the things that we want to see. Confirmation bias affects the way we consume and process information differently because it favors our beliefs. For instance, you purchase a pair of shoes that your favorite social media influencer has been promoting for the past couple of weeks. Once you get those shoes delivered, you realize that they aren’t as attractive as your influencer made it look on their profile.

How We Remember Things

Confirmation bias has the power to affect our memory. It influences the way we store and interpret information in our minds. Personal views can change memories as well. For instance, if you had a terrible experience at your previous organization because of an uncooperative manager, you’re likely to discredit the entire organization everywhere you go. One bad experience has the power to affect your overall perspective. 

How We Favor Information

Confirmation bias encourages us to favor or give weight to information that supports our beliefs. Similarly, we give less weightage to information that contradicts our beliefs. For instance, if your friends ask you to cut down on caffeine, you may not listen to them. But if the same advice comes from a friend, who is also a doctor, you’re more likely to act on it.

Confirmation bias is most notorious for affecting decision-making abilities. It’s also known as cherry-picking or whatever it takes to win an argument. Here are some examples of confirmation bias that highlight its setbacks.

Example 01: News And Media

You’ve probably come across WhatsApp forwards that are fake news and media in disguise. Sensationalist headlines and false claims often spread because of confirmation bias among readers. Their preexisting notions against something or someone is an easy catalyst of false news. People hardly check for the credibility of these sources and spread misinformation easily.

Example 02: Research And Analysis

Students need to submit a year-long thesis during the final year of college. Everyone comes up with a hypothesis and they work toward collecting data and analyzing it so that they can confirm that hypothesis. By doing so, students gather data that can be interpreted easily in a biased manner. They lean towards data that confirms their beliefs. If they want to conduct an impartial and objective study, they should try to prove their hypothesis wrong.

Example 03: Employee Relations

Confirmation bias can pose a huge problem at the workplace, especially when it comes to navigating professional relationships. If managers or team leaders feel a certain way toward an employee, they are likely to act or behave differently with them. One employee may be treated better than others because the manager has the same alma mater as the employee. 

We show confirmation bias because it protects our self-esteem . It helps us seek out information that validates our beliefs and world views. Everyone wants to feel good about themselves and confirmation bias acts as a catalyst. This is why we gather information in a biased manner, making it extremely challenging to overcome confirmation bias.

There are two forces driving confirmation bias psychology:

Challenge Avoidance: 

We try our best to avoid finding information that may prove us wrong or go against our previously-held beliefs. We ignore or avoid the information that is contradictory to our opinions and favor information that aligns with our perspectives.

Reinforcement Seeking: 

We tend to confirm our beliefs by seeking out information that proves us right. We find evidence to support existing claims. For example, when certain stereotypes get confirmed, you continue to act in a biased manner.

When people hold two or more contradictory beliefs, it leads to cognitive dissonance—a form of psychological distress. Confirmation bias helps minimize cognitive dissonance through challenge avoidance and reinforcement seeking. 

We fall victim to confirmation bias because we tend to jump to conclusions. To avoid being susceptible to this bias, we need to change the way we gather and process information. Here are a few helpful tips to get you started.

They say, “half knowledge is a dangerous thing”. If you want to process information more objectively, read the whole story. Understand multiple perspectives before you formulate opinions about something.

Always look for credible sources that support the information. You should always try to prove your hypothesis wrong, instead of confirming it. The more contradictory information you find, the more objective your research will be.

Be mindful when gathering and analyzing data. You don’t want to miss out on crucial aspects and generate false claims.

Confirmation bias is extremely challenging to overcome in workplace settings because people are afraid of vulnerability and transparency. Nobody likes to admit that they’re wrong, especially when you’re a team leader. Here are some ways to combat confirmation bias in the workplace and increase overall effectiveness:

Ask Neutral Questions

When conducting surveys, make sure that you ask objective and not leading questions (that directs someone towards an answer). Craft unbiased questions and make someone vet them before you circulate them.

Pay Attention To The Hiring Process

Confirmation bias often impacts the recruitment process as hiring managers prefer like-minded candidates that immediately fit into their organizational culture. While that seems like a good thing, you often end up with echo chambers with minimal diversity in thoughts and ideas. It’s time you rethink your hiring process and make sure that you ask objective or unbiased questions.

Employ A Devil’s Advocate

Whether you’re making decisions independently or as a team, ask a third person to play the devil’s advocate. By doing so, you get an outsider’s perspective; you’re likely to gain a deeper insight. Possible contradictory opinions will make you rethink and re-evaluate your project ideas or plans.

Moreover, when someone from your own team provides contradictory viewpoints, keep an open mind and listen to them. Find ways to manage differing viewpoints and provide everybody the opportunity to speak up.

No matter how much you continue to identify and tackle confirmation bias, you won’t be able to get rid of it. You need to accept its inevitability and make peace with the fact that some cognitive biases will always influence your thoughts. However, you can minimize its effects and keep a check on yourself from time to time. Practice critical thinking and analyze situations from multiple perspectives before jumping to conclusions.

Harappa Education’s   Thinking Critically course will help you connect the dots through a careful selection of data, insightful observations and detailed evaluation. The Ladder Of Inference will teach you how to process information more carefully. The Mental Models framework will guide you in thinking through situations and simplifying information to evaluate it more efficiently. The CAFE Framework will help you ask relevant questions to gain deeper insights. Don’t let your biases overpower you again! 

Explore Harappa Diaries to learn more about topics related to the THINK Habit such as What is  Critical Thinking , List of  Cognitive Skills ,  Halo Effect  Psychology &  Analytical Thinking  to think clearly and rationally.

Thriversitybannersidenav

Examples

Confirmation Bias

Ai generator.

example of confirmation bias critical thinking

Confirmation bias skews our perception of reality, leading us to favor information that aligns with our preexisting beliefs. This psychological phenomenon influences decision-making, impacts critical thinking, and perpetuates stereotypes. By understanding confirmation bias, we can take steps to recognize it in our daily lives, make more informed decisions, and foster open-mindedness. This article delves into the mechanisms behind confirmation bias, explores its effects, and offers strategies to mitigate its impact.

Confirmation Bias Examples

Confirmation Bias Examples

  • Political Beliefs : A person who strongly supports a particular political party tends to consume news from sources that align with their beliefs and dismisses reports from opposing viewpoints as biased or inaccurate.
  • Health and Diet : Someone who believes that a specific diet is the best will seek out information that supports their view and ignore studies or expert opinions that suggest otherwise.
  • Sports Teams : Fans of a sports team often believe that referees are biased against their team, interpreting calls and decisions in a way that supports this belief.
  • Academic Research : A researcher might unintentionally favor data that supports their hypothesis while disregarding data that contradicts it, leading to skewed results.
  • Personal Relationships : If someone believes a friend is unreliable, they are more likely to notice and remember instances when that friend cancels plans, while overlooking times when the friend was dependable.
  • Investment Choices : An investor with a strong belief in a particular stock or market trend will seek out positive news and analysis, ignoring or downplaying negative information.
  • Consumer Preferences : A loyal customer of a specific brand will focus on positive reviews and experiences with that brand while dismissing negative reviews as anomalies or unfair.
  • Religious Beliefs : Individuals may interpret ambiguous or neutral events as signs that reinforce their religious beliefs while ignoring or rationalizing events that contradict them.
  • Hiring Decisions : An employer who believes a candidate is a perfect fit for a job might overlook or downplay any negative feedback or red flags in the candidate’s background.
  • Social Media Interactions : Users tend to follow and engage with content that aligns with their views, creating echo chambers where they are exposed primarily to information that confirms their beliefs.

Confirmation Bias Examples in Real Life

  • People often follow news sources and social media accounts that align with their beliefs. This creates an echo chamber where they only encounter information that confirms their existing views.
  • For instance, someone with strong political beliefs may only read articles from news outlets that support their perspective, ignoring or dismissing opposing viewpoints.
  • Individuals seeking information about health treatments might favor studies and articles that support their preferred treatment method while disregarding research that contradicts it.
  • For example, a person who believes in the benefits of a particular diet may focus on success stories and positive testimonials, ignoring scientific studies that show mixed or negative results.
  • In relationships, people might notice behaviors that confirm their expectations of others while overlooking contradictory actions.
  • If someone believes their partner is untrustworthy, they may pay more attention to moments of secrecy or dishonesty and ignore signs of reliability and transparency.
  • Managers might favor information that supports their initial hiring decisions or project plans, dismissing feedback or data that suggests the need for changes.
  • A manager who believes a specific strategy will work might highlight successful case studies while downplaying instances where the strategy failed.
  • Students might focus on information that confirms their existing knowledge or opinions, neglecting new or challenging concepts.
  • A student who believes a particular historical event had a specific cause might only read sources that support that cause, ignoring evidence that presents a different perspective.
  • Investors might seek out and give more weight to information that supports their investment choices, leading to overconfidence and potentially poor financial decisions.
  • An investor convinced that a particular stock will perform well might ignore negative news or critical analysis of the company.
  • Investigators and jurors might focus on evidence that supports their initial impressions of a suspect’s guilt or innocence, potentially leading to wrongful convictions or acquittals.
  • If a juror initially thinks a defendant looks guilty, they might give more weight to incriminating evidence and downplay exonerating evidence.
  • Consumers may look for positive reviews and testimonials that confirm their decision to buy a product while ignoring negative reviews.
  • A person convinced that a particular brand is superior might disregard reports of defects or customer complaints.

Example of Confirmation bias in Relationships

  • Scenario : Jane believes that her partner, Tom, is always thoughtful and considerate.
  • Confirmation Bias : Jane notices and remembers all the times Tom brings her flowers or does something nice for her, reinforcing her belief. However, she overlooks or dismisses instances when Tom forgets important dates or is inconsiderate.
  • Scenario : Mike thinks his partner, Sarah, is always critical of him.
  • Confirmation Bias : Mike pays extra attention to and recalls instances where Sarah criticizes his actions, but he ignores or downplays the moments when she is supportive and complimentary.
  • Scenario : Anna believes her partner, Jack, is likely to cheat on her.
  • Confirmation Bias : Anna focuses on and remembers any suspicious behavior, like Jack talking to another woman, while ignoring his frequent expressions of love and loyalty.
  • Scenario : Chris believes that men are generally less emotionally expressive than women.
  • Confirmation Bias : Chris notices and remembers when his male partner, Alex, is not open about his feelings but disregards the times when Alex shares his emotions.
  • Scenario : Lisa thinks that arguments in a relationship are a sign of incompatibility.
  • Confirmation Bias : Lisa focuses on every disagreement as proof that she and her partner, Mark, are not meant to be together, while ignoring the resolution of conflicts and positive interactions.

Examples of Confirmation Bias in School

  • Example : A teacher who believes that certain students are high achievers may unconsciously give them higher grades or more positive feedback, even if their work is not significantly better than that of other students.
  • Example : If a teacher expects a student to perform poorly because of past behavior or stereotypes, they might interpret ambiguous answers or minor mistakes as evidence of the student’s lack of ability.
  • Example : Teachers might call on students they perceive as more knowledgeable or interested, reinforcing their belief that these students are more engaged or intelligent.
  • Example : A student who believes they are bad at math might focus on their mistakes and forget or downplay their successes, reinforcing their belief that they are poor in the subject.
  • Example : Students might form study groups with peers who share their views on certain subjects, leading to discussions that reinforce their existing beliefs rather than challenging them.
  • Example : When working on research projects, students might seek out sources that support their thesis and ignore those that contradict it, resulting in a biased presentation of information.

Administrators

  • Example : School administrators might favor policies that align with their personal beliefs or past experiences, such as standardized testing, despite evidence suggesting alternative methods could be more effective.
  • Example : An administrator who believes that veteran teachers are more effective might rate them more favorably in evaluations, ignoring evidence of innovative and effective teaching practices by newer teachers.
  • Example : Parents might focus on feedback that aligns with their belief about their child’s abilities or behavior, either overly positive or negative, and discount information that contradicts their views.
  • Example : Parents might choose schools based on reputations that match their preconceptions about what makes a good school, such as sports programs or academic rigor, potentially overlooking other important factors like school culture or student support services.

Overall School Environment

  • Example : A school might celebrate certain cultural events more prominently based on the predominant beliefs of the community, reinforcing a sense of validation for those beliefs and potentially marginalizing other cultures.
  • Example : The curriculum might be designed in a way that emphasizes particular viewpoints or interpretations of history, literature, or science that align with the beliefs of the curriculum developers, rather than presenting a balanced perspective.

Confirmation Bias Examples in Workplace

  • Resumé Screening : A recruiter believes that candidates from certain universities or companies are superior. They may unconsciously give more weight to candidates from these backgrounds, overlooking potentially better-suited candidates from other institutions.
  • Interview Process : If an interviewer has a positive initial impression of a candidate, they might focus on information that supports their favorable view and ignore signs that the candidate may not be a good fit.
  • Preconceived Notions : A manager who believes a particular employee is high-performing may interpret ambiguous behaviors or results as positive, while the same behaviors might be viewed negatively in an employee perceived as a poor performer.
  • Selective Feedback : Managers may give more positive feedback to employees they believe are strong performers and more critical feedback to those they view less favorably, regardless of the actual performance.
  • Project Assignments : A team leader might consistently assign challenging tasks to employees they believe are more capable, based on past performance or personal bias, without considering the potential of other team members.
  • Conflict Resolution : In resolving conflicts, a manager may side with an employee they favor, discounting the perspective of the other party and potentially worsening team tensions.
  • Data Interpretation : When analyzing data for decision-making, individuals may seek out or give more weight to information that confirms their preexisting beliefs, ignoring data that contradicts their views.
  • Strategy Development : Leaders might favor strategies that align with their previous successes or beliefs, disregarding innovative ideas that challenge the status quo.
  • Resistance to New Ideas : Employees or leaders who are convinced that a particular way of doing things is best might resist new ideas or changes, focusing only on information that supports their current practices.
  • Echo Chambers : Work environments where similar views are continuously reinforced can lead to a lack of diversity in thought and resistance to change, as individuals only seek out and confirm ideas that align with their existing beliefs.

Confirmation Bias Examples in Social Media

  • Example : Social media platforms like Facebook and Twitter use algorithms to show users content that aligns with their past behavior, such as likes, shares, and clicks. This can result in a feed filled with news articles and opinions that reinforce the user’s existing beliefs.
  • Example : Users are more likely to share news stories and articles that support their views, leading to a dissemination of biased information within their social network, reinforcing the beliefs of their followers.

Discussion and Debate

  • Example : On platforms like Reddit or Facebook groups, users often join communities that share their interests and viewpoints. This can lead to discussions where dissenting opinions are rare or discouraged, reinforcing the group’s collective beliefs.
  • Example : In comment sections, users might only respond positively to comments that align with their views while ignoring or attacking those that offer a different perspective. This behavior can create a skewed perception that most people agree with their stance.

Content Creation

  • Example : Influencers and content creators often produce content that resonates with their audience’s beliefs to maintain engagement and grow their following. This can result in a one-sided presentation of information.
  • Example : Creators might present data or anecdotes that support their viewpoint while omitting contradictory information. For instance, a health blogger might highlight the benefits of a particular diet using selective studies while ignoring studies that show negative effects.

Social Validation

  • Example : Users may feel validated in their beliefs when posts they agree with receive a high number of likes and shares, reinforcing the idea that their viewpoint is widely accepted and correct.
  • Example : Users might follow and interact primarily with accounts that share their opinions, receiving positive reinforcement and social validation, which strengthens their existing beliefs.

Misinformation and Fake News

  • Example : False or misleading information that aligns with users’ preconceptions can spread rapidly. Users are more likely to believe and share such content without critical evaluation if it supports their views.
  • Example : Organized efforts to spread false information often target specific beliefs and biases, exploiting confirmation bias to manipulate public opinion. For instance, during elections, disinformation campaigns might target voters with false stories that align with their political views.

Advertisements and Marketing

  • Example : Advertisers use data to target users with ads that align with their interests and beliefs. For instance, someone who frequently engages with eco-friendly content might see more ads for sustainable products, reinforcing their belief in the importance of environmentalism.
  • Example : Users might seek out reviews and testimonials that confirm their initial positive impression of a product, ignoring negative reviews. This can lead to a biased perception of the product’s quality and value.

How to Avoid Confirmation Bias

  • Seek Disconfirming Evidence : Actively look for information or evidence that contradicts your current beliefs or hypotheses. Consider why your assumptions might be wrong.
  • Ask for Peer Reviews : Get feedback from others, especially those with different perspectives. They may notice biases you have overlooked.
  • Use Structured Decision-Making : Employ systematic methods like decision matrices, pros and cons lists, or SWOT analyses to ensure all aspects are considered objectively.
  • Be Aware of Your Biases : Acknowledge that confirmation bias exists and that you are susceptible to it. Awareness is the first step toward mitigation.
  • Diversify Information Sources : Consult a variety of sources, especially those that challenge your views. This includes reading different news outlets, listening to various experts, and considering alternative viewpoints.
  • Slow Down Your Thinking : Take your time to make decisions. Quick decisions are more prone to bias. Reflect on why you believe something and whether you might be ignoring contrary evidence.
  • Question Your Assumptions : Regularly question the foundations of your beliefs and consider how you arrived at them. This can help uncover hidden biases.
  • Use Critical Thinking Techniques : Employ strategies such as Socratic questioning, where you ask and answer questions to stimulate critical thinking and illuminate ideas.
  • Engage in Debates : Participate in discussions and debates with people who have opposing viewpoints. This can help you see the strengths and weaknesses of your arguments and theirs.
  • Keep a Bias Journal : Document instances when you notice confirmation bias in your thinking. Reflecting on these entries over time can help you recognize patterns and improve your objectivity.

Confirmation Bias Psychology

Confirmation bias in psychology refers to the tendency of individuals to favor information that aligns with their existing beliefs and expectations while disregarding or undervaluing information that contradicts them. This cognitive bias affects how people gather, interpret, and recall information, often leading to skewed perceptions and flawed decision-making. For instance, in a therapeutic setting, a therapist might focus on details that support their initial diagnosis while ignoring symptoms that suggest alternative explanations. Recognizing and mitigating confirmation bias is crucial for promoting objective thinking and making well-rounded, informed decisions in both personal and professional contexts.

Confirmation Bias Psychology Example

  • Example : A therapist who strongly believes that a patient’s issues stem from childhood trauma may focus primarily on exploring the patient’s past, even when current life events or other factors might be more relevant to the patient’s problems.
  • Impact : This can lead to an incomplete understanding of the patient’s issues and potentially ineffective treatment.
  • Example : A person who smokes may seek out information that downplays the health risks of smoking or highlights the benefits of smoking (such as stress relief) while ignoring the overwhelming evidence of its dangers.
  • Impact : This allows the person to reduce the cognitive dissonance between their behavior (smoking) and the knowledge that it is harmful, making it easier to continue the behavior without guilt.
  • Example : In studies on stereotypes, researchers might find that participants remember stereotype-consistent information about a group more readily than stereotype-inconsistent information. For instance, if people hold a stereotype that engineers are introverted, they might recall information about an engineer being quiet at a party but forget details about the same engineer being outgoing in another context.
  • Impact : This can perpetuate stereotypes and reinforce biased views of social groups.
  • Example : When conducting research, psychologists might design experiments or interpret data in ways that confirm their hypotheses. For example, a researcher who believes that stress leads to poor academic performance might focus on data that shows a correlation between high stress levels and lower grades, while overlooking data that shows some students perform well under stress.
  • Impact : This can lead to biased research conclusions and hinder scientific progress.
  • Example : When interpreting the results of personality tests or other psychological assessments, practitioners might give more weight to results that fit their expectations about a client or patient. If they expect a client to be introverted based on initial impressions, they might emphasize test items that support this view while downplaying contradictory results.
  • Impact : This can lead to inaccurate assessments and inappropriate interventions.
  • Example : When individuals with opposing views on a controversial issue (such as gun control) are exposed to mixed evidence, they tend to interpret the evidence in a way that strengthens their original stance. For instance, a person who is pro-gun control may focus on studies showing the benefits of gun regulation, while a person who is anti-gun control might focus on studies highlighting the importance of gun ownership for self-defense.
  • Impact : This can lead to increased polarization and entrenchment of beliefs, making it harder to reach a consensus or compromise.

Confirmation bias in Research

Confirmation bias in research refers to the tendency of researchers to favor information or data that supports their pre-existing beliefs, hypotheses, or theories while disregarding or minimizing evidence that contradicts them. This cognitive bias can lead to skewed results, flawed methodologies, and invalid conclusions, ultimately compromising the integrity and reliability of the research. Confirmation bias can manifest in various stages of the research process, including the formulation of hypotheses, the design of experiments, the collection and interpretation of data, and the reporting of results. Researchers must be aware of this bias and take steps to mitigate its effects, such as using blinded study designs, pre-registering studies, and actively seeking disconfirming evidence.

Signs of Confirmation Bias

  • Selective Attention : Focusing on evidence that supports one’s beliefs while ignoring or undervaluing evidence that contradicts them.
  • Cherry-Picking Data : Highlighting data points that confirm a hypothesis while disregarding those that do not.
  • Overconfidence : Exhibiting an unwarranted high level of certainty in one’s beliefs or hypotheses despite contradictory evidence.
  • Ignoring Disconfirming Evidence : Disregarding or downplaying studies, data, or information that challenge one’s beliefs.
  • Biased Interpretation : Interpreting ambiguous evidence as supporting one’s beliefs.
  • Preference for Confirmatory Information Sources : Seeking out and valuing information from sources that align with one’s pre-existing views.
  • Misremembering Information : Recalling information in a way that reinforces one’s existing beliefs.
  • Resistance to Change : Being reluctant to revise beliefs in light of new, contradictory evidence.
  • Framing Effects : Presenting information in a way that emphasizes supportive evidence and downplays contradictory data.

Facts about Confirmation Bias

  • Experiment by Peter Wason : Peter Wason’s selection task experiment in the 1960s provided early evidence of confirmation bias, where participants favored information that confirmed their preexisting beliefs.
  • Information Processing : People with confirmation bias tend to process information by giving greater weight to data that supports their existing beliefs while dismissing or undervaluing information that contradicts them.
  • Echo Chambers : Confirmation bias contributes to the formation of echo chambers, especially in online environments, where individuals are surrounded by opinions and information that reinforce their beliefs.
  • Impact on Legal Decisions : Studies have shown that confirmation bias can affect legal professionals, such as judges and lawyers, who might favor evidence that supports their initial impressions of a case.
  • Health Decisions : In the medical field, confirmation bias can lead patients and even healthcare providers to favor treatments or diagnoses that align with their initial expectations, potentially leading to misdiagnosis.
  • Financial Decisions : Investors often fall prey to confirmation bias by seeking out information that supports their investment decisions, ignoring signs that they might need to reconsider their strategy.
  • Social Interactions : In social contexts, confirmation bias can cause people to associate primarily with others who share their views, reinforcing their beliefs and potentially leading to greater polarization.
  • Political Polarization : Confirmation bias plays a significant role in political polarization, as individuals tend to consume news and opinions that align with their political views, ignoring contrary evidence.
  • Impact on Education : Students and educators may exhibit confirmation bias by focusing on information that confirms their understanding of a subject, potentially hindering learning and critical thinking.
  • Mitigation Techniques : Strategies to counteract confirmation bias include actively seeking out opposing viewpoints, engaging in critical thinking exercises, and being aware of one’s biases to make more balanced decisions.

Why is confirmation bias important?

Confirmation bias is crucial as it affects decision-making by favoring information that aligns with existing beliefs, leading to flawed judgments.

Which statement best describes the confirmation bias?

Confirmation bias is the tendency to search for, interpret, and remember information that confirms preexisting beliefs.

What is confirmation bias in human factors?

In human factors, confirmation bias affects safety and efficiency by causing individuals to overlook errors that contradict expectations.

What is the confirmation bias which describes the human tendency?

It describes the human tendency to favor information that confirms existing beliefs while disregarding contradictory evidence.

What is the concept of confirmation bias specifically?

The concept of confirmation bias involves favoring information that aligns with one’s existing beliefs and dismissing opposing data.

What is confirmation bias in unconscious bias?

In unconscious bias, confirmation bias affects automatic judgments by reinforcing preexisting beliefs without conscious awareness.

What is confirmation bias in education?

In education, confirmation bias affects learning and teaching by reinforcing stereotypes and limiting exposure to diverse viewpoints.

What is confirmation bias explanation for kids?

For kids, confirmation bias means favoring information that matches what they already think and ignoring different opinions.

What is the difference between confirmation bias and belief perseverance?

Confirmation bias is seeking evidence to confirm beliefs; belief perseverance is sticking to beliefs even after they are discredited.

What is an example of bias for students?

For students, bias might be favoring information in research that supports their hypothesis while ignoring contradictory evidence.

Twitter

Text prompt

  • Instructive
  • Professional

10 Examples of Public speaking

20 Examples of Gas lighting

2.2 Overcoming Cognitive Biases and Engaging in Critical Reflection

Learning objectives.

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

Connections

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Confirmation bias.

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk cost fallacy.

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Think Like a Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

As an Amazon Associate we earn from qualifying purchases.

This book may not be used in the training of large language models or otherwise be ingested into large language models or generative AI offerings without OpenStax's permission.

Want to cite, share, or modify this book? This book uses the Creative Commons Attribution License and you must attribute OpenStax.

Access for free at https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Authors: Nathan Smith
  • Publisher/website: OpenStax
  • Book title: Introduction to Philosophy
  • Publication date: Jun 15, 2022
  • Location: Houston, Texas
  • Book URL: https://openstax.org/books/introduction-philosophy/pages/1-introduction
  • Section URL: https://openstax.org/books/introduction-philosophy/pages/2-2-overcoming-cognitive-biases-and-engaging-in-critical-reflection

© Dec 19, 2023 OpenStax. Textbook content produced by OpenStax is licensed under a Creative Commons Attribution License . The OpenStax name, OpenStax logo, OpenStax book covers, OpenStax CNX name, and OpenStax CNX logo are not subject to the Creative Commons license and may not be reproduced without the prior and express written consent of Rice University.

Critical Thinking Academy

Benefits of Critical thinking in the workplace

  Critical thinking training results in better decision-making and problem-solving skills A good…

  • Unsubscribe

Confirmation bias

linkedin article confirmation bias

Why logic and evidence thoroughly fail to change opinions and beliefs

Today,  supporters and opponents of Mr. Donald Trump in the USA and Mr. Modi in India are highly polarized. Supporters and detractors try very hard to provide to the other supporting their views. But contrary evidence only seems to harden and polarize opinions further. Why is neither side able to convince the other? Why do discussion and evidence result in further polarization?  Why do logic and evidence fail to convince?

Psychologists have studied this phenomenon and discovered that once we have formed a belief about something, we tend to seek out information that confirms our belief and filter out or reject any information that contradicts it. Psychologists initially dubbed this cognitive bias a the Confirmation bias, and later researchers (Gary Klein) seem to have termed this as  'Fixation'.  For this article, given its objective, I will be treating fixation and confirmation bias to mean the same.

Death penalty study

Lord, Ross and Lepper (1979) conducted a study to see if people change their opinions in the face of contrary evidence. They recruited a group of 48 students for a study. Half of them believed that the death penalty was a good deterrent to future crime, while the other was against the death penalty and doubted its effectiveness as a deterrent.

Two carefully prepared studies were given to both groups. One study provided data that supported the view that the death penalty was a deterrent, while the other study supported the view that the death penalty was not a deterrent. The participants were also provided with critiques of each study listing out their deficiencies.

So, a supporter of the death penalty got to see one study supporting it, and another opposing it along with the critiques of both. And the opponent also got to see both studies and both critiques.

Given that participants were presented with evidence that opposed their viewpoints, it was a possibility that they would soften their views and move to a somewhat central position. However, what happened was that the studies resulted in greater polarization of views.

Those who initially supported the death penalty ended up with even greater conviction in their position. The group opposing the death penalty too ended with greater conviction in their position that the death penalty was not a deterrent.

This tendency to only look for information that supports their own belief while rejecting or downplaying conflicting information has been termed as "Confirmation bias". This particular cognitive bias results in many errors in decision making and also creates sharp divides amongst people. Today's social media platforms aren't helping either.

Confirmation bias combined with the Artificial Intelligence-based social media engines is fuelling and strengthening differences between sections of society. If I believe that Trump is a great leader, Social media will flood me with a constant stream of news and articles which confirm this view. A Trump detractor on the other hand will be fed similar volumes of news and articles that are not complementary to Trump. The constant stream of confirming information only strengthens our particular beliefs and creates an unbridgeable divide. I believe that this cognitive bias could be a good explanation for why we as a society are more sharply divided even over trivial and not so important issues today as compared to times before the advent of social media.

Confirmation bias in testing hypotheses

Let's take a hypothetical situation where I make up a rule for a sequence of numbers. I write down this rule on a piece of paper and tell you that that the numbers  2-4-6  satisfy the rule. You are then asked to guess the rule I am applying to generate the sequences. You have to figure out what the rule is by asking me to confirm or deny sequences of numbers you generate based on your hypothesis of what the rule is. Lets say that you give me a sequence: x-y-z and  I give you feedback each time as 'Yes - confirms to the rule' or No - does not conform to the rule. How would you go about trying to confirm your hypothesis of what the rule is?

You may want to pause here, and look at your alternate hypotheses and then read on.

This was an actual experiment conducted by Wason (1960)1 to understand how people proceed in testing hypotheses. He was trying to confirm Popper's 2  belief that the general mistake people make in testing hypotheses is that they tend to try and confirm a hypothesis rather than trying to falsify it. Participants in Wason's experiment followed this predicted path in testing their hypothesis of the number sequence.

They framed a hypothesis to start with and tried to propose more sets of three numbers to satisfy this rule. In this case, they tried numbers such as 4-8-10,  6-8-12,  20-22-24. The feedback was positive that the number sequences satisfied the rule. After several rounds of testing, the participants were satisfied that their hypothesis was correct and stopped testing.

However, when they stated the rule they had in mind (even numbers), they were told that the rule was wrong. The rule was simply 'increasing numbers'. The numbers hypothesized by the participants was a subset of all possible numbers that could satisfy the rule, and hence they got the feedback each time that their sequence satisfied the rule. But their conclusion about what the rule itself turned out to be wrong.

So what went wrong? According to Wason, once the participants formed a hypothesis, they failed to work on finding sequences of numbers that would falsify the hypothesis. Instead, they worked only towards confirming the hypothesis.  A bias towards seeking only confirming evidence has also been called the 'Confirming-evidence' trap. And yet another name for it is the 'Positive testing' strategy.

This tendency to only look for confirming evidence is discussed by Prof John S. Hammond, Ralph L. Keeney, and Howard Raiffa   in an article  called 'The hidden traps in decision making' published in Harvard Business Review (Sep-Oct 1998)

In their article, they take the hypothetical example of the President of a large manufacturing operation wondering whether to go ahead or call off a manufacturing plant expansion. In specific, the President is concerned that the firm won't be able to maintain the rapid pace of growth of its exports. He fears that the value of the US dollar will strengthen making its goods more expensive for overseas consumers and dampen demand. But before taking a decision, he decides to call the Chief Executive of a smaller firm who also had dropped plans to set up a new factory to check why she had dropped her plans of expansion. The CEO cites her reason for dropping the expansion as her apprehension that the US dollar will strengthen against other currencies. The CEO's reasoning echoes the President's own fears.

The authors of the article warn us that the President should not treat this conversation as a deciding factor and drop his plans for expansion. If he decides based on this one conversation, he would be a victim of what psychologists call Confirming-evidence trap'.  This bias leads us to seek out only evidence that supports our existing instinct or point of view while avoiding information that contradicts it.

The confirming-evidence bias not only affects us where we go to look for evidence, it also affects the weightage we give to available evidence. We pay far too much weightage to evidence that supports our viewpoints and too little to conflicting viewpoints.

In their article they also provide some suggestions on how we can protect ourselves from this bias.

  • Always check whether we are examining all evidence with equal vigour
  • Do not accept confirming evidence without question
  • Get someone whose opinion you value to play the Devil's advocate
  • Build your counter-arguments too
  • Evaluate what's the strongest reason to do something else
  • Be honest with yourself about your motives- are you just seeking confirming evidence, or are you really wanting to impartially make a good choice?
  • In seeking the advice of others, don't ask leading questions

The confirmation bias is also a good explanation of why two different people have diametrically opposite views about an employee, or for that matter, while hard evidence shows excellent performance by the employee, a superior doesn't view the employee as a good performer. It may be wiser to rely on hard evidence rather than relying on memory to evaluate people - as relying on memory will only pull out confirming instances.

About the author

Prasad Aryasomayajula

Prasad Aryasomayajula

Prasad Aryasomayajula has not set their biography yet

Author's recent posts

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

example of confirmation bias critical thinking

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

example of confirmation bias critical thinking

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Richard H. Smith Ph.D.

One Way to Avoid the Confirmation Bias

Commit to generating and testing alternative narratives..

Posted May 23, 2024 | Reviewed by Jessica Schrader

  • Few people are immune to the .
  • Confirmation bias accounts for many instances of faulty thinking.
  • Our favored stories can shape our thinking more than the facts themselves.

Source: Bengt Nyman/Wikimedia Commons

The confirmation bias is the tendency to favor information confirming our attitudes and values.

Contemporary politics is a prime example, where the use of alternative “facts,” selective media exposure, and the balkanization of social interactions, etc., ensure that many of us easily fortify the apparent validity of our political leanings.

No wonder politics suffers from so much entrenched polarization.

An equal opportunity bias , it cuts across domains of life and types of people.

I came across a particularly instructive example of this bias in a recent memoir by Robert Lefkowitz, the Nobel Prize-winning chemist and physician. He describes a transformative experience from his third year in medical school.

During clinical rounds, one of his fellow trainees presented what appeared to be a convincing diagnosis of a serious case of pulmonary fibrosis. The trainee had assembled the facts of the case into such a sensible narrative that Lefkowitz was “dumbfounded” when the attending physician, Mortimer Bader, said, “OK, good job. Now, Lefkowitz, I want you to use the same facts of the case, but tell me a different story.” (Lefkowitz, 2021, p. 19)

Lefkowitz had found the narrative highly convincing. Put on the spot, he could come with no alternative story. None of the other trainees offered an alternative either.

Bader then proceeded to tell a very different story. He used the same details of the case but weaved an even more compelling narrative suggesting a much less life-threatening diagnosis, that of chronic asthma.

Interestingly, he did not alter the facts of the case. However, he gave some facts more weight and ordered the facts differently.

Bader summed up the lesson in a way that left Lefkowitz “ awestruck ": “Many people think data tell a story, but nothing could be further from the truth. Data are just data. A story is something you impose on the data.” (Lefkowitz, 2021, p. 19)

What was the ultimate diagnosis? Additional tests were necessary to solve the problem.

Lefkowitz took this lesson to heart. He rewired his approach to becoming a physician. He developed the conscious habit of generating and welcoming alternative narratives to apply to the facts of his clinical cases. He would then examine carefully how the facts fit these multiple narratives. He found this approach even more important in his research career . This was because the sheer amount of data available allowed any story to more easily appear to fit these data.

What does Lefkowitz’s experience tell us?

We think that the facts drive explanations for the events happening around us. However, the favored stories we bring to the event, or those that quickly arise as we learn the facts, powerfully shape how we weight, order, and further select the facts. Perhaps even from the very start, we begin veering away from true understanding.

A solution is to make a conscious pledge to construct multiple provisional narratives. We make permanent friends with a personal devil’s advocate.

Then, honest brokers, we see how the facts align with these alternative narratives.

The story we settle on will be closer to the truth we honor and seek.

Lefkowitz, R., (2021). A funny thing happened on the way to Stockholm . New York, NY: Pegasus Books.

Richard H. Smith Ph.D.

Richard H. Smith, Ph.D. , a social psychologist and a writer of nonfiction and fiction, taught at the University of Kentucky.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • International
  • New Zealand
  • South Africa
  • Switzerland
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Self Tests NEW
  • Therapy Center
  • Diagnosis Dictionary
  • Types of Therapy

May 2024 magazine cover

At any moment, someone’s aggravating behavior or our own bad luck can set us off on an emotional spiral that threatens to derail our entire day. Here’s how we can face our triggers with less reactivity so that we can get on with our lives.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

IMAGES

  1. 17 Confirmation Bias Examples (2023)

    example of confirmation bias critical thinking

  2. Confirmation Bias: Definition, Signs, Overcoming

    example of confirmation bias critical thinking

  3. Understanding Confirmation Bias When Interviewing

    example of confirmation bias critical thinking

  4. How Can Confirmation Bias Be Avoided

    example of confirmation bias critical thinking

  5. Confirmation Bias

    example of confirmation bias critical thinking

  6. This infographic explains the 5 effective strategies to combat

    example of confirmation bias critical thinking

VIDEO

  1. Breaking the Bias: Understanding Confirmation Bias in Decision-Making

  2. Debunking Psychic Tricks: The Truth Behind Astrologers and Psychics

  3. Slippery Slope Fallacy Explained: Slippery Slope Fallacy Examples & Applications [+Studies, Stories]

  4. 02 Logical Hazards

  5. Confirmation Bias and Politics

  6. Every Cognitive Bias Explained in 6 minutes!

COMMENTS

  1. Confirmation Bias In Psychology: Definition & Examples

    Confirmation Bias is the tendency to look for information that supports, rather than rejects, one's preconceptions, typically by interpreting evidence to confirm existing beliefs while rejecting or ignoring any conflicting data (American Psychological Association). One of the early demonstrations of confirmation bias appeared in an experiment ...

  2. What Is Confirmation Bias?

    Confirmation bias is the tendency to seek out and prefer information that supports our preexisting beliefs. As a result, we tend to ignore any information that contradicts those beliefs. Confirmation bias is often unintentional but can still lead to poor decision-making in (psychology) research and in legal or real-life contexts.

  3. Confirmation Bias: Seeing What We Want to Believe

    Fascinating Confirmation Bias Examples. Confirmation bias is commonplace and typically has a low impact, yet there are times when it is significant and newsworthy (Eysenck & Keane, 2015; Lidén, 2023). ... Develop critical thinking skills that evaluate evidence and arguments objectively without favoring preconceived notions or desired outcomes.

  4. Confirmation Bias (Examples + Definition)

    Confirmation bias was "discovered" in 1960 by a psychologist named Peter Wason. He confirmed his theory with a simple experiment. He gave participants three numbers and asked them to figure out the "rule" for the three numbers. The example he gave was "2-4-6.". The rule behind his set of three numbers is that they had to be chosen ...

  5. Illustrating Confirmation Bias in Psychology: Real-Life Examples and

    Confirmation bias is a pervasive cognitive phenomenon that profoundly influences how we perceive, process, and interpret information, underscoring the importance of critical thinking, evidence evaluation, and intellectual humility in navigating the complexities of belief formation and decision-making.

  6. Confirmation Bias: Definition, Signs, Overcoming

    Confirmation bias is a type of cognitive bias that favors information that confirms your previously existing beliefs or biases. For example, imagine that Mary believes left-handed people are more creative than right-handed people. Whenever Mary encounters a left-handed, creative person, she will place greater importance on this "evidence ...

  7. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  8. Confirmation Bias

    Confirmation bias is likely to occur when we are gathering information for decision-making. It occurs subconsciously, meaning that we are unaware of its influence on our decision-making. As such, the first step to avoiding confirmation bias is being aware that it is a problem.

  9. Confirmation bias

    confirmation bias, people's tendency to process information by looking for, or interpreting, information that is consistent with their existing beliefs. This biased approach to decision making is largely unintentional, and it results in a person ignoring information that is inconsistent with their beliefs. These beliefs can include a person ...

  10. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  11. Confirmation bias

    Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills. Biased search for information, biased interpretation of this information, and biased memory recall, have been invoked to explain four specific effects:

  12. 17 Confirmation Bias Examples (2024)

    Examples of Confirmation Bias. 1. Optimistic People. Being optimistic is good for a person's mental health, to some extent. Seeing the positive side of everything can keep us in a good mood. But optimists also seem to have a talent for ignoring negative or unpleasant information. Being pessimistic is just the opposite.

  13. The Confirmation Bias: Why People See What They Want to See

    How the confirmation bias affects people. The confirmation bias promotes various problematic patterns of thinking, such as people's tendency to ignore information that contradicts their beliefs.It does so through several types of biased cognitive processes: Biased search for information.This means that the confirmation bias causes people to search for information that confirms their ...

  14. Confirmation Bias Definition and Examples

    Confirmation bias is the tendency to seek information confirming preexisting beliefs while ignoring information contradicting them. This bias can be particularly problematic when making important decisions, leading to flawed reasoning and inaccurate conclusions. It is a type of cognitive bias. Confirmation bias not only affects how we gather ...

  15. What Is the Function of Confirmation Bias?

    Confirmation bias is one of the most widely discussed epistemically problematic cognitions, challenging reliable belief formation and the correction of inaccurate views. Given its problematic nature, it remains unclear why the bias evolved and is still with us today. To offer an explanation, several philosophers and scientists have argued that the bias is in fact adaptive. I critically discuss ...

  16. Cognitive Bias Is the Loose Screw in Critical Thinking

    People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality. Cognitive biases are mental shortcuts people take in order to process ...

  17. Confirmation Bias

    Here are some examples of confirmation bias that highlight its setbacks. Example 01: News And Media. You've probably come across WhatsApp forwards that are fake news and media in disguise. Sensationalist headlines and false claims often spread because of confirmation bias among readers. ... Practice critical thinking and analyze situations ...

  18. Confirmation Bias

    Definition. Confirmation bias is a cognitive bias where individuals favor information that confirms their preexisting beliefs or hypotheses while giving less consideration to alternative viewpoints. This bias manifests through selective search for evidence, interpretation of evidence, and memory recall, all skewed to support existing beliefs.

  19. 2.2 Overcoming Cognitive Biases and Engaging in Critical ...

    Confirmation Bias. One of the most common cognitive biases is confirmation bias, which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs.Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality.

  20. Confirmation bias

    Prasad Aryasomayajula. Confirmation bias is the tendency to selectively search for and interpret information in a way that confirms with one's pre-existing beliefs or hypotheses. In other words, you interpret new information in a way so that it becomes compatible with your.

  21. Cognitive Bias List: 13 Common Types of Bias

    The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.

  22. 12 Common Biases That Affect How We Make Everyday Decisions

    This bias brings to light the importance of, as I discussed in my previous post on "5 Tips for Critical Thinking," playing devil's advocate. That is, we must overcome confirmation bias and ...

  23. One Way to Avoid the Confirmation Bias

    The confirmation bias is the tendency to favor information confirming our attitudes and values. Contemporary politics is a prime example, where the use of alternative "facts," selective media ...

  24. Teaching with AI

    Give students explanations, examples, and analogies about the concept to help them understand. You should guide students in an open-ended way. Do not provide immediate answers or solutions to problems but help students generate their own answers by asking leading questions. Ask students to explain their thinking.