Belief bias refers to the tendency to assess information in a way that aligns with our existing views.
So, for example, when we come across a piece of political information that suits our ideology, we are more likely to believe that it is true. Likewise, we are prone to reject information that contradicts our beliefs.
Belief bias can take place in many contexts, from formal reasoning to everyday generalizations. It is affected by factors such as time and the nature of information, which we will discuss later. First, let us talk about the concept in more detail and look at some examples.
Definition of Belief Bias
Evans, Barston, and Pollard define belief bias as:
“the tendency to judge the validity of an argument based on the believability of its conclusion rather than the quality of its reasoning” (1983).
In other words, we overlook the logic behind the argument and simply judge it based on its believability: if it goes with our beliefs, we accept it; if it does not, then we reject it.
Belief Bias and the Dual-Process Theory
Many psychologists put forward the dual-process theory to explain belief bias. The theory states that humans have two neurological tracks for reasoning. The first (System 1) is a kind of automatic response system, which is “intuitive” and “rapid”. The second one is a controlled response system, which is “analytic” and “slow” (Schneider & Shiffrin, 1977). Confirmation bias occurs when we rely on the intuitive judgment provided by System 1. Deeply held beliefs, whether political or religious, can also promote belief bias.
Belief Bias vs Confirmation Bias
Belief bias and confirmation bias are similar but not exactly the same concepts. In fact, they tend to occur at the same time because belief bias supports confirmation bias.
Here’s the difference.
- Confirmation bias refers to the tendendency to filter-out information that doesn’t support our desired conclusions.
- Belief bias refers to the tendency to make judgments about the validity of logical syllogisms (arguments) based on the believability of their conclusions rather than the logical structure of the argument itself.
The key difference is that belief bias focuses on making judgments about validity of logic based on our expected conclusions; whereas confirmation bias is about making judgments about the conclusion to suit or desires. Often, when engaging in confirmation bias, we may employ belief bias as a way to support our desired conclusion.
Examples of Belief Bias
- Politics: Supporters of a particular candidate may dismiss or downplay logical controversies surrounding a candidate because they expect that people from that political party wouldn’t behave that way. In other words, they decide that valid arguments are invalid because they just don’t believe that the logical conclusion is possible (e.g. when you refuse to accept that your candidate is corrupt, even though the logic suggests they are).
- Religion: We tend to judge the validity of a theological argument based upon our expectation that logical arguments will confirm our pre-existing religious convictions. So, if a piece of evidence emerges that contradicts the writings of our holy text, we will try to twist the facts to make it fit, or, poke holes in the evidence until we can find a rationale that is consistent with our desire for our religion (or, for that matter, atheism) to be true.
- Syllogisms: Belief bias occurs most significantly in syllogisms, which are used to test logical reasoning. For example, “All birds can fly; crows can fly; therefore, crows are birds”. Now, we know that crows are birds, but that belief makes us overlook the argument, which is logically unsound. The conclusion does not follow from the premises: just because both birds and pigeons fly doesn’t mean that pigeons are birds (even insects fly). Moreover, the first premise itself is wrong since not all birds can fly (like kiwis and penguins).
- Wason’s Selection Task: The Wason Selection Task is a logic puzzle created by Peter Cathcart Wason (1966), and it is a good demonstrator of belief bias. It involves four cards, each of which has a number on one side and a color on another. Participants are then told to turn any two cards and test a rule: if there is an even number on one side, then the opposite side is blue. Because of belief bias, most participants would simply turn the cards mentioned in the rule. However, this does not provide any information about the other cards and therefore cannot confirm the rule.
- Evaluating Research: Belief bias can also exist in academic contexts, so even professional thinkers are not free from it. Kaplan conducted a qualitative study of undergraduate students who had taken an introductory statistics class. She found out that students were more likely to question the experiment design of the study when they did not believe in its conclusions (2017). Therefore, Kaplan argued, teachers should be mindful of the example problems used in class and have discussions about believability.
- THOG Problem: This is another logic puzzle created by P.C. Wason, which demonstrates how human thinking is prone to belief bias. The participant is shown four symbols: a black square, a white square, a black circle, and a white circle. The experimenter then tells them that he has picked one color and one shape. A symbol that possesses exactly one of the properties is called a THOG. The black square is a thog. Now, the participant has to answer what the other symbols are: a thog, definitely not a thog, or undecidable. Belief bias, in this case, is called “memory cueing” (Evans, 1993).
- Evans’ Study: Evans, Barston, and Pollard conducted a series of experiments in 1983 to see how participants evaluate logical validity. They found out that people demonstrate belief bias: they tend to reject valid arguments with unbelievable conclusions while accepting invalid arguments with believable conclusions. So, there was a greater acceptance (80%) of believable over unbelievable conclusions (33%). Moreover, there was a small difference between believable & valid conclusions and unbelievable & invalid conclusions.
- Making Generalizations: Belief bias plays a big role in our tendency to generalize. Generalization is when we make inferences about a large group of things based on our understanding of a few of them. For example, suppose we have a few bad experiences with people in a country, we might say that their entire population is rude. Belief bias makes us generalize in a way that is consistent with our beliefs. So, we are likely to accept overgeneralizations based on social class/religion that align with our beliefs and reject those that go against them.
- Judging Extremeness: How we assess the extremeness of any statement is affected by belief bias. Lynn and Williams tested this with technical college students in two highly unionized communities (1990). They found out that, when people encounter information that aligns with our views, they tend to see it as less extreme, even if it (objectively) makes very strong/controversial claims. At the same time, anything that contradicts their beliefs is seen as much more extreme than it is, irrespective of the actual arguments. This makes people easily dismiss anything that challenges their worldview, by making such information seem too far-fetched.
- Giving Social Attribution to Actions: In their research, Lynn and Williams also studied how belief bias impacts social attribution of actions (1990). Social attribution refers to the process of explaining the causes of another person’s action. When we observe someone, we try to figure out the internal (say personality) or external (environment) causes motivating them. Lynn and Williams, in their study of labor unions, found out that the highest level of belief bias occurs in giving social attribution.
- Ethical Questions: Belief bias can also be seen in ethical questions. For example, vegetarians and non-vegetarians will usually accept information that aligns with their views. Non-vegetarians will most likely say that eating meat is natural while ignoring the environmental impact of meat production and the suffering of animals. In contrast, vegetarians will argue that it is always immoral to eat meat, even in life-or-death situations.
- Political Views: The most apparent manifestation of belief bias takes place in the political landscape. People are always more likely to believe political information that aligns with their existing beliefs, and at the same time, more likely to distrust or reject anything that goes against their beliefs. It is quite a significant problem because it makes people misjudge political events and policies, which may ultimately influence their vote. Moreover, belief bias, along with various contemporary phenomena (such as the filter bubble on social media) has been increasing political polarization (Adee, 2016).
Factors Influencing Belief Bias
Various factors affect belief bias, such as time, nature of content, age, etc.
Researchers have found that mainly four factors influence belief bias. These include:
- Time: In 2005, Evans and Holmes conducted research to study the impact of time on belief bias. They gave the same set of reasoning questions to two groups: one was allowed as much time as they wanted while the other was given only two seconds. They found out that much more belief bias was found in the time-bound group; lack of time made them shift from logical to belief-based thinking.
- Content: The nature of the information that we are evaluating affects the belief bias. Goel & Vartanian found out that, in syllogisms with negative emotional content, belief bias was less likely. They concluded that when we are confronted with negative content, it makes us more careful in our judgments and reason in more detail (2011).
- Age: Our age also impacts belief bias. Children are more prone to rely on beliefs than young adults when it comes to reasoning. Older people are also affected more by belief bias than young adults (De Neys and Van Gelder, 2009).
Effects of Belief Bias
- Impaired Decision Making: If we have a bias only toward information that confirms our perspectives, we will end up making suboptimal decisions. We will end up ignoring equally or more valid evidence or more effective pathways to success. This can lead to poor choices and impaired decision-making in a variety of contexts, including personal, professional, and political decisions.
- Polarization of Viewpoints: Over time, belief bias can lead to the polarization of viewpoints. People will only consume and believe information that aligns with their existing views, which can move us further into extremes, social media bubbles, and radicalization. Contemplating alternative perspectives with an open mind and constantly challenging our biases are necessary to get closer to the truth and wisdom.
- Belief Perseverance: Belief bias can hinder learning and personal growth as it discourages individuals from considering alternative viewpoints or information. This can lead to a lack of understanding or knowledge in various areas and limit an individual’s ability to adapt and evolve their beliefs in the face of new information.
Belief bias refers to our tendency to evaluate information as per our beliefs.
Instead of logic, we reason according to our beliefs, accepting whatever fits with our worldview. Belief bias can occur in all contexts, from everyday situations to academic studies. It is affected by factors like time, the nature of content, and age.
Adee, S. ( 2016). Burst the filter bubble. New Scientist.
De Neys, W., & Van Gelder, E. (2009). Logic and belief across the lifespan: the rise and fall of belief inhibition during syllogistic reasoning. Developmental science, 12(1), 123-130. doi: https://doi.org/10.1111/j.1467-7687.2008.00746.x
Evans, J. S. B., Barston, J. L., & Pollard, P. (1983). On the conflict between logic and belief in syllogistic reasoning. Memory & cognition, 11(3), 295-306. doi: https://doi.org/10.3758/BF03196976
Evans, J. S. B., Newstead, S. E., & Byrne, R. M. (1993). Human reasoning: The psychology of deduction. New York: Psychology Press.
Goel, V., & Vartanian, O. (2011). Negative emotions can attenuate the influence of beliefs on logical reasoning. Cognition and Emotion, 25(1), 121-131. doi: https://doi.org/10.1080/02699931003593942
Kaplan, J. J. (2009). Effect of belief bias on the development of undergraduate students’ reasoning about inference. Journal of Statistics Education, 17(1). doi: https://doi.org/10.1080/10691898.2009.11889501
Lynn, M. L., & Williams, R. N. (1990). Belief‐bias and labor unions: The effect of strong attitudes on reasoning. Journal of Organizational Behavior, 11(5), 335-343. doi: https://doi.org/10.1002/job.4030110502
Schneider, W., & Shiffrin, R. M. (1977). Controlled and automatic human information processing: I. Detection, search, and attention. Psychological review, 84(1), 1. doi: https://psycnet.apa.org/doi/10.1037/0033-295X.84.1.1
Wason, P. C. (1968). Reasoning about a rule. Quarterly journal of experimental psychology, 20(3), 273-281. doi: https://doi.org/10.1080/14640746808400161