Confirmation Bias Is Influencing Every Decision You Make—Including Who You Will Vote For
With the U.S. Presidential election quickly approaching, confirmation bias is playing a starring role in how we process information, perceive candidates, and ultimately decide whom to support. This cognitive bias not only influences individual voting choices but also impacts the broader democratic process. In this article, we delve into confirmation bias—its benefits, how to spot it in our own thinking, and actionable strategies to mitigate its negative effects.
While confirmation bias is just one of many cognitive biases hosting a wild party in our brains, its pervasive and often subtle influence makes it a critical focus for informed and thoughtful voting. Fear not! We’ll tackle other cognitive culprits in future installments.
Understanding Confirmation Bias
Confirmation bias is the tendency to search for, interpret, favor, and recall information that confirms our preexisting beliefs or hypotheses. This sneaky bias leads individuals to give more weight to supporting evidence while ignoring contradictory information, often resulting in lopsided judgments and decisions.
Perhaps sharing a personal example of confirmation bias at play will highlight how critical identifying this thinking trap can be.
Once when I was living in the Philippines as a missionary, I was offered soup #5. Soup #5 is made from cow penis or testes and often all the other spare parts of a cow that the butcher can’t sell. It typically has eyeballs, anus, tongue, and hoofs as well.
My stomach turned a bit when I learned what sat in front of me. I eat some wild and disgusting things, but this soup is likely the most repugnant dish I’ve ever been served in good faith.
My preconditioned experience led me to fully believe that all the these parts of the cow were only edible when processed into hot dogs. My confirmation bias reared up and led me to avoid eating this dish, but local culture dictated I eat the whole bowl and a failure to finish the dish would be an insult to the host.
I definitely didn’t want to offend, so I started choking it down. The flavor wasn’t too bad … just beef stew right? Until you actually get a chunk of the beef. With each bite, my mind was screaming at me to stop, but logic told me everything in my bowl was perfectly edible.
I eventually got through the whole bowl and can tell you most of it actually wasn’t that bad, except for the eyeball. DO NOT EAT COW EYEBALL!!.
I share this story because it shows a subtle example of confirmation bias. My cultural background trained me to think these cow parts aren’t food, but they are perfectly good food and I’m sure if I had eaten them from childhood on up, I’d likely quite enjoy them.
The Evolutionary Advantages of Confirmation Bias
From an evolutionary perspective, confirmation bias acted as a mental shortcut, enabling our ancestors to make swift decisions crucial for survival in complex—often perilous—environments. Here’s how it helped back in the day:
- Cognitive Efficiency and Speed: Early humans needed to make quick decisions, usually without the luxury of exhaustive information processing or empirical evidence. By filtering out irrelevant or contradictory data, confirmation bias allowed for rapid judgments, even with limited inputs. It was like the original fast food for thought.
- Survival through Consistency: Consistent beliefs provided reliable patterns for interacting with both the environment and other people. Favoring known information helped reduce exposure to uncertain, potentially harmful situations.
- Social Cohesion and Group Dynamics: Shared beliefs strengthened group cohesion, essential for cooperative survival strategies. Reinforcing group norms minimized internal conflicts, fostering harmonious (enough) relationships.
- Emotional Comfort and Mental Health: Maintaining consistent beliefs reduced mental discomfort from contradictory information and cognitive dissonance. Consistency was king in managing anxiety and stress in unpredictable environments.
- Reinforcement Learning: Confirmation bias reinforced successful strategies by validating effective behaviors. Building upon established knowledge frameworks helped incorporate new, aligned information more rapidly.
Why We Rely on Confirmation Bias Today
Despite living in environments far removed from those of our ancestors (thank goodness for central heating and Wi-Fi), confirmation bias remains prevalent due to several factors:
- Information Overload in the Modern World: The sheer volume of news from media outlets and social networks is overwhelming. Prioritizing information that aligns with our beliefs is like choosing our comfort food in a buffet of chaos.
- Social Media and Echo Chambers: Algorithms curate content tailored to our preferences, reinforcing confirmation bias by limiting exposure to diverse perspectives. Our feed’s feedback loop might be more comforting than coffee on Monday mornings.
- Complex Decision-Making Scenarios: Modern life is riddled with complex decisions, encouraging the use of heuristics like confirmation bias. Without such shortcuts, we might find ourselves paralyzed, unable to choose the right cereal at the grocery store, let alone a presidential candidate.
- Cultural and Educational Systems: Educational and cultural institutions often reinforce prevailing beliefs, making confirmation bias a natural extension of societal learning processes.
Mitigating Confirmation Bias: Strategies for Better Informed Voting
Knowledge is power—and so is a good strategy. By becoming aware of confirmation bias, voters can take deliberate steps to mitigate its effects. Here are some actionable strategies, alongside recommended literature and guiding principles for each approach:. Here are actionable strategies, recommended literature, and guiding principles for each approach:
1. Seek Diverse Perspectives
The primary challenge when overcoming confirmation bias is getting stuck in limited perspectives. To overcome this we need to expand our horizons and approach everything with curiosity and avoid certainty at all costs. We can be confident in our opinions as long as we recognize they’re only opinions and other people have equally valid and meaningful opinions of their own.
Actionable Steps:
- Diversify Media Consumption: Subscribe to news outlets with different political leanings and follow commentators from various backgrounds. This news source bias rating chart can help you find less-biased or balanced sources of news.
Interactive Media Bias Chart | Ad Fontes Media
- Engage in Cross-Political Conversations: Discuss opinions with individuals holding different political views and participate in diverse forums. Be sure to approach these conversations with the objective of unveiling your biased thinking patterns, not with the objective of winning an argument or convincing someone else that your biased thinking is correct.
- Attend Public Debates and Forums: Watch debates between candidates from different parties to gain a broader understanding of issues. Don’t look for ideas that confirm what you already think, but instead look for ideas that you need to spend more time researching and understanding.
- Read Widely on Policy Issues: Explore materials presenting multiple sides of key election topics. If you read an article in favor of a policy, deliberately seek out another article with a counterperspective.
Recommended Reading:
2. Practice Critical Thinking
Critical thnking isn’t nearly as common as some may expect. Even people trained in reasoning and analysis fall into emotional and reactive thinking. It’s human nature to do so. This means we have to consciously bypass these emotional thoughts. The main method for doing so is methodically processing information with a focus on facts, evidence, and a healthy dose of skepticism.
Actionable Steps:
- Question the Source: Assess the credibility and potential biases of information sources. Just because a source purports data as fact doesn’t make it so. Verifying the veracity of a data source helps us avoid falling prey to false claims.
- Evaluate Evidence: Scrutinize the quality and reliability of evidence supporting claims. Even after verifying the credibility of a data source, data can easily be manipulated. Evaluating the data critically allows you to see where someone may be trying to mislead readers or misrepresent the truth.
- Identify Logical Fallacies: Learn to recognize and avoid flawed reasoning. Circular arguments, red herrings, straw man arguments, or false dichotomies are commonly used to mislead gullible readers.
- Use Analytical Frameworks: Apply methods like SWOT analysis to evaluate policies and candidates comprehensively. Frameworks provide balanced models to compare evidence between two candidates or perspectives.
Recommended Reading:
3. Question Assumptions
We all make assumptions, but we also know what assumptions can lead to. When we tell ourselves a story, we fill in the blanks where we don’t know the facts. This is functional and okay, but we have to recognize it for what it is and stay humble and open to new information that may force us to change our entire story. Being rigid in our assumptions leads to confirmation bias.
Actionable Steps:
- Self-Reflection Practices: Keep a journal to analyze your thoughts surrounding important decisions you make and underlying beliefs. Being honest with the emotions and motives tied to these decisions help asses the stories we tell ourselves.
- Identify Personal Biases: Acknowledge and reflect on personal biases affecting judgment. Most all of us have some type of implicit bias that impacts our judgement. We may not recognize that we’re biased toward people like us and against people unlike us, but science shows that to be true.
- Devil’s Advocate Approach: Argue against your own beliefs to test their robustness. Sometimes being antagonistic to our conclusions helps us recognize the underlying motive and belief driving a decision.
- Mind Mapping: Visually represent beliefs and assumptions to better understand and challenge them. This helps us get visual if that’s what helps us best see the forest for the trees.
Recommended Reading:
4. Encourage Open Dialogue
Discussing difficult topics with people who have different views than us is a skill. Improving your ability to engage in these conversations can yield huge rewards in your ability to understand and connect with others.
Actionable Steps:
- Create Safe Spaces for Discussion: Establish environments where diverse opinions can be expressed without fear. If people are defensive or don’t feel safe discussing a topic, they won’t bring their authentic feelings or be willing to listen. They’ll likely dig into a defensive posture which isn’t constructive.
- Practice Active Listening: Focus on understanding others’ viewpoints rather than preparing to counter them. Listening is the first step toward understanding and breaking through our own flawed or limited perspectives.
- Facilitate Constructive Debates: Organize structured debates to explore multiple sides of issues. Helping everyone involved arrive prepared and on equal footing will help everyone feel safe, valued, and respected.
- Promote Empathy: Strive to understand the emotions and motivations behind others’ beliefs. Asking probing questions can reveal motives at layers deeper than the initial surface level.
Recommended Reading:
5. Use Structured Decision-Making Processes
Using tools to help visualize your decision and the criteria most important to it can help alleviate unconscious bias and sometimes shows clearly where our biases are. The danger here is that your biases won’t always be resolved just by doing these exercises so you have to take steps to reduce them.
Actionable Steps:
- Pros and Cons Lists: List advantages and disadvantages of different candidates or policies. Writing out a list like this is very prone to confirmation bias if you’re not objective so be thoughtful as you list items out.
- Decision Matrices: Use weighted criteria to evaluate options objectively. Again, take precautions against unfairly weighting in favor of your bias.
- Checklists: Develop checklists of key issues and questions to address when evaluating candidates or decisions.
- Scenario Analysis: Explore various future scenarios based on different voting choices. This one is tricky with elections because candidates almost always make promises they won’t be able to fully keep, so look more generally at their preferences and judge accordingly.
Recommended Reading:
6. Implement Reflective Practices
Develop a personality trait that works continually to be mindful of and aware of cognitive biases. Mindfulness practices that give create space between you and your thoughts (you are not your thoughts) help us break free from bias.
Actionable Steps:
- Mindfulness Meditation: Practice mindfulness to increase awareness and reduce biased reactions.
- Journaling: Document voting decisions, thoughts, and feelings to uncover biases.
- Self-Assessment Tools: Use assessments to identify personal cognitive biases.
- Feedback Loops: Seek feedback from trusted individuals on your decision-making processes.
Recommended Reading:
7. Educate Yourself Continuously
Without continual learning, we tend to rely on outdated, misinterpreted, and poorly remembered details. Memories are always filtered through many biases, and have been proven to be very unreliable. Continually educating ourselves on relevant topics keeps our understanding fresh.
Actionable Steps:
- Enroll in Workshops and Courses: Participate in courses on critical thinking and cognitive psychology or other areas that interest you.
- Read Widely and Deeply: Engage with materials on cognitive biases and fallacious thinking.
- Stay Updated on Current Events: Follow multiple reputable news sources for a well-rounded perspective on relevant topics such as the presidential election.
- Engage in Lifelong Learning: Continuously seek to expand knowledge beyond comfort zones.
Recommended Reading:
Now go vote without bias
Understanding and addressing confirmation bias isn’t just an academic exercise—it has real-world implications that affect the very foundation of our democratic processes. By committing to recognize and mitigate confirmation bias, we empower ourselves to vote more thoughtfully, engage more meaningfully in political discourse, and contribute to a more resilient and unified society. Let us strive to navigate the complexities of the electoral landscape with clarity, empathy, and an unwavering commitment to truth and fairness.
And beyond voting, mitigating confirmation bias will also help us make all our decisions more effectively – including picking out cereal.
References
- Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140. DOI:10.1016/0749-5978(85)90049-4
- Dunning, D., & Kruger, J. (1999). Unskilled and unaware of it: How difficulties in recognizing one’s own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121–1134. DOI:10.1037/0022-3514.77.6.1121
- Festinger, L. (1957). A Theory of Cognitive Dissonance. Stanford University Press.
- Fischhoff, B. (1975). Hindsight ≠ foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1(3), 288–299. DOI:10.1037/9-340
- Janis, I. L. (1972). Victims of Groupthink. Houghton Mifflin.
- Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Are positive illusions healthy? A test of the self-serving bias and relationship with psychological well-being. Journal of Personality and Social Psychology, 86(2), 308–328. DOI:10.1037/0022-3514.86.2.308
- Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2(2), 175–220. DOI:10.1037/1089-2680.2.2.175
- Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. Advances in Experimental Social Psychology, 10, 173–220. DOI:10.1016/S0065-2601(08)60051-9
- Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232. DOI:10.1016/0010-0285(73)90033-9
- Tversky, A., & Kahneman, D. (1974). Judgment under Uncertainty: Heuristics and Biases. Science, 185(4157), 1124–1131. DOI:10.1126/science.185.4157.1124