Confirmation bias isn’t the same as lying – but it’s what makes you vulnerable to believing lies
We’ve spent five essays now building up this picture of how your brain really works. System 1 is fast and automatic but constantly wrong. System 2 is slow and accurate but lazy as hell. Your mind takes shortcuts to save energy. It builds coherent stories from incomplete information. It gets manipulated by subtle cues you don’t even notice.
But here’s where it gets really uncomfortable.
All of those flaws we’ve been discussing? They’re nothing compared to what happens when your ego gets involved.
Stuart Hanscomb’s “Critical Thinking: The Basics” lays this out with brutal clarity. The single biggest obstacle to clear thinking isn’t that your brain is lazy or that it takes shortcuts. It’s that your brain is desperately committed to proving it was right all along.
This is confirmation bias. And it’s probably sabotaging you right now.
Here’s how it works: When you encounter new information, your brain doesn’t evaluate it objectively. It evaluates it based on whether it confirms what you already believe. Information that supports your existing views gets accepted quickly, often without much scrutiny. Information that contradicts your views gets subjected to intense skepticism – or just ignored entirely.
You’ve seen this play out – someone shares a study that confirms their position, and suddenly they’re all about “trusting the science.” The same person sees a study that contradicts their position, and suddenly they’re pointing out methodological flaws and funding sources.
But here’s the thing – you don’t notice when you’re doing it.
Hanscomb explains that confirmation bias operates below conscious awareness. You’re not deliberately seeking out only the information that supports you. You genuinely believe you’re being objective. Your brain is just quietly filtering reality so that you only see the parts that confirm you were right.
Let’s say you believe renewable energy is the solution to climate change. When you encounter evidence supporting this view – studies showing solar costs dropping, data on job creation in green energy – your brain processes it smoothly. It feels true. It gets filed away as confirmation of what
you already knew.
But when you encounter evidence complicating this view – data on grid reliability challenges, information about rare earth mining for batteries, studies on intermittency problems – your brain immediately starts working overtime. Are these studies reliable? Who funded them? What’s the methodology? Is there a political bias?
And look – those are actually good questions to ask about any study. The problem is that you’re only asking them about the studies that disagree with you.
This is what makes confirmation bias so insidious. You can be intelligent, educated, committed to truth – and still completely trapped inside your own perspective. Your System 1 brain has already decided what’s true based on your existing beliefs.
System 2 isn’t checking System 1’s work. It’s building elaborate justifications for why System 1 was right.
Hanscomb points out something crucial here. Confirmation bias isn’t just about what information you seek out. It’s about how you interpret information once you have it. Two people can look at the exact same data and come away with completely opposite conclusions – both absolutely convinced that the data supports their position.
Remember that coherence trap we talked about in Essay 4?
How your brain constructs the most coherent story possible from available information? Confirmation bias is what happens when your brain decides that the most coherent story must be the one where you were right all along.
This shows up everywhere, not just in politics. You think your relationship is solid, so you interpret your partner’s late nights at work as dedication to their career, not as a warning sign. You think your relationship is failing, so you interpret the same late nights as evidence of growing distance. Same behavior. Completely different interpretations based on what you already believe.
Here’s what makes this particularly brutal. Confirmation bias gets stronger when you’re smart.
Hanscomb cites research showing that people with higher cognitive abilities are actually better at confirmation bias. They’re better at constructing elaborate justifications for their existing beliefs. They’re better at poking holes in contradictory evidence. Intelligence doesn’t protect you from bias – it gives you better tools to defend your biases.
Now here’s where I need to be really clear about something.
Confirmation bias affects everyone. Liberal, conservative, doesn’t matter. Your brain structure doesn’t change based on your politics. We’re all vulnerable to filtering information through the lens of what we already believe.
But – and this is critical – confirmation bias is not the same thing as deliberate lying.
When someone genuinely evaluates evidence but interprets it through their existing worldview, that’s confirmation bias. When someone fabricates evidence, deliberately misrepresents data, or spreads information they know to be false, that’s something else entirely. That’s bad faith.
The Trump administration has repeatedly made claims that are objectively, verifiably false. Not “interpreted differently” or “seen through a different lens” – just factually wrong. Claiming immigrants are eating pets when that never happened. Denying election results that were verified by dozens of courts and recounts. Spreading conspiracy theories with zero supporting evidence.
That’s not confirmation bias. That’s lying.
And here’s why this distinction matters for critical thinking. Confirmation bias makes you vulnerable to believing lies that confirm your worldview. If you already distrust immigrants, you’re more likely to believe false stories about immigrant crime. If you already think elections are rigged, you’re more likely to accept baseless claims of fraud.
Your confirmation bias creates the opening. Bad faith actors exploit it.
This is where all those earlier essays come together. Your lazy System 2 doesn’t want to do the hard work of fact-checking claims that feel right (Essay 2). So it uses heuristics to quickly categorize information as trustworthy or suspicious based on whether it confirms your beliefs (Essay 3). Information that confirms your worldview creates cognitive ease – it feels true (Essay 4). You’ve been primed by your media environment to accept certain narratives and reject others (Essay 5). And now confirmation bias ties it all together into a neat package where questioning your own side feels like betrayal.
This is exactly how propaganda works. It doesn’t need to convince you of new ideas. It just needs to confirm what you already suspect. Once confirmation bias kicks in, your brain does the rest of the work.
The thing is, you can’t eliminate confirmation bias. It’s too deeply embedded in how your brain works. But you can recognize it. You can develop habits that work against it.
Hanscomb suggests a simple practice: actively seek out the strongest arguments against your position. Not the strawman versions. Not the weakest defenders. The actual best case the other side can make.
This is hard. It requires engaging System 2 when every instinct is telling you to dismiss contradictory information. It requires intellectual humility – the willingness to consider that you might be wrong. It requires treating your own beliefs with the same skepticism you direct at opposing views.
Most people won’t do this. It’s too uncomfortable. It requires too much mental effort. It threatens the coherent worldview System 1 has constructed.
But here’s what I’m asking you to consider.
If you never seriously engage with the best arguments against your position, how do you know you’re right? If you only consume media that confirms what you believe, how are you different from the people you think are brainwashed? If you dismiss every contradictory piece of evidence without genuine examination, are you thinking critically or just protecting your ego?
Critical thinking requires you to be harder on your own beliefs than you are on anyone else’s. It requires recognizing that the information that feels most obviously true might be the information you most need to question.
Because confirmation bias doesn’t just trap you in your existing beliefs. It makes you an easy mark for anyone who knows how to tell you what you want to hear.
The politicians who lie to you aren’t stupid. They understand confirmation bias. They know that if they tell you something that confirms your fears or validates your anger, your brain will accept it without scrutiny. They know that once they’ve got you nodding along, you’ll defend their lies more vigorously than they will.
Your confirmation bias becomes their weapon.
So yes, cognitive biases affect everyone. We’re all vulnerable to filtering information through our existing beliefs. But that doesn’t mean all claims are equally valid. It doesn’t mean truth is just a matter of perspective. It means we all need to work twice as hard to fact-check information that confirms what we already think.
Because that’s where we’re most vulnerable. That’s where the lies slip through.
The uncomfortable truth is this: you are not naturally objective. Your brain is designed to protect your existing beliefs, not to pursue truth. And until you accept that, until you start questioning yourself as aggressively as you question everyone else, you’re just another person in the crowd believing whatever feels right and wondering why everyone else is so damn wrong.
They’re not more biased than you. They’re just confirming different beliefs.
The question is whether you’re willing to do the work to break out of your own echo chamber. Or whether you’re going to keep pretending that your particular bubble just happens to be the one that’s got everything right.
Okay – we’ve gotten to the point now where the majority of people who were interested in this series has fallen off. Because really digging in and thinking – is hard work. Who has time for this shit?
But let me make the argument that this is the metaphor of putting your own air mask on before tending to others. Everyone has an opinion on what other people should work on to help solve the cultural and political issues, we as a country, and a world, are facing. But the only person you actually have any control over – is yourself. So, if you’re serious about change – you start there.
And yes – I get the irony of that statement.
Okay – good talk.