Why your brain’s quest for easy answers is making you believe lies
You know that feeling when something just clicks? When an explanation sounds right, feels right, fits perfectly with what you already think you know? That warm glow of recognition, that sense of “ah yes, obviously”?
Don’t trust it.
Here’s what I mean. We’ve already established that your brain runs two systems. System 1 is fast, automatic, always on. System 2 is slow, effortful, and lazy as hell. We’ve talked about how System 2 avoids work whenever possible, and how System 1 uses mental shortcuts to make snap judgments. But there’s a missing piece to this puzzle, and it’s the reason why smart people believe stupid things.
It’s called cognitive ease.
Timothy Wilson explores this in “Strangers to Ourselves,” his deep dive into how the adaptive unconscious – basically System 1 – runs most of our mental life without us having any clue what it’s doing. The book lays out a fundamental problem. We think we know why we believe what we believe. We think we have access to our own reasoning. But as it turns out – we actually don’t.
Cognitive ease is the feeling of familiarity, of fluency, of things making sense. When information comes to you easily, when it fits smoothly into your existing worldview, when the words flow and the logic seems sound, your brain interprets that ease as truth.
Here’s the trap. System 1 is designed to construct the most coherent story possible from whatever information it has available. It’s not interested in truth. It’s interested in coherence. It takes the facts at hand, fills in the gaps with assumptions, and delivers you a neat, tidy narrative that feels complete.
Wilson calls this the “adaptive unconscious” and here’s what makes his work different. He’s not just saying System 1 is fast. He’s saying it’s a storyteller. It takes whatever scraps of information are available and weaves them into a narrative that feels complete. The coherence isn’t coming from the quality of the evidence. It’s coming from how your unconscious brain arranges the pieces.
Think about it this way. When you meet someone new and instantly decide you like them or don’t, your conscious mind immediately starts generating reasons. They remind you of someone. They have a trustworthy face. They said something funny. But Wilson’s research shows those aren’t the real reasons – they’re the story your brain invented after the fact to explain a judgment that already happened unconsciously.
Your brain likes to close loops and it takes shortcuts to expend as little energy as possible. When you decide you like someone it’s easier for your brain to come up with the reason “she was nice” rather than the more complex reason “she smelled like vanilla, spoke at a cadence that matched my resting heart rate, and gestured in a way that my unconscious mind catalogued as non-threatening based on 10,000 prior micro-interactions I can’t remember.”
That example may not seem like a very big deal, but where cognitive ease becomes dangerous is when you encounter information that contradicts your existing beliefs and it creates cognitive strain. Processing it requires effort. It doesn’t fit smoothly into your worldview. It feels wrong, even if it’s true. Your brain has to work harder, and remember, System 2 hates working hard.
So what does your lazy System 2 do?
It lets System 1 handle it. And System 1 does what it does best. It rejects the uncomfortable information, sticks with the coherent story it’s already built, and floods you with that warm feeling of – wait for it – cognitive ease. Everything still makes sense. No need to think harder.
Wilson points out something crucial about this situation. We’re strangers to ourselves because we can’t actually observe our own unconscious processes. You can’t watch System 1 working. You only get the output. It presents you with a feeling, a judgment, an intuition, and you mistake that for reasoning.
You think you carefully considered the evidence. You didn’t. Your unconscious did its thing, and now you’re just confabulating an explanation after the fact.
Here’s where this connects to what I’ve previously written: The Law of Least Effort from Essay 2 explains why your brain prefers cognitive ease. Thinking is expensive. Your brain is looking for any excuse to avoid it. The heuristics from Essay 3 are the shortcuts System 1 uses to create that coherent story quickly. And cognitive ease is the reward system that keeps you trapped in this cycle.
When something FEELS true, when it flows easily, when it confirms what you already believe, you stop questioning. You stop thinking. System 2 stays on the couch. Meanwhile, System 1 is building a coherent narrative that might be completely disconnected from reality.
In the book, Wilson describes studies where people make decisions based on unconscious factors, then confidently explain their reasoning in ways that have nothing to do with what actually influenced them. You ask someone why they chose a particular product, they give you rational reasons, but the real reason was something they never noticed. Like where it was positioned on a shelf. Or what music was playing in the store.
I see people do this constantly. They construct explanations for their beliefs that sound reasonable, but really they’re not even addressing the subject matter that they’re explaining.
This is the coherence trap. Your brain is so good at creating stories that make sense, it’s hard for some to tell the difference between a true story and a coherent one. Cognitive ease feels the same whether the information is accurate or not.
Think about how this plays out in politics. You hear a claim that fits your existing political worldview. It feels right. It’s easy to process. It confirms what you already believe. Cognitive ease kicks in, and boom, you accept it as true. Meanwhile, contradictory evidence feels wrong, requires effort to process, creates cognitive strain. So you reject it.
The scariest part is you think you’re being rational. You think you carefully evaluated the evidence. You can give reasons for your beliefs. But Wilson’s research shows those reasons are often just stories your conscious mind invented to explain what your unconscious mind already decided.
We’re walking around with this incredibly powerful unconscious system making snap judgments, constructing narratives, deciding what feels true. And we have almost no direct access to how it works. We’re strangers to our own minds.
This is why critical thinking is so hard. You’re not just fighting external misinformation. You’re fighting your own brain’s preference for coherent stories over true ones. You’re fighting the seductive feeling of cognitive ease. You’re fighting an unconscious system that’s been running the show since before you could talk.
Now pay attention to this part.
The first step is recognizing that the feeling of certainty means nothing. That warm glow of “this makes sense” is not evidence. Cognitive ease is not truth. Your brain’s ability to construct a coherent narrative does not mean that narrative is accurate.
Wilson argues we need intellectual humility precisely because we’re strangers to ourselves. We need to question our own reasoning, especially when it feels effortless. We need to be suspicious of information that’s too easy to process, too comfortable, too confirming.
Because here’s the thing. System 1 is going to keep doing its job. It’s going to keep seeking coherence, keep creating stories, keep flooding you with feelings of ease when things fit your worldview. You can’t turn it off.
What you can do is recognize when it’s happening. When something feels obviously true, that’s your cue to think harder, not less. When information slides smoothly into your existing beliefs, that’s when you need to engage System 2.
The coherence trap is always there. Your brain is always preferring easy stories over hard truths. But once you know the trap exists, you at least have a chance of avoiding it.
But how?
Here’s some practical advice: When you feel certain, that’s when you should be most skeptical.
Start by noticing the feeling itself. That warm glow when a headline confirms what you already think. That instant sense of “obviously” when someone explains why your political opponents are wrong. That comfortable click when new information slots perfectly into your existing worldview. These feelings aren’t insights. They’re warning signs.
Ask yourself: Why does this feel so right? Is it because the evidence is actually strong, or because it’s easy to process? Would I believe this if it contradicted what I wanted to be true? Can I even imagine evidence that would change my mind on this?
Wilson suggests treating your own certainty like a friend’s bad relationship. When your friend is absolutely sure their terrible partner is actually great, you don’t trust that certainty. You look at the evidence. Do the same with your own beliefs. The more obvious something feels, the harder you need to look at why.
This doesn’t mean becoming paralyzed by doubt. It means recognizing that cognitive ease is a feature of your brain’s architecture, not a truth detector. When System 1 hands you a neat, comfortable story, that’s your cue to wake up System 2. Make it do some actual work.
For me – when someone claims something is “common sense” I automatically start question it. Because there’s really no such thing as common sense.
Common sense is just cognitive ease wearing a disguise.
When something feels like “common sense,” what you’re really experiencing is information that matches your existing beliefs, your cultural assumptions, and the patterns your brain has already learned. It feels universal because everyone in your bubble processed the same patterns. But common sense in rural Montana isn’t common sense in downtown Brooklyn. Common sense in 1950 isn’t common sense now. There’s no universal logic everyone naturally shares – there’s just coherent stories we mistake for objective truth because they feel so obvious to us.
Okay. Good talk.