Mental Shortcuts: How Heuristics Shape Our Worldview (Essay #3)

Your brain’s efficiency hacks are systematically distorting your political judgment

Okay, so do you remember from the last essay how we talked about how your brain is basically lazy? Using System 2 – the actual thinking system – burns calories, takes effort, and your mind would rather do literally anything else. So here’s what happens: your brain develops workarounds.

These workarounds are called heuristics. They’re mental shortcuts that let System 1 make quick judgments without bothering your lazy System 2 to actually think things through. Most of the time, these shortcuts work fine. You don’t need to deliberate about whether that thing moving toward you is a car. You just get out of the way.

The problem is that heuristics don’t just help you avoid getting hit by cars. They shape how you understand politics, evaluate evidence, and form opinions about complex issues that actually require thought.

Let me show you how this works in practice.

Rolf Dobelli catalogs these mental shortcuts in The Art of Thinking Clearly, and one pattern emerges over and over: our brains trade accuracy for speed. One of the most powerful shortcuts is called the availability heuristic. The basic formula is simple: your brain judges how likely or important something is based on how easily examples come to mind.

If you can quickly think of instances of something happening, your brain assumes it happens a lot. If you struggle to think of examples, your brain assumes it’s rare.

Sounds reasonable, right?

Here’s the problem: what comes to mind easily has nothing to do with what’s actually common or important. What comes to mind depends on what’s memorable, dramatic, or recent. And that’s where things go sideways.

Think about how people perceive risk. After a plane crash gets wall-to-wall news coverage, people get nervous about flying. Meanwhile, they’ll hop in their cars without a second thought, even though you’re way more likely to die driving to the airport than on the actual flight. But car accidents don’t make national news unless something truly bizarre happens. They’re not memorable. So your availability heuristic tells you cars are safe and planes are dangerous, which is the exact opposite of reality.

This isn’t just about planes and cars. The availability heuristic shapes political opinions constantly. You probably know someone who’s convinced that violent crime is skyrocketing because they can easily recall news stories about shootings or assaults. Never mind that violent crime has actually dropped significantly over the past few decades in most American cities. The stories are vivid, they’re repeated, and they’re easy to remember. So the heuristic kicks in and tells people the world is more dangerous than it actually is.

Or consider how people evaluate political corruption. If one party’s scandals get more media attention, those examples become more available in memory. People then conclude that party must be more corrupt, even if both parties have similar rates of ethical violations. The availability heuristic doesn’t care about base rates or statistical reality. It cares about what your memory can grab quickly.

Here’s where this connects back to the Law of Least Effort from the last essay. Your brain uses these shortcuts precisely because thinking is expensive. It would take real work to look up crime statistics, compare corruption rates across parties, or calculate actual risk probabilities. System 2 could do that analysis, but System 2 is lazy and doesn’t want to work that hard.

So System 1 steps in with its heuristic: “Can I easily think of examples? Yes? Then it must be common.” Done. No effort required.

Dobelli identifies dozens of these thinking errors, each one revealing another way your mental shortcuts lead you astray. There’s the representativeness heuristic, where you judge probability based on how much something resembles your stereotype. There’s the affect heuristic, where your emotional reaction to something determines how you evaluate its risks and benefits. Each one is another shortcut that lets you skip the hard work of actually thinking.

The truly insidious part is that these heuristics feel right. When you use the availability heuristic to conclude that plane crashes are common, you don’t feel like you’re making an error. You feel like you’re being reasonable. You can think of examples. That seems like good evidence.

This is what makes these systematic errors. These aren’t random mistakes. They’re predictable patterns that emerge directly from how your mental shortcuts work. Everyone’s brain uses the same heuristics, so everyone makes the same kinds of mistakes in the same situations.

You can see this playing out in political discourse constantly.

Someone shares an anecdote about welfare fraud, and suddenly people conclude welfare fraud is rampant, even though it represents a tiny fraction of welfare spending. Someone else shares a story about an immigrant committing a crime, and people conclude immigrants are dangerous, even though immigrants commit crimes at lower rates than native-born citizens.

The stories are available. The statistics are not. The heuristic points you toward the story.

And knowing about heuristics doesn’t automatically stop you from using them. Your System 1 is always going to reach for shortcuts because that’s literally its job. The question is whether you’re going to catch yourself doing it and whether you’re willing to put in the effort to override those initial judgments with actual analysis.

Most people aren’t. Remember, System 2 is lazy. It would rather let the heuristic stand than do the work of checking whether the mental shortcut led you somewhere accurate or somewhere completely wrong.

That’s the real cost of these mental shortcuts. They don’t just occasionally lead you astray. They systematically distort your understanding of the world in predictable ways, and they do it while making you feel confident that you’re seeing things clearly.