The Malleable Past: Why We Can’t Trust Our Memories (Essay #9)

Your brain is rewriting history to make you feel right about being wrong

You remember exactly where you were when you heard about 9/11, right? You can picture the room, the TV, who told you, what you felt. The memory is vivid. Crystal clear. Absolutely certain.

Here’s the weird part. There’s a decent chance significant details of that memory are wrong.

Not because you’re lying. Not because you’re confused. But because memory doesn’t work the way you think it does.

We’ve spent eight essays now dismantling how your brain processes new information. System 1 runs on autopilot and makes constant errors. System 2 is lazy. You rely on mental shortcuts that introduce bias. You construct coherent stories from incomplete data. You get manipulated by subtle cues. Confirmation bias filters everything through what you already believe. Cognitive dissonance makes you justify your mistakes. The halo effect lets first impressions override evidence.

But all of those biases operate in the present tense. What about the past? Surely your memories of what actually happened to you are reliable, right?

Elizabeth Loftus spent decades proving that wrong, and her work is frankly disturbing.

Here’s what she discovered. Memory is not a recording. It’s a reconstruction. Every time you remember something, you’re not playing back a video file. You’re rebuilding the memory from fragments, and those fragments get contaminated by everything you’ve experienced since.

The original experiments were super simple. Loftus would show people footage of a car accident. Then she’d ask them questions about what they saw. But she’d vary the wording. Some people got asked “How fast were the cars going when they hit each other?” Others got asked “How fast were the cars going when they smashed into each other?”

That single word change – hit versus smashed – altered people’s memories. The “smashed” group remembered the cars going faster. A week later, when asked if they saw broken glass in the footage, the “smashed” group was more likely to say yes.

There was no broken glass in the video. But the word “smashed” planted the suggestion, and people’s brains filled in the details. They weren’t lying about seeing glass. They genuinely remembered seeing it. The memory felt real because it was real – their brain had constructed it to fit the narrative.

This is called the “misinformation effect,” and it gets worse.

Loftus showed you can implant entirely false memories with the right combination of suggestion and social pressure. In one study, she convinced about 25% of participants they’d been lost in a mall as a child – an event that never happened. She did it by having a trusted family member “remind” them of the incident with some specific but fake details. Once the seed was planted, people started adding their own elaborations. They remembered being scared. They remembered what the mall looked like. They remembered how it felt.

Their brains were doing exactly what System 1 always does. Taking fragments of information – real memories of malls, real experiences of fear, real childhood emotions – and weaving them into a coherent narrative. The fact that the narrative was based on a lie didn’t matter. It felt true, so it became true.

Connect this back to Essay 7. Remember cognitive dissonance? Your brain cannot tolerate contradictions. When your current beliefs conflict with past events, something has to give. And it turns out your brain is perfectly willing to rewrite history to resolve the conflict.

Think about political arguments. How many times have you heard someone insist they “always knew” something would happen, when you distinctly remember them saying the opposite? They’re not necessarily lying. Their memory has been reconstructed to align with their current position. The cognitive dissonance of being wrong got resolved by changing the past.

Or take relationships. When couples split up, they often remember the relationship completely differently. Not just interpretations – actual events. Because the brain is revising the story to make sense of the ending. If you broke up because the other person was terrible, then retroactively, they were always terrible. The memory of good times gets edited or deleted entirely.

This connects to confirmation bias too. You don’t just filter new information through your existing beliefs. You filter old information. Your memory of what happened gets shaped by what you currently believe should have happened.

Here’s where this gets dangerous for critical thinking. We treat personal experience as the gold standard of evidence. “I saw it with my own eyes” feels unassailable. Eyewitness testimony can send someone to prison for life. We trust our memories of events more than we trust statistics or expert analysis.

But Loftus’s work shows that eyewitness testimony is shockingly unreliable. People confidently misidentify suspects. They remember details that weren’t there. They’re influenced by leading questions from police, by photos shown during lineups, by media coverage after the fact. The memory feels certain, but certainty and accuracy are completely different things.

And the scariest part? You have no way to tell which of your memories are accurate and which have been contaminated. The false memories feel exactly as real as the true ones. The confidence you feel about a memory has almost no correlation with whether it actually happened that way.

This is why I always sigh when people give anecdotal evidence to prove a point. “Well, in my experience…” or “I know someone who…” or “I remember when this exact thing happened and…”

It’s literally worthless.

“Wait? What?”

Now stick with me here – and maybe read the next part of this essay twice – cause this is very hard for some people to get.

Anecdotal evidence is worthless, not because people are lying, but because personal experience runs through every single cognitive bias we’ve discussed. Your memory of the experience has been filtered through confirmation bias. It’s been reconstructed to resolve cognitive dissonance. It’s been shaped by priming and the halo effect. It’s been edited to create a coherent narrative that makes you the reasonable person in the story.

And you have no idea which parts are accurate and which parts your brain made up to fill the gaps.

When someone says “I saw it with my own eyes,” what they mean is “my brain constructed a memory that feels certain.” These are not the same thing. Personal experience is the lowest form of evidence because it’s the most contaminated by the exact biases we’re trying to overcome.

So what do you do with this information?

Here’s the hard part. You need to stop basing your judgments about the world on your personal experiences. You need to learn to think objectively instead of emotionally.

I know this is one of the hardest concepts for people to really grasp. Because personal experience feels so real, so valid, so meaningful. Something happened to you or to someone you know, and it touched you personally. That emotional connection feels like it should matter more than cold statistics or abstract analysis.

But that’s exactly the problem.

Let me give you a concrete example. Last. I met everyone and their mother had an opinion on SNAP benefits. I saw so many comments from people saying that they knew someone who gets SNAP while working under the table, or who sells their benefits for cash to pay for manicures or alcohol. There were people complaining that you could use SNAP to buy lobster. Therefore, SNAP is bad. The whole program is corrupt. We should cut it.

That’s small-minded thinking based entirely on personal anecdote.

Here’s what objective thinking looks like. You step back and look at the bigger picture. How many people does SNAP actually help? What happens to food insecurity rates when SNAP is available versus when it’s cut? What percentage of recipients are abusing the system versus using it as intended? What does “shrinkage” – loss due to fraud or waste – look like in this program compared to other government spending or private sector systems?

When you look at the data, you find that SNAP has extremely low fraud rates compared to most government programs. You find that it dramatically reduces childhood hunger and improves health outcomes. You find that the vast majority of recipients are working families, elderly people, or individuals with disabilities. You find that every dollar spent on SNAP generates about $1.50 in economic activity.

The person getting manicures exists. But using that one person to judge a program that feeds 41 million Americans is absurd. Every large system has shrinkage. Every program has some people who abuse it. That’s not an argument against the program. That’s an argument for comparing the program’s overall outcomes to its costs and making a rational decision about whether the benefits outweigh the problems.

This is the difference between thinking like someone who’s been personally affected by something and thinking like someone trying to create good policy. Both perspectives have value. But only one should drive your actual positions on issues.

Being upset because something bad happened to someone you know is natural. Making calculations and judgments based solely on that personal connection is lazy thinking. It’s letting System 1 run the show while System 2 takes a nap. It’s allowing cognitive ease and emotional coherence to override objective analysis.

Personal stories are incredibly powerful for manipulating people emotionally. That’s exactly why politicians and advocacy groups use them constantly. They know a single crying mother or a sympathetic victim or a scary anecdote will override mountains of evidence in most people’s minds. They’re weaponizing your cognitive biases against you.

When you see someone relying heavily on personal anecdotes to make their argument, that should be a red flag. It means their objective case is probably weak. It means they’re trying to get you to make decisions based on emotion rather than evidence. It means they’re counting on you not to engage System 2 and actually think about whether their personal story scales up to good policy.

And here’s the uncomfortable truth: when you make arguments based primarily on your own personal experience, you’re doing the exact same thing. You might have good intentions. You might believe you’re right. But you’re still using emotional manipulation instead of building a solid case.

This doesn’t mean personal experience is irrelevant. It means personal experience needs to be connected to broader evidence. If something happened to you, and you can show that your experience reflects a larger pattern backed by data, then you have something. If something happened to you, and you can demonstrate why your specific case reveals a systematic problem, then your anecdote becomes a useful illustration rather than the entire argument.

But if your evidence begins and ends with “this happened to me” or “I know someone who,” your argument is built on sand. You’re asking people to make decisions about complex systems based on your contaminated, biased, reconstructed memory of a single data point.

Start treating your own memories with the same skepticism you’d apply to someone else’s story. When you find yourself saying “I remember exactly what happened,” pause. Ask yourself what might have influenced that memory since. What do you currently believe that might be shaping how you remember the past?

More importantly, ask yourself: is my personal experience actually representative? Or am I trying to extrapolate from one case to an entire population? Am I thinking about what actually works for the most people, or am I just reacting emotionally to what happened in my immediate vicinity?

This is especially important in arguments. When you’re certain you remember someone saying or doing something, consider that your memory might have been revised to support your current position. When someone else remembers an event differently, maybe don’t immediately assume they’re lying or stupid. Their brain did the same reconstruction yours did.

The goal isn’t to become some emotionless robot. The goal is to recognize that personal experience – your own and everyone else’s – is one of the least reliable forms of evidence because it’s been processed through every cognitive bias we’ve spent nine essays discussing. Your memory is not a camera. It’s a storyteller. And storytellers reshape narratives based on what the audience – your current self – needs to hear.

The past isn’t fixed. It’s malleable. And your brain is editing it constantly, usually without asking permission. Which means making good decisions about the present and future requires looking beyond what you remember and what feels true. It requires engaging with objective evidence even when it contradicts your personal experience.

That’s hard. But it’s necessary if you actually want to understand the world rather than just feeling right about your place in it.