People who watched the same presidential debate rated their preferred candidate as the winner 75% of the time, regardless of which candidate actually performed better (Munro et al., 2002). Same event. Same words. Opposite conclusions.
Your Brain Has a Bouncer
Imagine a nightclub. At the door, there’s a bouncer. But this bouncer has a strange rule: he lets in everyone who agrees with the owner, and turns away everyone who doesn’t.
That’s your brain. Every piece of information you encounter goes through a filter:
- Evidence that confirms what you already believe? Passes right through. Feels true. Feels obvious.
- Evidence that challenges your beliefs? Stopped at the door. Scrutinized, picked apart, or simply forgotten.
This isn’t something you choose to do. It happens before you’re even aware of it.
Confirmation bias is the tendency to search for, interpret, and remember information in a way that confirms your existing beliefs.
It Works on Three Levels
Confirmation bias isn’t just one trick. It’s a system with three layers, each reinforcing the others.
1. Selective Search
You don’t search for the truth. You search for agreement.
Consider how people use Google:
| What you believe | What you search |
|---|---|
| Coffee is healthy | “why coffee is good for you” |
| Coffee is harmful | “coffee health risks” |
| Genuinely curious | “is coffee healthy or not” |
Almost nobody is in the third row. The question is framed to confirm the belief before a single result loads. And search engines, trained to give you what you want, happily oblige.
This isn’t limited to Google. It’s how you choose which news to read, which friends to ask for advice, and which books to pick up.
You curate your own echo chamber without realizing it.
2. Biased Interpretation
Even when two people see the exact same evidence, they draw opposite conclusions.
In a classic study by Lord et al. (1979), researchers gave pro-death-penalty and anti-death-penalty participants the same two studies on whether capital punishment deters crime. One study supported deterrence, one opposed it.
The result?
- Each group rated the study supporting their view as “well-conducted” and “convincing”
- Each group rated the study opposing their view as “methodologically flawed” and “unconvincing”
- Both groups left the experiment more polarized than when they entered
Same data. More division. Because each side interpreted ambiguous evidence as confirmation.
When you already believe something, evidence feels different. Supporting evidence feels like “obviously true.” Contradicting evidence feels like “but wait, let me find the flaw.”
3. Selective Memory
Your memory isn’t a neutral recording. It’s an editor with an agenda.
| What happened | What you remember |
|---|---|
| Your horoscope was vague 300 times | The one time it was eerily accurate |
| Your gut feeling was wrong a dozen times | The one time it was right |
| You saw 10 news stories challenging your view | The one that confirmed it |
This is why stereotypes persist. You notice and remember the one time someone from group X did the expected thing. You don’t encode the thousand times they didn’t.
Over time, your memory becomes a highlight reel of confirmation. Not a record of reality.
Smart People Are Worse at This
Here’s the uncomfortable part.
You might think intelligence protects you. Surely if you’re smart enough, educated enough, analytical enough, you can think your way past confirmation bias?
The opposite is true.
Kahan et al. (2012) tested scientific literacy against opinion on politically charged topics like climate change. People with higher scientific literacy were more polarized, not less. They weren’t using their analytical skills to find the truth. They were using them to build better arguments for what they already believed.
Higher IQ doesn’t give you better judgment. It gives you:
- More sophisticated rationalizations
- A larger toolkit for poking holes in opposing evidence
- Greater confidence that you’ve “thought it through”
- Faster access to supporting evidence
Intelligence doesn’t fight confirmation bias. It arms it. The smarter you are, the better you are at defending a wrong position without realizing you’re doing it.
Confirmation Bias in the Wild
This isn’t an abstract lab finding. It silently shapes decisions that matter:
| Domain | How it shows up |
|---|---|
| Medicine | A doctor forms an early diagnosis and interprets all subsequent symptoms as supporting it, missing signs of the actual condition |
| Hiring | An interviewer forms a first impression in 10 seconds, then spends the rest of the interview confirming it |
| Investing | You buy a stock, then only read news that says it will go up, ignoring every warning sign |
| Relationships | You decide someone is trustworthy, then explain away every red flag |
| Criminal justice | Once police have a suspect, they search for evidence of guilt, not evidence of innocence |
The Wason Selection Task
The psychologist Peter Wason designed an experiment in 1960 that made confirmation bias visible.
Participants were told: a secret rule generates number sequences. They saw one example: 2, 4, 6. Their job was to figure out the rule by proposing their own sequences and being told “yes” or “no.”
Most people assumed the rule was “add 2” and tested:
| Sequence tested | Result | What they learned |
|---|---|---|
| 8, 10, 12 | “Yes” | Nothing new, already believed this |
| 20, 22, 24 | “Yes” | Still nothing new |
| 100, 102, 104 | “Yes” | More confident in a wrong theory |
Nobody tested sequences that might break their theory. They never tried:
- 1, 5, 100 (which would also fit, revealing the rule is broader)
- 6, 4, 2 (which would fail, disproving the “add 2” theory)
The actual rule was simply “any ascending sequence.” But participants couldn’t discover it because they only tested confirmations, never disconfirmations.
To actually learn the truth, you have to try to prove yourself wrong. That’s exactly what confirmation bias prevents you from doing.
What Can You Do About It?
Knowing about confirmation bias doesn’t make you immune. But two strategies genuinely help:
1. Actively seek disconfirmation
Before deciding, ask yourself one question:
“What would change my mind?”
If you can’t answer that, your belief isn’t based on evidence. It’s based on identity.
When researching something, deliberately search for the strongest argument against your position. Not the strawman version. The steelman. The version a smart, honest person on the other side would actually make.
2. Consider the opposite
Lord et al. (1984) found that simply asking people to “consider the opposite” before evaluating evidence significantly reduced bias. Not eliminated. Reduced.
Before reaching a conclusion, force yourself to genuinely argue the other side: “If I believed the opposite, what evidence would I point to?”
The goal isn’t to have no beliefs. It’s to hold them loosely enough that new evidence can actually change them.