Survivorship Bias

During WWII, the U.S. military studied bullet holes on returning bombers and planned to armor the most damaged areas. Statistician Abraham Wald told them they had it backwards: armor the spots with no holes, because planes hit there never came back (Wald, 1943).

The military was about to make a critical decision based only on survivors.


The Invisible Losers

Survivorship bias is what happens when you draw conclusions from the winners while forgetting that the losers exist.

It sounds obvious when you state it directly. But it’s hard to spot in practice, because the losers are invisible:

  • They don’t write books
  • They don’t give interviews
  • They don’t show up in your feed
  • They don’t show up anywhere

They just quietly disappear. And the data you’re working with has been pre-filtered by failure, and you don’t notice the filter.


Wald’s Planes

The WWII bomber story isn’t just a nice anecdote. It’s a perfect model for how survivorship bias works.

The military examined planes that returned from missions and mapped the bullet damage:

  • Wings: riddled with holes
  • Fuselage: heavy damage
  • Tail: shot up badly
  • Engines: clean. Almost no damage.

The natural conclusion: reinforce the wings, fuselage, and tail.

Wald saw what they couldn’t. The sample was biased. These planes survived despite being hit in those spots. The spots with no damage on returning planes weren’t “safe” spots. They were fatal spots.

Planes hit in the engines never made it back to be studied.

The military was looking at survivors and asking “where are they hurt?” Wald asked the better question: “where are the dead ones hurt?”

The answer was hiding in what was missing from the data.


You Do This Every Day

Wald’s planes are dramatic, but the same error runs through everyday thinking:


“Follow Your Passion”

Every successful musician, actor, and athlete has a story about persevering against impossible odds. Their advice is always the same: “Never give up. Follow your dreams.”

What you don’t see: the thousands who followed the same passion, worked just as hard, sacrificed just as much, and failed.

  • They’re not on podcasts
  • They’re not writing memoirs
  • They’re working jobs they didn’t plan for
  • And nobody asks them for advice

The advice “never give up” comes exclusively from people for whom not giving up happened to work. The ones for whom it didn’t work are invisible.


“Drop Out Like Steve Jobs”

Steve Jobs, Mark Zuckerberg, Bill Gates. Three college dropouts who became billionaires. The implication: maybe formal education is overrated.

But the startup failure rate is roughly 90%. For every dropout billionaire, there are millions of dropouts who struggled. They just don’t make headlines.

What you seeWhat you don’t see
Jobs dropped out, became a billionaireMillions dropped out, became unemployed
3 famous examplesHundreds of millions of counterexamples
A compelling narrativeThe actual base rate

You’re comparing yourself to a sample that’s been filtered by extreme success. That’s not data. That’s a highlight reel.


“My Grandmother Smoked Until 95”

You’ve heard this one. Someone’s grandmother smoked her whole life and lived to 95, therefore smoking can’t be that bad.

The problem: you’re only hearing from the survivor.

The grandmothers (and grandfathers) who smoked and died at 55, 60, 65? They aren’t around to tell their story. Your sample has been filtered by death itself.

You’re using a single data point that survived, and ignoring the millions that didn’t.


“They Built Things Better in the Past”

Old buildings look amazing. Modern buildings look cheap. They just don’t build them like they used to.

Actually, they built plenty of garbage in the past. It fell down. It rotted. It was demolished. What’s left standing is the best of the best, the top 1% that survived centuries of weather, war, and neglect.

You’re comparing the average modern building to the survivors of 500 years of filtering. This applies to everything:

  • Old music seems better, but you only hear the classics, not the thousands of forgotten songs
  • Old books seem wiser, but only the best were reprinted for centuries
  • Past eras seem more elegant, but the ugliness was discarded long ago

The past looks golden because time is a filter. Only the best survived. You’re comparing curated history to unfiltered present.


Mutual Fund Performance

This one costs people real money.

Mutual funds that perform badly get quietly shut down or merged into better-performing funds. When you look at historical fund performance, the bad funds are gone from the data. Only the survivors remain.

Elton et al. (1996) found that ignoring dead funds overstated average returns by about 1% per year. Over a decade, that’s the difference between a smart investment and a mediocre one masquerading as smart.

Every time you see a chart showing “average fund performance over 20 years,” ask: “Does this include the funds that died along the way?” If not, the number is a lie.


The Core Mechanism

Every case follows the same pattern:

  1. A filter removes failures from the data (death, bankruptcy, obscurity, shutdown)
  2. You see only what survived the filter
  3. You draw conclusions from this biased sample
  4. Your conclusions are wrong because the missing data is the important data

The Fix

One question protects you:

“What am I not seeing?”

Apply it everywhere:

  • Someone tells you a success story. Ask: “How many tried the same thing and failed?”
  • You see convincing data. Ask: “What was removed before I saw this?”
  • An old thing seems better than a new thing. Ask: “Is it better, or is it just the one that survived?”
  • A strategy seems to always work. Ask: “Am I only hearing from the people it worked for?”

The survivors will always try to mislead you. Not because they’re lying. But because they’re all that’s left.