Failure Is Your Friend (Ep. 169)

Listen now:

2845637227_876663b5e7_oThis week’s episode is called “Failure Is Your Friend.” (You can subscribe to the podcast at iTunes, get the RSS feed, or listen via the media player above. You can also read the transcript, which includes credits for the music you’ll hear in the episode.)

This is a natural followup to last week’s episode, “The Upside of Quitting.” Why are so many people so reluctant to quit projects or jobs or relationships that have soured? One reason, Stephen Dubner argues, is that we tend to equate quitting with failure, and there’s a huge stigma attached to failure. But … should there be? In their new book Think Like a Freak, Dubner and Steven Levitt  argue that perhaps we’re not thinking clearly about failure. Failure, they say, can be your friend:

LEVITT: I always tell my students — fail quickly. The quicker you fail the more chances you have to fail at something else before you eventually maybe find the thing that you don’t fail at.

When failure is stigmatized, people will do everything they can to avoid it, often at great cost. Levitt tells the story of a large multinational retailer that was opening its first store in China — and how the company’s executives couldn’t express their misgivings to a bullish boss. Then we hear a story in which the boss’s “go fever” had far more tragic ramifications: the 1986 launch of the space shuttle Challenger. Allan McDonald, an engineer on the shuttle project and author of the book Truth, Lies, and O-Rings, tell us how his attempts to delay the launch were overruled:

McDONALD: What really happened was typical I think in large bureaucratic organizations, and any big organization where you’re frankly trying to be a hero in doing your job. And NASA had two strikes against it from the start, which one of those is they were too successful. They had gotten by for a quarter of a century now and had never lost a single person going into space, which was considered a very hazardous thing to do. And they had rescued the Apollo 13 halfway to the moon when part of the vehicle blew up. Seemed like it was an impossible task, but they did it. … So it gives you a little bit of arrogance you shouldn’t have. And a huge amount of money [was] involved. But they hadn’t stumbled yet and they just pressed on. So you really had to quote “prove that it would fail” and nobody could do that.

You might think that it would be rare that someone involved in a project could, like McDonald, foresee exactly how it might fail. But is it? And might there be a way to look around the corner and find out how you might fail before you go to the trouble of doing so?

Gary Klein has one suggestion. He is the author of Seeing What Others Don’t: The Remarkable Ways We Gain Insights and a proponent of what he calls the “pre-mortem.” While many institutions conduct a post-mortem to examine why a given project has failed, Klein walks us through an exercise that can spot potential failures before things have gone wrong.

So get out there and start failing, people — failing well, failing fast, and failing productively — so that today’s failure can make way for your tomorrow’s triumph.


I think failure can be made more palatable with good consolation prizes as incentives, which is a little different than celebrating failure. For example, when I was in high school, I had the chance to participate in the Westinghouse Social Science competition. Our teacher told us, "If you win, you won't have to apply to college. They'll come looking for you."

"But what if I don't win?" I thought to myself. The competition was bound to be fierce. So I decided that instead of spending my summer doing extra schoolwork, I'd be better off working. I became a corporate clerk in Macy's. It was my first office job.

It's one of the big regrets of my life, though not the biggest. I'm still working in an office, but I dream of a career in academia, doing writing and social research. I wish that after my teacher told us about being a Westinghouse winner, he'd told us what was in it for us just for participating. "Even if you don't win, it will still set your college application apart from all the non-participants." I needed an incentive to be an also-ran, which is what I was most likely to be. Also-rans may not be winners, but "loser" gives the completely wrong impression.

I still work in an office today, and I'm bored as heck. I dream of a career as a writer or academic, and that competition could have been my first step along that path. The topic I'd been considering was to survey my peers on their opinions of the possibility of nuclear war. (This was 1985, close to the end of the Cold War.) Because I lived in Jamaica, Queens, I had access to kids from many classes and races within New York. My teacher tried to prepare me for the possibility that some of the students would have no opinion about that and would care about something else. That would have been an interesting lesson for me, too, and Mr. Weiner was probably the perfect person to have guided me through it - my first disproven thesis.

This was all brought back to me from your first book, by the way. When I read about Sudhir Venkatesh's venture into the projects with his survey, I thought, "That could have been me," except I was a female, white high school student and would have brought my survey only to much tougher schools and crowds than my own.

I love Freakonomics. I can hardly wait to get hold of your next book!



Please explain to me why Gus Grissom, Ed White and Roger Chaffee burning to death on the launch pad during a test launch in 1967 (Apollo 1) don't count toward NASA's safety record. 25 years without losing a man? Except for those 3 because it was only a test launch, not an actual launch so NASA was allowed to stay cocky and pretend it never happened?
It's a little tough to accept the Challenger as a story of productive failure when first of all, NASA killed three astronauts 20 years earlier, which ALSO is often sited as a productive failure in case studies. Then there was the Titan rocket that exploded 4 months before the Challenger, another Titan rocket that exploded 3 months after the Challenger, and a Delta rocket that exploded 4 months after the Challanger. But I guess since no humans were on those rockets, there was no need to learn from those mistakes?

Roland Grit

Don't forget Space Shuttle Columbia, not many years later. A lesson not learned is often repeated.


Wow, this podcast hit me upside the head. I am about the finish a data project and need away to conduct a review to get good and useful feedback from the business unit. So now I will be conducting a “pre-mortem” review. I have listen to this podcast several times and made some notes, but would like to know if Gary Klein has a “How To” guide or some pointers on conducting a pre-mortem?

Pre-mortem - seems this has bern kind of UK Health & Safety Law for years. for every new project even a school trip a leader is required to write a list of things that could go wrong. the idea that this puts them in the correct frame of mind tyo anticipate problems before they happen.
- commonly it is regarded as faceless bureaucracy .. But it has been proven to work and bring down the accudent rate.


Not to discredit Mr. Klein's suggestion but in the automotive industry this type of thinking has been used for decades. We call it DFMEA (Design Failure Modes Effects Analysis). We take every possible failure mode and assign a score to it, then subsequently derive improvements to reduce the risk of the failure modes. The goal is to lower the risk score.

dr gla

Same church, different pew. Kathryn Schulz' "Being Wrong: Adventures in the Margin of Error" is a terrific book, which may also act in the service of failing brilliantly and thinking about the world in very different ways. Harvard's President, Drew Gilpin Faust, said she'd like to see every incoming freshman read it. But there's plenty of grist for older minds, too. Maybe more, since the hardening process of thinking that sometimes comes with age and habit often needs more than a nudge.

Peter Cheimets

I had a couple of comments regarding the Space shuttle portion of the Fail Quickly podcast:

First, the space industry now has an approach to failure that is a little like the method that was described toward the end of the podcast (pre-mortem). It is called risk-mitigation. Everyone sits around thinking possible risks. Then the risks are rated to their likelihood and severity. Those with a high combined score are then mitigated per a strict schedule.

Second, the people who made the decision to launch the Space shuttle over rode the best technical advise available at the time. And in retrospect, that advise was correct. I have seen any number of presentations on this event, mostly portraying the problem as one of persuasiveness.

A couple of things need to be noted however: James Begg had been NASA administrator up until a couple of months prior to the disastrous launch. He was suspended and replaced by an acting administrator, William Graham. Graham had been deputy administrator, a position which is far more politically in tuned than the Administrator (our person at the to speak). Reagan wanted to make some comment about the flying shuttle during his upcoming State of the Union speech, and that is where the pressure to launch was coming from. Graham posed a decision making process for that launch that was more in line with a PR exercise than a process surrounding a large, life-threatening activity like a manned space launch. With Begg gone, and the head of the Agency pushing absolutely for launch, there was no countermeasure to a go for launch. It was pretty much preordained. I doubt much was going to stop that launch, no matter how hard the technical folks tried to stop it..

The only lesson (vis a vis failure) I draw from this is that when you make decisions like this, there can only be one consideration, the ability to carry out your objective within your pre-defined safety parameters. If you are considering anything else, you have failed already.



Fail as quick as you can, because you learn a lot from each mistake. And maybe how to not repeat the mistakes so much. After taking action, the evaluation phase is key to figure out what's working and what changes need to be made as to direction, intensity, effort and timing.


The Challenger debacle is the classic example of GROUP THINK. Communications students study the Challenger O Rings and the communication breakdown for the decision to launch. Im not sure what Go fever is.

John Ager

In response to your opening question to the conversation with Gary Klein: “What is a way to learn how you might fail without going to the trouble of failing?”
In 1965 Drs. Kepner and Tregoe published the Rational Manager, a guide to best practices for Clear (or Freaky) Thinking, based on observations and interviews with CEOs and managers. One of the four thinking patterns they identified they called PPA (Potential Problem Analysis), similar to FMEA, which looks at the downside of unpredictable change. They also identified a parallel process POA (Potential Opportunity Analysis) which looks at the upside of unpredictable change.
These approaches help people explore the question "What could change?". In the case of Potential Problems, the goal is to identify Preventive Actions we can take in advance to reduce the probability of a particular unexpected change, Contingent Actions we can take after the fact to reduce the seriousness of the unexpected change, and Triggers to monitor if the change has occurred. A subsequent book, The Rational Project Manager, 2005, describes how to apply PPA and POA to projects.
In response to your last question, “Why hasn’t it taken over the world?”, I don’t know.



Levitt published macroeconomics papers while a grad student at MIT? What a failure!!

What was this podcast about again?

-Struggling grad student.


Firstly I enjoy the podcast and have listened to it for quite a while now.

I have a question relating to the Pre-mortem exercise as described, the process sounds exactly like a known industry practice which is called a risk review.

“In business and project management, risk analysis is a process that involves "gathering data and synthesizing information to develop an understanding of the risk of a particular enterprise" (

This is a common practice in the construction industry.

I am just concerned someone is reinventing the wheel or just renaming an existing process.

Keep up the good work.

Rosaleen Anne Lynch

Neil Gaiman similarly advocates getting out there and making mistakes..."Make glorious, amazing mistakes" was his advice in his New Year Wish, "because if you are making mistakes, then you are making new things, trying new things, learning, living, pushing yourself, changing yourself, changing your world. You're doing things you've never done before, and more importantly, you're Doing Something."


you cannot navigate this website without the ad for how to rob a bank book popping up. Seriously, I don't mind you promoting your book, but when the ad comes up EVERY time you click something it ruins the entire site. drop me a line if you ever get this fixed, and I will come back. until then, forget it.