Episode Transcript
In 1968 — 50 years ago — the governor of New York State, Nelson Rockefeller, received a proposal he’d commissioned. It addressed the mass-transit needs of the New York City area. One centerpiece of the plan was a new subway line that would run from lower Manhattan, up the East Side, and into the Bronx. It was called the Second Avenue Subway. Four years later, Rockefeller and New York City mayor John Lindsay held a ground-breaking ceremony for the Second Avenue Subway. But not long afterward, the project was shelved because of a fiscal crisis. Years later, a new governor, Mario Cuomo, tried to restart it. But once again, the budget would not allow — and back it went on the shelf. By now, the Second Avenue Subway had become a punchline. A New Yorker would promise to pay back a loan “once the Second Avenue Subway was built.” It came to be known as “the most famous thing that’s never been built in New York City.” But then, along came a man named Michael — here, I’m going to let him say it.
HORODNICEANU: Michael Horodniceanu. If you look to see how people on Second Avenue would recognize me as Dr. H. No one is really willing to pronounce my last name.
Okay, let’s go with “Dr. H.” He is a longtime transportation scholar and executive. In 2008, he became president of Capital Construction for the Metropolitan Transportation Authority. And one of the first things he did was restart the Second Avenue Subway. By now it was 40 years since Governor Rockefeller’s original proposal. Dr. H. updated the budgets and estimates and finally got construction started. In 2010, a massive tunnel-boring machine began its work underneath 92nd Street.
HORODNICEANU: So they are about to start and they probed ahead into the rock, then suddenly the realization is that the quality of the rock was poor, in effect there was water going through there. So the decision was made that we have to freeze about — close to two blocks. It took us four months to actually freeze the ground. And it’s costly! On Second Avenue, we spent $10 million to do it.
Ten million dollars and four months just to freeze the ground just to start building the tunnel! Would New York City ever get its Second Avenue Subway? The reason this story became so famous is that it is such a grotesque example of a blown deadline. But surely you can identify. Surely you’ve been involved in something — maybe a work project, or a home renovation, even writing a paper — that was also grotesquely late? And painful? And expensive? Why are we so, so, so bad at finishing projects on time? And what are we supposed to do about it?
* * *
This episode begins, as so many good things do, in Canada.
BUEHLER: All right, well, I’m Roger Buehler and I’m a professor of psychology at Wilfrid Laurier University.
That’s in Waterloo, Ontario.
BUEHLER: I study social cognition as well as judgment and decision-making.
Buehler has long wanted to know why we’re so bad at managing projects. His interest began during grad school, with a personal puzzle.
BUEHLER: Every night as I’d leave the office I would pack up my briefcase with jobs to do at home and more often than not I’d come back to the office the next day with all of it untouched. But every night as I packed up that briefcase, I was sure that my plans were realistic. So that was the puzzle. Why wouldn’t I learn from experience and get more realistic in my estimates?
DUBNER: In the beginning, did you think, “It must just be me, I must be the one who’s failing here”? Or did you recognize it as a generalizable phenomenon?
BUEHLER: Well, partly it seemed odd but then I noticed it in people around me, too. In fact, I remember some particular colleagues who were forever promising things and never coming through and noticing that, well, they seemed to believe it though when they’re when they’re saying it. It seems real. So it did start to seem like a more generalizable thing
Buehler and his colleagues were, of course, not unique. The phenomenon even had a name, courtesy of the psychologists Danny Kahneman and Amos Tversky. They had called it “the planning fallacy.”
BUEHLER: So the planning fallacy is a tendency to underestimate the time it will take to complete a project while knowing that similar projects have typically taken longer in the past. So it’s a combination of optimistic prediction about a particular case in the face of more general knowledge that would suggest otherwise.
You can imagine the value of being able to diagnose, and treat, the planning fallacy. We all plan for the future — whether it’s a huge infrastructure project or a research paper that’s due in three weeks. Wouldn’t it be nice to be able to plan better? So, first, Roger Buehler and some colleagues set out to measure the planning fallacy. Their first experiment used honors students who were working on their thesis projects. The researchers asked each student to predict when they’d submit their thesis.
BUEHLER: I know psychologists make use of students too much. But for this research, they really do have a lot of activities on the go and they’re kind of concrete activities often with clear deadlines, making them a nice population to actually study for this topic.
These students predicted, on average, that their theses would take 33.9 days to finish. How long did it actually take? Fifty-five-point-five days. That’s a 64 percent overage. Buehler and other researchers found similar evidence of the planning fallacy among stockbrokers and electrical engineers and doctors; they also found it in everyday activities like Christmas shopping, doing taxes, even waiting in line for gas.
DUBNER: So let me ask you the largest and most impossible question: What’s wrong with us? Why is there such a gap between intention and behavior?
BUEHLER: Right. Well, my thinking on this has been guided by a distinction drawn by Kahneman/Tversky between two types of thinking, one being a kind of inside approach and the other being an outside approach. And the inside approach involves really focusing on the case at hand and trying to work out the details of that unique case. It’s like you’re developing a mental scenario or mental simulation of how you think that project will unfold. But the problem is that mental simulations often don’t provide the thorough and comprehensive representation of how things will go. They tend to often be kind of idealized. Oversimplified. And when people get into that frame of thought, they don’t entertain alternative ways in which things may go. They kind of get locked into one scenario. But then I would couple that with also people’s wishes and desires. So, generally when you’re planning something out, you’re planning to succeed. You’re not planning to fail.
This second tendency that Buehler is talking about — seeing the future in rosy terms — there’s a name for that, too. It’s called the “optimism bias.”
SHAROT: I think it’s a wonderful thing.
Tali Sharot is a cognitive neuroscientist at University College London.
SHAROT: There are so many positive aspects to having an optimism bias. In fact, in our research we see that the people without an optimism bias tend, in most cases, to be slightly depressed at least, with severe depression being related to a pessimistic bias where people expect the future to be worse than it ends up being. So I mean it’s a good thing because it kind of drives us forward. It gives us motivation. It makes us explore different things. It’s related to better health — both physical and mental health — because if you expect positive things, then stress and anxiety is reduced. So that’s very good for both physical and mental health.
Sharot believes in the optimism bias, and she believes it is rooted in neuroscience. Her experiments have repeatedly shown that the brain tends to process positive information about the future more readily than negative information. It’s easy to see how that could feed the planning fallacy.
SHAROT: So, for example, if I tell you, “You know, you’re going to get many more listeners this week than usual.” You’d say, “Oh, okay.” And you’d kind of change your estimate and think it’s going to be 10 million. But if I tell you, “You’re going to get many less listeners this week,” than I expect you’d say, “Well, she doesn’t know what she’s talking about.”
DUBNER: That’s exactly what I was saying in my mind, just so you know. Do you believe the optimism bias was, is baked in for legitimate or positive evolutionary purposes then? Or do you think it’s more of, you know, a little bit of a design flaw that we’ve learned can have some benefit?
SHAROT: So, I think it’s not a design flaw. And it’s a two-part answer. So the first part answer is, you know, it’s probably there because of all the positive aspects that I just mentioned. I mean we know that people who are optimists live longer. So we survive more. We’re more likely to find a partner. We’re more likely to have kids and all of that. So there is a clear survival benefit and a benefit to just progressing. However, as you imply, there are also these negative consequences, right. If we think everything’s going to be OK or better than what we anticipate, we might not take precautionary action. We might, you know, smoke when we shouldn’t and that kind of thing. So there are the negative aspects to it. But what our research shows is that it’s even better than what I just explained because the optimism bias is, in fact, flexible. So it changes in response to the environment. It can disappear under environments in a way that may be optimal.
Here’s what Sharot means by that. She and her colleagues run experiments in which they ask different kinds of people — firefighters, for instance — to assess the likelihood of bad things happening to them: getting divorced or being in a car crash or getting diagnosed with cancer. These are basically their wild guesses.
SHAROT: And then we give them information about the average likelihood of having these events for someone like them and then we ask them again.
Okay, so the firefighters would have their baseline guess and then they’d guess again after getting some statistical context. But there was another twist: they were also asked the question in two different environments: on days when they’d been fighting fires and during down time, when they weren’t.
SHAROT: And what we found was that when the firefighters were under stress, they learned more from this negative information. The more stressed they were, the more anxious they were — the more likely they were to take in any kind of negative information that we gave them, you know whether it’s about cancer or divorce or being in a car accident and so on.
Based on these results, Sharot argues that human optimism is both adaptive and mutable. So that’s good to know: the optimism bias may be a sort of evolutionary insurance policy against hopelessness and depression. And maybe it’s played a role in some of the astounding progress that humankind has made over the millennia — you’d have to be pretty optimistic to come up with space travel and aspirin and French cuisine, n’est-ce pas? But still, wouldn’t it be nice to also figure out how to get projects done on time and on budget? That is something that Katherine Milkman has been thinking about for years.
MILKMAN: I’m an associate professor at the Wharton School of the University of Pennsylvania.
Milkman’s Ph.D. is in computer science and business, but as an undergrad she studied operations research. Which means what?
MILKMAN: You try to figure out how to be more efficient about everything, using math.
To that end, Milkman has spent a lot of time studying, and teaching, the planning fallacy. And yet: this does not personally inoculate her against it.
MILKMAN: Well, I will tell you that it took longer to prepare to talk to you about this than I expected.
DUBNER: Like how long?
MILKMAN: I don’t know, like an hour.
DUBNER: Wow, yeah, that’s a lot.
MILKMAN: Yeah, but it’s planning fallacy. I was sure it would take, like, ten minutes.
In her research, Milkman has found that when groups work together on a project, a number of factors collude to form the planning fallacy.
MILKMAN: So one is overconfidence, or the tendency we have to think we’ll do things better than we will. We are overconfident for many, many reasons. One is that it makes us feel better about ourselves; we’re often rewarded for it. Imagine two people walked into an interview and one of them says, “I’m going to be great at this job, I’m great at everything I do.” And the other person says, “I hope to be great at this job, I try to be great at everything I do, but sometimes I fail.” I think most of us would respond more positively to the person who says, “I’m going to be great.” And that is a lifetime of feedback we give people, where we’re rewarding them for overconfidence constantly.
So there are individual biases like overconfidence. But with large projects, there’s also what’s called —
MILKMAN: Coordination neglect.
And coordination neglect is… ?
MILKMAN: The failure to think about how hard it is to put stuff together when other people are involved. And so that can make the planning fallacy bigger and badder when there are teams of people trying to finish work on time.
DUBNER: From an economic standpoint, this sounds backwards. You would think that larger size, theoretically, creates more specialization of labor, and ultimately higher productivity. Why doesn’t it?
MILKMAN: So when you staff a bigger team on a project, you focus on all the benefits associated with specialization, what you just mentioned. And what you neglect is to think about how challenging it is to get that work all back together into a single hole. So this engineer now has to talk to that engineer about how to combine their outputs into one integrated system.
And there’s one more component of the planning fallacy — a blatantly obvious one.
MILKMAN: It also relates to procrastination. Because of self-control failures, we put it off, and then that can make the planning fallacy a bigger issue. Because if you don’t start the project on time, because you keep putting it off, how in the world are you going to finish it on time?
DUBNER: And let me ask you this: what is, if we have any idea, the primary root cause of procrastination?
MILKMAN: Oh god, what is the primary root cause of procrastination? Like, get to the heart of everything I’ve thought about for the last 15 years in one question. I think the primary root cause of procrastination is impulse control. The fact that we tend to want to do what’s more instantly gratifying in the moment than what is better for us. And so we put off doing the things we know we should do, in favor of what’s instantly gratifying.
There is reason to believe that our impulse control is being tested today more than ever. With the revolution in digital communication has come a blizzard of notifications, alerts, messages, and more. While there are obvious upsides to the speed and magnitude of this communication, there are also costs: information overload is thought to decrease U.S. productivity by at least $1 trillion a year. What to do about all that wonderful, terrible digital distraction? We brought that question to Justin Rosenstein.
ROSENSTEIN: I’m the co-founder and head of product at Asana.
DUBNER: Okay. Asana — for those who don’t know what it is — let’s start with that.
ROSENSTEIN: Asana is software that enables teams to be able to work together more easily.
All right, but let’s back up. Rosenstein’s first job was at Google. Here’s how he envisioned that job before it began.
ROSENSTEIN: I’m going to be spending all my time working with these world-class engineers, world-class designers, figuring out great things that we can do to be able to make people’s lives better.
And he did work on some exciting projects.
ROSENSTEIN: I was the original product manager for Google Drive and helped co-invent a lot of that. I co-invented G-mail chat.
But the reality of working at Google didn’t match his vision of working at Google.
ROSENSTEIN: Literally the majority of the time was spent not doing work, not writing code, but was doing the work about work. It was making sure that the left hand knew what the right hand was doing. It was sitting in status meetings and preparing status updates.
At first, he thought he must have been doing something wrong.
ROSENSTEIN: There’s no way that this could be what everyone tolerates when they go to work and work in companies. And so I thought maybe this was something that was wrong with Google. Talked to people who worked at other companies and discovered to my horror, that no, Google was actually very advanced. Most companies were far less organized.
He looked around for software solutions. There were a lot — but none that did what he wanted.
ROSENSTEIN: What I really wanted was just a single place that I could go to see what is everyone on my team working on? And who’s responsible for which things? And what’s the sequence, and what are the dependencies between those things?
So, nights and weekends, he started hacking together some software to accomplish that.
ROSENSTEIN: And I just built it as something for me and a few dozen people that I worked with to use, but it grew virally within Google. And that really startled me, because Google has access to the best tools in the world. It was insane to me that there were so many people who were excited about what I had built.
But not long after, Rosenstein left Google — for Facebook.
ROSENSTEIN: At some point, I got a Facebook friend request from Dustin Moskovitz, who’s the co-founder of Facebook. And so eventually was persuaded that, yeah, it would be exciting to be a part of that. Actually one of the sad things about leaving Google was that I had to leave that tool behind that I had built inside Google.
DUBNER: Because it belonged to Google?
ROSENSTEIN: Because, yeah, it was the intellectual property of Google.
Rosenstein asked Moskovitz if Facebook had the same project-management problems he’d seen at Google.
ROSENSTEIN: And he was like, “You have no idea, I am tearing my hair out. By the time I get information about what the people in my own company are working, the information is so old that it’s wrong.”
By day, Rosenstein was helping invent Facebook’s “like” button. At night and on weekends, he and Moskovitz started talking about a software solution to help manage their workflow.
ROSENSTEIN: And at some point we started building it — we started actually implementing that solution. And this internal tool that we built at Facebook took off.
And they realized their solution wasn’t unique to Facebook, or Google, or just tech companies. What they’d developed, he believed…
ROSENSTEIN: We had developed a general solution that would enable any team to be able to work together more effectively.
DUBNER: So, this time did you own the I.P.?
ROSENSTEIN: That is Facebook’s I.P. and the work that we did there — actually Facebook still uses it to this day.
Rosenstein and Moskovitz eventually left Facebook to start up Asana.
ROSENSTEIN: What really made us decide that we wanted to leave Facebook, was really feeling like this was itself a Facebook-sized opportunity.
DUBNER: So I understand that you and Dustin, I guess, had an estimate for how long it would take to get the company up and running. I’d love you to talk about that and how long it actually took.
ROSENSTEIN: Yeah, we were hopeful, especially given that by this point, I had built some version of this twice. And we were like, maybe in about a year, we’ll be able to launch the first version of the product. And it took three years.
By now, Asana has a large and prestigious customer list. They’re one of dozens of companies that build productivity software; the market is estimated at $1.2 billion. Rosenstein, like Katy Milkman, believes that a key to success here lies in mastering impulse control and in fighting distraction
ROSENSTEIN: “Continuous partial attention” is this term for this state that it’s easy to get into if you’re not careful, where you’re never quite focused on any one thing. And I think humans in general tend to distract themselves. But we’ve entered a world in which distraction has become more and more the norm. Your cell phone buzzing, some chat message coming in. And there’s research from the University of California at Irvine that every time you get interrupted, it takes 23 minutes to fully recover.
DUBNER: It is an irony, however, isn’t it — in that you’re creating software to solve the problem that’s been created essentially by software?
ROSENSTEIN: Software has been a big contributor to that problem. And as I mentioned I co-invented the Like button —
DUBNER: — including the firms you used to work for.
ROSENSTEIN: I’ve thought about this issue quite a bit.
DUBNER: So this is all atonement, in other words, on some level at least, yeah?
ROSENSTEIN: Now, you point out this really important point that there are unintended consequences. And things that I’ve worked on in the past have had this property where even if you have the intention that they’ll be used entirely in a good way, sometimes there are negative things you couldn’t foresee. I think that the answer to that is neither to shrug and just say, “Well, whatever happens happens,” nor is it become a Luddite and say, “Well, we have no idea what the unintended consequence of anything is going to be so we shouldn’t even try building things.” The middle way is to accept there will always be unintended consequences, but we can notice those unintended consequences and then design through them.
* * *
There are a lot of reasons why that project you planned can take way longer than you anticipated, and cost way more. Outright fraud, for instance — the lying, cheating, and stealing familiar to just about anyone who’s ever had, say, a home renovation, especially in New York City. And yes, I speak from personal experience. There’s also downright incompetence; that’s hard to plan for. But today we’re talking about the planning fallacy, which was formally described a few decades ago by the psychologists Danny Kahneman and Amos Tversky. When they started theorizing about how to correct for the planning fallacy, they identified what they thought was a key factor. When people estimate how long a project will take, they focus too much on the individual quirks of that project and not enough on how long similar projects took. This second approach is called reference-class forecasting.
GRUSHKA-COCKAYNE: The reference-class forecast says, actually if you’re planning project X that you’re about to start — ignore Project X.
That’s Yael Grushka-Cockayne. She teaches project management and decision-making at the University of Virginia’s Darden School of Business.
GRUSHKA-COCKAYNE: Don’t think about it too much. Look back. Look back at all the projects you’ve done, all the projects that are similar to this new project X, and look historically at how well those projects performed in terms of their plan versus their actual. See how accurate you were, and then use that shift or use that uplift to adjust your new project that you’re about to start.
As we’ve seen with Justin Rosenstein and Katy Milkman, Grushka-Cockayne does not always practice what she preaches.
GRUSHKA-COCKAYNE: I’m a pretty decent planner but I’m not as organized as I would probably recommend other people be. I like to improvise. I’m also— I’m Israeli and Israelis are notorious for being pretty spontaneous by nature. So we hate planning. And I’m married to a Brit who is, you know, the exact opposite. You know the Brits plan what they want to eat for lunch in about six months’ time. And I’m like, “I don’t know if I want custard or pudding — I don’t know, leave me alone.”
Grushka-Cockayne has been studying the planning fallacy in governments as well as in private firms. And she likes the trend line:
GRUSHKA-COCKAYNE: I will say that more and more companies these days are improving their performance overall.
She believes that further improvement lies in a stronger embrace of — no surprise here — data.
GRUSHKA-COCKAYNE: More and more government bodies are publishing some planned and some actual deadlines and budgets. And tracking performance and tracking historical plans and actuals is the fundamental first step in overcoming the planning fallacy. So the main broad insight is: you should track your performance, because if you just start with that, let alone anything more sophisticated, you would raise the profile of the issue as a performance issue within the organization, and you will improve.
One profession that exemplifies this improvement? Meteorologists, believe it or not.
GRUSHKA-COCKAYNE: While we like to give them a hard time, we all have weather apps on our phone and we trust them quite a great great deal because you know — putting it all together at the end of the day — they’re pretty accurate with their predictions. And, yes, they have sophisticated systems that decompose what they’re trying to predict, but they also track, and they score themselves, and they keep record of how accurate they were. And only by doing that will you stand a chance to improve.
Tracking and scoring the difference between forecasts and outcomes — that trend owes a lot to this man:
FLYVBJERG: My name Bent Flyvbjerg and I’m a professor at Oxford University’s Saïd Business School.
Flyvbjerg is an economic geographer who years ago became fascinated with infrastructure megaprojects. It began in his native Denmark.
FLYVBJERG: So Denmark at one stage decided to start doing megaprojects and the first megaproject they did was a connection between east and west Denmark and it went terribly wrong. And I got curious and wondered, is this just bad luck, or is this common?
It was, he found out, very common. First, let’s get a sense of the magnitude of the problem. Flyvbjerg has estimated that infrastructure projects with a budget above $1 billion add up to between $6 and $9 trillion a year globally. That’s about 8 percent of global G.D.P.
FLYVBJERG: So we’ve studied estimated costs and actual out-turn costs for hundreds of projects around the world and it turns out that 80 to 90 percent of all projects have cost overruns. We did the same for schedules. So comparing estimated schedule, how long would it take, with actual schedule, how long did it actually take, and found the same thing, that 80 to 90 percent of projects have schedule overrun. So it takes longer — many times years longer — than originally planned.
These data led Flyvbjerg to establish what he calls “the iron law of megaprojects.”
FLYVBJERG: Over budget, over time, under benefits, over and over again.
And it’s a long-standing trend.
FLYVBJERG: So our data now go back 100 years. And we find that it’s a very constant situation. It doesn’t matter which part of the period you look at, you have a constant cost overrun. You have constant schedule overrun, and you have constant benefits shortfalls.
Okay, so why does this happen? The first theory Flyvbjerg embraced is called “strategic misrepresentation.” Which is essentially a fancy way of saying that you lie in order to get what you want.
FLYVBJERG: We’d actually interviewed planners who said that they did this deliberately, that they actually were incentivized to misrepresent the business cases for the projects in their benefit-cost analysis. And they wanted their projects to look good on paper, to increase their chances of getting funded and getting approval for their projects. And they said, “We do this by underestimating the cost and overestimating the benefits, because that gives us a nice high benefit-cost ratio so that we actually get chosen.
This may strike you as intellectually dishonest, at the very least. But this strategy is endorsed by no less an authority than Danny Kahneman himself, who won a Nobel Prize for economics:
KAHNEMAN: If you realistically present to people what can be achieved in solving a problem, they will find that completely uninteresting. You can’t get anywhere without some degree of over-promising.
So between strategic misrepresentation and the optimism bias, what are you supposed to do if you’re on the commissioning end of a megaproject? After all, the stakes are quite high.
FLYVBJERG: So, if you have a lot of megaprojects going wrong, your whole national accounting and funding system, you know, where you plan the budget for next year, becomes very unreliable.
Consider what the British government did with its official Green Book, which tracks public spending.
FLYVBJERG: So the U.K. government has a Green Book about estimating projects, which was developed quite a while ago, because it turned out that all these projects going wrong all the time actually made it very difficult for the government to produce reliable budgets. So the U.K. decided to do something about that, led by the Treasury.
Flyvbjerg worked with the U.K. Treasury, and the Department of Transport, in order to get this right.
FLYVBJERG: …to get this right for infrastructure and transport projects, and developing a methodology which has, in the meantime, become mandatory in the U.K. for large projects. And other countries have studied this, including Denmark, and Denmark has also made this method mandatory.
And what is this mandatory method? Basically, it’s strategic misrepresentation in the opposite direction.
FLYVBJERG: Let’s say you’re doing an urban rail project, you’re having a subway extension. You will do all the usual conventional stuff, you’ll get your cost estimate and your schedule estimate and then on the basis of empirical evidence, you maintain a database documenting how much are the budgets usually underestimated for this type of project? How much is the schedule usually underestimated for this type of project? And then you’re using the numbers from previous projects to adjust the numbers for the new project that you’re doing. So let’s say that on average, projects go 40 percent over budget. You’d add 40 percent to the budget for your planned project. And then you would have a much more accurate budget.
But what about perverse incentives? If the contractor knows you’re expecting them to come in 40 percent over budget — and over deadline, too — where’s their incentive to work hard?
FLYVBJERG: Yeah, that’s very important, and we actually don’t recommend using this methodology unless you incentivize contractors , because you could make the situation much worse. And it is something that we didn’t ignore when we developed this, with the U.K. government and the Danish government. So at the same time as this methodology was made mandatory, it was also made mandatory that the people involved in delivering these projects would actually have skin in the game, as we call it. So you need to write your contract for instance, in a manner where the contractor will gain additional profit if they actually meet your targets, but they will also be punished by having to pay for it and they will make less profit if they don’t meet your target.
So how well has this worked? It’s hard to be too definitive; this system has been in place only since 2004, and big infrastructure projects have long timelines. But a preliminary analysis done by outside researchers has found the projections to be reasonably accurate and the cost overruns to be reasonably small — about 7 percent from the planning stages of a transportation project to completion. All of which suggests that pricing in the optimism bias and using reference-class forecasting are truly useful tools to fight the planning fallacy. Katy Milkman, as I learned, has one more suggestion.
DUBNER: So let me ask you this: considering that the planning fallacy is as wide and as large as it is, and considering all the costs, I mean, you can just imagine in construction or in medicine, all these costs that can rack up, what should be done about it?
MILKMAN: Algorithms.
DUBNER: That was easy. See you later.
MILKMAN: There’s such a clear answer to this one. And it is frustratingly hard to convince people, for reasons — I don’t know if you’ve covered algorithm aversion on your podcast, but people are very averse to using algorithms, for all sorts of reasons that I think are a little crazy. Anyway, algorithms are the answer.
DUBNER: So when I think of people who don’t have a planning fallacy, I think of Amazon. So, like, I go on Amazon and I find something I want, and they tell me when I’m going to have it, and then by then, or even before then, I have it, probably 95 percent of the time. So that doesn’t seem so hard. What’s the problem with everybody else?
MILKMAN: I love that example, because Amazon relies on an algorithm to make that forecast, and they have lots and lots of data going into that algorithm, because they’ve literally solved this problem probably billions of times before. That is exactly how we cure the planning fallacy. We use data instead of human judgment to make forecasts, and then we don’t have this problem anymore.
Okay, so that sounds perfect, and foolproof. But of course it isn’t.
GRUSHKA-COCKAYNE: We don’t always think mathematically, and we don’t have crazy statistical models in our mind that allow us to come up with exact accurate predictions.
That, again, is Yael Grushka-Cockayne, from the University of Virginia.
GRUSHKA-COCKAYNE: And you need the data. And you need to apply it in a sensible way, you need to identify those similar projects. What is my reference class? You know it’s not going to be just a bunch of projects with very different characteristics. So going and finding similar reference is by definition not always going to be easy, because projects are different.
And the difficulty of truly solving the planning fallacy is perhaps best exemplified by our Canadian friend Roger Buehler.
DUBNER: So you began by telling us that you got interested in the planning fallacy out of personal experience. You’d bring this briefcase home full of work and never get it done. Have you improved over time?
BUEHLER: Unfortunately, no. I’m still carrying around this overweight briefcase. I do think I’ve learned and that I’m more open to using my past experiences and probably more likely having studied it this many years to pause and ask myself, “Okay, what’s happened in the past?” And I should take that in. But even so it’s not the natural way to do it. It still takes effort to make your predictions in that manner. It’s easier to fall into an optimistic plan for the case at hand.
Was there ever a more optimistic plan than New York Governor Nelson Rockefeller’s plan to build the Second Avenue Subway in New York City? As you’ll recall, the proposal was hatched in 1968; it was finally passed along to the Metropolitan Transportation Authority’s Dr. H. in 2008.
HORODNICEANU: I cannot tell you that I have one big thing that created a problem, because there was not that. You do have the little things that constantly interfere.
Those little things tend to add up. In 2007, the M.T.A. pushed back the completion date to 2014. A year later, it was pushed back another year. Same routine the following year. It took a work-acceleration agreement and an infusion of $66 million above the planned budget to finally bring the project home on the last day of 2016.
HORODNICEANU: The governor took a celebratory ride the night of December 31st. We had a big party in the station at 96th Street.
Andrew CUOMO [from this clip]: Well, thank you very much. Good evening to all of you. What a great night, huh?
That governor, by the way, was Andrew Cuomo. Who’s the son of one of the previous governors, Mario Cuomo, who tried to build the Second Avenue Subway. To date, the project has cost $4.5 billion, making it, per mile, one of the most expensive mass-transit projects in history. Oh yeah: it also came in about $700 million over budget. And here’s the best part: all that money and time went into building just two miles of tunnel and three new stations — not the 8.5 miles and 15 stations in the original plan. Those are still to come. As Dr. H. sees it, a deadline this badly missed comes from some unexpected developments, to be sure. But also: deliberate deception.
HORODNICEANU: Quite frankly, most of these megaprojects are being started by people that never end them. I’m talking about elected officials, because they are the ones that make the decisions. They will have aggressive schedules and optimistic budgets. Why? Because they will have to show their constituents that this is done. So, now if you start a project today, chances are that someone else will finish it. Right? So I’m going to take the view that I’m on the job now. I want to get to be re-elected for the next four years. We’re going to be efficient and do all of these things. And then you’re gone. So now comes the next guy or woman and says, “Holy, it’s going to cost another, I don’t know, half a billion dollars to complete it.” So lot of these projects would have never actually happened if people were, and this is the unfortunate truth, unless they’re presented a more optimistic view than actually would be.
The second phase of the Second Avenue subway line, another 1.5 miles, is now under construction. It’s set to cost $6 billion and open between 2027 and 2029. How much would Dr. H. be willing to bet on a timely, on-budget completion?
HORODNICEANU: I’m not a betting man. I only bet when I am when I am sure that I can win, right? I wouldn’t bet on that at this point.
And how about his chances of someday riding the entire completed line, all the way from downtown Manhattan up into the Bronx?
HORODNICEANU: Me? No. I have no expectation or desire to live that long.
* * *
Freakonomics Radio is produced by WNYC Studios and Dubner Productions. This episode was produced by Alvin Melathe. Our staff also includes Alison Hockenberry, Merritt Jacob, Greg Rosalsky, Stephanie Tam, Max Miller, Harry Huggins and Brian Gutierrez. The music throughout the episode was composed by Luis Guerra. You can subscribe to Freakonomics Radio on Apple Podcasts, or wherever you get your podcasts. You can also find us on Twitter, Facebook, or via email at radio@freakonomics.com.
Sources
- Roger Buehler, professor of psychology at Wilfrid Laurier University.
- Bent Flyvbjerg, professor at Oxford University’s Saïd Business School.
- Yael Grushka-Cockayne, professor of project management and decision-making at the University of Virginia’s Darden School of Business.
- Michael Horodniceanu, former president of M.T.A. Capital Construction.
- Danny Kahneman, professor of psychology at Princeton University.
- Katherine Milkman, professor of operations, information and decisions at the Wharton School of the University of Pennsylvania.
- Justin Rosenstein, co-founder of Asana.
- Tali Sharot, cognitive neuroscientist at University College London.
Resources
- “Exploring the ‘Planning Fallacy’: Why People Underestimate Their Task Completion Times,” by Roger Buehler, Dale Griffin, and Michael Ross (Journal of Personality and Social Psychology, 1994).
- “The Role of Motivated Reasoning in Optimistic Time Predictions,” by Roger Buehler, Dale Griffin, and Heather MacDonald (Personal and Social Psychology Bulletin, 1997).
- “Planning, Personality, and Prediction: The Role of Future Focus in Optimistic Time Predictions,” by Roger Buehler and Dale Griffin (Organizational Behavior and Human Decision Processes, 2003).
- “What You Should Know About Megaprojects, and Why: An Overview,” by Bent Flyvbjerg (Project Management Journal, 2014).
Extras
- “Information Overload Now $997 Billion: What Has Changed?” by Jonathan B. Spira (Basex, 2010).
- “Worker, Interrupted: The Cost of Task Switching,” by Kermit Pattison (Fast Company, 2008).
- The U.K.’s “Green Book.”
- “Procedures for Dealing with Optimism Bias in Transport Planning,” by the British Department for Transport (2004).
- “Optimism Bias Study: Recommended Adjustments to Optimism Bias Uplifts,” by the British Department for Transport (2017).
Comments