Search the Site

Episode Transcript

DUBNER: This is what we’re going to do. It’s going to work out well, because I said so.

*      *      *

DUCKWORTH: I’m Angela Duckworth.

DUBNER: I’m Stephen Dubner.

DUCKWORTH + DUBNER: And you’re listening to No Stupid Questions.

Today on the show: What is the secret to making a great prediction?

DUBNER: My prediction would have been right if these things had happened.

Also: Why do even the most successful people have bad days?

DUCKWORTH: The eggs were overcooked. 

*      *      * 

DUBNER: Angela Duckworth.

DUCKWORTH: Stephen Dubner.

DUBNER: There is at least one way that I would very much like to be more like my son, Solomon, who is a college sophomore. He seems capable of making predictions in a way that is totally divorced from emotion, even when he has a stake in the thing he’s predicting. So, as an example, he works in politics. And he follows things closely, and he has some decent information, so he often has a pretty good sense of who’s going to win a campaign — whether it’s his candidate or the other. And even if it’s the opponent, he’s pretty realistic about not letting his fan interests get in the way. The same thing for sports. And he’ll even say, “I have a lot of confidence in this particular projection.” Even though he has never studied with the forecasting guru Phil Tetlock, who, you know, he’s probably never heard of Phil Tetlock, he’s able to really assess a probability about something that he cares about, and then follow it in a really unemotional way. And when I watch him do that, I think, I would like to have some of that — not just predictions, and not just betting, and not just sports and politics, but how can I, and maybe other people, learn to make decisions that are less influenced by emotion, and then, relatedly, not get swept up by the emotional response while monitoring the outcome? 

DUCKWORTH: Well, that is really remarkable, actually, that Solomon is able to make predictions about things that people have emotions about in particular, right? You know, “Is my candidate going to win? Will my team win the Super Bowl?” And not to get wrapped up in that. It sounds like he is what the venerable Phil Tetlock would call a “superforecaster.” What Phil would need to say that Solomon was indeed a superforecaster is to actually have Solomon make a series of predictions: How is the stock market going to do? Are we going to invade this country or not? Will the United States make this decision or that decision? And then, he would actually see how accurate Solomon is. That’s how Phil identified, with his colleagues, a group that consistently performed in the top two percent of forecasters who were all doing a tournament extended over years. 

DUBNER: And we should say, one reason he did that superforecaster tournament was because in many, many years of work before that, he studied pundits, essentially — people who, in their profession, make predictions. So, these are: people in academia, political scientists, let’s say, and people in the stock market. So, he looked at all these different domains. And what he found, which is both surprising, and yet not surprising at all, is that experts are really bad at predicting the future, because predicting the future is really hard — but experts may have an added dilemma in that they tend to have a lot of confidence, including overconfidence, which Tetlock identified as a particularly powerful trait in our ability to not be good at predicting. 

DUCKWORTH: Yeah, this study that he did, which is really one of the big studies to happen in the last 20 or so years, was that experts’ predictions are really only very slightly better than throwing darts at a dartboard. So, why is it that they’re so confident? It may be that, first of all, many experts are making, quote unquote, “predictions” in hindsight. So, they are the Monday morning commentary on what happened at the Sunday football game, and we turn to the experts for an explanation but, of course, in hindsight, everything is 20/20. The other reason, I think, confidence ends up becoming a hallmark of commentators whom we assume to be experts at what they do is because, if it were not for that confidence, we wouldn’t be listening to them. And there might be a weak-positive correlation between the confidence in your judgment and its accuracy. So, when we, for example, go and ask a waiter, “What’s the best dish on this menu?” If the waiter very confidently says, “You must get the chicken salad on a croissant — hands down, it’s fantastic!” Most people, including me, would be more inclined than if the waiter were like, “Eh, I don’t know, but maybe the chicken salad?” 

DUBNER: See, when I hear him pushing the chicken salad on the croissant, I think, “Oh, they made four gallons of it Thursday, and they can’t get rid of it.” 

DUCKWORTH: It’s about to go bad! But, in general, I think there could be a weak-positive correlation between the accuracy of a forecast and our confidence in it, and that might lead the audience to spuriously assume and conflate confidence with accuracy. So, you said of Solomon, not only does he have this uncanny ability to be rational and deliberate, and therefore maybe more accurate than you are about the same kinds of forecasts, you said he was very confident in his forecasts.

DUBNER: Well, he will say, occasionally, “This is a forecast that I have a lot of confidence in.” Let’s say its electoral outcomes. So, during the most recent election, he worked on a couple of campaigns, and I think both his candidates lost. And he told me, long before the outcome, that one of them was definitely going to lose, even though I thought that person was a favorite. The other one was a bit of a dark horse, but was getting a lot of headlines, and so everyone assumed that that person was going to win. The coverage certainly said it. Some of the polling said it. But Solomon told me, like, “Nah. It’s not going to work out that way.”

DUCKWORTH: The interesting thing just to underscore, though, on confidence, is that superforecasters, as a group, would tend to actually have less confidence in their predictions overall. One of the mistakes that most people make when they are making predictions is to have too narrow a confidence interval, is the statistical term, but in lay terms, it would mean being overconfident about how precise you are in, you know, guessing how many electoral votes one candidate is going to get versus the other. So, this idea that, to be a superforecaster, you would not only try to divorce the personal bias or emotion from things, but also that you would be willing to say, “Hey, you know what, there’s actually a large range of possibilities and I think this is going to happen, but, to some extent, who knows?” 

DUBNER: Can I ask you a question that’s always confused me about the prediction literature? I know that Phil Tetlock and his colleagues argue that one component of a good predictor — or a better-than-bad predictor, at least — is that they tend to employ what’s called “an outside view.” I believe that phrase may have come from Danny Kahneman, not from Phil Tetlock, right?

DUCKWORTH: Right. When Danny talks about “the outside view” rather than the inside view, he’s talking about looking at the base rate of similar phenomena in the world that are not in this special case. For example, if a company says, “You know, we’re really having a problem with our inventory management, could you come and work with us? We’re going to tell you about our inventory. We’re going to talk about our org chart. We’re going to show you our numbers.” That’s all the inside view, because that’s the particulars of that company. But really, the outside view is also helpful — which is, in general, when people have inventory management problems, what is the typical culprit? And if you then think about things like hiring — I make this mistake all the time, that when I have a candidate that I start to root for, like, “I think I’m going to hire this person,” I feel like all I can take is the inside view. I just reread their CV again and convince myself again that they’re perfect. Now, the outside view, Danny might say, would be this: “Angela, forget all about that. At a base rate, how many people are successful in this job, in that they go on to, you know, P.h.D programs, etc.?” Then I would say, “Well, Danny, that’s kind of sobering, because, you know, not many. You use the inside information to calibrate, but the starting point is the base rate. 

DUBNER: Right. Especially when you talk about hiring or project completion. I do remember seeing this research about the usefulness of what are called “prediction markets.” A prediction market is essentially creating a market for prediction so that people have some skin in the game. And we’ve seen that prediction markets are better at predicting than pundits because they represent a diversity of view, and a pundit will often have a rooting interest but not that much at stake. So, it’s fun to go on TV or write an op-ed for the Times or the Journal and make a big, bold prediction, not necessarily being right. And then, they have all different ways to explain after the fact, “Well, my prediction would have been right if these things had happened.” Well, yeah, that’s why predicting is hard. But what a prediction market will do is bring together all the information. One area where I’ve read about it applied in a particularly fruitful way is within companies that are trying to, let’s say, open up business on a new continent or start a new product. What happens is, you have this internal prediction market, and it may be anonymous. You ask people, “What is your level of expectation that this project will be completed on time? If it’s not, what will be the problems?” And what this tends to do is float up, especially from the bottom of the organization, information that’s much more realistic rather than the project manager, or the C.E.O., or the C.F.O. predicting: “This is what we’re going to do, it’s going to work out well because I said so, and because we have a plan.” Leaders of this sort often experience what’s called “go fever.” I think the phrase came from NASA, where once you put the rocket on the launchpad, you really want to launch it. Firms and institutions all over the world do this all the time without really knowing how well that project launch will go. So, if you can have this anonymous internal prediction market, you can probably glean much more useful information, because you’re hearing from the people who have better information than the kind of “fans,” or people who are rooting at the top.

DUCKWORTH: Rooting for an outcome and allowing that to cloud your judgment of whether the outcome is going to happen is indeed a dangerous thing. We want something to be so, and therefore we predict that it will be so. And superforecasters, as a rule, tend to do that less, and that, indeed, makes them better forecasters. I know that Phil has spent much of his life studying how what people want to believe and what people do believe end up blending. Our ideology clouds our judgment. And I do think that, if you run a large organization, rather than having the four people whose vested interest is in promoting this new project and getting their part of the budget allocated for the next fiscal year, taking everyone’s judgment and weighing it. The obvious problem of that is, say you’re in this prediction market and you get an email that says, “Hey, you know, we’re thinking about starting this new branch. We’re wondering, what do you think the likelihood of success is? What’s your lower bound and what’s your upper bound?” What you don’t have is a lot of information. One of the things that Phil Tetlock and Barbara Mellers —his wife and collaborator — and their colleagues would say is that, it’s not that all opinions are equal. Like, the wisdom of crowds idea, which is that collectively we may come to a better answer, that is true. But it doesn’t mean that all people’s answers are equally good or bad. 

DUBNER: You know, there were two other attributes that I take real comfort in when Philip Tetlock talks about people who are better than average at predicting. And you touched on both of them briefly. One is: you can’t be dogmatic. If you believe your prediction will come true because it is what you believe in, then it’s likely to be a pretty poor prediction. And another one is just humility, essentially, but it’s really acknowledging not only what you don’t know, but what is unknowable. And I think that’s something that people have a really hard time with, even with something as low- stakes and binary as sports. It’s remarkable when you hear sports pundits who are really bad, on average, at predicting outcomes. And these are people who have a lot of experience, a lot of information, but their prediction records are often terrible. And I think the reason why is, even though they might be a former player or a former coach, and they may know much more about the game than the average person, there is so much more in the way the world works and the way a particular athlete or team will perform on a given day. And then there’s chance, and how do you measure chance? We’re not very good at those things either. So, it kind of makes you want to throw up your hands and say, “Well, the world’s unknowable. I know it will be colder in winter than it is in summer. But other than that —”

DUCKWORTH: Well, okay, we can take some lessons from the superforecasting research that Phil and colleagues did. When they studied the ability of this top two percent of their whole prediction tournament — and they did this year-on-year, so, then, at the end they actually had a reasonably large number of superforecasters they could study, and they could look for systematic features of these superforecasters that we might emulate. One is that superforecasters systematically gather evidence from a variety of sources. Now, what this looks like for us is, like, don’t just read The New York Times, right? Like, yes, love your New York Times, but also occasionally read The Wall Street Journal, or The Economist, or tune into Fox News just to see what’s going on. So, don’t narrow yourself to a very small number of informational sources that are starting to become redundant with each other. The second thing is that superforecasters tend to think probabilistically. This doesn’t mean that they were mathematicians. Actually, it’s not the case that the superforecasters necessarily had statistics background, or higher-level mathematics background. They simply thought probabilistically. For example, if you tell me, “I think it’s going to rain tomorrow,” what a superforecaster would immediately do is think that there’s some probability that it’s going to rain, that there is an upper bound and a lower bound. They’re not thinking it’ll rain or not rain. And Phil has done intervention research to show that you can actually help people think a little bit more probabilistically. Third is, they work in teams, and they tend to benefit a lot from other people’s opinions. And then, the last thing, I think, is most important, because it might explain why sports commentators and political pundits are not as expert as they ought to be, which is that you have to basically keep score of how you’re doing. When your prediction turns out not to be true, then you need to examine what happened, and you have to admit that you’re wrong and then update your beliefs. And I think that’s the thing that pundits and commentators are not incentivized to do. 

DUBNER: You know, in terms of thinking probabilistically and explaining your predictions in probabilistic terms, I think, actually, that meteorologists have done an amazing job. If you get the five-day weather forecast, there’s a 20 percent chance of rain next Sunday morning. That means it’s probably not going to rain, but if it does, I’ve got plausible deniability — 20 percent. It’s very rare — at least in this part of country where I live, in the Northeast — that you see 100 percent chance of anything. I actually think that’s brilliant. What I would like [is] to have all my pundits talk to me that way, like, “There’s a 70 percent chance that this very likely thing will happen, but let’s not deny the fact that 30 percent is not nothing.”

DUCKWORTH: Right. “There’s a 60 percent chance that this healthcare reform is going to be good — like, net for everyone. And there’s a 40 percent chance it’s going to be bad. We hope you’ll vote for it.” I like that. It makes the Weather Channel come out as one of the great elevators of the human experience. And, I actually, in all seriousness, wonder whether more people think probabilistically because of the Weather Channel. 

DUBNER: On the other hand, the Weather Channel is really good at taking a squall in some state a thousand miles away and making it into the end of the world, so — 

DUCKWORTH: Stephen, they have 24 hours of programming to get through every day. You got to make the most of a squall. 

Still to come on No Stupid Questions: Stephen and Angela discuss ways to get over a bad day. 

DUBNER: It’s why there’s a thing called cocktail hour. 

*      *      * 

DUCKWORTH: Stephen, I have a question from Tyler Thorstrom. Here it is, very briefly: Why do successful people have bad days? 

DUBNER: Why do successful people have bad days? We may have to change the name of the show.

DUCKWORTH: Oh, you think it’s a— Don’t say it!

DUBNER: That is a borderline—

DUCKWORTH: Tyler, cover your ears. 

DUBNER: I mean, my answer would be: Why wouldn’t successful people have bad days?  But maybe Tyler meant to ask something beneath the surface that I’m not seeing. I’m guessing that’s what it is. Maybe he means to say, “Why do even successful people have bad days? 

DUCKWORTH: I think this question actually might be — and Tyler, forgive me if I have this wrong — that, say you really are a productive, effective, wonderful human being.

DUBNER: A successful person, in other words. 

DUCKWORTH: Yes. Now, you have all these wonderful competencies, why wouldn’t you only have good days? In some ways, it makes a lot of sense. Like, why is there variability in your life after you’ve achieved a certain capability, right?

DUBNER: I don’t know. I think that feels absolutist and utopian in a weird way. But look, I kid, Tyler. I think it’s a really interesting observation and good question — in terms of why, quote, “even” successful people have bad days. And you could argue, depending on how you want to define a successful person, that people who are, quote, “successful” probably try more difficult things than people who are not successful, and therefore, you might argue that they would have more bad days than unsuccessful people. 

DUCKWORTH: I was just reading this book.

DUBNER: I read that one, too.

DUCKWORTH: Did you? You’re always doing the same thing. There was a quote there — this is now fourth-hand, or something — I read the book, and then the author said that the founder of Spanx had a father who said, “What have you failed today?” when she was a little girl growing up. And that taught her that, look, if you’re going to do hard things, you’re going to fail.

DUBNER: Is there any evidence from the psych literature showing that there are some humans who’ve never had a, quote, “bad” day? 

DUCKWORTH: Well, there are so many studies that track people’s experience over time, and so much evidence that there’s variability in mood over time, and in relationships over time, in productivity — that just, on that evidence, you could say that everybody has, at least relative to themselves, a bad day. Let me ask, channeling my inner Tyler Thordstrom: “Stephen, do you have bad days?” Let’s assume that you would count as a successful person. What are they like and why do you have them? 

DUBNER: I appreciate you counting me as a, quote, “successful” person. I’m flattered. Do I have bad days? Absolutely. But I don’t think of them as bad days. I think of them as a bad thing, or three, happened on this day. 

DUCKWORTH: Okay. Give me some examples. 

DUBNER: Well, when bad things happen, that’s not what I’m talking about when I think of a bad day. Because terrible things happen all the time. Someone gets killed, or there’s a terrible tragedy. Like, the world is full of bad things happening. If you’re the person to whom that happens, that’s the very definition of an infinitely bad day.

DUCKWORTH: You’re right. We should acknowledge that. But I don’t think that’s what Tyler meant.

DUBNER: Right. And there’s a spectrum of bad. For me, when a bad thing happens that touches me somehow, but it’s not of my doing, I don’t have that big of a problem with that. I feel it’s unfortunate. If there’s a failure with our work—

DUCKWORTH: Like, the power goes out or a file gets deleted. 

DUBNER: Yeah. It’s unfortunate, and you deal with it. To me, what feels like a bad day, or bad incidents in a day, is when I did something that I regret.

DUCKWORTH: So, you had to have some volition.

DUBNER: Some agency, yeah. For me, a bad day feels like when I’ve made a series of decisions that led to a suboptimal outcome and maybe a bad emotional response — maybe hurt feelings or frustration. That’s what feels like a bad day to me. So, I’ll give an example. Let’s say we’re working on a particular piece — a podcast or a book chapter or something like that. And you’re constantly making decisions about what’s an idea worth pursuing. Is it new and interesting? Is it demonstrably true? What’s the data or stories to explain this idea to a listener or reader? And there will be a decision along the way. Like, we think this person would be a good contributor to that. And in the pit of my stomach, I may think, “What I know of their work, it doesn’t look that good. It doesn’t look that interesting. It doesn’t look that current. It doesn’t look that empirical.” And then I’ll either talk myself into it or let myself get talked into it. And then you start to prepare for hours and hours. And then the interview comes, and in the back of my mind I’m thinking, this might have just been a massive waste of time, and then it is. Like, within five minutes, I can tell that this person is just not the right person for what we’re trying to do. They don’t have a command of the literature or the history that we’re trying to tell. They are not an interesting talker, whatever it is. And then, afterwards, I am ticked. That’s a bad day. That’s something that I’m going to feel bad about for a while and be frustrated at. 

DUCKWORTH: And then what happens. 

DUBNER: And then I think, okay, let’s post-mortem the crap out of that sucker. Why did it happen exactly? Where did our assumption go wrong? Maybe it’s just chance. Maybe I just didn’t do a good job. Maybe the energy just wasn’t right. Maybe their kid was home with the flu and they were distracted and worried. But it does lead you to constantly change your protocol of how to figure out how to spend your time more productively. Because, as my friend Angela Duckworth has told me, time and again, life is short. And even an hour that you could spend doing something interesting, productive, fruitful is worth doing interestingly, and productively, and fruitfully. And yet, I still fail at that kind of thing, probably as much now as I did ten years ago. So I’m not really getting very good at figuring out how to solve that sort of problem retrospectively.

DUCKWORTH: Doing pretty well, though. You’ve got this curious attitude about what you can harvest from this crop of failure. I’m feeling like you’re pretty evolved. I’ve been reading this book from, gosh, I don’t know, the 70s or something, called The Erroneous Zones, or Your Erroneous Zones, have you ever heard of it?

DUBNER: So, this is a play on Your Erogenous Zones, correct? 

DUCKWORTH: It’s a play on your erogenous zones. It was, like, one of these early self-help books. It was a runaway bestseller. The reason I’m reading it is because my literary agent, Richard Pine, told me it’s a book that changed his life. His dad was also a literary agent, and apparently, this manuscript comes in — and I don’t know, maybe nobody in the office had any interest in it. And his dad gives it to Richard, and he says, “Take a look at this. Tell me what you think.” Richard reads every word. And then his dad says “What should we do with it?” And Richard says, “It’s the best book I’ve ever read. It changed my life. I’ll never be the same. What this book says is that I have choices, and I am the sum total of those choices. I have agency.” The book is titled Your Erroneous Zones because you have mistakes that you are likely to make, traps that you’re likely to get in — being the human that you are — and it’s a guide to making better choices. So, as I was listening to the audio version of this book, I think the author is Wayne Dyer. And Wayne says, in Your Erroneous Zones, that at an earlier point in his life, he didn’t think there would be such a thing as a bad day if you did everything right. And part of growing up is just realizing that no matter how successful you become, and even if you do make all the, quote unquote, “best choices,” that you should expect that you will have some days that don’t feel good, and that should be okay. You shouldn’t run away from it or labor under the illusion that it is anything but that way.

DUBNER: So, these are really nice pieces of advice, and it reminds me of something I heard recently that just so tickled me. I have a friend who is a doctor, he’s maybe 60, early 60s, and he was telling me about a friend and mentor of his. I believe he’s a rheumatologist, the older gentleman. He’s in his 80s now, and he’s still practicing medicine, still teaching. And my friend, whose name is David, says that every time he sees this eighty-year-old friend-mentor, he says, you know, “How are you doing?” And the answer is always the same, “Never been better!” Which, somehow when you’re in your 80s, it seems wrong. You’ve never been better? Come on, what about when you were 21? And he says, “If you say it every day, to yourself and other people, it’s not like it’s a magic trick and it becomes true, but a little part of you believes a little part of it and makes it better.” But if the inclination is to eliminate bad days, I don’t think that is a good inclination. Because then you’ll take no risks, you’ll reach for nothing beyond your comfortable reach.

DUCKWORTH: Well, not that I’m Wayne Dyer, and not like I wrote Your Erroneous Zones, but in this 1970s self-help book, he does say that one of the pieces of advice that he would like to impart is that what’s done is done, the past is in the past, and not to relive that bad day for many more days. Make it at least one bad day and not an infinite number of bad days by revisiting it, chewing on it, ruminating. 

DUBNER: One solution that I’ve come up with is, if I had what you might call a, quote, “bad day,” I try to focus on what can I do between now and tomorrow to be in a good frame of mind for tomorrow? Because I don’t want it to carry over. And part of what I do then, which is, I think, what a lot of people do — it’s why there’s a thing called cocktail hour. Now, everybody has a different form of cocktail hour. But whatever it is, it’s to try to put an ending to even a bad day that feels like a better ending. This goes back to this colonoscopy paper that we talked about a while ago on the show. This was back when colonoscopies were painful to have administered. They’re not anymore. But the notion that if the end of the procedure is less painful than the rest of it, then your impression was that the whole thing wasn’t so bad. Now, you could say, “It’s just lipstick on a pig. The bad day was the bad day.” Or you could look at it with a little more optimism. My mom used to say, “A little powder and paint makes a girl what she ain’t.” The bad day recedes as the happy ending takes over, and I don’t know if a little self-delusion isn’t, in fact, quite helpful in order to get you on a better track so that tomorrow, at least you have a belief that you’ll have a better day, which I think is important, honestly.

DUCKWORTH: And I guess there’s a reason why happy hour is not at 10:00 in the morning and it’s not at noon. You know, let’s end our day with some happiness. 

DUBNER: That’s one reason. I think there may be some other reasons for that. So, let me ask you this, you hang out with a lot of people who really studied how to change the mindset with which you approach a task or a day. And you’ve done this yourself for years. This is what you do. So, let’s say that Tyler or someone comes and says, “I had a really bad day and I’m determined to have a better day tomorrow — even though the circumstances that produced the bad day today may still exist.” So, what are some ideas to at least approach it with a little bit more optimism — any kind of reset, or fresh start, or other good mind tricks? 

DUCKWORTH: Well, you just named one of them. And that’s the idea from Katy Milkman of having a fresh start. And that is actually more mental than physical, that you would just frame the new day as the beginning of a new streak, and that things are actually going to be different. And that could happen the beginning of the month, the beginning of the week, your birthday, January one. But you could also just mentally create your own fresh start, like, “Today is a new day. I turn a new page.” I am very fond of a math teacher in New York City named Jeff Lee, and he lives by the mantra, “Forgive yourself every night, recommit every morning.” When I hear that, I almost can picture a river where you baptize yourself every night, and you forgive yourself, and you begin fresh and new with commitment to go do the good work that you’re doing. That’s, again, something of a mental fresh start — reframing. I have an undergraduate in my class who says that he keeps a journal — very simple, because so many of us want to keep a journal, we never got around to it, we buy journals and we don’t fill them. But his journal entries are so simple that he’s been able to make it a daily habit. He rates his day, every day, on a three-point scale: negative one — bad day, zero — eh, plus one — good day. Very simple. And then he says one thing that he did that was completely factual. Like, “I had two eggs, sunny-side-up, for breakfast.” Completely mundane.

DUBNER: That would make my whole day good right there. Just the two sunny-side eggs.

DUCKWORTH: Especially if they’re just right. And then, finally, something that he learned. And I think, in a way, that’s the best attitude — for any day, honestly, but especially the bad ones. Because you can imagine if you said “minus one, bad day, the eggs were overcooked. That’s what I had for breakfast.” And then, “What I learned is I should take the pan off the stove one minute earlier if I want my eggs to be sunny-side-up.” There you go. And I think that was good advice. 

DUBNER: That’s good advice. I hope Tyler enjoys this advice too.

DUCKWORTH: I hope Tyler’s not vegan. 

*      *      * 

No Stupid Questions is part of the Freakonomics Radio Network, which also includes Freakonomics Radio, People I (Mostly) Admire, and Sudhir Breaks the Internet. This episode was produced by me, Rebecca Lee Douglas. And now here is a fact-check of today’s conversations. 

Stephen recalls that the phrase “go fever” originated with NASA. This is correct, but more specifically, the term was initially used to describe the 1967 Apollo I disaster —three astronauts died when their command module caught on fire during pre-flight testing, two weeks before their scheduled departure. A review board determined that a number of design oversights led to the fire — the cabin had been pressurized with pure oxygen and there were many combustible materials in the capsule in addition to, quote, “vulnerable” wiring and plumbing. Critics blamed “go fever” as a group-think phenomenon where protocols were rushed in order to meet President Kennedy’s challenge of landing a man on the moon before 1970. 

Later, Stephen and Angela share their thoughts on how to recover from a bad day. Stephen tells the story of an 80-year-old rheumatologist who says he’s, quote, “never been better” when anyone asks him how he’s doing. This is actually a form of self-affirmation! In psychotherapy, a self-affirmation is a positive statement about the self that a person repeats on a regular basis. Mental health professionals may recommend this tool as part of a treatment program for depression. Regular affirmations have been shown to improve a person’s relationships, reduce their negative thoughts, and encourage behavior change. 

Angela suggests additional tools to help move on from a bad day including: committing to a fresh start, journaling, and trying not to ruminate. Stephen recommends wrapping your day with a metaphorical “cocktail hour,” whatever that means to you. Cognitive behavioral therapists might also recommend listening to music, spending time outside, talking to loved ones, practicing mindfulness, and getting good sleep. While a literal cocktail with friends may be the solution for some, for others, alcohol can result in increased feelings of next-day anxiety — or “hangxiety.”

That’s it for the fact-check.

No Stupid Questions is produced by Freakonomics Radio and Stitcher; our staff includes Alison Craiglow, Greg Rippin, Mark McClusky, James Foster, Joel Meyer, Tricia Bobeda, Zach Lapinksi, Mary Diduch, Brent Katz, Morgan Levey, Emma Tyrell, Lyric Bowdich, Jasmin Klinger and Jacob Clemente. Our theme song is “And She Was” by Talking Heads — special thanks to David Byrne and Warner Chappell Music. If you’d like to listen to the show ad-free, subscribe to Stitcher Premium. You can also follow us on Twitter at NSQ_Show and on Facebook @NSQShow. If you have a question for a future episode, please email it to nsq@freakonomics.com. And if you heard Stephen or Angela reference a study, an expert or a book that you’d like to learn more about, you can check out Freakonomics.com/NSQ, where we link to all of the major references that you heard about here today. Thanks for listening! 

DUCKWORTH: Okay. I’m going to put my retainer back in. Hold on a sec. I have to, like, suck all the food out of my teeth. 

Read full Transcript

Sources

  • Philip Tetlock, professor of psychology at the University of Pennsylvania.
  • Daniel Kahneman, professor of psychology at Princeton University.
  • Barbara Mellers, professor of psychology at the University of Pennsylvania.
  • Richard Pine, Angela’s literary agent and co-founder of Inkwell Management.
  • Katy Milkman, professor of behavioral economics at the University of Pennsylvania.

Resources

Extras

Comments