Episode Transcript
Stephen DUBNER: This is B.S.
* * *
Angela DUCKWORTH: I’m Angela Duckworth.
DUBNER: I’m Stephen Dubner.
DUCKWORTH + DUBNER: And you’re listening to No Stupid Questions.
Today on the show: What is the cost of admitting you’re wrong?
DUCKWORTH: Oh, look, I just updated a belief! Ding.
* * *
DUBNER: Angela, I recently had an embarrassing moment that I thought I could turn into a question for you. It will not be embarrassing for you.
DUCKWORTH: Hm? This is like my favorite thing. It’s going to be embarrassing for you.
DUBNER: Exactly. So, I was having a conversation with a neighbor of mine, an elderly Japanese fellow. And the shoes I was wearing happened to be this brand of Japanese sneaker that I love. I’ve been wearing them since high school.
DUCKWORTH: Wait, what’s the brand?
DUBNER: They’re called Onitsuka Tiger, or so I thought.
DUCKWORTH: Oh, I think I know what these look like. They have stripes that go down instead of up, like the Nike swoosh.
DUBNER: Yeah. They’re kind of minimalist shoes. They’re comfortable. So, anyway, I wear these a lot. And I was wearing them, and he looked down and pointed at them and just said something, you know, like, “Oh, yeah.” And I said, “Yeah, Onit-SU-ka.” And he said, “O-NIT-suka.” And I said to him, “No, no, no, it’s Onit-SU-ka.”
DUCKWORTH: Zero self-awareness, like, none.
DUBNER: And in that split second, I realized, “Wait a minute, that is a Japanese brand. And he is Japanese, and I am not.” So, I stood corrected. It was a little embarrassing. It did make me think back to this incident years earlier, when a friend of mine, he was a guy in a band — we were all in bands back then. And he was trashing this other guy in another band. And he said, “Yeah, that guy, he’s the ‘epit-tome’ of stupidity.” I thought to myself, “Wait a minute, you sure he’s the ‘epit-tome’ of stupidity?”
DUCKWORTH: I love that.
DUBNER: So, you know, that’s a case where there’s a word that you’ve seen written — you don’t know how it’s pronounced. My son, when he started to read, he would have all these words that he sort of knew what they meant, but he didn’t know how to say them. So, anyone from Norway was, according to him, a “Norwiggian,” which, you know, makes a lot of sense on page. So, anyway, there are these things that, you learn over time, you’re wrong about, and then you correct yourself, and it’s pretty costless. But it made me want to ask you a much larger and more serious question, which is: Have you had a similar experience — not on something small and silly, like a pronunciation — but on an idea? In other words, can you tell me something that you believed for a long time to be true and didn’t question it at all, but later learned that you were in fact wrong?
DUCKWORTH: I can think of a couple of examples, one small and one big. So, small example is: I’m having dinner with my mother-in-law and my husband recently, and they’re talking about Scandinavia, and I’m trying to join in the conversation, as if I know anything. It comes to light that I clearly think Scandinavia is a country. And then gently — ever so gently — my mother-in-law points out that Scandinavia is not a single country, but a region. And I was like, “Why would they go and name a region something that just sounds so much like country? And doesn’t it have a flag?” And she was like, “Scandinavia doesn’t actually have a flag, but I can see how you would get confused because many of the flags of these countries, they’re so similar.” And my husband says, like, “No, they’re not.”
DUBNER: And did she ask you to produce, on the spot, your actual Ph.D. certificate to prove that you—
DUCKWORTH: So that she could shred it? So she could rescind my credentials? No, my mother-in-law’s very gentle and tactful, and it’s not the first time that she’s encountered the oceans of my ignorance. So, I think she’s pretty good at finding a way out of the situation that allows me to save face. But it is sort of shocking, actually, to spend your entire adult life thinking that a region is a country.
DUBNER: By the way, the “Norwiggian” people live in Scandinavia — if that’s any consolation.
DUCKWORTH: So, that’s a small example. But let me give you a bigger one. I, for a very long time, subscribed to this theory of self-control called “ego depletion.” Have you heard of it?
DUBNER: I have, I think, from you. And it’s one of those phrases that is so compelling as a name.
DUCKWORTH: Yeah, it’s incredibly compelling. It’s a theory, and it’s also a metaphor, and it goes like this: Roy Baumeister, among others — but chiefly Roy Baumeister, brilliant psychologist — was interested in what’s really going on with self-control, and why is self-control so hard? And why is it that, after a long day, we end up maybe drinking too much, or eating too much, or losing our temper? His theory of self-control was that the self, “the ego,” is a limited resource. And the way this works is that if you “use up” your self-control, as it were — if you do one, two, three, four hard things in a row: you have to concentrate on a boring proofreading task for your work, and then you have to control your temper, and then you have to do something else — that your self-control, bit by bit, goes down to the point where you don’t have enough left. And that’s why you end up acting impulsively at the end of a long day, or after doing a series of hard things.
DUBNER: I’m still waiting for the sword to drop, though, because to me it sounds like it should be true.
DUCKWORTH: Well, there was a study where you have to do some tasks that require self control — stuff that takes a lot of concentration, is very tedious and un-fun, and therefore requires some amount of impulse control, because you have to do it and not let your mind wander. And in one condition, the research subjects drank some Splenda-sweetened lemonade. And in another condition, they drank sugar-sweetened lemonade. And the idea is, if self-control is a limited resource — if the ego is a limited resource — maybe it runs on sugar, because we know the brain, essentially, does run on glucose. The results of the study were striking. Only the subjects who drank the sugar-sweetened lemonade ended up having this, kind of, bounce-back in their ability to exert self-control on a subsequent task. And so, not only did we have this compelling metaphor — self-control is a limited resource — but now we have what seems like brain science. The limited resource is not just a metaphor, it’s a thing.
DUBNER: So, you’re just persuading me more, and more, and more, and more that ego depletion is real.
DUCKWORTH: Yeah. And that’s where I was. It was so provocative. And here’s the sword dropping: So, in quiet ways, one Ph.D. student after another was trying to actually do these ego-depletion experiments. And they were failing to find the ego-depletion effect. Then, it turns out that the lemonade study that I just told you about does not replicate. When you do it again with bigger samples, you don’t get the effect. What the field now believes — which is a big update: most people think that what happens when you do something that requires you to choose an effortful, not-so-fun option over a really fun, immediately gratifying one, is not that you run out of self-control, not that your brain uses up more glucose or energy, but really that you’re getting a signal — the sensation that, “Hmm, there is a cost to what you’re doing.” And that feeling of effort, or that feeling of displeasure is just, like, a counter. Like, “Hey, keep in mind that what you’re doing here is coming at a cost, even an opportunity cost — things that you could otherwise use your attention to do.”
DUBNER: What did it cost you to give up your belief in that? I’m guessing not very much. Were you heavily invested, yourself, in the theory of ego depletion?
DUCKWORTH: I’m sure I had written papers on self-control that alluded to ego depletion. I don’t think — and this is, maybe, one reason why I was able to update my beliefs more readily than otherwise — I don’t think that I had my reputation to lose. It wasn’t my theory.
DUBNER: It was Roy Baumeister’s theory. So, what did Roy Baumeister lose from having his ego-depletion theory picked apart?
DUCKWORTH: Well, I’ll tell you. It’s kind of an ongoing saga. I think Roy Baumeister stands by his original proposition, maybe with some slight modifications. He’s a giant of a social psychologist who, you know — he should be pretty comfortable walking away from that one theory.
DUBNER: Right. Because he’s got enough.
DUCKWORTH: He has a half dozen. You know: the nature of consciousness, why we have self-esteem, the list goes on. But I do think that, when you talk about grit, for example, like, what if everything I have claimed about grit is simply untrue? How easy will it be for me to turn around, make a 180, and say, “You know what? I just need to announce to everyone that success doesn’t come from passion and perseverance.”
DUBNER: So, let’s have that thought experiment for a minute.
DUCKWORTH: Oh, is so uncomfortable. I don’t really want to do it. No, no, wait, hold on. I’m going to put up my grownup pants. Okay. Go.
DUBNER: We can talk about ice cream sundaes, if you’d prefer — if that’s more comfortable.
DUCKWORTH: No, self-control. Let’s do the better, more gratifying thing in the long run.
DUBNER: So, you’re an academic who’s built a reputation in the academic arena, and in the public arena, really, around this one idea of grit. And there are people who’ve written about grit not being the great explanatory factor that you have claimed it is. So, how do you process those claims? And how do you allow yourself to take them in without your ego getting in the way and saying, “Well, let me take a look at what’s being said here, and let me re-examine my research and my argument to make sure that I’m on the right track?”
DUCKWORTH: So, we all know what the right thing to do is. We all know that when somebody criticizes you and your work—
DUBNER: It’s to hire a private investigator to trash your critics?
DUCKWORTH: Yes. Exactly. We all know that! No, we all know that we’re supposed to be intellectually humble. Let me give you a little quiz, Stephen. And I’m not avoiding the question. I’m going to answer these questions for myself. They’re intellectual humility questions. And then let’s see how I do in terms of grit.
DUBNER: Can I just say: If this is a quiz to measure intellectual humility, I have a lot.
DUCKWORTH: Let’s see how you do. Actually, I’m just going to read them all to you. Do you want to just keep a tally?
DUBNER: Okay.
DUCKWORTH: First question, yes or no: “I question my own opinions, positions, and viewpoints, because they could be wrong.” “I reconsider my opinions when presented with new evidence.”
DUBNER: Isn’t that kind of the same as number one?
DUCKWORTH: They’re all kind of the same. This is the way scales work. “I recognize the value and opinions that are different from my own.” “I accept that my beliefs and attitudes may be wrong.” “In the face of conflicting evidence, I am open to changing my opinions.” “I like finding out new information that differs from what I already think is true.”
DUBNER: Okay.
DUCKWORTH: So, how’d you do?
DUBNER: I hate these kinds of things, because “yes or no” is hard for me. Even though I ridiculed the first two as being very, very similar, there were a lot of distinctions as we went. And so, if I had had a scale of, let’s say, zero to five, I think I would have been pretty variegated. But I answered “yes” to four of the six: numbers one, two, three, and five, and “no” to four and six.
DUCKWORTH: So, you answered “no” to “I accept that my beliefs and attitudes may be wrong,” and “I like finding out new information that differs from what I already think is true.”
DUBNER: Yeah, I think that’s painful. Even if it’s not a public thing, even if it’s not a costly thing, I feel a little bit like a dummy. Like, “Wait a minute, why was I so convinced that was true? And now I see evidence and, plainly, it’s not.”
DUCKWORTH: And then you feel worse, not better.
DUBNER: Correct.
DUCKWORTH: And I’m with you. I think that when I’m criticized about grit, you know, “Grit is not nearly as predictive as Angela Duckworth makes it out to be,” or, “Grit is essentially the same thing as five, six, or seven other things that psychologists had been studying for centuries,” it stings! I kind of want to, first, show that I’m right before I take any steps towards intellectual humility. And, of course, those are at odds with each other. But let me tell you, over the last year or two, I’ve started to work more closely with Danny Kahneman — our friend, great psychologist, thinker. And I have watched his facial expressions very carefully when we are working through a problem, and he makes a correction of himself — we’ll be talking about something, and he will say, “I’m wrong.” And what’s interesting is that he’s simultaneously smiling.
DUBNER: He lights up, yeah!
DUCKWORTH: Yeah, he really does. Right?
DUBNER: Well, I know he’s argued that the best thing about being wrong is: you have a new set of information to play with.
DUCKWORTH: You’re learning.
DUBNER: Which is an extremely healthy attitude, but I think it’s taken him a lot, a lot, a lot of years to get there.
DUCKWORTH: I really have tried to imitate that. I was like, “Wow, what would it be to feel like I’m wrong with a smiley face?” And I think it’s working. I try it with my own students. I’m, like, arguing with them, and then I realize that they’ve made a point, and I say, “I’m wrong.” I just smile at the same time. And I think saying it completely straightforwardly — not, “You have a good point there.” Literally saying, “I’m wrong. I made a mistake.”
DUBNER: So, Adam Grant, your colleague, has written this book called “Think Again.” And I know that one argument about why people have such a hard time changing their minds is because a change of mind is essentially a threat to your identity. Your opinions, your beliefs, your research, the stories you tell, et cetera, they are who you are. We all feel that we are the kind of people who hold a certain set of beliefs or ideas, or we think certain facts or scenarios to be true. And then, if we’re presented with evidence that, perhaps, they’re not true — then, am I the “me” who I thought I was? And even if it’s not costly in a reputational way, I think it can shake people up enough to dissuade them from wanting to examine themselves in that way.
DUCKWORTH: The only, like, Jedi-master trick out of that is to rest your identity on intellectual humility, or call it open-mindedness, or thinking of yourself as a curious person — whatever it is that you can build into your identity that gives you an escape route out of this. I’ve been thinking about this in the context of something that Lee Ross called the “fundamental attribution error.” This is one of the great social psychologists ever. Lee Ross did pass away recently. He was famous for the fundamental attribution error — that is, overly attributing the behavior of people to their personalities and underweighting situational factors. You know, somebody comes late to a meeting and you’re like, “Oh, they’re lazy and unconscientious.” And then, ignoring the fact that they may have gotten tied up in traffic or had situational constraints. It became this bedrock principle in all of psychology to look for the situational explanations for what people do. But one of the last things he wrote was called, literally, “From the Fundamental Attribution Error to the Truly Fundamental Attribution Error and Beyond: My Research Journey.” And the fundamental attribution error that he thinks is truly the problem is: the illusion of personal objectivity — the illusion that you see the world as it is, not as you interpret it to be, but just as it is. And this illusion that we have clear-eyed objectivity gets us into trouble, especially when it comes to politics and war, but, you know, our marriages and work. So, if I feel like I have a handle on the way the world really is, as a psychologist, I’m on thin ice.
DUBNER: Are you talking about the notion that some people believe there is an objective truth in all cases, versus subjective views? Or is it something different?
DUCKWORTH: I’m not talking about, like, relativism. It’s not that epistemologically deep. I don’t think what he wanted to say is, like, we’re all living in a dream. It’s just to say that the human mind is like a meaning-making machine, and we’re constantly making inferences. We’re leaping to conclusions. There is this universal flaw of human cognition where it feels so real. Like, if you have strong views about what’s going on politically in this country, the idea that you could be just as wrong as your mortal enemy across the aisle is very hard to truly appreciate.
* * *
Still to come on No Stupid Questions: Stephen and Angela discuss how the desire for consistency can result in a change of beliefs.
DUCKWORTH: Hey, you just gave me cognitive dissonance.
* * *
Before we return to Stephen and Angela’s conversation about the challenge of changing your mind, let’s hear some of your thoughts on the topic. We asked listeners to tell us about a moment when they realized they were wrong.
@ThatWMD writes, “I used to be for the death penalty. I remember arguing with classmates in law school that it’s only fair if you commit a serious enough crime (murder), you pay with your life. Oddly enough, the movie The Green Mile really showed me the issue in a way that made me question my beliefs.
@1976BullDawg says, “When we had forced integration into our ‘white’ schools in south Mississippi in 1970. The guys from ‘the other side of town’ were no different from us. That realization changed my life completely. I honestly felt stupid for not realizing it before.”
@MostlyBitter writes, “I once mixed up the meanings of ‘anthropologist’ and ‘philanthropist’ and ended up arguing about it with a friend for much longer than I should have. Still painful to think about.”
@MostlyBitter, I have to say, that seems pretty on-brand for your username.
If you’d like your thoughts to appear on an upcoming show, make sure to follow our Twitter account, @NSQShow. Now, back to Stephen and Angela’s conversation about the psychology of changing your mind.
* * *
DUBNER: We’ve been spending time thinking about the areas and ways in which it can be really hard to change your mind, or change your position. When I think about changing your mind, I think about incentives and costs. And, it seems, one area in which changing your mind is really costly is politics. Politicians are routinely punished if they change their positions.
DUCKWORTH: Yeah, because you’re a flip-flopper, and you’re not trustworthy, and you’re disloyal.
DUBNER: And also, you’ve built a constituency around a certain belief, and now the only chance you have — on that position, at least — is to go get a different constituency, which is really — especially in a two-party system — really hard. So, I understand why politicians very rarely will, quote, “change their mind.” But then, I try to think about areas where people really do change their minds routinely. You know, science, theoretically — when practiced the way science, I think, is meant to be practiced — I should say, when I talk about “academia” and “science,” I see them as a little bit different, because in academia, everybody’s got a reputation, but the goal of science is to find stuff out, and you don’t do that except by coming up with a lot of hypotheses, many of which turn out to be not true. I think two other areas are sports and investing. Those are both areas where people will sour on winners very quickly.
DUCKWORTH: What do you mean by that?
DUBNER: Well, let’s say I’m a — whatever — sports fan or an investor, anything where there’s a performance that’s measurable. The minute I see some weakness, I think, “Okay. That asset is no longer so valuable to me.”
DUCKWORTH: Oh, you mean they can be enthusiastic proponents one day, and cynical opponents the next.
DUBNER: Yeah. They sour on winners quickly — and maybe a little bit less commonly — they go all-in on former losers. And the difference, as opposed to, let’s say, politics, is that the data are very clear. Performance in sports, and in the stock markets, performance is very measurable. A lot of the hot-button debates we have where people don’t change their minds, the circumstances are much more—
DUCKWORTH: Ambiguous.
DUBNER: Yeah. And the smarter you are, the better you are at defending a position that might not be right anymore, because of confirmation bias. We know that smart people tend to be quite intransigent, because they’re really good at seeking out information that confirms their underlying beliefs, and really good at denying what might be contradictory evidence. So, if this is all true, is there a way to bring to those complex issues some of the clarity of the stock market and the N.F.L.? Do you think that might be fruitful?
DUCKWORTH: The point you’re making, Stephen, is really profound, and yet I don’t know how to apply it. And I think it’s that when you have precise and unambiguous feedback that corrects your beliefs, you are less likely to sustain incorrect beliefs. But then I’m thinking, “Okay, let’s take politics in this country.” Like, what should we do about affirmative action? What should our immigration policy be? Is the stimulus package too small or too big? Or, like, what’s going to happen in the economy, and what should happen in the economy? I am not able to come up with a feedback system that is going to allow me to correct the people’s beliefs in the same way that I thought those Steelers were going to win by seven, and it turns out they lost by 12.
DUBNER: Hey, if you’re going to give a football example with the Steelers, don’t make it where you thought they were going to win and they lost. Make the other way around, please. Thank you very much.
DUCKWORTH: You can flip that. You know, sustain your bias.
DUBNER: Edit! Edit! Win equals lose. Lose equals win. So, the Steelers just won the Super Bowl, and then what happened?
DUCKWORTH: And then they had a party. It was awesome. Did you go? You’re right, though. I think it’s a good lead. I’m not sure exactly how to apply it to real-world politics, but Danny Kanneman came up with, as you know, the “adversarial-collaboration” idea. Say you have two people who really believe opposite things. You know: “I believe in ego depletion.” “I believe that ego depletion is a myth.” What you do is: you get these two adversaries together, and they collaborate. They agree to run an experiment, or to do a study. And I think that idea that two adversaries could collaborate on a project and say, “Hey, if it turns out this way, I concede to you. If it turns out this way, you concede to me.” Danny has been involved in them himself. And it gives you some idea: Could you get the Democrats and Republicans to say something like, “Look, let’s agree that if such and such happens, I’m right. If such and such happens, you’re right.” I don’t know, but it’s an interesting direction.
DUBNER: It also makes me just think to some of the legalizations that have happened in the last, let’s say, 10 years — that if you were looking at now from 20 years ago, you might be very surprised. The legalization of gay marriage kind of came out of nowhere, all of a sudden. Even though those who’d been working on it had been working on it for a long time. But it seemed as though the country was against it, including President Obama, and then the country was for it.
DUCKWORTH: President Obama was against gay marriage?
DUBNER: He was. Yeah.
DUCKWORTH: I did not know that. Oh, look, I just updated a belief! Ding.
DUBNER: When I think about some of the other widespread legalization going on right now — marijuana, in this country, where for a long, long, long time, there was a lot of momentum against it — and sports gambling in the last couple of years has just been legalized at the Supreme Court, and now it’s being rolled out state-by-state. And when you think about how seemingly rapidly a populace — and the politicians and the policymakers that are working theoretically in collaboration with that populace — can make what looks like a sea change, and all the hullabaloo from the opponents kind of dies away.
DUCKWORTH: Forgotten.
DUBNER: You know, I’ve tried to explore this in the past on a couple of Freakonomics Radio episodes. We did an episode a few years back about the power of incrementalism, and how very often we want radical change, and people who promote radical changes often get a lot of attention, but that, often, real change comes about incrementally. And it takes a certain kind of temperament to stay in there long enough to make that change happen.
DUCKWORTH: What be a good word to describe that? “Grit,” for example!
DUBNER: I think about how gay marriage became accepted, and there’s a story that I’d read that I think is maybe apropos to this larger conversation we’re having. Senator Rob Portman, who’s a Republican — Ohio — was anti-gay marriage, and he turned around on the topic, and it was considered very surprising because of his established political position. But he said the change was because of a personal reflection that he’d begun a couple of years after his own son told him that he was gay. So it made me just think: what does it take for a given person to have a position that they believe so strongly that they think they’ll never change their mind about it, and then all of a sudden they do? In this case, if you’re going to reduce it to one word, it would be “love.” You love your kid. And then you say, “Wait a minute, how can I be opposed to a thing that is embraced by someone I love?” I don’t mean to go mushy on you, Angela, but I do wonder if — and this may be from having watched a little bit too much of the Beatles documentary lately — but I do wonder if love, maybe, is the answer, on some dimension?
DUCKWORTH: You know, it could be love that leads to dissonance. And I mean it in the following way: How do I live one identity but believe mutually exclusive things — that my son is good, and my son is bad. That’s dissonance. And one could argue that learning comes from the resolution of dissonance, where it’s like: “Well, I’m going to update this belief that I have about gay marriage.” And I do think growth comes from this inner conflict. And I think what people like Danny Kanneman are able to do is: to enjoy this conflict and enjoy the resolution. Maybe he’s had it happen in a way that turns out well for him over and over again. Like, the experience of being wrong and, in a way, being rewarded for it — not being punished for it, not being ridiculed for it, not being “canceled” for it, but he’s been rewarded for being wrong, for pointing out his own errors. So, there’s some hope here. If we could begin to reward people for telling stories of how they used to believe X, but now they believe Y — I have to say, in today’s climate, I wonder how comfortable it is for people to mention that they used to have outdated beliefs.
DUBNER: Yeah.
DUCKWORTH: I remember this one particular retreat my family went to. My dad was a member of this professional fraternity. I don’t even know what that means, but it wasn’t associated with his college. It was just this professional fraternity of like Asian — maybe Chinese, specifically — engineers. And we went to this retreat in the Poconos. And there was this one workshop on diversity, and I remember being very bored, and they were telling us how diversity was good. And we had to, like, respect people who had different points of view than we did.
DUBNER: And you’re how old at this point?
DUCKWORTH: I think I was in seventh grade. Mostly I was bored off my ass, just watching the clock. But for the rest of my life, I’ll remember the last part, because it was so dissonant. And they were like, “Oh, except for Black people.” And then they were like, “Okay, great. Lunch is at 12 in the lodge.” And I was like, “What? Wait, that doesn’t make sense. Those things come into conflict.” I mean, I didn’t say, “Hey, you just gave me cognitive dissonance.” But they were really incongruent to me. And I do think that kind of dissonance, and the discomfort that comes from dissonance, is probably the seed of learning. In this case, it made me think that the Asian engineers my dad was hanging out with maybe were not the most enlightened people. I think that was my update. I was like, “Hmm.”
DUBNER: And did it change your view of your dad?
DUCKWORTH: It did make me think that, maybe, I came from a culture that had some deep-seated racist beliefs. I think that’s true of every culture. And to our point about “Think Again,” I think about the beliefs that my parents had about race when I was growing up, compared to what he was at the very end of his life. And I think maybe if we could — I don’t know — think of people more as, like, stories where the characters change and they develop, maybe that would also help us have some intellectual humility, to not think ourselves as having to be always right, but just that we’re these characters in a Netflix special who are going to evolve by episode eight.
DUBNER: It sounds like you may have changed your view of your father, maybe a few times, over the course of your life. I mean, obviously, there’s maturation from child to adult and so on, but you have these reckonings along the way.
DUCKWORTH: I think I changed my mind about my dad, and I think he changed. The world changes, and our view of the world changes. So, maybe the moral of the story — from the truly fundamental attribution error — is that we can be wrong, because our views may not be at all what the reality is. And the reality is also changing.
DUBNER: In that spirit, let me ask you a question. What would it take for me to persuade you that rum-raisin ice cream is, indeed, the best ice cream?
DUCKWORTH: Oh, God, what would it take you to persuade me that rum-raisin was far superior to, for example, mint-chip — or espresso-chip, which I’ve recently come to love?
DUBNER: Rum-raisin can go up against any of them and come out on top, in my opinion. But that’s not your opinion. So, what’s it going to take?
DUCKWORTH Rum-raisin was my dad’s favorite ice cream flavor.
DUBNER: You’ve never told me that! You’ve just made fun of me for liking rum-raisin.
DUCKWORTH: It’s such an old-person flavor.
DUBNER: Wait, you give me a hard time for liking rum-raisin only because your dad liked it.
DUCKWORTH: I don’t think it’s Freudian, but I still don’t like it. I will say this: If I can learn to be open-minded about rum-raisin, you know, consider the possibilities for humanity.
* * *
No Stupid Questions is produced by me, Rebecca Lee Douglas. And now here is a fact-check of today’s conversation.
During her story about Scandinavia, Angela admits that she thought the region had its own flag. She says her mother-in-law told her that she could see why Angela would think that, because Scandinavian countries have such similar flags, but her husband disagreed, and said they weren’t similar at all. I would say that the evidence supports Angela’s mother-in-law here. The Scandinavian Cross, or the Nordic Cross — a symbol of Christianity — appears on the flag of the Scandinavian countries Denmark, Sweden, Norway as well as additional Nordic countries like Iceland and Finland. The flags all have the same basic design with individualized colors and minor distinguishing details.
Later, when Stephen says that he’s “waiting for the sword to drop,” he appears to be mixing metaphors. There’s “waiting for the other shoe to drop,” and then there’s the “sword of Damocles.” “Wait for the other shoe to drop” supposedly originated in the 20th century in New York City tenements. A resident would hear someone in the apartment above them drop one shoe on the floor and anticipate that the other shoe would make more noise soon after. The “Sword of Damocles” is a metaphor from an Ancient Greek parable where Damocles, a courtier of Dionysius, sits below a sword suspended by a single horsehair, but in the story, the sword never drops.
Finally, Angela tells the story discriminatory behavior in her father’s professional fraternity, but she says she doesn’t really know what it means to be a part of a professional fraternity in the first place. Most professional fraternities are associated with accredited colleges, universities, or professional schools, and unlike social fraternities, members of professional fraternities are interested in a particular field of study. But Angela says that her father’s group wasn’t connected to a college. There are certain fraternal organizations like the Freemasons and the Loyal Order of the Moose that are completely separate from educational institutions. Many of these groups — like most longstanding organizations in the United States — also have a history of racial discrimination.
That’s it for the fact-check.
* * *
Coming up next week on No Stupid Questions: What’s the psychology behind our fascination with coincidences?
DUBNER Lincoln was elected in 1860. Kennedy, 1960. Both were shot in the head, from behind, and in the presence of their wives. The names of the assassins, John Wilkes Booth and Lee Harvey Oswald each contain 15 letters. Lincoln and Kennedy were each succeeded by southerners named Johnson.
DUCKWORTH: Okay. That is eerie.
That’s next week on No Stupid Questions.
* * *
No Stupid Questions is produced by Stitcher and Renbud Radio and is part of the Freakonomics Radio Network, which also includes Freakonomics Radio, People I (Mostly) Admire, and Freakonomics, M.D. This show was mixed by Eleanor Osborne. Our staff also includes Alison Craiglow, Greg Rippin, Morgan Levey, Zack Lapinski, Mary Diduch, Ryan Kelley, Jasmin Klinger, Emma Tyrell, Lyric Bowdich, and Jacob Clemente. Our theme song is “And She Was” by Talking Heads — special thanks to David Byrne and Warner Chappell Music. If you’d like to listen to the show ad-free, subscribe to Stitcher Premium. You can follow us on Twitter at NSQ_Show and on Facebook @NSQShow. If you have a question for a future episode, please email it to nsq@freakonomics.com. To learn more, or to read episode transcripts, visit Freakonomics.com/NSQ. Thanks for listening!
DUBNER: What do you find funny?
DUCKWORTH: Calvin and Hobbes is sometimes funny.
DUBNER: What about Calvin Klein?
DUCKWORTH: Mmm — not funny.
DUBNER: What about “Eine kleine Nachtmusik”?
DUCKWORTH: Pretty funny.
Sources
- Roy Baumeister, professor of psychology at the University of Queensland.
- Daniel Kahneman, professor of psychology and public affairs, emeritus, at Princeton University.
- Adam Grant, professor of management and psychology at the Wharton School of the University of Pennsylvania.
- Lee Ross, professor of psychology at Stanford University.
Resources
- “Wanna Bet? Explaining Where All 50 States Stand on Legalizing Sports Gambling,” by Pete Blackburn, Chris Bengel, and Shanna McCarriston (CBS Sports, 2022).
- Think Again: The Power of Knowing What You Don’t Know, by Adam Grant (2021).
- “What is Cognitive Dissonance?” by Kendra Cherry (Verywell Mind, 2020).
- “From the Fundamental Attribution Error to the Truly Fundamental Attribution Error and Beyond: My Research Journey,” by Lee Ross (Perspectives on Psychological Science, 2018).
- “Have We Been Thinking About Willpower the Wrong for 30 Years?” by Nir Eyal (Harvard Business Review, 2016).
- “See Obama’s 20-Year Evolution on LGBT Rights,” by Katy Steinmetz (TIME, 2015).
- “Gay Son Leads Rob Portman to Embrace Same-Sex Marriage,” by Deirdre Shesgreen (The Cincinnati Enquirer, 2013).
- “Beliefs About Willpower Determine the Impact of Glucose on Self-Control,” by Veronika Job, Gregory M. Walton, Katharine Bernecker, and Carol S. Dweck (Proceedings of the National Academy of Sciences, 2013).
- “Toward a Physiology of Dual-Process Reasoning and Judgement: Lemonade, Willpower, and Expensive Rule-Based Analysis,” by E. J. Masicampo and Roy Baumeister (Psychological Science, 2008).
- “Experiences of Collaborative Research,” by Daniel Kahneman (American Psychologist, 2003).
- “Intellectual Humility Playbook,” by Tenelle Porter (Character Lab).
Extras
- “In Praise of Incrementalism,” by Freakonomics Radio (2016).
- Onitsuka Tiger, Japanese sports shoes brand.
Comments