Episode Transcript
DUCKWORTH: I’m Angela Duckworth.
DUBNER: I’m Stephen Dubner.
DUCKWORTH + DUBNER: And you’re listening to No Stupid Questions.
Today on the show: Why do humans tend to choose simplicity over complexity?
DUCKWORTH I kind of feel like we need the opposite of Occam’s razor. Maybe we could call it the Dubner/Duckworth razor?
* * *
DUBNER: Angie, I have a question for you that is both a straightforward question and, I suspect, a bit of a trick question.
DUCKWORTH: Ooh.
DUBNER: Are you intrigued?
DUCKWORTH: I am. What is it?
DUBNER: Is grit the key to success? Or maybe I should say: Is grit the key to success? How do you answer that?
DUCKWORTH: You know, my TED Talk used to be called, “The Key to Success” and some very unhappy person pointed out that’s overselling and unlikely to be true. I do not think, Stephen, that grit is the key to success, in the sense that I don’t think grit is the only determinant of success. I didn’t choose the name of my TED Talk. People may not know this, but, when you write an op-ed — say, for example, The New York Times — you don’t get to pick the title.
DUBNER If you’re lucky, they’ll run it past you and say, “Can you live with this?”
DUCKWORTH: But, likewise, I just gave a talk and they named it “The Key to Success.” I emailed the powers that be at TED, and I said, “Titling this ‘The Key to Success’ is really not an accurate title.” And they just changed it to my definition of grit: “Passion and Perseverance for Long-Term Goals.” And I can tell you the epilogue of what happened after that interaction: I’ve been thinking about how the human mind likes for there to be the key—
DUBNER: Yes!
DUCKWORTH: —to weight loss, romantic satisfaction, a happy life, fill-in-the-blank. We love magic bullets. So, I think the idea of “overselling” is a really interesting one from a psychological point of view — like, why do we oversell things? Why are we so eager to be oversold?
DUBNER: That’s what I really wanted to talk about — this embrace of the “magic bullet” way of thinking about the world, because I see it kind of everywhere I look. A lot of the work that I do with Freakonomics Radio, — or at least what I try to do — is to cut against that, to show that, while there is a virtue to simplicity, the world is complex. Moreover, even if a simple solution works, it will probably fail to work forever.
DUCKWORTH: Or for everyone. I was just reading Max Bazerman’s new book. I don’t think it’s out yet. But you know Max Bazerman, yes?
DUBNER: I do.
DUCKWORTH: He’s a friend of ours. He’s a professor at Harvard Business School.
DUBNER: “A friend of ours” — not in the Cosa Nostra way that “a friend of ours” might imply.
DUCKWORTH: Wait. What is “Cosa Nostra”? Is that like an Italian restaurant?
DUBNER: It’s kind of the mafia.
DUCKWORTH: Oh. Is that what it really is?
DUBNER: In the lingo — at least in the mob movies that I’ve seen, because I’ve never been in the mob — when you say, “He’s a friend of ours,” that means he’s one of us.
DUCKWORTH: Well, I can’t think of somebody who is less likely to be in the mob. Can you think of anyone who is less likely to be in organized crime than Max Bazerman?
DUBNER: I will say this, though: His name is the perfect name for the Jewish guy that’s in the mob, because there’s usually one or two Jewish guys — like Hesh from The Sopranos. So, Max Bazerman definitely sounds like a badass. I said that in an admiring way, but anyway, you were saying this non-badass Max Bazerman.
DUCKWORTH: Professor Max Bazerman at Harvard Business School —
DUBNER: Again, “Professor” is the perfect nickname for a mobster.
DUCKWORTH: They call him “the professor.” And he has this new book called Complicit: How We Enable the Unethical and How to Stop. But there was this one passage — it’s so related to this magic bullet psychology. I’m going to read you what he said about teaching a group of executives. He says, “I was teaching this group of executives online. I asked the class, ‘What caused the massive fraud at Theranos?’ Each executive was asked to enter their answer in the chat function. The time allowed was generous. Clearly, there were multiple causes of fraud at Theranos, but when I looked at the class’s answers, I found they tended to be simple and singular. 62 of the 70 responses offered a single cause. 56 of those 62 were a simple description of Elizabeth Holmes, such as her ego or lack of integrity.”
DUBNER: We should say: Theranos is this company that turned out to be mostly fraudulent, that was supposed to do really quick, and easy, and cheap blood testing to give all kinds of diagnostics.
DUCKWORTH: Like, a single drop of blood, and we can tell you everything you need to know. The point here is really less about Theranos and more about the psychology of these executives who thought of a cause, a reason — not multiple reasons. And here is his conclusion, “In other words, the vast majority of the sophisticated executives in my class who knew all aspects of the story blamed a single source.” And then, Max goes on to talk about the single-cause bias. We have a propensity to stop thinking after we see or understand one cause of an effect. When, if you just pause to think about it, really, almost everything — like: Why did Angela just get a little mini can of Diet Coke out of the refrigerator? Is there one cause, or is it multiple causes? And it’s hard to even think of a single phenomenon or human behavior that doesn’t have many causes.
DUBNER: It’s so interesting, because what’s pushing against his word of caution is a long history of embracing simplicity as a means to figure out a very complex world. Occam’s Razor is a famous example:Wwhen there’s a complicated explanation and a simple explanation, let’s assume the simple explanation is right. Aristotle, and Kant, and many, many, many people have praised the virtue of simplicity. So, how would you suggest that we think about the relationship between simplicity and truth?
DUCKWORTH: It’s an interesting question of where Occam’s Razor comes from. This might be apocryphal, but wasn’t Occam a friar? Or a monk, right? I think it was “Brother Occam.”
DUBNER: I don’t know.
DUCKWORTH: Parsimony was supposed to be not only preferred, aesthetically, but more likely to be true. That was the idea — that nature is simple. Which, again, if you ponder almost any aspect of human behavior, you’re like, “Wait a second, the complex explanation ought to be right.” But there really is a bias toward simplicity.
DUBNER: I think what this is really about is: Just understanding how cause and effect works. Maybe it’s our fault for using that phrase “cause and effect” as opposed to “causes and effects.”
DUCKWORTH: Both plural.
DUBNER: I’ll give you an example of where I first really began to wrestle with this. So, years, and years, and years ago, when I first met Steve Levitt, he had written this paper along with the legal scholar and economist John Donohue, which argued that, surprisingly, one of the biggest drivers of the drop in crime in the U.S. starting in the 1990s, was the legalization of abortion by the U.S. Supreme Court, Roe v. Wade, in 1973. And their argument — which is probably familiar to many people who listen to this show — was that what the legalization of abortion did was provided an avenue for parents, particularly women, who decided that now was not a good time to have a baby. And it turns out that, if you don’t want a baby at a given time, that’s a pretty good signal that baby will not have the greatest odds for success in life. They basically argued that “unwantedness” is an undesirable component for a child. Fewer unwanted children meant that the small share of those children who might have become criminals was declining even more. And so that’s what led to a drop in crime. Now, let’s just put aside the fact that it’s a fairly nuanced argument, it’s a fairly controversial argument. When Levitt and I wrote about that argument in Freakonomics, we were very careful to write it in a way that tried to make clear that this was not an argument about the upsides or the downsides of abortion, per se. We tried to write it as an argument about how it can be that policy leads to behaviors that lead to downstream unintended consequences. We also went out of our way to say: Look, if your first instinct is to lower crime, then abortion is a terrible way to do it. Putting aside any moral or ethical questions about it, it’s extraordinarily inefficient. So, this was not a utilitarian argument.
DUCKWORTH: You’re not making a policy recommendation. You’re trying to explain what accounted for a pretty remarkable drop in crime.
DUBNER: We wrote about this in Freakonomics in the context of all the other factors that either did or did not decrease crime. And, in fact, several years after the abortion paper, Levitt wrote another paper called “Understanding Why Crime Fell in the 1990s: Four Factors That Explain the Decline and Six That Do Not.” This was one of the reasons I really fell in love with economic research, and with Levitt’s in particular. It really acknowledges, like, those are 10 factors right there.
DUCKWORTH: Were there three other than abortion?
DUBNER: There were three others than abortion, and that’s really my point here.
DUCKWORTH: Wait. I need to know what they are before we continue this conversation. I can’t not know.
DUBNER: One factor that did cause a drop in crime was a big increase in imprisonment. In other words, if you put a lot of people in prison, which we did in the ’80s and then ’90s, that is definitely going to decrease the crime rate because — this is a basic tenet of criminology — it’s a relatively small group of people who do the majority of crimes. So, if you’re very, very tough on them, you convict them, and you put them in prison for quite some time, then crime will go down.
DUCKWORTH: And I think I found the other two.
DUBNER: One, I want to say, is more policing.
DUCKWORTH: Yes. Increases in the number of police. I’m not a genius. I’m just using Google.
DUBNER: All right. Tell me, what’s the fourth that matters?
DUCKWORTH: The fourth one is the waning of the crack epidemic.
DUBNER: Crack was a really interesting drug and market. Crack created a lot of violence. It was so profitable that the gangsters who were selling it fought over the rights, and they killed a lot of people. The really interesting question is: Why did crack mostly go away? It’s hard to argue this empirically, but one good answer seems to be that crack is such a terrible drug. It’s so damaging to a person, both physically and mentally. If my uncle is a crackhead, or my older brother is a crackhead, and I see what’s going on with them, I’m like, “No way.” I might use heroin. I might smoke weed. I might drink a lot.
DUCKWORTH: I’m not doing crack.
DUBNER: Even if someone reads Freakonomics, where we actually walk through this paper of Levitt’s and say, here is evidence that there were four pretty major contributors to the drop in crime and six contributors that you might think had contributed — those include: a stronger economy, innovative policing methods, changing demographics, gun-control laws, carrying of concealed weapons, the use of capital punishment. Those were some that Levitt empirically argued didn’t decrease crime for a variety of reasons. It is astonishing to me how even someone who’s read that fairly carefully seems to gravitate toward the magic bullet — or single-cause explanation — and say, “Oh, it was abortion.”
DUCKWORTH: By the time they get to their next dinner party, it’s the only thing that’s left. They’re like, “Oh, did you know that crime’s down because of abortion?”
DUBNER: Exactly.
DUCKWORTH: One of my favorite papers that got published in the last couple of years is by Matt Sagalnik at Princeton, who tried to take a dataset that had a massive amount of data about these children and their families. They were tracked from birth all the way through age 15. And they ran this tournament, giving this, like, huge team of scientists almost all of the data that was there, but they withheld some of the outcome data from age 15. And the question was: How well could they predict those outcomes from everything else in the dataset? And you might think, “Pretty well. We have really powerful computers, and they have all these theories, and they understand human nature.” But the conclusion of the paper — and I’ll just read you verbatim, because it’s so compelling: “Despite using a rich dataset and applying machine-learning methods optimized for prediction, the best predictions were not very accurate. Overall, these results suggest practical limits to the predictability of life outcomes.” And I think that is very similar — I wrote a paper predicting outcomes at West Point. In the study, I do show you can predict West Point graduation from grit. But here’s the last line of my paper: “There may be severe limits to how well any set of personal attributes can forecast an individual’s destiny. People change, contexts change, life trajectories are shaped by the whims of chance and path dependency.” In other words, it’s complicated. Occam’s Razor was not right. I kind of feel like we need the opposite of Occam’s Razor. Maybe we could call it the Dubner/Duckworth Razor — that, given a simple explanation and a complex one, that you should choose the complex one.
DUBNER: But the fact of the matter is that sometimes simplicity is really valuable. Einstein was really, really, really good at taking an extraordinarily complex situation and understanding it so that the important pieces were reduced to a simplicity. It’s the mark of a good mind to acknowledge the power of simple cause-and-effect, while also acknowledging that complexity is real. It makes me think— Did you ever hear the story of NASA trying to come up with a pen that would write in space?
DUCKWORTH: No, I don’t think so. I am imagining myself in a gift shop in some, like, science museum, buying a pen that writes upside down, or something.
DUBNER: I’ve bought that pen, too.
DUCKWORTH: Right next to the, like, astronaut ice cream.
DUBNER: Ah, yeah, that’s the worst food ever, but it’s really fun to eat.
DUCKWORTH: Horrible!
DUBNER: So, here’s the way the story goes — and I will emphasize: the way the story goes, because I don’t want you to take it as fact.
DUCKWORTH: Got it.
DUBNER: So, as the story goes, as NASA was booting up and starting to do more and more ambitious things, and obviously, we were competing with the U.S.S.R. — there was the space race — there was the realization that a pen that works well here wouldn’t work in zero gravity.
DUCKWORTH: Because you need gravity to pull the ink down.
DUBNER: It’s interesting. That’s one of the exercises used in the “illusion of explanatory depth” research by Steve Sloman, right? Which is: explain how a pen works. And most people would tell you, “Oh yeah, I know how a pen works.” I’m just saying, right here: I have no idea how a pen actually works.
DUCKWORTH: You’re owning it.
DUBNER: I am owning it. So, NASA was — theoretically, according to the story — spending, let’s call it “billions” of dollars. Along with all the other rocket stuff, and space-suit stuff, and training stuff, they were also having to come up with a pen that would work in space. And the way the story goes is, “Well, what about the Soviets? What did they do? Did they have billions of dollars to spend to develop a gravity-proof pen?” And the answer was, “No, they used a pencil.” This was a story that was told for years and years, which turns out to be, I’m pretty sure, almost entirely false.
DUCKWORTH: Really? It’s such a good story, though.
DUBNER: I think the true story was that they had these kind of fancy, mechanical pencils that cost much more than a regular pencil. And when the budget item was noticed in the NASA budget, someone flipped out and said, “Whoa, that’s a lot to spend for mechanical pencils!” And so that gave rise to this story about creating an even more expensive pen. But the moral of the story is: The Soviets were smarter in this case. They came up with the simple solution — a pencil, much simpler. But to me, there’s a second layer of a lesson here, which is that a pencil seems to be a simple technology, but in fact there’s a very famous essay called “I, Pencil.” It’s a first-person essay written, theoretically, by a pencil — it wasn’t actually written by a pencil — explaining that no one person knows how to make a pencil. That it’s actually an incredibly complicated process that’s been perfected by a lot of different people, in a lot of different fields and industries, over many, many, many years. You have to figure out how to get the lead inside, how that’s mined, how it’s crafted, all these different supply chains, and things like that. So, even the simple thing turns out to not be very simple. And so, simplicity is awesome. And when it’s very simple for me to use a good, or a service, or an interaction, I love it. But it’s a mistake to think that it didn’t take many, many, many, many hours, and many, many, many dollars, and a whole lot of brain cells for a lot of people to actually come up with something so simple. And that, in fact, complexity is real and that we shouldn’t pooh-pooh it. So, that’s my takeaway from the story of the pencil.
DUCKWORTH: That is a great takeaway. I think it is a good thing to think about when we try to say, you know, “I know what’s going to happen to this person.” Like, pencils are complicated. And so is human life. And if that is true, there are many, many reasons why we’re going to do something, like take the Diet Coke out of the fridge, or stand up, or raise our hand —
DUBNER: You really want that Diet Coke, don’t you?
DUCKWORTH: I already took it out of the fridge. Now you have to guess the 95 reasons why I did that.
Still to come on No Stupid Questions: Stephen and Angela explore the human desire for magic-bullet treatments — and reflect on how it played out during the pandemic.
DUCKWORTH: These recommendations were so complicated, and I kept thinking, like, “Well, can’t you make it simpler?”
DUBNER: Just give me the pill!
* * *
Now, back to Stephen and Angela’s conversation about the human tendency to oversimplify.
DUBNER: So, here’s my question for you. I understand that it’s convenient to come up with a single-cause explanation. I understand that it’s easy. I understand that our brains aren’t really great at memory — generally, most of us. Like, if I give you a list of 10 items to remember until the next day, if you have two or three, still, that’s probably pretty good.
DUCKWORTH: Or even the next minute. I’m pretty sure I would not be able to remember 10 things.
DUBNER: That’s true. So, I get that there are cognitive limitations. I see why it’s attractive. But it’s often just a tip of the iceberg. So, why do you think that we so want to believe the simplest versions?
DUCKWORTH: You really do have to ask yourself, “Why?” Because evolution should work against systematic mistake-making. Why have we evolved to look for single causes, as opposed to multiple causes, if the world really does work in a multiple-cause way? And I am thinking about — get out your shot glass — Danny Kahneman. His research on system-one thinking versus system-two thinking — these very quick heuristics versus more deliberate “weighing all the evidence” — one of the things I have learned from Danny is that the human mind likes to substitute, for a complex problem, a simpler one. And it’s true that two competing explanations that are equally descriptive of what’s going on — all things being equal, you should probably choose the simpler one. It’s certainly going to be more actionable. It’s taking a lot less of your cognitive capacity to understand. But I think the thing that we’re saying is that things are not equal.
DUBNER: Right.
DUCKWORTH: The simple explanation is not always as true as the complex one. And so, the reason why the human mind gravitates towards simple explanations is that, when we have that simple explanation, we’re not feeling, at the moment, that we’re incomplete. We’re feeling like we chose the one that is more tractable. Also, just the fact that we can more fully understand the simple explanation draws us to it. Like, “Grit, it’s the secret to success! Oh, I got it. I pick that.” As opposed to what I would say, which is like, “Grit is one thing among many. And then, of course, there is such a thing as the rate at which you learn. And, by the way, there are very complex situational factors that interact in nonlinear ways with all of the above.” Nobody wants to hear that.
DUBNER: But then, I guess, in your case, in the story of grit, you become known as “the grit lady.” And so, all that anyone wants to ask you about —
DUCKWORTH: I am still known as “the grit lady”! This happened to me yesterday. I was asking somebody about something entirely different and then they kept thinking that I was asking them about grit. It was a little bit frustrating.
DUBNER: That’s the point I was about to make, which is: Once an idea is known, that’s the focus of the follow-up inquiries. And it sort of encourages you to give a response that’s a little bit un-self-interested, which is to say, “Yeah. I wrote a book called Grit. I’ve studied grit, but let’s talk about all the other things that actually go into this.” You could argue that your incentives to do that are not very strong.
DUCKWORTH: I mean, I really don’t think that grit is the most important thing. Like, you know in Disney movies, when there are fairies that grant wishes? If you only had one, would you make your child gritty? Like, oh my gosh, wait a second! Before you make that choice, what about kindness and integrity? What about happiness?
DUBNER: Height.
DUCKWORTH: Height, obviously. The incentive for me is enormous to introduce to the conversation the many things that any parent would want their kids to grow up to be. But I think the force that pushes against the complexifying of anything is this desire for single causes. Like, what was the reason why Theranos collapsed? What is the secret to success? What is the one food that I need to eat to stay healthy? So, I’m not the only one. That means that anybody who’s trying to talk about things in their full complexity is pushing against a very strong force for simplicity.
DUBNER And I think people who work in all different realms have come up against this. Think about cancer research over the past 40 or 50 years. We’ve heard about all these different sorts of “magic bullet” treatments, which have turned out to be somewhat successful on some cancers, but mostly not successful on many. You know, the phrase “magic bullet” goes back to Paul Ehrlich from — gosh — probably over a hundred years ago. He was talking at the time about a drug that theoretically could solve the problem in the human body without doing anything bad to the human body.
DUCKWORTH: The magic bullet would, like, hit its target, but not hit anything else.
DUBNER: Exactly. I mean, we’re still wrestling with this in medicine, in many areas — not just oncology, but think about antibiotics. You want antibiotics that target the bacteria that are doing the bad stuff, but don’t do anything to anything else in the body. And, I mean, if you want to talk about a complex, dynamic system, it’s the human body. In a way, the human body can make public society seem kind of simple. Look at all those different interacting systems. So, I think that the more sophisticated a medical researcher gets — or researchers in many, many, many realms — the more you realize: Yes, simple ideas can have massive leverage, but you have to appreciate them in the context of the complexity. I do wonder, when it comes to psychology, does this notion — this single-cause bias, as you’ve called it — does this intersect at all with what you psychologists called the “fundamental attribution error”?
DUCKWORTH: The “fundamental attribution error” is one of the big ideas to come out of social psychology. These are the psychologists who typically have focused on how human behavior is influenced by the situations that we’re in. And the fundamental attribution error refers to this bias toward thinking about the person and their influence on their own behavior and under-weighting — ignoring, even — the situational pressures. So, for example, a kid comes into class, doesn’t have their homework. The fundamental attribution error is the teacher thinking, like, “Little Stephen, without his homework, doesn’t care —”
DUBNER: “Lazy, stupid kid.”
DUCKWORTH: And, who knows? Maybe the dog literally did eat the kid’s homework.
DUBNER: Or maybe my mom is undergoing cancer treatments.
DUCKWORTH: You can think of many situational factors. In a lot of the laboratory experiments, the way that the fundamental attribution error was shown to exist is that you would tell people very explicitly that the person who did this thing was under situational pressure to do it. Like, “The reason why they wrote this pro-Castro essay is that they were told to write the pro-Castro essay.” And then, you, as a participant, you’re asked, “How much do you think this person really is pro-Castro?” And people would still attribute some motive to that person’s own beliefs, even knowing that they were just compelled by some other person to write it. So, that’s what the fundamental attribution error refers to. It’s been around for so long that it’s part of Intro Psych for students everywhere. The more recent argument that I have been reading is, like: Look, in real life, almost everything we do, quite obviously, is the product both of our own personality, our own preferences, and situational forces. So, because we, in general, operate assuming that behavior is a product of the situation and the person, when we are in a lab study, that just kind of comes out. We’re carrying over common sense. In general, anybody who would write an essay is probably doing it in part because of what they have to do, given the situation, and also what they really feel.
DUBNER: You know, there’s one more example— I’m thinking, as you’re speaking about the single-cause bias, or the magic-bullet idea, and a way in which it’s really damaging. If you look at the way the world, but especially Americans, responded to the Covid-19 pandemic, every single element of our response became a binary choice about “This is the thing,” or “This is not the thing.” “Masks will save us all.” “Masks do nothing.” I think it reached its apotheosis with the vaccine. In the beginning, there was this conversation about, “Wow, here’s a terrible virus. It’s killing people. We don’t really know how to treat it. We don’t know the long-term effects. So, the only really responsible thing to do is to come up with a vaccine. Let’s ask some of the best scientists in the world to do it.” And then they did it. They came up with a vaccine that works and that’s somewhere between, let’s call it, 60 and 93 percent effective. “Well, that’s not the magic bullet!” You know what? There are hundreds of thousands, maybe millions of people alive because of that. And yet, we still have a really hard time putting our minds around the fact that, “Oh. Someone who took the vaccine still got really sick,” or, “You know, there might be some side effects of this vaccine.”
DUCKWORTH: Or you might need to do the vaccine and certain health precautions. I remember, during the very opening months of the pandemic, these recommendations were so complicated. And I kept thinking like, “Well, can’t you make it simpler?”
DUBNER: Just give me the pill!
DUCKWORTH: But then, there you go! I was falling prey to this kind of single-cause, magic-bullet bias. And I’ll say this Stephen: It’s not just the pandemic, and it’s not just trying to forecast what’s going to happen with the person that you just hired, or your own life, or your kids. I think, for example, when you say, “Hey, how are you going to fix the achievement gap that opened up like the Grand Canyon during the pandemic for rich and poor kids?” If we could avoid single-cause bias, we will be on much firmer ground. We both read a paper recently about how there was a study of cognitive therapy, and it was being compared to just giving people money.
DUBNER: Right.
DUCKWORTH: These are economists, who were testing interventions in a relatively poor population. I think it was Liberia. And what I remember taking from that paper was that they were smart to actually not only ask the question “Which is better?” But they actually had a group that got both, so they could look to see the effect of having a combination of therapy and a small cash grant, and I believe that was the group that actually, in the long run, did the best. And so, it’s not “either/or.” You know, my favorite go-to is “both/and.” And “both/and” is not single-cause, right? “Both/and” says there’s a lot going on here.
DUBNER: The paradox of this conversation is that you and I, but also, I think, everyone who would listen to a show like this, for the most part, is already on the side of “both/and.” So, in a way, we’re preaching to the choir. But that’s okay, because, you know, a choir can then go out and preach to people who aren’t in the choir.
DUCKWORTH: Yeah, we can go sing.
DUBNER: I guess my takeaway is that nothing is 100 percent all the time for everyone except maybe death, and the passage of time, and this podcast.
DUCKWORTH: There are no stupid questions and there really are no simple answers.
No Stupid Questions is produced by me, Rebecca Lee Douglas. And now here is a fact-check of today’s conversation.
In the first half of the show, Angela says that TED changed the title of her talk from “The Key to Success” to “Passion and Perseverance for Long-Term Goals.” She was slightly off here. The TED Talk was actually renamed “Grit: The Power of Passion and Perseverance” — the same title as her best-selling book.
Later, Angela wonders about the identity of Occam, of “Occam’s Razor.” William of Ockham was a 14th-century English, Franciscan friar. His “razor,” or heuristic, comes from his 1323 publication Summa Logicae, where he wrote, quote, “It is futile to do with more what can be done with fewer.”
Finally, Stephen shares what he is pretty sure is a mostly fictional story about how Americans spent billions developing a space pen while Soviets simply relied on pencils. He was right to doubt the veracity of this narrative. In the 1960s, Paul C. Fisher of the Fisher Pen Company developed a pen that worked in space, underwater, and in extreme temperatures. This unique writing utensil was used by both American astronauts and Soviet cosmonauts. Wooden pencils were considered a fire hazard in spaceships, where the atmosphere was 100 percent oxygen. And, as Stephen mentioned, NASA was briefly embroiled in a controversy about how much they were spending on mechanical pencils. That may be where the false narrative originated.
That’s it for the fact-check.
Coming up next week on No Stupid Questions: Stephen and Angela answer a listener’s question about teaching children financial responsibility.
DUCKWORTH: I do think that kids working is a good thing, I made, slash, asked my two daughters to work as soon as it was legal in the state of Pennsylvania.
That’s next week on No Stupid Questions. For that episode, we want to hear about your very first job. Were you a cashier? A babysitter? Did you run a killer lemonade stand? What did you take away from that experience? To share your thoughts, send a voice memo to NSQ@Freakonomics.com with the subject line “First Job.” Make sure to record in a quiet, indoor space with your mouth close to the phone, and please keep your thoughts to under a minute.
* * *
No Stupid Questions is part of the Freakonomics Radio Network, which also includes Freakonomics Radio, People I (Mostly) Admire, Freakonomics, M.D., and Off Leash. All our shows are produced by Stitcher and Renbud Radio. This show was mixed by Eleanor Osborne. We had help on this episode from Lyric Bowditch and Jacob Clemente. Our staff also includes Neal Carruth, Gabriel Roth, Greg Rippin, Morgan Levey, Zack Lapinski, Julie Kanfer, Ryan Kelley, Jasmin Klinger, Emma Tyrell, and Alina Kulman. Our theme song is “And She Was” by Talking Heads — special thanks to David Byrne and Warner Chappell Music. If you’d like to listen to the show ad-free, subscribe to Stitcher Premium. You can follow us on Twitter @NSQ_Show and on Facebook @NSQShow. If you have a question for a future episode, please email it to nsq@freakonomics.com. To learn more, or to read episode transcripts, visit Freakonomics.com/NSQ. Thanks for listening!
DUCKWORTH: Let me just keep it simple for you, Stephen.
DUBNER: Hey, can I just say, I’m not that much of a simpleton.
Sources
- Max Bazerman, professor of business administration at the Harvard Business School.
- John Donohue, professor of law at Stanford Law School.
- Paul Ehrlich, biochemist and professor of medicine at the University of Berlin.
- Daniel Kahneman, professor emeritus of psychology and public affairs at Princeton University.
- Steve Levitt, professor of economics at the University of Chicago.
- Matt Sagalnik, professor of sociology at Princeton University.
Resources
- Complicit: How We Enable the Unethical and How to Stop, by Max H. Bazerman (2022).
- “Simplicity,” by Alan Baker (The Stanford Encyclopedia of Philosophy, 2022).
- “Cognitive Behavior Therapy Reduces Crime and Violence Over 10 Years: Experimental Evidence,” by Christopher Blattman, Margaret A. Sheridan, Julian C. Jamison, and Sebastian Chaskel (SocArXiv, 2022).
- “Fact Check: NASA Did Not Spend Billions on Space Pens While Russia Used Pencils,” (Reuters, 2021).
- “Cognitive and Noncognitive Predictors of Success,” by Angela L. Duckworth, Abigail Quirk, Robert Gallop, Rick H. Hoyle, Dennis R. Kelly, and Michael D. Matthews (PNAS, 2019).
- “The Tyranny of Simple Explanations,” by Philip Ball (The Atlantic, 2016).
- “Political Extremism Is Supported by an Illusion of Understanding,” by Philip M Fernbach, Todd Rogers, Craig R, Fox, and Steven A. Sloman (Psychological Science, 2013).
- “Grit: The Power of Passion and Perseverance,” by Angela Duckworth (TED, 2013).
- Thinking, Fast and Slow, by Daniel Kahneman (2011).
- “Paul Ehrlich’s Magic Bullet Concept: 100 Years of Progress,” by Klaus Strebhardt and Axel Ullrich (Nature Reviews Cancer, 2008).
- “Understanding Why Crime Fell in the 1990s: Four Factors that Explain the Decline and Six that Do Not,” by Steven D. Levitt (Journal of Economic Perspectives, 2004).
- “The Impact of Legalized Abortion on Crime,” by John J. Donohue and Steven D. Levitt (The Quarterly Journal of Economics, 2001).
- “I, Pencil,” by Leonard Read (1958).
- “Why Do We Underestimate the Influence of the Situation on People’s Behavior?” (The Decision Lab).
Extras
- “Abortion and Crime, Revisited,” by Freakonomics Radio (2019).
- “How to Change Your Mind,” by Freakonomics Radio (2019).
Comments