Search the Site

Episode Transcript

DUCKWORTH: Whichever comes first — free will or your head exploding.

*      *      *

DUCKWORTH: I’m Angela Duckworth.

MAUGHAN: I’m Mike Maughan.

DUCKWORTH + MAUGHAN: And you’re listening to No Stupid Questions.

Today on the show: is anyone truly evil?

DUCKWORTH: This is a bad apple. This is a bad person.

*      *      *

DUCKWORTH: Mike, we have a question from Alissa on evil.

MAUGHAN: Oh, gosh. Okay.

DUCKWORTH: Just a light topic.

MAUGHAN: I almost wish I had a malevolent laugh right there, but I I don’t know that I have one.

DUCKWORTH: Like a Disney villain laugh. All right. Alissa asks, “What exactly do you mean when you use the word ‘evil?’ I’m especially curious because the other day my friend and I got into a surprisingly heated yet curt debate. We were talking about whether my neighbor is mentally ill or evil. I said, ‘You have to be religious or believe in some supernatural force to believe in evil.’ She firmly refuted and had no interest in going deeper. Clearly, I do. Before my friend shut down our conversation, she asked, ‘Was Hitler evil?’ I replied, ‘He was likely mentally ill.’ And she said, ‘That sounds like an excuse.’ Please, Angela and Mike, enlighten me.”

MAUGHAN: Well, what I’m mostly curious about is what happened in her neighborhood, because it sounds like some great neighborhood drama if a neighbor is described as “evil.”

DUCKWORTH: Yeah, like, what’s going on? It is an interesting question, Like, is there evil? And did you read Lolita, by Nabokov?

MAUGHAN: No, I haven’t. 

DUCKWORTH: What!? 

MAUGHAN: I know. There’s so many books in the world.

DUCKWORTH: You know the plot, right? Vladimir Nabokov is writing about this pedophile named Humbert Humbert. And the little girl that he entraps and so forth is named Lolita. It’s told, like, from the interior of his worldview, like, you really mostly see the world through the eyes of Humbert Humbert. You begin the book thinking, like, “This is pure evil.” And at the end of the book you’re disgusted with yourself because you have spent all this time with Humbert Humbert — 

MAUGHAN: In his mind. 

DUCKWORTH: And you feel some sympathy? You know, you see the nuances and so forth. So, I, I think this is a question people — I certainly asked myself when I was reading Lolita, like, “Is there such a thing as evil?” Like, what is evil? And I think actually this is a case in which there is a consensual psychological definition. Let me put it out there for you and see if you agree. I think when most people use the word “evil,” first of all, they mean human evil. Even, like, say, an animal kills a human, I don’t think we think of the animal as being “evil.” I think we think it’s a uniquely human capacity. I think also the definition includes harm. So, lots of ink has been spilled in philosophy, but also in psychology, about morality. Like, what are our moral instincts? And one of the primary moral instincts, which is true across societies, and is actually also evident very early in children, like, just the idea that you do no harm to others.

MAUGHAN: Isn’t that the Hippocratic Oath?

DUCKWORTH: It is! It is the Hippocratic — “first, do no harm.” Jon Haidt, the psychologist, has argued that all societies — and the left and the right agree that “do no harm” is a moral foundation. He would also argue, I think, that fairness, like, “tit for tat,” or like, you know, people being treated under some rule that is consistent — he would argue that that is universal across certainly the United States, you know whether you’re on the left or the right. And then he’s kind of famous for arguing that there are three other moral foundations that are less universal. And those are loyalty, respect for authority, and purity. And he wants to argue that those are moral foundations that are not shared consensually and perhaps are more embraced by the right than they are by the left. But getting back to evil, there aren’t a lot of people that I know who would disagree with the definition that something like intentional harm perpetrated by a human. And I think the emphasis is on intentionality. I think if you accidentally do great harm, say you accidentally set a fire or you accidentally do something terrible with your car — I, I can’t even name these things because now they’re, like, making me freak out.

MAUGHAN: Well, I mean, if say, you’re a single parent, you are exhausted, you are driving, and you fall asleep at the wheel, and you plow into a building or a restaurant or something and kill a lot of people. I don’t think anyone’s going to say you’re an evil person. That’s a horrific, tragic, awful situation. But not “evil” because, like you’re saying, intentionality was not there.

DUCKWORTH: And then I think people are like, “Well, what about the responsibility?” But I would agree that there is a pretty consensual definition that it has something to do with intentional harm perpetrated by a person. So, like all questions, you keep asking until you start talking about free will.

MAUGHAN: Or until you — your head explodes because there’s so many iterations.  

DUCKWORTH: Whichever comes first — free will or your head exploding. This is a conversation that we will not tie up with a bow because it’s like free will, evil. But I do want to you know, tell you about one study that was done that I think actually, just, like, pries open the door on, like, how deep this question really is. So, there is a very famous study. It was published in Science, arguably the most prominent journal in scientific research. And it’s called, “The Double-Edged Sword.” It’s about when judges make sentences about people who have done, like, you know terrible violent crimes. If, as is often the case, these criminals have a diagnosis of a psychiatric disorder, does that matter?  So, these are U.S. state trial judges, 181 of them, and they’re all given a hypothetical case. And the convict in this case is diagnosed with psychopathy. So everybody gets that, but some of the judges are presented evidence that suggests a biomechanical cause: there’s evidence that their brain — like, you know, “We put them in the M.R.I., and we’re, like, seeing these lesions,” or maybe you hear that they have a faulty amygdala. The amygdala is the part of the brain — it actually does a lot of things, but it does fear. A lot of moral psychologists would argue that if you don’t have an amygdala, you don’t properly develop moral instincts. So, if you’re a judge, you have to rule on the sentence, like what is the punishment for this crime? And you’re either a judge who gets, “Okay, this person, they did this terrible violent crime, they’re a psychopath, they have a faulty brain, right?” Or you’re a judge who gets all of the same information without the expert testimony on the biomechanism.

MAUGHAN: Okay. So, just to summarize on my side. Because often a judge will also get mitigating circumstances. You know, “This was this individual’s childhood. They were raised in this tough situation.” Or, “They were exposed to alcohol or abuse” or whatever. You’re not even talking about any of this. This is just pure, biological, their brain doesn’t function, right?

DUCKWORTH: The only thing that varies across the conditions, I believe, is this, you know, is there a biomechanism present? And of course, one of the important things of running experiments, typically you don’t know about the other group. So, you’re just a judge, and you get this scenario and you’re like, “Well, what would you rule?” And what the question was in this article was, like, “How many years do you sentence this person to prison?” And knowing about this biomechanical explanation makes them give less punitive sentences. It makes a difference between a ruling of 12.83 years versus 13.93. So, I guess a little more than a year. So, that study is very important. It’s very highly-cited, because it suggests that when we have an explanation — and certainly a biological explanation, but maybe you would argue, like, any explanation — it changes the way we perceive the guilt, the evil, of the person.

MAUGHAN: It’s interesting, because there’s the role of a judge in safety and society and   if they have a damaged amygdala, then are they going to be a threat to society even more in the future? And I, I’m not a lawyer. I’m not a judge. But that’s something that comes to my mind is, “Okay, so, do we need to be even more careful upon release, and what happens to this individual?”

DUCKWORTH: I think you’re right that when you just on the face of it consider, you’re a judge, you’re getting this evidence, now you get these brain scans and you’re like, “Whoa, this person’s brain is not like other people’s brains,” you could argue that that judge is going to say, “I think they should get a longer sentence.” Because you could argue that you know a sentence is primarily for the protection of society, not for the punishment of an act that was wrong — more like a practical thing to, like, keep all of us safe from this person who might do a bad thing again. And so, that’s why this article is called, “The Double-Edged Sword.” And they say, “Look, it could go one way, it could go the other.” What does the evidence say when we randomly assign judges to condition? And they say, “Well, turns out it goes this way.” When they did content analysis of the reasoning that judges gave for their sentence, they found that the proportion of judges that list mitigating factors, meaning like, psychopathy should bear on the sentencing,” it increased from about 30 percent to about 48 percent when you go from the condition lacking evidence of biology to the one with biology. In other words, it looms large in our minds. I think that classic study is pretty consistent with this research by this guy that — have I ever mentioned Adrian Raine?

MAUGHAN: I do not know the name Adrian Raine.

DUCKWORTH: So, Adrian Raine is like one of the foremost criminologists who has argued a biological basis for violent crime. What he has found is that individuals in our society who do terrible, harmful things to other individuals in our society typically have dysfunctional brains. But I do want to say, on behalf of Adrian — but also on behalf of psychiatrists who study mental illness and violent crime — that the base rate for violent crime in the population that does have mental illness, it’s still really low. Most people who have those diagnoses do not actually do anything violent. It’s just that statistically, they’re more likely than people who have no diagnosis at all. But in general, you should not think that somebody who has a diagnosis — for example of schizophrenia or a personality disorder — is going to do something terrible and violent.  

MAUGHAN: So, what I’m curious about, because so much of what we’re talking about — and so much of the conversation that I see on a lot of this — is the individual versus the situation. One indication that I think is interesting on this topic, if you’ll remember, in October of 2017, there was a horrific mass shooting at an outdoor music festival in Las Vegas, where someone killed 58 people, injured hundreds of others, and this is a mass oversimplification, but in general many conservatives came out and talked about the individual and, and used the word “evil” a lot. So, President Trump, for example, described it as, quote, “an act of pure evil.” Progressives tended to, on the other hand, blame the violence on lack of regulation, for example, blamed it on the situation. “Hey, if we could just control guns,” right, “and do more to get some of these weapons off the street, etc.”   And many in the Republican party came out and said, quote, “You can’t regulate evil.”  I think it’s obviously, in my mind, a combination of both. There are things we can do to fix situations, and we need to take into account the individual.

DUCKWORTH: Yeah. I mean, first of all, I think you’re right in the sense that a lot of people want to ask the question, like, “Is it the individual who’s evil?” Like, “This is a bad apple. This is a bad person. Some people are evil. Hitler was evil.” But then there are others who would say, “No, I want to think about the situation.”

MAUGHAN: Like, what happened to create a Hitler?

DUCKWORTH: Exactly. Like, what are the circumstances and so forth? You know, “Is it the individual? Is it the situation? Is it biology? Is it experience?” So, Mike, I think you and I would love to hear what our listeners think about this concept of evil. Is anyone truly all bad? Record a voice memo in a quiet place with your mouth close to the phone and email us at NSQ@freakonomics.com. Maybe we’ll play it on a future episode of the show. Also, here’s an easy way to be a force for good in the world: if you like this show and want to support us, tell a friend and spark a wickedly good conversation. You can also spread the word on social media or leave a review in your favorite podcast app. 

Still to come on No Stupid Questions: can good people do evil things?

MAUGHAN: I don’t know. It just doesn’t compute, it doesn’t make sense to me.

*      *      *

Now, back to Mike and Angela’s conversation about what it means to be evil.

DUCKWORTH: Any social scientist would tell you, it’s extremely complicated, but I do want to try to shine a light on it through a particular psychologist’s view, Phil Zimbardo. So, you’ve heard about Phil Zimbardo’s prison experiment, right? 

MAUGHAN: Of course I’ve heard of Phil Zimbardo.

DUCKWORTH: Did you learn about it in college?

MAUGHAN: Yeah, and we studied it in grad school.

DUCKWORTH: Oh, you did? At the Kennedy School of Government?

MAUGHAN: It was in a class that was about human behavior, leadership development, stuff like that. We read a book about it, uh, his book.

DUCKWORTH: Oh, you read The Lucifer Effect

MAUGHAN: Yeah, I have it — I have it on the bookshelf right back there.  

DUCKWORTH: So, let me back up and just remind you of what the Stanford Prison Experiment was. It’s been years since it happened, but it really stands as one of the most controversial experiments. So, Phil Zimbardo at Stanford — and I believe this took place in the basement of the psychology building. He decided to create an artificial situation that would be like prison. And he was going to randomly assign people to either be, by flip of the coin, a prisoner or a prison guard. And he wanted to see what would happen when you put people in these roles. How strong are these social roles that we foist people in? And he, by design, tried to actually not only pick a random sample of people — so, you know, he wasn’t looking for psychopaths and violent criminals — but in fact he specifically was looking for the healthiest, you know, physically and mentally, volunteers that he could find. So he puts out this ad that he wants college students for a study of prison life, 75 people answer the ad and volunteer, and he gives them personality tests and then he interviews them. And he picks 24 — what he calls “the most normal, the most healthy.” So, he’s got “good apples,” as he would put it — or he’s tried to find them, and he has no reason to believe that there are any, like, truly evil people in the bunch. Then he randomly assigns them, flip of the coin, prisoner or guard. He actually — I mean, he had a — a flair for the dramatic. So, you volunteer for this experiment, and you know you’re going to be in it, but what you don’t know, for example, is if you’re randomly assigned to be the prisoner, oh, like, people show up at your house and handcuff you and take you off. Do you remember that detail, right? 

MAUGHAN: Yes.  

DUCKWORTH: He wanted to be as realistic as possible. And, you know, what happens in that basement at Stanford is a shock to everyone, because he doesn’t actually give the prisoners or the guards many rules other than the fact that the prisoners have to stay where they’re supposed to, and the guards are in charge. So, he gives the power to the guard. And I think for Zimbardo, his definition of evil is not only doing intentional harm. For him, power is at the heart of evil acts — that there’s always this asymmetry of power and someone who abuses their power to do harm intentionally to others. But, you know, pretty soon you have these guards making up from only their own imaginations these terrible things — like, the guards got the prisoners to clean the toilet bowls with their bare hands. They stripped them naked. These guards sexually taunted the volunteer prisoners. They had them simulate sodomy.  

MAUGHAN: The whole thing is shocking to me still, and I’ve obviously read the entire book, I’ve learned about this many times, but to devolve so quickly into such barbarism. I don’t know. It just doesn’t compute, it doesn’t make sense to me.

DUCKWORTH: Because it’s just so hard to imagine that 24 volunteers — I mean, you could argue that these are probably, like, nicer than average people, right?

MAUGHAN: Yeah, and with no-to-limited instruction, for it to devolve so quickly into such disarray that’s the massive cognitive dissonance for me. 

DUCKWORTH: So, Zimbardo will say it happened. He was there. And by the way, when this was unfolding, right — so there’s day one of the prison, and there’s day two, and I think by day two there were kids who were having nervous breakdowns. Now day three. Now day four and these, like, sadistic acts are becoming ever more extreme. So, it didn’t make it to day seven. It ended on day six. 

MAUGHAN: Right, they stopped it early.

DUCKWORTH: And Zimbardo likes to point out that it wasn’t his conscience that stopped the prisoner experiment. It was his girlfriend’s. So, his girlfriend, who I believe was a Ph.D. student in psychology at the time, goes down to the basement and sees what’s going on and immediately says, “This has to stop.” And he listened to her. He later married her, and she later became the world’s foremost researcher on burnout. We’ve talked about her, and you’ll recognize her name, Christina Maslach. So, she later herself wrote that witnessing this happening was partly why she became so interested in the power of the situation. So, I think Zimbardo, as he would put it, is more interested in “bad barrels” than “bad apples,” because he thinks that the prisoner scenario that he created — and also Abu Ghraib and also Nazi Germany —  like, that there are these evil situations. I don’t think he’s saying that there are no bad people. But I think, you know, when he talks about Rumsfeld coming down to sort out what’s going on in Abu Ghraib, I think he quotes Rumsfeld as actually saying, like, we have to find the “bad apples.” And he wants to say that, like, instead of looking for bad apples, you should see what is it about this barrel that’s making everybody so evil in it?

MAUGHAN: And here’s the thing: I think all of us would like to believe that we are not capable of committing some of these most evil acts — and hopefully we’re not, but I, I don’t know and, you know — 

DUCKWORTH: Like, you don’t know who Mike Maughan would be in the Stanford Prison Experiment, right?

MAUGHAN: I mean, I hope I know who Mike Maughan would be. But that’s what I’m saying. I think none of us want to believe ourselves capable of any of these things, and I think maybe that’s why it’s so disorienting — and why the Stanford Prison Experiment is so horrific is because it’s like, “No, that’s not possible.” But then it’s like, “Well, gosh, I bet those 24 people didn’t think it was possible.” I mean, you know the old parable of the two wolves?

DUCKWORTH: I think it’s supposed to be a Cherokee myth, right? The little boy who’s talking to his grandfather?

MAUGHAN: Yeah, exactly. A little boy talking to his grandfather, and he’s basically saying, “There are two wolves that live within us all — good and evil — and there’s a constant battle between them.” And the little boy says to grandpa, “Well, which wolf wins?” And the grandpa says, “Whichever one you feed, so feed the good.” And I bring it up only to say that, like, it’s easy to sit here and say, “I would never.” And in my heart of hearts, and in my mind, and everything in my core, I would think if I’m in that Stanford Prison Experiment, I would never. But then I have to acknowledge that, I don’t know, is there more to human psychology? Is there more to situational behavior? Is there more to the “barrel,” as you call it, besides the apple? And that’s a horrifying thing to have to ask the question about oneself — or one’s neighbor — or anybody else. 

DUCKWORTH: I think it might be helpful to hear from Zimbardo’s own perspective what these bad barrels are like. So, he calls this “the slippery slope of evil.” So, the first step is so small that it’s not noticeable. And, and Mike, you know, we’ve talked about Milgram and the famous Milgram experiment before, but you really can’t talk about evil without coming back to this classic experiment. You’ve also probably been taught this, like, more than once.

MAUGHAN: Yeah. You’re better to explain it, though.

DUCKWORTH: So, Stanley Milgram, lived, uh, you know, a while ago. He’s not alive anymore. He was I believe a professor at Yale. And he actually, in the version of the experiment that’s so famous, was very deliberate about trying to get kind of, like, “every man.” He basically wasn’t looking for Yale undergraduates, but more “people off the street,” as he would say, like, barbers and clerks. There was an experiment that was advertised about memory, and he tried to recruit these volunteer adults. I think they were paid. And they were really in an experiment that had nothing to do with memory. So, as you’ll recall, you come into the lab, and you’re told that you’re going to be randomly assigned to be either a learner or a teacher. But actually, secretly, it’s all rigged, and the other person who walks in with you is an actor. And everybody in this experiment is actually quote, unquote, “randomly assigned” to be a teacher. So, you sit down in your chair, and then the experimenter gives you your directions. You’re supposed to basically quiz this person in the other room on these words. And when they get it wrong — because you have the answer key — you’re supposed to flip the next switch on electric shock. And I say “the next switch” because from left to right, it goes from low shock, 15 volts, all the way to the right: 450 volts. And it doesn’t say “fatal,” but it says, “XXX.” And at 375, it says “danger, severe shock” — so kind of it implies that these are fatal levels, you know, 450 volts of electricity, that you are inflicting on this other person who you think volunteered for the experiment just like you. And so, when I say the slippery slope of evil starts with a first step that is so small you can’t tell — what Milgram finds is that people, you know, they have no problem, like, you give 15, the guy goes, “Eh.” You know, 30, you know, like, 45. When you start to get, like, in the hundreds, like, you know, he’s, like, “Ow!” And then, as you recall, he’s complaining. He says he has a heart condition in many of the versions of the experiment, because Milgram ran, I think, 16 versions of this experiment. And the thing is about the slippery slope of evil — I think Milgram, and Zimbardo, and really most psychologists would argue, is that part of what gets us into trouble is that the first act is not this dramatic change from normal. And I have to wonder — and I don’t know Christina Maslach — but I, I wonder about how she wandered down to the basement on Stanford campus and saw for the first time this thing that had been gradually evolving over days, and I think for her, it just struck her between the eyes that, like, it has to stop right now.

MAUGHAN: Well, and that’s where, like, I’m actually not surprised at all by the part of the story saying that she comes down to the basement. I’m sure you’re familiar with the old Alexander Pope quote, but he says, “Vice is a monster of so frightful a mien” — a face, right? “That to be hated needs but to be seen. But seen too oft, familiar with her face, we first endure, then pity, then embrace.” And it’s sort of that idea of the gradualism, but she came in not having been a part of the every single day and just saw vice, if you will, of so frightful a face that she’s like, “What in the — is this? Stop this now.” Whereas, if you’re part of the gradualism —.

DUCKWORTH: Like, you can imagine these Milgram volunteers — by the way, two out of three of them go all the way to 450. I mean, it’s kind of nuts. Two out of three!

MAUGHAN: While someone’s in the other room saying, “This hurts.”

DUCKWORTH: That they’re having a heart attack, right? And I think there’s a point where they just stop responding altogether, you know, as if they’re, like, slumped over, possibly dead. So, you can imagine, though, if the experiment were like, “Oh, okay, you got randomly assigned to be the teacher. Now you have to shock them at 450” — like, there’s something about the gradualism, like, I think that, you know, maybe we habituate to it. Also, we rationalize.

MAUGHAN: Well, and that’s the appeal to authority. It’s not me making the decision. There’s some experimenter telling me, “Keep going.” So, I’m like, “Well, it’s on them, not me.”

DUCKWORTH: Actually, I also don’t think — I don’t think they address you by name. I think they call you “teacher.” They’ll be like, “Teacher, continue.” Because many of these volunteers were like, “Wait, do you see that guy? Wait, I think that guy needs help.” “Teacher, please continue.” And actually, I think the script also includes that the experimenter will say, like, “I take responsibility for this.” Like, “Teacher, please continue.” And by the way, Milgram was not doing this out of whole cloth. He was actually thinking about, um, what happened in World War II with the Jews and the Nazis. He was trying to have some explanation for what the hell happened in an entire country where you had so many people doing so many evil acts. I actually have to say that I come to a slightly different conclusion at the end of the day than many social psychologists. I don’t think that it’s an excuse that the situation is a bad barrel. You know, you can talk about a circumstance that you can’t change, and certainly that exists. But I think most situations are at least partly under our control. So, my conclusion is, if you have any hint of being in a situation that brings out your worst, then you can choose the wolf by choosing the situation.

MAUGHAN: What I’m fascinated by are the people in those environments who basically bucked the trend and bucked the system. 

DUCKWORTH: Mmm, like, the one in three who did not go to the right-hand panel of shock in Milgram.

MAUGHAN: If I believe that it’s not that I’m good or evil, but there are these two wolves, and it’s which one I feed, then I want to be able to say, “Okay,   what can I do in that environment to ensure that I make the decision to not become” — 

DUCKWORTH: Or how can I not be in that environment? That’s what I want to say. 

MAUGHAN: And that’s fair. I guess maybe “yes/and.” If you’re in Nazi Germany, you could leave Nazi Germany, yes, or you could become an Oskar Schindler. Or if I think about the United States in the midst of McCarthyism and the Red Scare. Everyone is getting caught up in this mass hysteria about communism and people are turning on their neighbors and their friends and all of these things. Well, in the midst of all of this going on with McCarthyism, you have an Edward R. Murrow. Now, granted, he is a person of high status, with a platform — he’s one of the main news broadcasters and journalists, but he was one of the few who stood up to Senator McCarthy in the midst of this. I guess I’m saying if you can’t change the circumstance, you live in the United States in the midst of McCarthyism in the 1950s, what can you do to be the person that stands up?

DUCKWORTH: I want to agree that, like, if you’re in a situation that you can’t control, then I hope you do have that conscience. And by the way, one out of three people didn’t go all the way to 450. And there was a conscientious person in Abu Ghraib, who was the whistleblower. That’s important. But I also want to say this. In the Milgram experiment, in one of the experiments — because he ran different versions of this — it was the case that you got to see how other people behaved and how they decided. And  if you see another person protest and say, like, “No, I’m not doing it,” then you have a 90 percent chance of also protesting. So, if that’s true, if the people around us are a huge part — even just one person around you — it’s not “either/or.” Like, we all know moral people. I mean, I have to say — I know I talk about him too much — but I did marry this incredibly honest person. You know, many of us can choose our friends. We can choose the people that we spend a lot of time with, choose the people that we, you know, spend our lives with. Yeah, there are aspects of your situation that you can’t change, but if it’s that powerful that even seeing one person around you be honest or kind versus evil and sadistic, then, like, that’s an element of the situation you can choose. 

MAUGHAN: I guess what you’re saying is even in a macro environment, we can still surround ourselves in a micro situation, where McCarthyism may be happening, but I can surround myself with the right people. 

DUCKWORTH: What I’m saying is that if you can, you should. I’m not saying that you always can, but I think we can more often than than we sometimes imagine. Mike, do you think Hitler was evil?

MAUGHAN: I do. I believe unequivocally Hitler was evil. There were a lot of other people who lived at that time in similar circumstances who didn’t become Hitler.

DUCKWORTH: I want to go on the record agreeing with you. You know, for all of my interest as a psychologist in slippery slope situations and the power of the situation on us that is unconscious and all encompassing — I do believe that there are bad barrels, as Phil Zimbardo would say. But I do believe there are bad apples. And we can ask how the apples got that way, but I think Hitler was evil. I don’t know if we necessarily will resolve this debate between Alissa and her friend about whether the neighbor is mentally ill or evil, but I think whatever you feel, I will just say, the person who doesn’t feel the way you feel — well, I know how you feel about that other person. I mean, when we talk about these questions of morality and values, I can see how this country is being pulled apart, because it’s a natural human instinct also that when somebody answers the question differently than you, it is not like disagreeing about what ice cream flavor is best. It’s just so visceral that we are appalled by the people who don’t share our view.

MAUGHAN: I mean, even, even Alissa in her question to us said, like, my friend didn’t want to carry on the conversation. But it almost felt like they should have had a moral obligation to carry it on or something.

DUCKWORTH: Right, but I can also understand why it got heated. But, Alissa, it’s a conversation that I hope that more people have, whether you agree or disagree, because this, to me, is the reason why we talk in the first place, right? to wrestle with the nuances.

MAUGHAN: Absolutely. And I will just say my main takeaway from all this is that while there may be bad barrels in the world, follow the example of an Oskar Schindler or an Edward R. Murrow, and find a way to be a good apple.

Coming up after the break: a fact-check of today’s episode and stories from our NSQ listeners.

*      *      *

And now, here’s a fact-check of today’s conversation:

Contrary to popular belief, the phrase “first, do no harm” is not included in the either the original or the modern versions of the Hippocratic Oath — a pledge that some, but not all, medical schools ask their graduates to take. And while the oath is often attributed to the ancient Greek physician Hippocrates, many scholars believe that he is not the original author. However, a version of “do no harm” is found in Hippocrates’s book Of the Epidemics, from 400 B.C.E.” He writes that the physician must have, quote, “two special objects in view with regard to disease, namely, to do good or to do no harm.”

Also, Angela says that psychologist Philip Zimbardo arranged for the student volunteers who were randomly assigned to be prisoners in the Stanford Prison Experiment to be handcuffed and taken to their cells. It’s worth mentioning that the surprise arrests were performed by real Palo Alto police officers. The students were not only handcuffed, but searched, read their rights, and driven in a squad car to the police station for booking and fingerprinting. And we should note that U.C. Berkeley psychologist Christina Maslach had finished her P.h.D. at the time of the Stanford Prison Experiment and was no longer a student when she insisted that Zimbardo end the study.

Later, Mike and Angela discuss the controversial experiments performed by Stanley Milgram — a psychologist who coincidentally attended James Monroe High School in the Bronx at the same time as Philip Zimbardo. Angela says that she thinks Milgram ran 16 versions of the famous electroshock experiments. The study actually contained 23 conditions in which obedience levels varied enormously.

That’s it for the fact-check.

Before we wrap today’s show, let’s hear some thoughts about our previous episode on adulthood.

Shawn MAYO: Hey Angela and Mike, this is Sean in New Brunswick and listening to your conversation about when do you become an adult, I actually have a very specific answer. When I was in college, I was having a conversation late at night in the hallways with one of my professors and he and I started talking about how  we each lost parents as kids. And I said, you know, I’m becoming an adult and trying to figure stuff out, and what does exactly this mean? And he said, you know, Sean, you’re not becoming an adult or a man now. You became an adult the day that your father passed away, because life gave you something that you had no choice but to deal with. And sometimes I think, you know, maybe the key thing there is about dealing with things that you have no choice but to deal with. You know, life will happen to us. And we don’t get to really decide on when that is, so some of us experience that sooner and some of us experience that later. Anyways, just wanted to share that, uh, moment of enlightenment from Gavin Buchanan, my Seneca College professor. Thanks a lot.

Rebecca LIZARDE: Hi, Mike and Angela. My name is Rebecca, and I’m from Los Angeles. I just graduated from college this past spring, and like Angela said, it definitely wasn’t one big moment of, “Oh, I’m an adult now.”  That moment happened on the cross-country bike tour that I immediately started two days after graduation. On the second day of biking on this trip, two of my teammates and I were driving at the bottom of a mountain that my friends were at the top of biking. And our car broke down with all of our camping gear and food that we needed to camp at the top of the mountain for the night. And this started a very large misadventure of trying to figure out how to get the food and supplies to our friends. And I think this was just my realization that I was responsible for this situation and responsible for other people’s safety and my own safety. And so in that way, it was just a big wake up call, and I’ve been thinking about it ever since.

Rob TRAISTER: Hi, NSQ. Regarding your episode about when people reach adulthood, I often wonder when people feel like an adult, because personally, at the age of 56, I often still don’t feel like an adult, even though when I look in the mirror, I very clearly see a 56-year-old man looking back at me. I asked my grandfather about this one time, and he told me that the first time he felt like an adult was on the day that I was born. I was his first grandchild, and he said that that definitely told him that he had reached adulthood. 

That was, respectively, Shawn Mayo, Rebecca Lizarde, and Rob Traister. Thanks to them and to everyone who shared their stories with us. And remember, we’d love to hear your thoughts about what it means to be evil. Send a voice memo to NSQ@Freakonomics.com, and you might hear your voice on the show!

Coming up next week on No Stupid Questions: What does it take to restore someone’s image after a scandal?

DUCKWORTH: Monica Lewinsky and I were interns at the White House in the very same summer.

That’s coming up on No Stupid Questions.

*      *      *

No Stupid Questions is part of the Freakonomics Radio Network, which also includes Freakonomics Radio, People I (Mostly) Admire, and The Economics of Everyday Things. All our shows are produced by Stitcher and Renbud Radio. The senior producer of the show is me, Rebecca Lee Douglas, and Lyric Bowditch is our production associate. This episode was mixed by Greg Rippin with help from Jeremy Johnston. We had research assistance from Daniel Moritz-Rabson. Our theme song was composed by Luis Guerra. You can follow us on Twitter @NSQ_Show, and you can watch video clips of Mike and Angela at the Freakonomics Radio Network’s YouTube Shorts Channel or on Freakonomics Radio’s TikTok page. If you have a question for a future episode, please email it to NSQ@Freakonomics.com. To learn more, or to read episode transcripts, visit Freakonomics.com/NSQ. Thanks for listening!

 DUCKWORTH: “Ugh, this is so bad!”

Read full Transcript

Sources

  • Jonathan Haidt, professor of ethical leadership at New York University’s Stern School of Business.
  • Christina Maslach, professor of psychology at the University of California, Berkeley.
  • Stanley Milgram, 20th century professor of psychology at Yale University.
  • Edward R. Murrow, 20th century American broadcast journalist and war correspondent.
  • Alexander Pope, 17-18th century English poet.
  • Adrian Raine, professor of criminology, psychiatry, and psychology at the University of Pennsylvania.
  • Oskar Schindler, 20th century German businessman.
  • Philip Zimbardo, professor emeritus of psychology at Stanford University.

Resources

Extras

Comments