Search the Site

Episode Transcript

Stephen GREENSPAN: Yes, life was pretty good until I got a phone call from my broker.

Stephen J. DUBNER: That’s Stephen Greenspan. He’s an emeritus professor of psychology at the University of Connecticut.

GREENSPAN: Hi Katherine.

Katherine WELLS: Hey.

DUBNER: And this is Katherine Wells. She’s one of the producers on our show.

DUBNER: Hi Katherine.

WELLS: Hi Stephen.

DUBNER: So you are here with a story for us, yes?

WELLS: Right. A story about Stephen Greenspan. He has an interesting specialty; he’s an expert in what he calls “social incompetence.”

DUBNER: I have some of that.

WELLS: Which, you know, we all feel. What he means is, he studies why people do dumb things.

DUBNER: Presumably, that means: why smart people do dumb things?

WELLS: Right, that included.

DUBNER: And when he told you there that “life was pretty good,” what did he mean? What was so good exactly?

WELLS: Well, it was December, 2008, and he had a book coming out called Annals of Gullibility. The other thing that seemed pretty good was his financial situation. About a year earlier, he had invested in this hedge fund and it was doing pretty well, so he was getting nicely set up for retirement too. So, one day in December, he gets the first pre-release copy of the book — the gullibility book. And two days later, his broker calls.

GREENSPAN: I said, “How are you?” He said, “Terrible, it’s the worst day of my life.” Now this is a man who had lost a son, so when he said it’s the worst day of my life, that got my attention. And I said, “Why?” He said, “Well, Bernard Madoff just admitted that he was running a Ponzi scheme.” And I responded, “Who is Bernard Madoff, and what’s it have to do with me?”

DUBNER: Uh-oh. Katherine, I think we can kind of smell where this is headed.

WELLS: Right. This fantastic hedge fund that Greenspan had invested in turned out to be a feeder for Madoff’s Ponzi scheme. And Greenspan had no idea — he didn’t remember ever even having heard Madoff’s name.

DUBNER: Oh, man. So the gullibility expert has been gulled.

WELLS: Right, gulled in a big, ironic way. He lost $400,000. Now, this was just about a third of his savings, so it wasn’t the total end of the world. And he should get some money back, eventually, from settlements. But he’s 70 now. He has two college-aged kids, and he’d really hoped to be retired by now. And, you know, he certainly didn’t want to be remembered in this way.

GREENSPAN: There was a columnist — a financial columnist in Canada — who in his blog wrote: the first Greenspan, Alan, will be remembered as the economist who didn’t see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who forgot to read his own book on gullibility.

WELLS: I mean, it’s ironic, because Greenspan’s own research shows how even the smartest people can be duped.

GREENSPAN: I mean, a good example of that would be Sir Isaac Newton, the greatest scientist of all time, who lost over a million dollars — in modern dollars — in the South Sea bubble. And so he wrote, “I can calculate the orbit of heavenly bodies, but I cannot fathom the madness of men.”

WELLS: In reference to losing the money?

GREENSPAN: In reference to his own foolishness in putting all of his fortune at risk in something that he wasn’t really, in spite of his incredible brilliance, able to really understand or adequately calculate the risk of.  

WELLS: So, in a way, you joined an elite club of brilliant, informed, educated people who can be fooled.

GREENSPAN: I joined the human race basically. Like Sir Isaac Newton and the South Sea Bubble, I knew nothing about Madoff and just basically went along with the crowd. And that’s powerful. We tend to take our cues from other people, especially in situations where we don’t quite know what to do. 

DUBNER: So it may no longer surprise us to learn that smart people sometimes make dumb decisions.

WELLS: Right, it’s like Greenspan says: it’s the instinct to “go along with the crowd” and to “take our cues from other people.” And that’s really what today’s show is about.

DUBNER: Right. And I want to talk about something else Greenspan mentioned, an even more elemental issue: how we make decisions about a risk that we just aren’t equipped to calculate. But here’s the thing: if it can’t be calculated, then maybe it’s not exactly a risk. About 100 years ago, the economist Frank Knight argued that risk and uncertainty are nearly identical, but for one key difference: risk can be measured; uncertainty, by its nature, cannot. But what happens when you can’t tell the two of them apart?

*      *      *

DUBNER: So Stephen Greenspan, the gullibility expert, loses a third of his life savings in what turns out to be a Ponzi scheme. Now, even if you feel sympathetic toward him, you might say, “Hey, you know, he’s just one person. Bad things happen to people every day. At least the world didn’t end.” But what if we were worried about something that might end the world? No, I’m not talking about an attack by alien nations – not yet, at least. That’ll come later in the program. I’m talking about…climate change. How are people like you and me supposed to calculate the threats from something like climate change? There’s so much complexity, so much uncertainty. So most of us do what Stephen Greenspan did when he was looking to invest: we take our cues from other people.

Al GORE: It’s not a question of debate. It’s like gravity. It exists.

Rush LIMBAUGH: The reason that you know you’re right is that you know things they don’t know. And because they don’t even have that baseline of knowledge to chat with you, they can’t understand where you’re coming from.  And that’s exactly how I feel talking to people who believe this global warming crap.

ABC WORLD NEWS: The science is solid, according to a vast majority of researchers, with hotter temperatures, melting glaciers and rising sea level providing the proof.

Glenn BECK: When the University of Madison Wisconsin comes out with their definitive study, do I believe that? No! Do I believe scientists? No! They’ve lied to us about global warming. Who do you believe?

Who do you believe? That was Glenn Beck, by the way. Before him — from the top — you heard Al Gore, and then Rush Limbaugh, and an ABC World News report. When it comes to something like climate change, as fraught as it is with risk and uncertainty – and emotion! – who do you believe? And, more important, why?

Ellen PETERS: You know, my personal perception is that I don’t know enough about it, believe it or not. This is an issue that I think—.

DUBNER: Wait, could you just say that again so that everyone in the world can hear an honest response? It’s so rare for some version of “I’m not quite sure” or “I don’t know.” So, sorry, say it again and then proceed.

PETERS: What I was saying — I’m not sure exactly what I believe on it, in terms of the risk perceptions of climate change. It’s something that I don’t think I am personally educated on enough to have a really firm opinion about that.

That was Ellen Peters. She teaches in the psychology department at Ohio State University. She is part of a research group called the Cultural Cognition Project. They look at how the public feels about certain hot-button issues – like nuclear power and gun control – and then they try to figure how much those views are shaped by cultural values. That is, not empirical evidence, but people’s — what they call “cultural cognition.” So, they recently did a study on climate change. “How was it,” they wanted to know, “that the vast majority of scientists think the Earth is getting warmer because of human activity, but only about half the general public thinks the same?” Could it be, perhaps,  that people just don’t trust scientists? Here’s Dan Kahan. He’s another Cultural Cognition researcher and a professor at Yale Law School.

Dan KAHAN: Well, in fact, the scientists are the most trusted people in our society. The Pew Foundation does research on this, and this has been a consistent finding over time.

DUBNER: OK, so there goes that theory. That explanation won’t work for us then.

KAHAN:  Correct.

All right, so maybe people just doesn’t understand the science. Surveys have found that fewer than 30 percent of Americans are scientifically literate. Ellen Peters again:

PETERS: People have the belief that the reason that people don’t believe the risks of climate change are high enough is because they’re not smart enough, they’re not educated enough, they don’t understand the facts like the scientists do. And we were really interested in that idea and whether that’s really what was going on, or whether something else might matter.

So Peters and Kahan started out their climate-change study by testing people on their scientific literacy and numeracy — how well they knew math.

PETERS: And the items are things like: it is the father’s gene that decides whether the baby is a boy or a girl, true or false?

DUBNER: True.

PETERS: So fairly simple.

DUBNER: Is it true?

PETERS: You know, I’m actually not even positive on that one. I think it’s the comb— oh, no it has to be the father’s gene.

DUBNER: I’m putting my money on father. True.

PETERS: Father is true there, absolutely. Second question: antibiotics kill viruses as well as bacteria, true or false?

DUBNER: Negative.

PETERS: That one is absolutely false.

You can see why they wanted to know how people did on these questions before asking them about climate change.

PETERS: Numeracy, in general — what it should do is it should help you to better understand information, first of all. And that kind of comprehension is sort of a basic building block for good decisions across a variety of domains.

DUBNER: Right.

PETERS: But numeracy should also do other things. It should also help you just simply process the information more systematically. It should, in general, help you to get to better decisions that are more in line with the facts.

DUBNER: All right, so that makes perfect sense. But you have found something that kind of flies in the face of that haven’t you?

PETERS: We have. It’s the idea that people who are highly numerate and highly scientifically literate — they seem to actually rely on preexisting beliefs, on these sort of underlying cultural cognitions they have about how the world should be structured, more than people who are less scientifically literate, or less numerate.

DUBNER: So, if I wanted to be wildly reductive, I might say the more education a culture gets, the more likely we are to have intense polarization, at least among the educated classes, is that right?

PETERS: Based on our data, that’s what it looks like. It’s so interesting and so disturbing at the same time.

It is interesting, isn’t it? I mean, Peters and Kahan found that high scientific literacy and numeracy were not correlated with a greater fear of climate change. Instead, the more you knew, the more likely you were to hold an extreme view in one direction or the other — that is, to be either very, very worried about the risks of climate change, or to be almost not worried at all. In this case, more knowledge led to: more extremism! Why on Earth would that be? Dan Kahan has a theory. He thinks that our individual beliefs on hot-button issues like this have less to do with what we know than with who we know.

KAHAN: My activities as a consumer, my activities as a voter — they’re just not consequential enough to count. But my views on climate change will have an impact on me in my life. If I go out of the studio here over to campus at Yale, and I start telling people that climate change is a hoax – these are colleagues of mine, the people in my community — that’s going to have an impact on me. They’re going to form a certain kind of view of me because of the significance of climate change in our society — probably a negative one. Now, if I live, I don’t know, in Sarah Palin’s Alaska, or something, and I take the position that climate change is real, and I start saying that, I could have the same problem. My life won’t go as well. People who are science literate are even better at figuring that out — even better at finding information that’s going to help them form, maintain a view that’s consistent with the one that’s dominant within their cultural group.  

DUBNER: So you’re saying that if I believe that climate change is a very serious issue and I want to align my life with that belief, that it’s actually more important that I align my life with that belief not because of anything I can do, but because it helps me fit in better in my circle — there’s more currency to my belief there. What about you? You’re in New Haven, Connecticut, at Yale. I gather you haven’t walked into a classroom and publicly declared that you believe climate change or global warming is a hoax, have you?

KAHAN: No, I haven’t done that.

This makes sense, doesn’t it? But it’s also humbling. We like to think that we make up our minds about important issues based on our rational, unbiased assessment of the available facts. But the evidence assembled by Kahan and Peters shows that our beliefs —  even about something as scientifically oriented as climate change — are driven by a psychological need to fit in. And so we create strategies for doing this. Here’s my Freakonomics friend and co-author Steve Levitt.

Steve LEVITT: I think one of the issues with information gathering is that when people go to the trouble to learn about a topic, they tend not to learn about a topic in an open-minded way. They tend to seek out exactly those sources which will confirm what they’d like to believe in the first place. And so, the more you learn about a topic, you tend to learn in a very particular way that tends to reinforce what you believe before you ever started.

Aha. So if you’re already scared of something, you tend to read more about how scary it is. And if you’re not worried — then you don’t worry. Right?

LEVITT: So if there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear, versus the things we actually do fear. And the kind of things that tend to scare us are things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies — everything small that flies or that runs, she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean, the things that you should be afraid of are french fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.

Coming up: since we’re so bad at figuring out what’s really dangerous, let’s bring in the professionals, shall we?

Michael SHERMER: I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do.

And: a cautionary tale about siding with the conspiracy theorists.

Nick POPE: I think somebody actually thought I was an alien myself.

*      *      *

So as Steve Levitt sees it, we seek out information that confirms our preexisting biases, and we are congenitally bad at assessing risk. So how are people supposed to figure out what to be afraid of? Here’s Levitt again.

LEVITT: To know what to be afraid of, you need to go through an in-depth data collection process, you need to be properly informed. And people are too busy, — rightfully too busy — leading their lives instead of dwelling on what the exact, almost infinitesimal probability is that any particular thing will kill them. So it’s sensible for people to be uninformed, and it’s sensible to rely on the media. It just turns out that the media is not a very good source of information.

If you really wanted to make sure that every one of your beliefs was worth holding, you’d have to spend so much time gathering primary data that you’d have no time for anything else in life. You’d practically have to become a professional skeptic. And that’s not a job — is it?

SHERMER: Uh, yeah, I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do. So, anything from U.F.O.s and alien abductions, to Bigfoot and conspiracy theories, all the way up to things like global warming and climate change, and autism and vaccinations. We cover it all.

DUBNER: Michael Shermer, a professor at Claremont University, has a Masters degree in experimental psychology and a Ph.D. in the history of science. He’s also the publisher of Skeptic magazine, and he writes books. His latest is called The Believing Brain.  

DUBNER: Now, as a professional skeptic, I’m guessing a lot of people look at you — or hear about a guy like you, or read a book by you — and think, “Oh man, that’s, like, the dream job.” You know, people think, “Well, I’m a skeptic, I don’t believe anything.” So, what do you have to do to be you, Michael?

SHERMER: Haha. Well, we actually do believe all sorts of things. You have to have all sorts of beliefs just to — just to get out of bed in the morning — and so, the question then becomes,: well, which of your host of beliefs are the ones that are really supported by evidence, or are questionable, or are probably not true, and which are those that we base on instinct and intuition, and which are we basing on, you know, solid evidence? And so, that’s where the rubber meets the road. It’s, it’s not: Do you believe something or not? Of course, we all believe all sorts of things. The question is: are they true? And what’s the evidence? What’s the quality of the evidence?

DUBNER: Talk to me about how we end up believing what we believe in. I was going say, “How we choose to believe what we believe in,” but it sounds like it’s not really a choice, right?

SHERMER: It isn’t really a choice, no. Our brains are designed by evolution to constantly be forming connections — patterns, learning things about the environment. And all animals do it. You think A is connected to B. And sometimes it is, sometimes it isn’t — but we just assume it is. So my thought experiment is: imagine you’re a hominid on the plains of Africa, three-and-a-half million years ago. Your name is Lucy. And you hear a rustle in the grass. Is it a dangerous predator — or is it just the wind? Well, if you think that the rustle in the grass is a dangerous predator and it turns out it’s just the wind, you’ve made a Type 1 error in cognition – a false positive. You thought A was connected to B, but it wasn’t. But no big deal. That’s a low-cost error to make. You just become a little more cautious and vigilant, but that’s it. On the other hand, if you think the rustle in the grass is just the wind, and it turns out it’s a dangerous predator, you’re lunch. Congratulations, you’ve just been given a Darwin award for taking yourself out of the gene pool before reproducing. So we are the descendants of those who were most likely to find patterns that are real. We tend to just believe all rustles in the grass are dangerous predators, just in case they are. And so, that’s the basis of superstition and magical thinking.

DUBNER: But then we get to something like climate change, which is, theoretically, an arena bounded entirely by science, right?

SHERMER:  You would think so.

DUBNER:  Yeah, you would think so.  So what do we find, actually?

SHERMER:  Either the Earth is getting warmer or it’s not, right?

DUBNER:  Yeah.

SHERMER: I mean, it’s just a data question. Well, because it also has ideological baggage connected to it, — you know, left-wing versus right-wing politics — and so the data goes out the window.  It’s like, I don’t know — “Whatever the data is, I don’t know, but I’m going to be against it.” Now, I can’t just say, “Oh, I’m against it because my party is,” or, “I just do what other people tell me.” Nobody says that. What you do is, you make the decision: “I’m skeptical of that” or “I don’t believe it.” And then you have to have arguments.  So then you go in search of the arguments.

DUBNER: It doesn’t sound like it surprises you at all, then, that education — level of education — doesn’t necessarily have a big impact on whether you’re pro- or con-something. Correct?

SHERMER: That’s right, it doesn’t. And giving smart people more information doesn’t help. It actually just confuses things. It just gives them more opportunity to pick out the ones that support what they already believe. So being educated and intelligent, you’re even better at picking out the confirming data to support your beliefs after the fact.

DUBNER: Let’s talk now for a bit about conspiracy theories, which we’re nibbling around the edges of. How would you describe, if you can generalize, the type of person who’s most likely to engage in a conspiracy theory that’s not true?

SHERMER: Well, their pattern-seeking — their pattern-seeking module is just wide open. The net has — you know, is indiscriminatory. They think everything’s a pattern. If you think everything’s a pattern, then you’re — you’re kind of a nut.

POPE: I suppose I’m best known for having had a job at the government where my duties were investigating U.F.O.s.

That’s Nick Pope. Until 2006, he worked for the British Ministry of Defence. And in the early 90’s, he headed up the office that handled reports of U.F.O. sightings.

POPE: Flying saucer sightings, as they were called then.

His job was to figure out if any of these sightings had merit, and if perhaps there were extraterrestrial visitors.

POPE: To satisfy ourselves that there was no threat to the defense of the U.K.

Pope came into the job as a skeptic. But some U.F.O. reports, especially from pilots and police officers, got him wondering if perhaps we were being visited by aliens. Now, mind you, there was no hardcore confirmatory evidence. But Pope started talking — in the media — about the possibilities.

KQRE: You say you believe with 99 percent certainty that we’re not alone. So tell us what you’ve discovered.

POPE: Well I think it’s inconceivable in this infinite universe that we’re alone. And then that begs the question: if we’re not alone, are we being visited? It’s a related question.

POPE: When I started speaking out on this issue, I think some people in the U.F.O. community thought that I might be some sort of standard bearer for them.

DUBNER: Meaning one of them?

POPE: Yes, absolutely, that I could be a spokesperson for the movement. Of course, I had the huge advantage that whilst everyone else had done this as a hobby, I’d done it as a job.

DUBNER: Did that make you a bit of a hero in the U.F.O. community?

POPE: It did, and a lot of people still hold that view. They want me to come out and say, “Yes, it’s all real and yes, I was part of a cover up.” Their fantasy is what they call Disclosure, with a capital “D”, as if there’s going to be some magical parting of the curtains and a moment where a crashed spaceship is revealed for all the world to see. Because I say, “You know what? I don’t think that spaceship exists.” So, in a sense, I manage to upset everyone. I go too far for a lot of the skeptics by being open to the possibility, but I don’t go far enough for the believers, particularly the conspiracy theorists. And I get called things like “shill,” and that’s one of the more polite things that I’ve been called.

DUBNER: Yeah, I’ve looked at some of the comments on YouTube from a speech you gave. I’ll read you a bit of it — we’ll have to employ our bleeping technician later. “Nick Pope, what a f****** spastic. He works, he quote, ‘works’ for the government, why else is he constantly on every bloody U.F.O. program on every f****** channel? He talks enough bull**** to keep the U.F.O. nutters happy while never actually saying anything of importance.” Let’s unpack that one a little bit, shall we Mr. Pope?

POPE: Yes.

DUBNER: It says you quote, “work for the government.” Do you still work for the government?

POPE: No, I don’t. This is, in itself, one of the great conspiracy theories: that in 2006, I didn’t really leave. I just went under deep cover, and that they’re passing me wads of bank notes in a brown paper bag, or something.

DUBNER: But here’s my favorite. There’s one claim on a U.F.O. blog that you, Nick Pope, have been abducted by aliens yourself and now lie about it.

POPE: Well, yes I’ve heard that one. I’ve even seen one, which I think you might have missed. I think somebody actually thought I was an alien myself.

DUBNER: That would explain a lot wouldn’t it?

Nick Pope discovered a sad truth. The more transparent he tried to be — the more information he released about himself and his work — the more worked-up his attackers became. They took facts that would plainly seem to work in his favor and they somehow made these facts fit their conspiracies instead. But before we judge, consider how good we all are at deciding first what we want to believe, and then finding evidence for it. So what’s the solution? What can we do to keep ourselves headed down the road, albeit slowly and clumsily, toward a more rational, reasoned civilization? Here’s Ellen Peters again, from the Cultural Cognition Project.

DUBNER: So, I guess, the depressing conclusion one might reach from hearing you speak is that ideology trumps rationalism?

PETERS: I think that we are seeing some evidence for that in this study, but I don’t think that that has to be the final answer. I think that policy makers — communicators — need to start paying attention to some of these cues that deepen cultural polarization. So, for example, telling the other side that they’re scientifically inept? Probably a bad idea. Probably not the best way to continue people coming together on what the basic science really does say. Or, coming up only with solutions that are antagonistic to one side. And you know it, if you’re listening to them, that those are just antagonistic solutions — again, probably not the best idea. It’s a sign or a signal that we’re not listening maybe as well to beliefs on the other side.

Dan Kahan agrees that, whatever the solution, none of us are able to go it alone.

KAHAN: What’s clear is that our ability to acquire knowledge is linked up with our ability to figure out whom to trust about what. And ordinary people have to do that in making sense of the kinds of challenges that they face. But, the amount that we know far exceeds the amount that any one of us is able to establish through our own efforts. Maybe — you know that the motto for the Royal Society is “Nullius in Verba,” which means “Don’t take anybody’s word for it.” And it’s kind of admirable and charming, but obviously false.

DUBNER: Not very practical, is it?

KAHAN: Can’t be right. I mean, what would I do? I’d say, “You know, don’t tell me what Newton said in the Principia, I’m going to try to figure out how gravity works on my own.”

DUBNER: And speaking of Isaac Newton — remember what Stephen Greenspan told us earlier — how Newton was suckered into this terrible investment? It’s heartening to learn that even Newton, the scientific sage, was able to acknowledge the flaws in his own thinking, the shortcomings in his own thinking. And he left behind some advice that might be helpful for us all. He wrote, “to explain all nature is too difficult a task for any one man or even for any one age. ‘Tis much better to do a little with certainty, and leave the rest for others that come after you, than to explain all things by conjecture without making sure of any thing.” In other words: don’t get too cocky.

*      *      *

Freakonomics Radio is produced by WNYC, APM: American Public Media and Dubner Productions. This episode was produced by Katherine Wells. Our staff includes Suzie Lechtenberg, Diana Huynh, Bourree Lam, Collin Campbell and Chris Bannon. Our interns are Ian Chant and Jacob Bastian. David Herman is our engineer. Special thanks to John DeLore. If you want more Freakonomics Radio, you can subscribe to our podcast on iTunes or go to Freakonomics.com where you’ll find lots of radio, a blog, the books and more.

Read full Transcript

Comments