The Truth Is Out There…Isn’t It? A New Freakonomics Radio Podcast


Our latest Freakonomics Radio podcast is called “The Truth Is Out There…Isn’t It?” (You can download/subscribe at iTunes, get the RSS feed, listen live via the media player above, or read the transcript below.) In it, we try to answer a few fundamental questions: how do we know that what we believe is true? How do we decide which information to trust? And how do we quantify risk — from climate change to personal investments?

The program begins with Stephen Greenspan, a psychologist and an expert on “social incompetence” and gullibility. He knows from personal experience that even the smartest people can be duped into bad risk assessments, especially on the advice of people they trust. You can read more about him here (spoiler alert!).

We also talk with Dan Kahan of Yale Law School and Ellen Peters of Ohio State University, both of whom belong to the Cultural Cognition Project, a scholarly group focused on “how cultural values shape public risk perceptions.” We blogged earlier about their interesting finding: 

Greater scientific literacy and numeracy were associated with greater cultural polarization: respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased.

The authors hypothesize that people who are more numerate and scientifically literate are better at gathering information that confirms their existing beliefs. Kahan believes this happens, in part, for a pretty basic reason: we just want to fit in with our friends. So we work to maintain viewpoints that fall in line with our social group.

You’ll hear from professional skeptic Michael Shermer, who explains the evolutionary basis of funky risk-assessment practices. It all goes back to our hominid ancestors, he says, who needed to be on high alert to protect against predators.

Steve Levitt also chimes in:

If there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are the things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.

And what happens when our normal fears kick into overdrive? We talk to Nick Pope, formerly of the British Ministry of Defence, who for several years investigated UFO sightings for the government. (Some files from the Ministry’s UFO department have recently been made available at the British National Archives.) Since leaving government, Pope has been accused of being part of an elaborate government cover-up. He talks about the futility of trying to change a conspiracy theorist’s mind.

Climate change, hominid ancestors, UFO cover-ups, and smart people making bad decisions: all that and more in this week’s podcast.

Audio Transcript



Stephen GREENSPAN: Yes, life was pretty good until I got a phone call from my broker.

Stephen J. DUBNER: That’s Stephen Greenspan. He’s an emeritus professor of psychology at the University of Connecticut.

GREENSPAN: Hi Katherine.


DUBNER: And this is Katherine Wells. She’s one of the producers on our show. Hi Katherine.

Katherine WELLS: Hi Stephen.

DUBNER: So you are here with a story for us, yes?

KWELLS: Right. A story about Stephen Greenspan. He has an interesting specialty: he’s an expert in what he calls “social incompetence.”

DUBNER: I have some of that.

WELLS: Which, you know, we all feel. What he means is he studies why people do dumb things.

DUBNER: Presumably that means … why smart people do dumb things?

WELLS: Right, that included.

DUBNER: And when he told you there that “life was pretty good,” what did he mean? What was so good exactly?

WELLS: Well, it was December, 2008, and he had a book coming out called Annals of Gullibility. The other thing that seemed pretty good was his financial situation. About a year earlier, he had invested in this hedge fund and it was doing pretty well, so he was getting nicely set up for retirement too. So one day in December he gets the first pre-release copy of the book, the gullibility book. And two days later, his broker calls.

GREENSPAN: I said, how are you? He said, terrible, it’s the worst day of my life. Now this is a man who had lost a son, so when he said it’s the worst day of my life that got my attention. And I said why? He said well Bernard Madoff just admitted that he was running a Ponzi scheme. And I responded, who is Bernard Madoff, and what’s it have to do with me?

DUBNER: Uh-oh. Katherine, I think we can kind of smell where this is headed.

WELLS: Right. This fantastic hedge fund that Greenspan had invested in turned out to be a feeder for Madoff’s Ponzi scheme. And Greenspan had no idea -- he didn’t remember ever even having heard Madoff’s name.

DUBNER: Oh, man. So the gullibility expert has been gulled.

WELLS: Right, gulled in a big, ironic way. He lost four hundred thousand dollars. Now, this was just about a third of his savings, so it wasn’t the total end of the world. And he should get some money back eventually from settlements. But he’s 70 now, he has two college-aged kids, and he’d really hoped to be retired by now. And, you know, he certainly didn’t want to be remembered in this way...

GREENSPAN: There was a columnist, a financial columnist in Canada who in his blog wrote: the first Greenspan, Alan, will be remembered as the economist who didn’t see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who forgot to read his own book on gullibility.

WELLS: I mean, it’s ironic, because Greenspan’s own research shows how even the smartest people can be duped.

GREENSPAN: I mean, a good example of that would be Sir Isaac Newton, the greatest scientist of all time, who lost over a million dollars -- in modern dollars -- in the South Sea bubble. And so he wrote, “I can calculate the orbit of heavenly bodies, but I cannot fathom the madness of men.”


WELLS: In reference to losing the money?


GREENSPAN: In reference to his own foolishness in putting all of his fortune at risk in something that he wasn’t really, in spite of his incredible brilliance, able to really understand or adequately calculate the risk of.  


WELLS: So in a way, you joined an elite club of brilliant, informed, educated people who can be fooled.


GREENSPAN: I joined the human race basically. Like Sir Isaac Newton and the South Sea Bubble, I knew nothing about Madoff and just basically went along with the crowd. And that’s powerful. We tend to take our cues from other people, especially in situations where we don’t quite know what to do. 

DUBNER: So it may no longer surprise us to learn that smart people sometimes make dumb decisions.

WELLS: Right, it’s like Greenspan says. It’s the instinct to “go along with the crowd” and to “take our cues from other people.” And that’s really what today’s show is about.

DUBNER: Right. And I want to talk about something else Greenspan mentioned, an even more elemental issue: how we make decisions about a risk that we just aren’t equipped to calculate. But here’s the thing: if it can’t be calculated, then maybe it’s not exactly a risk. About 100 years ago, the economist Frank Knight argued that risk and uncertainty are nearly identical but for one key difference. Risk can be measured; uncertainty, by its nature, cannot. But … what happens when you can’t tell the two of them apart?


ANNOUNCER: From WNYC and APM, American Public Media: This is FREAKONOMICS RADIO.  Today: The Truth Is Out There … Isn’t It?  Here’s your host, Stephen Dubner.

DUBNER: So Stephen Greenspan, the gullibility expert, loses a third of his life savings in what turns out to be a Ponzi scheme. Now, even if you feel sympathetic toward him, you might say, Hey, you know, he’s just one person. Bad things happen to people every day. At least the world didn’t end. But what if we were worried about something that might end the world? No, I’m not talking about an attack by alien nations – not yet, at least. That’ll come later in the program. I’m talking about…climate change. How are people like you and me supposed to calculate the threats from something like climate change? There’s so much complexity, so much uncertainty. So most of us do what Stephen Greenspan did when he was looking to invest. We take our cues from other people…

Al GORE: It’s not a question of debate. It’s like gravity. It exists.


Rush LIMBAUGH: The reason that you know you’re right is that you know things they don’t know. And because they don’t even have that baseline of knowledge to chat with you, they can’t understand where you’re coming from.  And that’s exactly how I feel talking to people who believe this global warming crap.


ABC WORLD NEWS: The science is solid, according to a vast majority of researchers, with hotter temperatures, melting glaciers, and rising sea level providing the proof.


Glenn BECK: When the University of Madison Wisconsin comes out with their definitive study, do I believe that? No! Do I believe scientists? No! They’ve lied to us about global warming. Who do you believe?

DUBNER: Who do you believe? That was Glenn Beck, by the way. Before him, from the top, you heard Al Gore and then Rush Limbaugh and an ABC World News report. When it comes to something like climate change, as fraught as it is with risk and uncertainty – and emotion! – who do you believe? And, more important, why?

Ellen PETERS: You know, my personal perception is that I don’t know enough about it, believe it or not. This is an issue that I think…


DUBNER: Wait, could you just say that again so that everyone in the world can hear an honest response? It’s so rare for some version of I’m not quite sure or I don’t know. So, sorry, say it again and then proceed.


PETERS: What I was saying, I’m not sure exactly what I believe on it in terms of the risk perceptions of climate change. It’s something that I don’t think I am personally educated on enough to have a really firm opinion about that.

DUBNER: That was Ellen Peters. She teaches in the psychology department at Ohio State University. She is part of a research group called the Cultural Cognition Project. They look at how the public feels about certain hot-button issues – like nuclear power and gun control – and then they try to figure how much those views are shaped by cultural values. That is, not empirical evidence. But people’s what they call “cultural cognition.” So, they recently did a study on climate change. How was it, they wanted to know, that the vast majority of scientists think the Earth is getting warmer because of human activity, but only about half the general public thinks the same? Could it be, perhaps … that people just don’t trust scientists? Here’s Dan Kahan. He’s another Cultural Cognition researcher and a professor at Yale Law School.

Dan KAHAN: Well, in fact, the scientists are the most trusted people in our society. The Pew Foundation does research on this, and this has been a consistent finding over time.


DUBNER: OK, so there goes that theory. That explanation won’t work for us then.


KAHAN:  Correct.

DUBNER: All right, so maybe people just doesn’t understand the science. Surveys have found that fewer than 30% of Americans are scientifically literate. Ellen Peters again:

PETERS: People have the belief that the reason that people don’t believe the risks of climate change are high enough is because they’re not smart enough, they’re not educated enough, they don’t understand the facts like the scientists do. And we were really interested in that idea and whether that’s really what was going on, or whether something else might matter.

DUBNER: So Peters and Kahan started out their climate-change study by testing people on their scientific literacy and numeracy, how well they knew math.

PETERS: And the items are things like: it is the father’s gene that decides whether the baby is a boy or a girl, true or false?



PETERS: So fairly simple.


DUBNER: Is it true?


PETERS: You know, I’m actually not even positive on that one. I think it’s the comb…Oh, no it has to be the father’s gene.


DUBNER: I’m putting my money on father, true.


PETERS: Father is true there, absolutely. Second question, antibiotics kill viruses as well as bacteria, true or false?


DUBNER: Negative.


PETERS: That one is absolutely false.

DUBNER: You can see why they wanted to know how people did on these questions before asking them about climate change.

PETERS: Numeracy in general, what it should do is it should help you to better understand information first of all. And that kind of comprehension is sort of a basic building block for good decisions across a variety of domains.


DUBNER: Right.


PETERS: But numeracy should also do other things. It should also help you just simply process the information more systematically. It should, in general, help you to get to better decisions that are more in line with the facts.


DUBNER: All right, so that makes perfect sense, but you have found something that kind of flies in the face of that haven’t you?


PETERS: We have. It’s the idea that people who are highly numerate and highly scientifically literate, they seem to actually rely on preexisting beliefs, on these sort of underlying cultural cognitions they have about how the world should be structured more than people who are less scientifically literate, or less numerate.


DUBNER: So, if I wanted to be wildly reductive, I might say the more education a culture gets, the more likely we are to have intense polarization at least among the educated classes, is that right?


PETERS: Based on our data, that’s what it looks like. It’s so interesting and so disturbing at the same time.

DUBNER: It is interesting, isn’t it? I mean, Peters and Kahan found that high scientific literacy and numeracy were not correlated with a greater fear of climate change. Instead, the more you knew, the more likely you were to hold an extreme view in one direction or the other -- that is, to be either very, very worried about the risks of climate change or to be almost not worried at all. In this case, more knowledge led to … more extremism! Why on earth would that be? Dan Kahan has a theory. He thinks that our individual beliefs on hot-button issues like this have less to do with what we know than with who we know.

KAHAN: My activities as a consumer, my activities as a voter, they’re just not consequential enough to count. But my views on climate change will have an impact on me in my life. If I go out of the studio here over to campus at Yale, and I start telling people that climate change is a hoax – these are colleagues of mine, the people in my community—that’s going to have an impact on me; they’re going to form a certain kind of view of me because of the significance of climate change in our society, probably a negative one. Now, if I live, I don’t know, in Sarah Palin’s Alaska, or something, and I take the position that climate change is real, and I start saying that, I could have the same problem. My life won’t go as well. People who are science literate are even better at figuring that out, even better at finding information that’s going to help them form, maintain a view that’s consistent with the one that’s dominant within their cultural group.  


DUBNER: So you’re saying that if I believe that climate change is a very serious issue and I want to align my life with that belief, that it’s actually more important that I align my life with that belief not because of anything I can do, but because it helps me fit in better in my circle, there’s more currency to my belief there. What about you? You’re in New Haven, Connecticut, at Yale. I gather you haven’t walked into a classroom and publicly declared that you believe climate change or global warming is a hoax, have you?


KAHAN: No, I haven’t done that.

DUBNER: This makes sense, doesn’t it? But it’s also humbling. We like to think that we make up our minds about important issues based on our rational, unbiased assessment of the available facts. But the evidence assembled by Kahan and Peters shows that our beliefs, even about something as scientifically oriented as climate change, are driven by a psychological need to fit in. And so we create strategies for doing this. Here’s my Freakonomics friend and co-author Steve Levitt.

Steve LEVITT: I think one of the issues with information gathering is that when people go to the trouble to learn about a topic, they tend not to learn about a topic in an open-minded way. They tend to seek out exactly those sources which will confirm what they’d like to believe in the first place. And so the more you learn about a topic, you tend to learn in a very particular way that tends to reinforce what you believe before you ever started.

DUBNER: Aha. So if you’re already scared of something, you tend to read more about how scary it is. And if you’re not worried -- then you don’t worry … right?

LEVITT: So if there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.

DUBNER: Coming up: since we’re so bad at figuring out what’s really dangerous, let’s bring in the professionals, shall we?

Michael SHERMER:  I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do.

DUBNER: And: a cautionary tale about siding with the conspiracy theorists.

Nick POPE: I think somebody actually thought I was an alien myself.


ANNOUNCER: From WNYC and APM, American Public Media: This is FREAKONOMICS RADIO. Here’s your host, Stephen Dubner.

DUBNER: So as Steve Levitt sees it, we seek out information that confirms our preexisting biases, and we are congenitally bad at assessing risk. So how are people supposed to figure out what to be afraid of? Here’s Levitt again.

LEVITT: To know what to be afraid of, you need to go through an in-depth data collection process, you need to be properly informed. And people are too busy, rightfully too busy, leading their lives instead of dwelling on what the exact, almost infinitesimal probability is that any particular thing will kill them. So it’s sensible for people to be uninformed and it’s sensible to rely on the media. It just turns out that the media is not a very good source of information.

DUBNER: If you really wanted to make sure that every one of your beliefs was worth holding, you’d have to spend so much time gathering primary data that you’d have no time for anything else in life. You’d practically have to become a professional skeptic. And that’s not a job … is it?

SHERMER:  Uh, yeah, I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do. So, anything from UFOs and alien abductions to Bigfoot and conspiracy theories all the way up to things like, global warming and climate change and autism and vaccinations. We cover it all.

DUBNER: Michael Shermer, a professor at Claremont University, has a masters degree in experimental psychology and a Ph.D. in the history of science. He’s also the publisher of Skeptic magazine and he writes books. His latest is called The Believing Brain.  

DUBNER:  Now, as a professional skeptic, I’m guessing a lot of people look at you, or hear about a guy like you or read a book by you and think, oh man, that’s, like, the dream job. You know, people think, well, I’m a skeptic, I don’t believe anything. So, what do you have to do to be you, Michael?


SHERMER:  Haha.  Well, we actually do believe all sorts of things. You have to have all sorts of beliefs just to, just to get out of bed in the morning, and so, the question then becomes, well, which of your host of beliefs are the ones that are really supported by evidence, or are questionable, or are probably not true, and which are those that we base on instinct and intuition, and which are we basing on, you know, solid evidence, and so, that’s where the rubber meets the road, is, is not, do you believe something or not—of course, we all believe all sorts of things.  The question is, are they true?  And what’s the evidence? What’s the quality of the evidence?


DUBNER: Talk to me about how we end up believing what we believe in. I was going say, how we choose to believe what we believe in, but it sounds like it’s not really a choice, right?


SHERMER:  It isn’t really a choice, no. Our brains are designed by evolution to constantly be forming connections, patterns, learning things about the environment. And all animals do it. You think A is connected to B and sometimes it is, sometimes it isn’t, but we just assume it is. So my thought experiment is, imagine you’re a hominid on the plains of Africa, three and a half million years ago. Your name is Lucy. And you hear a rustle in the grass. Is it a dangerous predator, or is it just the wind? Well, if you think that the rustle in the grass is a dangerous predator and it turns out it’s just the wind, you’ve made a Type 1 error in cognition – a false positive. You thought A was connected to B, but it wasn’t. But no big deal. That’s a low-cost error to make. You just become a little more cautious and vigilant, but that’s it. On the other hand, if you think the rustle in the grass is just the wind, and it turns out it’s a dangerous predator, you’re lunch. Congratulations, you’ve just been given a Darwin award for taking yourself out of the gene pool before reproducing. So we are the descendants of those who were most likely to find patterns that are real. We tend to just believe all rustles in the grass are dangerous predators, just in case they are. And so, that’s the basis of superstition and magical thinking.


DUBNER: But then we get to something like climate change, which is, theoretically, an arena bounded entirely by science, right?


SHERMER:  You would think so.


DUBNER:  Yeah, you would think so.  So what do we find, actually?


SHERMER:  Either the earth is getting warmer or it’s not, right?


DUBNER:  Yeah.


SHERMER:  I mean, it’s just a data question. Well, because it also has ideological baggage connected to it, you know, left wing versus right wing politics, and so the data goes out the window.  It’s like, I don’t know—whatever the data is, I don’t know, but I’m going to be against it.  Now, I can’t just say, oh, I’m against it because my party is, or I just do what other people tell me. Nobody says that. What you do is, you make the decision, “I’m skeptical of that” or “I don’t believe it,” and then you have to have arguments.  So then you go in search of the arguments.


DUBNER: It doesn’t sound like it surprises you at all, then, that education—level of education -- doesn’t necessarily have a big impact on whether you’re pro- or con-something. Correct?


SHERMER:  That’s right, it doesn’t. And giving smart people more information doesn’t help. It actually just confuses things. It just gives them more opportunity to pick out the ones that support what they already believe. So, being educated and intelligent, you’re even better at picking out the confirming data to support your beliefs after the fact.


DUBNER:  Let’s talk now for a bit about conspiracy theories, which we’re nibbling around the edges of. How would you describe, if you can generalize, the type of person who’s most likely to engage in a conspiracy theory that’s not true?


SHERMER:  Well, their pattern-seeking, their pattern-seeking module is just wide open.  The net has, you know, is indiscriminatory.  They think everything’s a pattern. If you think everything’s a pattern, then you’re, you’re kind of a nut.




POPE: I suppose I’m best known for having had a job at the government where my duties were investigating UFOs.

DUBNER: That’s Nick Pope. Until 2006, he worked for the British Ministry of Defence. And in the early 90’s, he headed up the office that handled reports of UFO sightings.

POPE: Flying saucer sightings, as they were called then.

DUBNER: His job was to figure out if any of these sightings had merit, and if perhaps there were extraterrestrial visitors.

POPE: To satisfy ourselves that there was no threat to the defense of the UK.

DUBNER: Pope came into the job as a skeptic. But some UFO reports, especially from pilots and police officers, got him wondering if perhaps we were being visited by aliens. Now, mind you, there was no hardcore confirmatory evidence. But Pope started talking -- in the media -- about the possibilities.

KQRE: You say you believe with 99% certainty that we’re not alone. So tell us what you’ve discovered.

POPE: Well I think it’s inconceivable in this infinite universe that we’re alone. And then that begs the question, if we’re not alone, are we being visited? It’s a related question.

POPE: When I started speaking out on this issue, I think some people in the UFO community thought that I might be some sort of standard-bearer for them.


DUBNER: Meaning one of them?


POPE: Yes, absolutely, that I could be a spokesperson for the movement. Of course I had the huge advantage that whilst everyone else had done this as a hobby, I’d done it as a job.


DUBNER: Did that make you a bit of a hero in the UFO community?


POPE: It did, and a lot of people still hold that view. They want me to come out and say, yes it’s all real and yes, I was part of a cover up. Their fantasy is what they call Disclosure with a capital “D”, as if there’s going to be some magical parting of the curtains and a moment where a crashed spaceship is revealed for all the world to see. Because I say, you know what, I don’t think that spaceship exists. So, in a sense I manage to upset everyone. I go too far for a lot of the skeptics by being open to the possibility, but I don’t go far enough for the believers, particularly the conspiracy theorists. And I get called things like “shill” and that’s one of the more polite things that I’ve been called.


DUBNER: Yeah, I’ve looked at some of the comments on YouTube from a speech you gave. I’ll read you a bit of it. We’ll have to employ our bleeping technician later. “Nick Pope, what a f****** spastic. He works, he quote, ‘works’ for the government, why else is he constantly on every bloody UFO program on every f****** channel. He talks enough bull**** to keep the UFO nutters happy while never actually saying anything of importance.” Let’s unpack that one a little bit, shall we Mr. Pope?


POPE: Yes.


DUBNER: It says you quote, “work for the government.” Do you still work for the government?


POPE: No, I don’t. This is in itself one of the great conspiracy theories that in 2006 I didn’t really leave. I just went under deep cover, and that they’re passing me wads of bank notes in a brown paper bag or something.


DUBNER: But here’s my favorite. There’s one claim on a UFO blog that you, Nick Pope, have been abducted by aliens yourself and now lie about it.


POPE: Well, yes I’ve heard that one. I’ve even seen one, which I think you might have missed. I think somebody actually thought I was an alien myself.


DUBNER: That would explain a lot wouldn’t it?


DUBNER: Nick Pope discovered a sad truth. The more transparent he tried to be -- the more information he released about himself and his work -- the more worked-up his attackers became. They took facts that would plainly seem to work in his favor and they somehow made these facts fit their conspiracies instead. But before we judge, consider how good we all are at deciding first what we want to believe, and then finding evidence for it. So what’s the solution? What can we do to keep ourselves headed down the road, albeit slowly and clumsily, toward a more rational, reasoned civilization? Here’s Ellen Peters again, from the Cultural Cognition Project.

DUBNER: So, I guess, the depressing conclusion one might reach from hearing you speak is that ideology trumps rationalism?


PETERS: I think that we are seeing some evidence for that in this study, but I don’t think that that has to be the final answer. I think that policy makers, communicators need to start paying attention to some of these cues that deepen cultural polarization. So for example, telling the other side that they’re scientifically inept? Probably a bad idea. Probably not the best way to continue people coming together on what the basic science really does say. Or, coming up only with solutions that are antagonistic to one side. And you know it if you’re listening to them that those are just antagonistic solutions -- again, probably not the best idea. It’s a sign or a signal that we’re not listening maybe as well to beliefs on the other side.

DUBNER: Dan Kahan agrees that, whatever the solution, none of us are able to go it alone.

KAHAN: What’s clear is that our ability to acquire knowledge is linked up with our ability to figure out whom to trust about what. And ordinary people have to do that in making sense of the kinds of challenges that they face. But, the amount that we know far exceeds the amount that any one of us is able to establish through our own efforts. Maybe you know that the motto for the Royal Society is Nullius in Verba, which means “Don’t take anybody’s word for it.” And it’s kind of admirable and charming, but obviously false.


DUBNER: Not very practical, is it?


KAHAN: Can’t be right. I mean, what would I do? I’d say you know, don’t tell me what Newton said in the Principia, I’m going to try to figure out how gravity works on my own.


DUBNER: And speaking of Isaac Newton -- remember what Stephen Greenspan told us earlier -- how Newton was suckered into this terrible investment? It’s heartening to learn that even Newton, the scientific sage, was able to acknowledge the flaws in his own thinking, the shortcomings in his own thinking. And he left behind some advice that might be helpful for us all. He wrote, “to explain all nature is too difficult a task for any one man or even for any one age. 'Tis much better to do a little with certainty, and leave the rest for others that come after you, than to explain all things by conjecture without making sure of any thing.” In other words, don’t get too cocky.

ANNOUNCER: FREAKONOMICS RADIO is produced by WNYC, APM: American Public Media and Dubner Productions. This episode was produced by Katherine Wells. Our staff includes Suzie Lechtenberg, Diana Huynh, Bourree Lam, Collin Campbell and Chris Bannon. Our interns are Ian Chant and Jacob Bastian. David Herman is our engineer. Special thanks to John DeLore. If you want more Freakonomics Radio, you can subscribe to our podcast on iTunes or go to where you’ll find lots of radio, a blog, the books and more.

Leave A Comment

Comments are moderated and generally will be posted if they are on-topic and not abusive.



View All Comments »
  1. Jay Turpin says:

    I really liked the episode. The only thing that bothered me was that the term “climate change” was used repeatedly. I think you meant “man-made climate change”. I think most people agree that the climate is changing, but the question about our overall impact is far from being settled. Scientists not sharing data to allow others to validate their experiments, attempting to squash competing ideas (such as the effect of solar activity on the climate) and exaggerating the facts are not making matters any better.

    Thumb up 1 Thumb down 0
  2. Zach S says:

    I couldn’t help but notice the deep irony at the root of this podcast. There is a throwaway line midway through where Levitt notes that the media isn’t such a great way to find truth. Well, why is that? I wish this particular issues was delved into a little deeper because I think this very podcast merely adds fuel to the fire, at least with regard to the climate change debate.

    Over the past few years or decade or however long you want to go back, “the science of climate change” has become an on-the-one-hand-but-on-the-other-hand discussion. Skepticism on this particular issue, much like abortion, was born (excuse the pun) a political issue. And the way we address issues we deem to be political in this country (and others to be sure) is to put up a split screen of two paid consultants testing the limits of the volume control on your television set and radio. “The media” — the same world where this podcast lives, incidentally — presents global warming (like abortion) as an issue where there are two sides, one side to be selected. And the more the idea that there are “two sides” is solidified in our consciousness, the more likely we are to see the sides as even.

    We don’t do this with gravity, the existence of bacteria/viruses, or the question of whether the world is going to end in 2013 because “we” have come to reasonable conclusions about those things. The science says, period. To argue otherwise, even earnestly, is to waste the time of listeners. The more we pretend there are two sides, the more likely we are to want to “teach the controversy.”

    We should be clear when we talk about the roots of skepticism, because pretending not to understand the “we have to choose from these two sides of this issue” trap that the hated “media” seems to fall into (for eyeballs and ad revenue), we are doing ourselves a disservice. Global warming, evolution, and beginning-of-life skepticism is on the rise because there is a media industry and political system devoted to there being skepticism about these particular issues.

    Frankly, I am disappointed that Freakonomics crew — a team so hell-bent on following the data — did not recognize the irony of presenting an issue like climate change as a mere war between “two sides.” The presentation itself is an unscientific, political interpretation of a wholly scientific question.

    When we are presented with “two sides,” we tend to want to give those sides equal weight. But when we talk about science, we are not talking about flipping a coin. As our UFO friends will — without a hint of irony — tell us, the truth is out there. At least on that point, they are right.

    Thumb up 1 Thumb down 0
    • James Briggs says:

      Most people 90% choose sides. They chose one side or the other and find reason to support their side. There are procedures for finding the truth. They don’t always work but they tend to work. Most people reject them. (I agree with global warming. If I don’t say it often it drives people crazy.) You saw my debate with the pro-global warming guy. He assumed because I said his side was less than perfect that I had to be against global warming. I am not. Then he said I knew nothing about science because I wanted people to look at the data before they decide. Being opposed to looking at data is a sign of irrationality. Only a sign we must keep an open mind. The point is irrationality is the norm.

      Thumb up 0 Thumb down 0
  3. Dan K. says:

    This was a very thought-provoking piece. The idea that more education in a subject could potentially cause more polarization and that many times, we seek out reaffirmation of our own world view to “fit in” rather than to find the truth, is fascinating.

    However, one loophole is that the authors of this piece, as well as the cited studies used to support this premise, may have the very same bias that they are reporting on. And perhaps this piece reinforces the worldview of people (like me) who are drawn to Freakonomics and are looking for reaffirmation of their own indelible truths. It’s a Catch-22 of sorts.

    Thumb up 1 Thumb down 0
    • James Briggs says:

      Most education teaches facts. It used to teach thinking. People who read the classics over time tend to develop patterns of thinking that encourage finding the truth. If people at taught the scientific method, not a science, over a long enough period of time they will learn to used it when it is important. More importantly they will tend to respect a cogent argument when they hear it.

      Thumb up 1 Thumb down 0
  4. Sam Morrill says:

    I would like to preface this comment by saying that I am a big fan of Freakonomics. I’ve read your work, I listen to your podcast (somewhat obsessively) and I follow your blog via Twitter. Generally speaking, I love the way you hack away at conventional wisdom and find new angles to approach tired solutions to old problems. However, I take serious issue with your approach to the issue of climate change.

    In your most recent podcast, you cite Pew research that shows that scientists consistently rank as the most trusted professionals in the United States. In general, this may be true, but keep in mind that the scientists that Americans have the most direct contact with are their physicians. Since physicians are primarily tasked with keeping us alive, we as their patients have a vested interest in trusting them and following their advice. Climatologists and other environmental scientists, on the other hand, do not provide this sort of vital service to us. On the contrary, their advice is often opposed to our immediate interests since human comfort and convenience often comes at the expense of the environment. Therefore, I suspect that if you were to take a closer look at the numbers (rather just lump all scientists together), you would probably find significantly less trust of those scientists who do not necessarily serve our direct and immediate interests like physicians do.

    Furthermore, I found it particularly offensive when you interviewed Steven Levitt and had him liken (albeit indirectly) the fear of climate change to some of our less rational fears–spiders, sharks, earthquakes and so on and so forth. This analogy makes a huge omission: Whereas our fears of animals and natural disasters are largely rooted in our fear of bodily harm to ourselves, the fear of climate change is largely rooted in a fear of the havoc that it may wreak on our children and our children’s children and society as a whole. In other words, the fear of climate change is a more selfless fear than it is a selfish fear.

    This brings me to my final point—the issue of selfishness versus selflessness. Considering that the credo of Freakonomics is the “study of incentives – how people get what they want or need,” I’m surprised that you spent no time discussing the incentives that drive each side of the climate debate. On one hand, you have climate science deniers who are largely driven by the fear of their tacit understanding that any sort of action to combat climate change would likely require changes to their lifestyle. On the other hand, you have those of us who are greatly concerned about climate change, full well knowing that we may need to make sacrifices to offset the threat, but are willing to because it is the right thing to do. I take no issue with Freakonomics throwing climate science under a microscope, but it is truly regrettable to grant (false) equivalency to those who have their head in the sand to professionals who are dedicating their lives to seeking a real solution to a real problem. You would do us all a better service by dedicating thirty minutes to discussing why some people are able put aside their immediate self-interests and why others, sadly, are not.

    Thumb up 1 Thumb down 2
    • Jay Turpin says:

      Wow! Did you really mean to sound so sanctimonious? Skeptics are all lumped into a single, selfish, homogeneous group? Don’t be so close-minded.

      I’m more than willing to make sacrifices to do my part for the environment. I’m a skeptic because open, scientific debate is being suppressed. Data and models used to make predictions should be freely available so other scientists can validate and replicate conclusions. Different views should be analyzed critically, not shutdown because they don’t follow the party line, like Henrik Svensmark cosmic ray theory (

      I’m not an expert, nor do I claim vast knowledge in this area, but please don’t categorize skeptics as ignorant fools.

      Thumb up 1 Thumb down 0
      • Sam Morrill says:

        As I said in my post, “I take no issue with Freakonomics throwing climate science under a microscope.”

        It is one thing to have a well-informed debate over the details of climate change. It is another thing to either flat out deny that climate change is occurring or to acknowledge that it is happening and completely reject the notion that we are a driving force behind it.

        The Freakonomics podcast lent equal credence to the extremes on both sides of the debate, which is what I took issue with. I would much rather that we overreact to a real threat than to deny its existence altogether or, even worse, surrender to it as so many seem to have done.

        Thumb up 0 Thumb down 0
      • James Briggs says:

        As I see it if one levels even the slightest criticism of the global warming movement it is assumed that they flatly deny global warming.

        It seems the global warming people would be right at home with Barry Goldwater’s statement: I would remind you that extremism in the defense of liberty is no vice! And let me remind you also that moderation in the pursuit of justice is no virtue!

        Thumb up 0 Thumb down 0
    • James Briggs says:

      Anyone who follows Freakonomics should know there are hidden rewards in everything. As for climate change what do you think will happen to climate researchers as more and more people accept global warming. They will get money gobs of it. They will get zillions of dollars. Well maybe not zillions but they will get a lot. They will set up institutes and have any number of PhDs working for them. The money connection between supporters of global warming has been well documented. How much money did Al Gore make from his movie? If the worse predictions are true the whole global economy will change and the people in charge of that change will rule the global economy and control money than anyone ever has in history.

      Thumb up 1 Thumb down 0
  5. Tecumseh says:

    So, “scientifically literate” people (SLP) are MORE polarized on global warming, so your conclusion is that people who should know better are MORE susceptible to social pressure and confirmation bias?!!


    Have you tested how polarized SLP are on, say, life on other planets, or evolution? My anecdotal observation is that even those religious practitioners who are also SLP, believe in all or most of the tenets of evolution.

    ***So, how polarized are SLP on other issues?***

    Here’s an alternate hypothesis:

    1) One group of people stands to make billions of dollars and amass unprecedented power if their claims can be given force of law.

    2) They are supported by like minds whose livelihood and credibility are based on *narrative*. Facts be damned.

    3) Another group of SLP has no immediate skin in the game. They see:

    3.1) The preponderance of geological, astrophysical, historical, and meteorological evidence is ignored.

    3.2) *ALL* of the only deemed acceptable solutions involve a massive sweep of tyranny against the ideals of individual liberty and free economics.

    3.3) Data is fabricated on a massive scale, the very meaning of temperature measurement is redefined, not for accuracy but for narrative. Historical temperature records are “adjusted” so that they no longer match first-hand accounts from the time.

    3.4) Lies and censorship are accepted practice and, when exposed, the whistle-blowers are attacked or ignored.

    3.5) Every single scare-mongering prediction made, fails to pan out. Unlike any other scientific endeavor, the theory is not questioned and the bad predictors are not held accountable.

    3.6) One side is couched in the mind-set and phraseology, not of science, but of religion. There is no such thing as “settled science”, and “scientific consensus” is an oxymoron. Scientists are not priests to be ordained by ideologues, and one schoolgirl, with a new fact, is worth 1000 “scientists”.

    SLP are not more susceptible to unscientific behaviors. No, the subject is not science anymore, it has been corrupted by vast amounts of political meddling.
    It’s not SLP polarized against SLP.
    It’s SLP polarized against wanna-be tyrants, con-men, and their useful idiots. (If the latter groups are SLP, it is secondary.)

    Thumb up 3 Thumb down 0
  6. David says:

    I find the references to Newton interesting. Of course his 30 year investigations of alchemy, because he was convinced it would connect the earthly and heavenly worlds, would also have made a fine example for this program.

    Thumb up 0 Thumb down 0
  7. Matt Albee says:

    Early in the podcast it is stated that the “general public” is much more divided on the issue of climate change than the scientific community. I have read elsewhere that while this is true of the *American* general public, the same is not true in other parts of the world (notably Europe) where the facts of climate change are not disputed, even by politicians on the right.

    Thumb up 0 Thumb down 1
    • James Briggs says:

      Just because a European believes something doesn’t make it so. I do believe in global warming but I am very suspicious of the motives of those who use of fallacious arguments and strong arm tactics to get people to agree with them.

      Thumb up 2 Thumb down 0
  8. Mark says:

    Actually, I find most people’s minds can be changed if one starts from basic principles and basic points on where people agree. For example, on climate change, people will — often readily go from pure skeptic to accepting that humanity has an effect on climate and that it can be significant. But to do that, one has to admit the weaknesses of one’s position and try to find something other than total victory. On climate change, it is very hard for me to argue convincingly that climate models do much more than give us potential scenarios — at least to argue it convincingly enough to change minds. Why, because of the fluid dynamics involved and the inherently chaotic nature of the phenomena involved. The average person will say, how can you predict long term climate effects when we can’t even predict the weather a week out? And you know what? That is a VERY good question when one takes a step back from viewing the science within the blinders of the “science.” I.e., when I step outside of the community of climatologists that view certain tests and models as “accepted”, it is harder to objectively say the science — the predictive ability of the science — is settled. While I may believe that the better evidence given what we know today is that a substantial amount of warming exists and that much of it is caused by human activity, it is hard to say, given the limitations of the science per se, that one can be very precise. Any policy prescriptions thus are inherently “political” for that reason.

    I think that if we were to be more humble in our knowledge and correctness, we might find it easier to convince others.

    Thumb up 0 Thumb down 0