The Truth Is Out There…Isn’t It? A New Freakonomics Radio Podcast


Our latest Freakonomics Radio podcast is called “The Truth Is Out There…Isn’t It?” (You can download/subscribe at iTunes, get the RSS feed, listen live via the media player above, or read the transcript below.) In it, we try to answer a few fundamental questions: how do we know that what we believe is true? How do we decide which information to trust? And how do we quantify risk — from climate change to personal investments?

The program begins with Stephen Greenspan, a psychologist and an expert on “social incompetence” and gullibility. He knows from personal experience that even the smartest people can be duped into bad risk assessments, especially on the advice of people they trust. You can read more about him here (spoiler alert!).

We also talk with Dan Kahan of Yale Law School and Ellen Peters of Ohio State University, both of whom belong to the Cultural Cognition Project, a scholarly group focused on “how cultural values shape public risk perceptions.” We blogged earlier about their interesting finding: 

Greater scientific literacy and numeracy were associated with greater cultural polarization: respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased.

The authors hypothesize that people who are more numerate and scientifically literate are better at gathering information that confirms their existing beliefs. Kahan believes this happens, in part, for a pretty basic reason: we just want to fit in with our friends. So we work to maintain viewpoints that fall in line with our social group.

You’ll hear from professional skeptic Michael Shermer, who explains the evolutionary basis of funky risk-assessment practices. It all goes back to our hominid ancestors, he says, who needed to be on high alert to protect against predators.

Steve Levitt also chimes in:

If there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are the things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.

And what happens when our normal fears kick into overdrive? We talk to Nick Pope, formerly of the British Ministry of Defence, who for several years investigated UFO sightings for the government. (Some files from the Ministry’s UFO department have recently been made available at the British National Archives.) Since leaving government, Pope has been accused of being part of an elaborate government cover-up. He talks about the futility of trying to change a conspiracy theorist’s mind.

Climate change, hominid ancestors, UFO cover-ups, and smart people making bad decisions: all that and more in this week’s podcast.

Audio Transcript



Stephen GREENSPAN: Yes, life was pretty good until I got a phone call from my broker.

Stephen J. DUBNER: That’s Stephen Greenspan. He’s an emeritus professor of psychology at the University of Connecticut.

GREENSPAN: Hi Katherine.


DUBNER: And this is Katherine Wells. She’s one of the producers on our show. Hi Katherine.

Katherine WELLS: Hi Stephen.

DUBNER: So you are here with a story for us, yes?

KWELLS: Right. A story about Stephen Greenspan. He has an interesting specialty: he’s an expert in what he calls “social incompetence.”

DUBNER: I have some of that.

WELLS: Which, you know, we all feel. What he means is he studies why people do dumb things.

DUBNER: Presumably that means … why smart people do dumb things?

WELLS: Right, that included.

DUBNER: And when he told you there that “life was pretty good,” what did he mean? What was so good exactly?

WELLS: Well, it was December, 2008, and he had a book coming out called Annals of Gullibility. The other thing that seemed pretty good was his financial situation. About a year earlier, he had invested in this hedge fund and it was doing pretty well, so he was getting nicely set up for retirement too. So one day in December he gets the first pre-release copy of the book, the gullibility book. And two days later, his broker calls.

GREENSPAN: I said, how are you? He said, terrible, it’s the worst day of my life. Now this is a man who had lost a son, so when he said it’s the worst day of my life that got my attention. And I said why? He said well Bernard Madoff just admitted that he was running a Ponzi scheme. And I responded, who is Bernard Madoff, and what’s it have to do with me?

DUBNER: Uh-oh. Katherine, I think we can kind of smell where this is headed.

WELLS: Right. This fantastic hedge fund that Greenspan had invested in turned out to be a feeder for Madoff’s Ponzi scheme. And Greenspan had no idea -- he didn’t remember ever even having heard Madoff’s name.

DUBNER: Oh, man. So the gullibility expert has been gulled.

WELLS: Right, gulled in a big, ironic way. He lost four hundred thousand dollars. Now, this was just about a third of his savings, so it wasn’t the total end of the world. And he should get some money back eventually from settlements. But he’s 70 now, he has two college-aged kids, and he’d really hoped to be retired by now. And, you know, he certainly didn’t want to be remembered in this way...

GREENSPAN: There was a columnist, a financial columnist in Canada who in his blog wrote: the first Greenspan, Alan, will be remembered as the economist who didn’t see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who forgot to read his own book on gullibility.

WELLS: I mean, it’s ironic, because Greenspan’s own research shows how even the smartest people can be duped.

GREENSPAN: I mean, a good example of that would be Sir Isaac Newton, the greatest scientist of all time, who lost over a million dollars -- in modern dollars -- in the South Sea bubble. And so he wrote, “I can calculate the orbit of heavenly bodies, but I cannot fathom the madness of men.”


WELLS: In reference to losing the money?


GREENSPAN: In reference to his own foolishness in putting all of his fortune at risk in something that he wasn’t really, in spite of his incredible brilliance, able to really understand or adequately calculate the risk of.  


WELLS: So in a way, you joined an elite club of brilliant, informed, educated people who can be fooled.


GREENSPAN: I joined the human race basically. Like Sir Isaac Newton and the South Sea Bubble, I knew nothing about Madoff and just basically went along with the crowd. And that’s powerful. We tend to take our cues from other people, especially in situations where we don’t quite know what to do. 

DUBNER: So it may no longer surprise us to learn that smart people sometimes make dumb decisions.

WELLS: Right, it’s like Greenspan says. It’s the instinct to “go along with the crowd” and to “take our cues from other people.” And that’s really what today’s show is about.

DUBNER: Right. And I want to talk about something else Greenspan mentioned, an even more elemental issue: how we make decisions about a risk that we just aren’t equipped to calculate. But here’s the thing: if it can’t be calculated, then maybe it’s not exactly a risk. About 100 years ago, the economist Frank Knight argued that risk and uncertainty are nearly identical but for one key difference. Risk can be measured; uncertainty, by its nature, cannot. But … what happens when you can’t tell the two of them apart?


ANNOUNCER: From WNYC and APM, American Public Media: This is FREAKONOMICS RADIO.  Today: The Truth Is Out There … Isn’t It?  Here’s your host, Stephen Dubner.

DUBNER: So Stephen Greenspan, the gullibility expert, loses a third of his life savings in what turns out to be a Ponzi scheme. Now, even if you feel sympathetic toward him, you might say, Hey, you know, he’s just one person. Bad things happen to people every day. At least the world didn’t end. But what if we were worried about something that might end the world? No, I’m not talking about an attack by alien nations – not yet, at least. That’ll come later in the program. I’m talking about…climate change. How are people like you and me supposed to calculate the threats from something like climate change? There’s so much complexity, so much uncertainty. So most of us do what Stephen Greenspan did when he was looking to invest. We take our cues from other people…

Al GORE: It’s not a question of debate. It’s like gravity. It exists.


Rush LIMBAUGH: The reason that you know you’re right is that you know things they don’t know. And because they don’t even have that baseline of knowledge to chat with you, they can’t understand where you’re coming from.  And that’s exactly how I feel talking to people who believe this global warming crap.


ABC WORLD NEWS: The science is solid, according to a vast majority of researchers, with hotter temperatures, melting glaciers, and rising sea level providing the proof.


Glenn BECK: When the University of Madison Wisconsin comes out with their definitive study, do I believe that? No! Do I believe scientists? No! They’ve lied to us about global warming. Who do you believe?

DUBNER: Who do you believe? That was Glenn Beck, by the way. Before him, from the top, you heard Al Gore and then Rush Limbaugh and an ABC World News report. When it comes to something like climate change, as fraught as it is with risk and uncertainty – and emotion! – who do you believe? And, more important, why?

Ellen PETERS: You know, my personal perception is that I don’t know enough about it, believe it or not. This is an issue that I think…


DUBNER: Wait, could you just say that again so that everyone in the world can hear an honest response? It’s so rare for some version of I’m not quite sure or I don’t know. So, sorry, say it again and then proceed.


PETERS: What I was saying, I’m not sure exactly what I believe on it in terms of the risk perceptions of climate change. It’s something that I don’t think I am personally educated on enough to have a really firm opinion about that.

DUBNER: That was Ellen Peters. She teaches in the psychology department at Ohio State University. She is part of a research group called the Cultural Cognition Project. They look at how the public feels about certain hot-button issues – like nuclear power and gun control – and then they try to figure how much those views are shaped by cultural values. That is, not empirical evidence. But people’s what they call “cultural cognition.” So, they recently did a study on climate change. How was it, they wanted to know, that the vast majority of scientists think the Earth is getting warmer because of human activity, but only about half the general public thinks the same? Could it be, perhaps … that people just don’t trust scientists? Here’s Dan Kahan. He’s another Cultural Cognition researcher and a professor at Yale Law School.

Dan KAHAN: Well, in fact, the scientists are the most trusted people in our society. The Pew Foundation does research on this, and this has been a consistent finding over time.


DUBNER: OK, so there goes that theory. That explanation won’t work for us then.


KAHAN:  Correct.

DUBNER: All right, so maybe people just doesn’t understand the science. Surveys have found that fewer than 30% of Americans are scientifically literate. Ellen Peters again:

PETERS: People have the belief that the reason that people don’t believe the risks of climate change are high enough is because they’re not smart enough, they’re not educated enough, they don’t understand the facts like the scientists do. And we were really interested in that idea and whether that’s really what was going on, or whether something else might matter.

DUBNER: So Peters and Kahan started out their climate-change study by testing people on their scientific literacy and numeracy, how well they knew math.

PETERS: And the items are things like: it is the father’s gene that decides whether the baby is a boy or a girl, true or false?



PETERS: So fairly simple.


DUBNER: Is it true?


PETERS: You know, I’m actually not even positive on that one. I think it’s the comb…Oh, no it has to be the father’s gene.


DUBNER: I’m putting my money on father, true.


PETERS: Father is true there, absolutely. Second question, antibiotics kill viruses as well as bacteria, true or false?


DUBNER: Negative.


PETERS: That one is absolutely false.

DUBNER: You can see why they wanted to know how people did on these questions before asking them about climate change.

PETERS: Numeracy in general, what it should do is it should help you to better understand information first of all. And that kind of comprehension is sort of a basic building block for good decisions across a variety of domains.


DUBNER: Right.


PETERS: But numeracy should also do other things. It should also help you just simply process the information more systematically. It should, in general, help you to get to better decisions that are more in line with the facts.


DUBNER: All right, so that makes perfect sense, but you have found something that kind of flies in the face of that haven’t you?


PETERS: We have. It’s the idea that people who are highly numerate and highly scientifically literate, they seem to actually rely on preexisting beliefs, on these sort of underlying cultural cognitions they have about how the world should be structured more than people who are less scientifically literate, or less numerate.


DUBNER: So, if I wanted to be wildly reductive, I might say the more education a culture gets, the more likely we are to have intense polarization at least among the educated classes, is that right?


PETERS: Based on our data, that’s what it looks like. It’s so interesting and so disturbing at the same time.

DUBNER: It is interesting, isn’t it? I mean, Peters and Kahan found that high scientific literacy and numeracy were not correlated with a greater fear of climate change. Instead, the more you knew, the more likely you were to hold an extreme view in one direction or the other -- that is, to be either very, very worried about the risks of climate change or to be almost not worried at all. In this case, more knowledge led to … more extremism! Why on earth would that be? Dan Kahan has a theory. He thinks that our individual beliefs on hot-button issues like this have less to do with what we know than with who we know.

KAHAN: My activities as a consumer, my activities as a voter, they’re just not consequential enough to count. But my views on climate change will have an impact on me in my life. If I go out of the studio here over to campus at Yale, and I start telling people that climate change is a hoax – these are colleagues of mine, the people in my community—that’s going to have an impact on me; they’re going to form a certain kind of view of me because of the significance of climate change in our society, probably a negative one. Now, if I live, I don’t know, in Sarah Palin’s Alaska, or something, and I take the position that climate change is real, and I start saying that, I could have the same problem. My life won’t go as well. People who are science literate are even better at figuring that out, even better at finding information that’s going to help them form, maintain a view that’s consistent with the one that’s dominant within their cultural group.  


DUBNER: So you’re saying that if I believe that climate change is a very serious issue and I want to align my life with that belief, that it’s actually more important that I align my life with that belief not because of anything I can do, but because it helps me fit in better in my circle, there’s more currency to my belief there. What about you? You’re in New Haven, Connecticut, at Yale. I gather you haven’t walked into a classroom and publicly declared that you believe climate change or global warming is a hoax, have you?


KAHAN: No, I haven’t done that.

DUBNER: This makes sense, doesn’t it? But it’s also humbling. We like to think that we make up our minds about important issues based on our rational, unbiased assessment of the available facts. But the evidence assembled by Kahan and Peters shows that our beliefs, even about something as scientifically oriented as climate change, are driven by a psychological need to fit in. And so we create strategies for doing this. Here’s my Freakonomics friend and co-author Steve Levitt.

Steve LEVITT: I think one of the issues with information gathering is that when people go to the trouble to learn about a topic, they tend not to learn about a topic in an open-minded way. They tend to seek out exactly those sources which will confirm what they’d like to believe in the first place. And so the more you learn about a topic, you tend to learn in a very particular way that tends to reinforce what you believe before you ever started.

DUBNER: Aha. So if you’re already scared of something, you tend to read more about how scary it is. And if you’re not worried -- then you don’t worry … right?

LEVITT: So if there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.

DUBNER: Coming up: since we’re so bad at figuring out what’s really dangerous, let’s bring in the professionals, shall we?

Michael SHERMER:  I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do.

DUBNER: And: a cautionary tale about siding with the conspiracy theorists.

Nick POPE: I think somebody actually thought I was an alien myself.


ANNOUNCER: From WNYC and APM, American Public Media: This is FREAKONOMICS RADIO. Here’s your host, Stephen Dubner.

DUBNER: So as Steve Levitt sees it, we seek out information that confirms our preexisting biases, and we are congenitally bad at assessing risk. So how are people supposed to figure out what to be afraid of? Here’s Levitt again.

LEVITT: To know what to be afraid of, you need to go through an in-depth data collection process, you need to be properly informed. And people are too busy, rightfully too busy, leading their lives instead of dwelling on what the exact, almost infinitesimal probability is that any particular thing will kill them. So it’s sensible for people to be uninformed and it’s sensible to rely on the media. It just turns out that the media is not a very good source of information.

DUBNER: If you really wanted to make sure that every one of your beliefs was worth holding, you’d have to spend so much time gathering primary data that you’d have no time for anything else in life. You’d practically have to become a professional skeptic. And that’s not a job … is it?

SHERMER:  Uh, yeah, I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do. So, anything from UFOs and alien abductions to Bigfoot and conspiracy theories all the way up to things like, global warming and climate change and autism and vaccinations. We cover it all.

DUBNER: Michael Shermer, a professor at Claremont University, has a masters degree in experimental psychology and a Ph.D. in the history of science. He’s also the publisher of Skeptic magazine and he writes books. His latest is called The Believing Brain.  

DUBNER:  Now, as a professional skeptic, I’m guessing a lot of people look at you, or hear about a guy like you or read a book by you and think, oh man, that’s, like, the dream job. You know, people think, well, I’m a skeptic, I don’t believe anything. So, what do you have to do to be you, Michael?


SHERMER:  Haha.  Well, we actually do believe all sorts of things. You have to have all sorts of beliefs just to, just to get out of bed in the morning, and so, the question then becomes, well, which of your host of beliefs are the ones that are really supported by evidence, or are questionable, or are probably not true, and which are those that we base on instinct and intuition, and which are we basing on, you know, solid evidence, and so, that’s where the rubber meets the road, is, is not, do you believe something or not—of course, we all believe all sorts of things.  The question is, are they true?  And what’s the evidence? What’s the quality of the evidence?


DUBNER: Talk to me about how we end up believing what we believe in. I was going say, how we choose to believe what we believe in, but it sounds like it’s not really a choice, right?


SHERMER:  It isn’t really a choice, no. Our brains are designed by evolution to constantly be forming connections, patterns, learning things about the environment. And all animals do it. You think A is connected to B and sometimes it is, sometimes it isn’t, but we just assume it is. So my thought experiment is, imagine you’re a hominid on the plains of Africa, three and a half million years ago. Your name is Lucy. And you hear a rustle in the grass. Is it a dangerous predator, or is it just the wind? Well, if you think that the rustle in the grass is a dangerous predator and it turns out it’s just the wind, you’ve made a Type 1 error in cognition – a false positive. You thought A was connected to B, but it wasn’t. But no big deal. That’s a low-cost error to make. You just become a little more cautious and vigilant, but that’s it. On the other hand, if you think the rustle in the grass is just the wind, and it turns out it’s a dangerous predator, you’re lunch. Congratulations, you’ve just been given a Darwin award for taking yourself out of the gene pool before reproducing. So we are the descendants of those who were most likely to find patterns that are real. We tend to just believe all rustles in the grass are dangerous predators, just in case they are. And so, that’s the basis of superstition and magical thinking.


DUBNER: But then we get to something like climate change, which is, theoretically, an arena bounded entirely by science, right?


SHERMER:  You would think so.


DUBNER:  Yeah, you would think so.  So what do we find, actually?


SHERMER:  Either the earth is getting warmer or it’s not, right?


DUBNER:  Yeah.


SHERMER:  I mean, it’s just a data question. Well, because it also has ideological baggage connected to it, you know, left wing versus right wing politics, and so the data goes out the window.  It’s like, I don’t know—whatever the data is, I don’t know, but I’m going to be against it.  Now, I can’t just say, oh, I’m against it because my party is, or I just do what other people tell me. Nobody says that. What you do is, you make the decision, “I’m skeptical of that” or “I don’t believe it,” and then you have to have arguments.  So then you go in search of the arguments.


DUBNER: It doesn’t sound like it surprises you at all, then, that education—level of education -- doesn’t necessarily have a big impact on whether you’re pro- or con-something. Correct?


SHERMER:  That’s right, it doesn’t. And giving smart people more information doesn’t help. It actually just confuses things. It just gives them more opportunity to pick out the ones that support what they already believe. So, being educated and intelligent, you’re even better at picking out the confirming data to support your beliefs after the fact.


DUBNER:  Let’s talk now for a bit about conspiracy theories, which we’re nibbling around the edges of. How would you describe, if you can generalize, the type of person who’s most likely to engage in a conspiracy theory that’s not true?


SHERMER:  Well, their pattern-seeking, their pattern-seeking module is just wide open.  The net has, you know, is indiscriminatory.  They think everything’s a pattern. If you think everything’s a pattern, then you’re, you’re kind of a nut.




POPE: I suppose I’m best known for having had a job at the government where my duties were investigating UFOs.

DUBNER: That’s Nick Pope. Until 2006, he worked for the British Ministry of Defence. And in the early 90’s, he headed up the office that handled reports of UFO sightings.

POPE: Flying saucer sightings, as they were called then.

DUBNER: His job was to figure out if any of these sightings had merit, and if perhaps there were extraterrestrial visitors.

POPE: To satisfy ourselves that there was no threat to the defense of the UK.

DUBNER: Pope came into the job as a skeptic. But some UFO reports, especially from pilots and police officers, got him wondering if perhaps we were being visited by aliens. Now, mind you, there was no hardcore confirmatory evidence. But Pope started talking -- in the media -- about the possibilities.

KQRE: You say you believe with 99% certainty that we’re not alone. So tell us what you’ve discovered.

POPE: Well I think it’s inconceivable in this infinite universe that we’re alone. And then that begs the question, if we’re not alone, are we being visited? It’s a related question.

POPE: When I started speaking out on this issue, I think some people in the UFO community thought that I might be some sort of standard-bearer for them.


DUBNER: Meaning one of them?


POPE: Yes, absolutely, that I could be a spokesperson for the movement. Of course I had the huge advantage that whilst everyone else had done this as a hobby, I’d done it as a job.


DUBNER: Did that make you a bit of a hero in the UFO community?


POPE: It did, and a lot of people still hold that view. They want me to come out and say, yes it’s all real and yes, I was part of a cover up. Their fantasy is what they call Disclosure with a capital “D”, as if there’s going to be some magical parting of the curtains and a moment where a crashed spaceship is revealed for all the world to see. Because I say, you know what, I don’t think that spaceship exists. So, in a sense I manage to upset everyone. I go too far for a lot of the skeptics by being open to the possibility, but I don’t go far enough for the believers, particularly the conspiracy theorists. And I get called things like “shill” and that’s one of the more polite things that I’ve been called.


DUBNER: Yeah, I’ve looked at some of the comments on YouTube from a speech you gave. I’ll read you a bit of it. We’ll have to employ our bleeping technician later. “Nick Pope, what a f****** spastic. He works, he quote, ‘works’ for the government, why else is he constantly on every bloody UFO program on every f****** channel. He talks enough bull**** to keep the UFO nutters happy while never actually saying anything of importance.” Let’s unpack that one a little bit, shall we Mr. Pope?


POPE: Yes.


DUBNER: It says you quote, “work for the government.” Do you still work for the government?


POPE: No, I don’t. This is in itself one of the great conspiracy theories that in 2006 I didn’t really leave. I just went under deep cover, and that they’re passing me wads of bank notes in a brown paper bag or something.


DUBNER: But here’s my favorite. There’s one claim on a UFO blog that you, Nick Pope, have been abducted by aliens yourself and now lie about it.


POPE: Well, yes I’ve heard that one. I’ve even seen one, which I think you might have missed. I think somebody actually thought I was an alien myself.


DUBNER: That would explain a lot wouldn’t it?


DUBNER: Nick Pope discovered a sad truth. The more transparent he tried to be -- the more information he released about himself and his work -- the more worked-up his attackers became. They took facts that would plainly seem to work in his favor and they somehow made these facts fit their conspiracies instead. But before we judge, consider how good we all are at deciding first what we want to believe, and then finding evidence for it. So what’s the solution? What can we do to keep ourselves headed down the road, albeit slowly and clumsily, toward a more rational, reasoned civilization? Here’s Ellen Peters again, from the Cultural Cognition Project.

DUBNER: So, I guess, the depressing conclusion one might reach from hearing you speak is that ideology trumps rationalism?


PETERS: I think that we are seeing some evidence for that in this study, but I don’t think that that has to be the final answer. I think that policy makers, communicators need to start paying attention to some of these cues that deepen cultural polarization. So for example, telling the other side that they’re scientifically inept? Probably a bad idea. Probably not the best way to continue people coming together on what the basic science really does say. Or, coming up only with solutions that are antagonistic to one side. And you know it if you’re listening to them that those are just antagonistic solutions -- again, probably not the best idea. It’s a sign or a signal that we’re not listening maybe as well to beliefs on the other side.

DUBNER: Dan Kahan agrees that, whatever the solution, none of us are able to go it alone.

KAHAN: What’s clear is that our ability to acquire knowledge is linked up with our ability to figure out whom to trust about what. And ordinary people have to do that in making sense of the kinds of challenges that they face. But, the amount that we know far exceeds the amount that any one of us is able to establish through our own efforts. Maybe you know that the motto for the Royal Society is Nullius in Verba, which means “Don’t take anybody’s word for it.” And it’s kind of admirable and charming, but obviously false.


DUBNER: Not very practical, is it?


KAHAN: Can’t be right. I mean, what would I do? I’d say you know, don’t tell me what Newton said in the Principia, I’m going to try to figure out how gravity works on my own.


DUBNER: And speaking of Isaac Newton -- remember what Stephen Greenspan told us earlier -- how Newton was suckered into this terrible investment? It’s heartening to learn that even Newton, the scientific sage, was able to acknowledge the flaws in his own thinking, the shortcomings in his own thinking. And he left behind some advice that might be helpful for us all. He wrote, “to explain all nature is too difficult a task for any one man or even for any one age. 'Tis much better to do a little with certainty, and leave the rest for others that come after you, than to explain all things by conjecture without making sure of any thing.” In other words, don’t get too cocky.

ANNOUNCER: FREAKONOMICS RADIO is produced by WNYC, APM: American Public Media and Dubner Productions. This episode was produced by Katherine Wells. Our staff includes Suzie Lechtenberg, Diana Huynh, Bourree Lam, Collin Campbell and Chris Bannon. Our interns are Ian Chant and Jacob Bastian. David Herman is our engineer. Special thanks to John DeLore. If you want more Freakonomics Radio, you can subscribe to our podcast on iTunes or go to where you’ll find lots of radio, a blog, the books and more.

Leave A Comment

Comments are moderated and generally will be posted if they are on-topic and not abusive.



View All Comments »
  1. Adriel says:

    It does feel like changing minds on personal issues is futile. When I look at trends on support for things like same-sex marriage, it seems as if the only way you can really impact support is to just wait for the stuck generations to die out.

    Well-loved. Like or Dislike: Thumb up 5 Thumb down 0
    • James says:

      I don’t think the numbers support that, as the shift on same-sex marriage has happened quite a bit faster than would be expected from just a die-off effect. There’s also a big bulge in the don’t really care much middle who’ve shifted pretty quickly from “Why would anyone want that?” to “OK, if it makes them happy.”

      Thumb up 4 Thumb down 0
      • Chizom says:

        I agree with James on this one. The “really dont care” crowd are technically the only people who matter. Though it is true that changing minds on personal issues is damn near impossible we do not have to wait for them to die off in order for change to come. Think about the allocation of campaign funds for presidential candidates and the overall objective they are trying to achieve. The main focus is always the independent vote(swing vote) because those are the votes that win elections. They put less effort into party loyalists because they know “changing minds on personal issues are futile”,so the key is to persuade people who have not made up their mind. How else do you explain the substantial increase in support of issues like gay marriage within the last 5-10 years, the middle crowd just does not care enough to protest. Certainly enough nay sayers did not die off in the past 5-10years.

        Thumb up 3 Thumb down 0
    • callistio7 says:

      Well isn’t this convenient. So if someone disagree or doesn’t believe the constant onslaught by the mainstream media then they are too stupid to realize their own bias to the “TRUTH”? And of course those people who are the most analytical and logical come to the wrong conclusion more often than those who are ruled by their emotions and take everything they are told as gospel. HA HA HA HA HA
      WHAT A LOAD OF CRAP! Not saying this doesn’t happen. Just question which personality type is most effected by it.

      Thumb up 0 Thumb down 0
  2. Eric says:

    @Adriel: I think you are right and that change occurs demographically, i.e. generational die-offs.

    Thumb up 2 Thumb down 0
  3. Yves says:

    That is truly interesting. I guess the people who are caught in the their mindset of conspiracy theories and new age thinking ought to be, as they say, open minded about this. But that would conflict their set ideologies and their desire to be shocked.

    To bad for them, they think most poeple are blind and live in a prison. Yet, they too, are somewhat caught in a prison themselves. At least that’s what I’ve come to understand, anybody thinks this too?

    Thumb up 1 Thumb down 0
  4. abqhudson says:

    I think the discussion on Climate Change missed the point. No one argues that the climate does not change. The discussion should be about the cause of Climate Change – Ice Ages, periods of global warming and so on. That is where the leap of faith is taking lace with no hard/credible science to back it up.
    And, why should we believe scientists who have to grovel for funds when getting the funds is the objective?

    Thumb up 8 Thumb down 6
    • James says:

      This is pure baloney. The science of CO2-related climate change has been well known for over a century, and in fact is far better established than the amount of current change, which depends on measurements & statistics.

      Thumb up 3 Thumb down 2
      • James Briggs says:

        What is pure baloney? You answer didn’t address my point. I never doubted CO2-related climate change. For the sake of argument I will agree: “The science of CO2-related climate change has been well known for over a century, and in fact is far better established than the amount of current change,
        which depends on measurements & statistics.” Is a metaphysical certainty. Co2 related climate change is as undoubtable as a three sided triangle.

        My point is and was does being right justify falsifying the data. Does it justify saying the snow caps of the Himalayas are melting when they aren’t. Does it justify calling anyone who disagrees with you an idiot.

        Well-loved. Like or Dislike: Thumb up 7 Thumb down 1
      • James says:

        Hidden due to low comment rating. Click here to see.

        Disliked! Like or Dislike: Thumb up 2 Thumb down 7
      • James Briggs says:

        I do believe global warming (based on a man made the increase in CO2) and have for years. I stated my agreement on the internet in the 1980s. Then as in now I believe in following certain rules.

        1. “Does it justify saying the snow caps of the Himalayas are melting when they aren’t.”

        But they are.
        The fact that the Himalayan Mountains are not melting has nothing to do with global warming. It is because they are growing at nearly three inches a year.
        U.N. climate chiefs apologize for glacier error
        January 20, 2010|By Matthew Knight, for CNN

        The U.N.’s leading panel on climate change has apologized for misleading data published in a 2007 report that warned Himalayan glaciers could melt by 2035.In a statement released Wednesday, the Intergovernmental Panel on Climate Change (IPCC) said estimates relating to the rate of recession of the Himalayan glaciers in its Fourth Assessment Report were “poorly substantiated” adding that “well-established standards of evidence were not applied properly.”

        This measurement was determined by Boston Museum of Science professor, Brad Washburn. However, this figure is bound to go higher over time as it is estimated that the Himalayan Mountain range, which Mount Everest is a part of, is growing at least 2.64 inches a year.
        Apparently you believe that making false statements helps your case.

        2. “Does it justify calling anyone who disagrees with you an idiot.”

        Only when their reasons for disagreement are demonstrably idiotic :-)
        If you think making a joke out of a serious question is a valid way to answer then be my guest.

        3. “My point is and was does being right justify falsifying the data.”

        Does being afraid of what the actual data shows justify claiming that people have falsified that data, when they haven’t?
        No it doesn’t. Simply agreeing with someone when they are right doesn’t hurt my case at all. Might I suggest you do the same?

        4. Your unwillingness to concede anything seems to be a pattern among some who support global warming. Perhaps you are afraid? I support the idea of global warming and I am willing to be honest. I don’t think it makes me special.

        Well-loved. Like or Dislike: Thumb up 9 Thumb down 1
      • Doug says:

        While we’re talking about misperceptions of risk, a kilometer of ice over the Thames scares the *cue sound guy* out of me.

        If global warming becomes a credible threat to life on the planet (the Venus scenario) I’m absolutely certain that Gaia Engineering will solve the problem. Apologies to the polar bears in the meantime as we pretend we have control over the aspirations of all members of the human race. Fiddling while Rome burns is so much fun.

        In contrast, the big volcano under Yellowstone is a much more challenging global life ending threat. Trust me, I’m a Doctor (of Engineering) *wink*.

        Thumb up 0 Thumb down 0
  5. abqhudson says:

    I think the discussion on Climate Change missed the point. No one argues that the climate does not change. The discussion should be about the cause of Climate Change – Ice Ages, periods of global warming and so on. That is where the leap of faith is taking place with no hard/credible science to back it up.
    And, why should we believe scientists who have to grovel for funds when getting the funds is the objective?

    Thumb up 1 Thumb down 5
  6. James Briggs says:

    As the author said humans are bad at predicting the odds. If the author is human then he is included in the statement. There must be a reason that evolution selected for humans who live in the now. Yes it can be harmful. Drug addiction is a good example but to dismiss the idea is fraught with peril. The author seems to believe because he knows the odds of things happening in the past he knows the odds of them happening in the future. Human knowledge may double every 72 hours at the very least that makes it impossible to predict the odds. Certain things may be bad for you but how bad no one knows. The threat of death by French fries seems to be based on the increased chances of death from French fries and nothing else. Something may kill you before the French fries. If you are in a car heading over a cliff does having one last fry affect your life span? What if you take such good care of yourself food shouldn’t kill you. Again there may be advances in medical science in the future which may end French fries being a cause of death. It also assumes that death is the only valid value. These ideas also ignore the idea that people have their own future and expectations. There may be an investment opportunity where one hundred thousand dollars maybe worth far more than a million dollars two years from now.

    I feel sorry for the author’s wife she knows in some cases being bitten by an animal is a virtual certainty but no one can know the odds of a French fry killing you.

    Thumb up 2 Thumb down 0
  7. James Briggs says:

    Two things made accepting global warming much harder. It was made political from the start. We were told that industry was evil and had to stop or we would all die from global warming. The proper way was to establish that global warming was happening. Then look at the causes of global warming. Only after all that was done do you propose remedy. The other problem was the rejection of the Scientific Method proponents of global warming. There was an editorial in Scientific American saying fudging the data was justified because the issue was so important. Fudging the data is never justified. Then there was name calling. Anyone who says a physicist can’t understand the data has disqualified themselves from the debate.

    Well-loved. Like or Dislike: Thumb up 6 Thumb down 1
    • James says:

      “The proper way was to establish that global warming was happening. Then look at the causes of global warming.”

      Which shows a pretty shaky understanding of the science, or even of common sense. We have science, worked out in considerable detail, with lots of experimental evidence, that shows that adding a lot of CO2 to the atmosphere will cause bad things to happen. Why should we have to wait until those things have happened before trying to stop them from happening? Isn’t that like bailing out of a plane, but not opening your parachute until after you hit the ground?

      Thumb up 2 Thumb down 1
      • James Briggs says:

        Clearly you are putting words in my mouth.

        “The proper way was to establish that global warming was happening. Then look at the causes of global warming.”

        Which shows a pretty shaky understanding of the science, or even of common sense. We have science, worked out in considerable detail, with lots of experimental evidence, that shows that adding a lot of CO2 to the atmosphere will cause bad things to happen. Why should we have to wait until those things have happened before trying to stop them from happening? Isn’t that like bailing out of a plane, but not opening your parachute until after you hit the ground?

        You are saying that hypotheses should not be tested. You know the truth the hell with facts. This is the scientific method. It has served mankind well

        1. Use your experience: Consider the problem and try to make sense of it. Look for previous explanations. If this is a new problem to you, then move to step 2. 2. Form a conjecture: When nothing else is yet known, try to state an explanation, to someone else, or to your notebook. 3. Deduce a prediction from that explanation: If you assume 2 is true, what consequences follow? 4. Test: Look for the opposite of each consequence in order to disprove 2. It is a logical error to seek 3 directly as proof of 2. This error is called affirming the consequent.

        You are saying we should not look at the data with regard to an increase in temperature. Do you think that we should just do whatever you tell us to do without evidence?

        It turns out that has been an increase in global temperature and it did make sense to check it.

        Global surface temperatures have increased about 0.74°C (plus or minus 0.18°C) since the late-19th century, and the linear trend for the past 50 years of 0.13°C (plus or minus 0.03°C) per decade is nearly twice that for the past 100 years.

        So the condition has been satisfied.

        Did you believe that the temperature shoot up suddenly and kill everyone? Did you think it would change as rapidly as hitting the ground?

        Thumb up 3 Thumb down 1
      • Doug says:

        “Why should we have to wait until those things have happened before trying to stop them from happening?”

        Your cost vs benefit analysis is not flawed. It is non-existent.

        Two reasons:
        1. Potential misallocation of resources. Every unit of currency and word spent on global warming could be spent on, for example, education.
        2. It may be inneffective anyway. We do not control the aspirations or actions of anyone on the planet other than ourselves. Given the current global resource allocation it is my opinion that global warming will not dominate economic policy in the Second and Third world until long after we discover we did not have a parachute. Hubris.

        I perceive significant risk that a great deal of resources will have been applied to a solution that will not work for either scientific or human reasons. With those resources we could have uplifted an entire generation. Alternatively, we could use a fraction of those resources to find a real solution that will work in the real world that contains real people and we could, quite literally, choose our global mean temperature.

        Thumb up 0 Thumb down 0
  8. Dyske says:

    If I understood this correctly, it means that the more educated you are, the more likely that you would care about getting along with others. This makes sense because approval and respect from others is what leads to “success” in life, and what would be the point of doing/saying the right thing if it does not lead to success? That is, the smarter we become, the more we understand the pointlessness of being right/correct for its own sake. “Success” is ultimately what we are after. This is even true of academics, artists, scientists, philosophers, or economists. Doing/saying the right thing is only a means to achieve success.

    Success is not an absolute measure that exists outside of human values/perceptions/subjectivity. So, we are not striving for anything rational or objective. I’m sure there are millions of people in history who said and did the right things but were never recognized as such, and are now completely forgotten. Who would want to be one of them?

    If you actually believe in truth as its own virtue, picking on “conspiracy theorists” is a cowardly act. Anyone could do that with no risk. We need more people who are skeptical of the mainstream, widely accepted ideas. For instance, we needed more people who were skeptical of the housing bubble. But what happens is that when you become skeptical of widely accepted ideas, you run the risk of being labeled a “conspiracy theorist” yourself.

    Thumb up 2 Thumb down 0
    • Dyske says:

      I was just thinking that my original comment was a bit confusing/confused. It’s not that people want to “get along” to succeed. We in fact cannot become successful simply by “getting along.” “Getting along” has no power structure. To succeed, you need to belong to a power structure, like Harvard University, New York Times, or even your immediate network of colleagues and friends who are willing to help support your views and values. Once you commit yourself to such a power structure, you lose independence of mind. You have to work within the rules and limitations of that structure. You cannot put truth above everything else; truth only serves to promote your position in the structure, and if it does not, then you discard or distort it. Your ultimate objective is not the truth but power and success.

      This, I do not think, is a result of education. I think people who seek education do so out of their desire to succeed in our society so even before they receive education they are biased towards serving their own success, not the truth. Naturally, this will lead to defending your own success as well as defending the structure to which you belong at any cost. The debate, therefore, is no longer about the truth.

      I believe this is why Wittgenstein refused to belong to any political organizations. He felt that to become a true philosopher, one needs to reserve the right, or the possibility, to change one’s mind. Once you commit yourself to a structure, this becomes difficult to do.

      Thumb up 3 Thumb down 0
      • James Briggs says:

        Being part of a group affects your choices. You have to support the group if you want the group to support you. Think about the Penn Staters cheering Joe Paterno and having it look like they were cheering molesting little boys.

        Then there is identity politics. If you think of yourself as an intellectual doesn’t that mean you have to believe in global warning? If you are a cowboy how are you going to feel about intellectuals? If you think you are smarter than everyone else doesn’t that bias you against democracy? Even with me I tend to feel alienated from all groups so I tend to pick a position that is different everyone elses.

        Thumb up 1 Thumb down 1
      • Dyske says:

        Good points, James. Being compelled to seek truth is a curse because truth and success are not correlated. (In many cases, they are inversely correlated.) If you keep seeking truth, leaving yourself open to change your mind as soon as you learn any contradicting evidence, you can’t climb any power structure, leaving you poor and unsuccessful. To survive in this world, we must be biased. We can’t just serve truth.

        Thumb up 2 Thumb down 1
      • James Briggs says:

        It is true that I am poor and unsuccessful. But I seek truth for a selfish reason. It turns out for me my sanity and the search for truth is highly correlated.

        Thumb up 4 Thumb down 0