The Truth Is Out There…Isn’t It? A New Freakonomics Radio Podcast

(iStockphoto)

Our latest Freakonomics Radio podcast is called “The Truth Is Out There…Isn’t It?” (You can download/subscribe at iTunes, get the RSS feed, listen live via the media player above, or read the transcript below.) In it, we try to answer a few fundamental questions: how do we know that what we believe is true? How do we decide which information to trust? And how do we quantify risk — from climate change to personal investments?

The program begins with Stephen Greenspan, a psychologist and an expert on “social incompetence” and gullibility. He knows from personal experience that even the smartest people can be duped into bad risk assessments, especially on the advice of people they trust. You can read more about him here (spoiler alert!).

We also talk with Dan Kahan of Yale Law School and Ellen Peters of Ohio State University, both of whom belong to the Cultural Cognition Project, a scholarly group focused on “how cultural values shape public risk perceptions.” We blogged earlier about their interesting finding: 

Greater scientific literacy and numeracy were associated with greater cultural polarization: respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased.

The authors hypothesize that people who are more numerate and scientifically literate are better at gathering information that confirms their existing beliefs. Kahan believes this happens, in part, for a pretty basic reason: we just want to fit in with our friends. So we work to maintain viewpoints that fall in line with our social group.

You’ll hear from professional skeptic Michael Shermer, who explains the evolutionary basis of funky risk-assessment practices. It all goes back to our hominid ancestors, he says, who needed to be on high alert to protect against predators.

Steve Levitt also chimes in:

If there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are the things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.

And what happens when our normal fears kick into overdrive? We talk to Nick Pope, formerly of the British Ministry of Defence, who for several years investigated UFO sightings for the government. (Some files from the Ministry’s UFO department have recently been made available at the British National Archives.) Since leaving government, Pope has been accused of being part of an elaborate government cover-up. He talks about the futility of trying to change a conspiracy theorist’s mind.

Climate change, hominid ancestors, UFO cover-ups, and smart people making bad decisions: all that and more in this week’s podcast.

Audio Transcript

 

 [CASH REGISTER]

Stephen GREENSPAN: Yes, life was pretty good until I got a phone call from my broker.


Stephen J. DUBNER: That’s Stephen Greenspan. He’s an emeritus professor of psychology at the University of Connecticut.

GREENSPAN: Hi Katherine.
   

WELLS: Hey.


DUBNER: And this is Katherine Wells. She’s one of the producers on our show. Hi Katherine.

Katherine WELLS: Hi Stephen.

DUBNER: So you are here with a story for us, yes?

KWELLS: Right. A story about Stephen Greenspan. He has an interesting specialty: he’s an expert in what he calls “social incompetence.”

DUBNER: I have some of that.

WELLS: Which, you know, we all feel. What he means is he studies why people do dumb things.

DUBNER: Presumably that means … why smart people do dumb things?

WELLS: Right, that included.

DUBNER: And when he told you there that “life was pretty good,” what did he mean? What was so good exactly?

WELLS: Well, it was December, 2008, and he had a book coming out called Annals of Gullibility. The other thing that seemed pretty good was his financial situation. About a year earlier, he had invested in this hedge fund and it was doing pretty well, so he was getting nicely set up for retirement too. So one day in December he gets the first pre-release copy of the book, the gullibility book. And two days later, his broker calls.

GREENSPAN: I said, how are you? He said, terrible, it’s the worst day of my life. Now this is a man who had lost a son, so when he said it’s the worst day of my life that got my attention. And I said why? He said well Bernard Madoff just admitted that he was running a Ponzi scheme. And I responded, who is Bernard Madoff, and what’s it have to do with me?


DUBNER: Uh-oh. Katherine, I think we can kind of smell where this is headed.

WELLS: Right. This fantastic hedge fund that Greenspan had invested in turned out to be a feeder for Madoff’s Ponzi scheme. And Greenspan had no idea -- he didn’t remember ever even having heard Madoff’s name.

DUBNER: Oh, man. So the gullibility expert has been gulled.

WELLS: Right, gulled in a big, ironic way. He lost four hundred thousand dollars. Now, this was just about a third of his savings, so it wasn’t the total end of the world. And he should get some money back eventually from settlements. But he’s 70 now, he has two college-aged kids, and he’d really hoped to be retired by now. And, you know, he certainly didn’t want to be remembered in this way...

GREENSPAN: There was a columnist, a financial columnist in Canada who in his blog wrote: the first Greenspan, Alan, will be remembered as the economist who didn’t see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who forgot to read his own book on gullibility.


WELLS: I mean, it’s ironic, because Greenspan’s own research shows how even the smartest people can be duped.

GREENSPAN: I mean, a good example of that would be Sir Isaac Newton, the greatest scientist of all time, who lost over a million dollars -- in modern dollars -- in the South Sea bubble. And so he wrote, “I can calculate the orbit of heavenly bodies, but I cannot fathom the madness of men.”

 

WELLS: In reference to losing the money?

 

GREENSPAN: In reference to his own foolishness in putting all of his fortune at risk in something that he wasn’t really, in spite of his incredible brilliance, able to really understand or adequately calculate the risk of.  

 

WELLS: So in a way, you joined an elite club of brilliant, informed, educated people who can be fooled.

 

GREENSPAN: I joined the human race basically. Like Sir Isaac Newton and the South Sea Bubble, I knew nothing about Madoff and just basically went along with the crowd. And that’s powerful. We tend to take our cues from other people, especially in situations where we don’t quite know what to do. 


DUBNER: So it may no longer surprise us to learn that smart people sometimes make dumb decisions.

WELLS: Right, it’s like Greenspan says. It’s the instinct to “go along with the crowd” and to “take our cues from other people.” And that’s really what today’s show is about.

DUBNER: Right. And I want to talk about something else Greenspan mentioned, an even more elemental issue: how we make decisions about a risk that we just aren’t equipped to calculate. But here’s the thing: if it can’t be calculated, then maybe it’s not exactly a risk. About 100 years ago, the economist Frank Knight argued that risk and uncertainty are nearly identical but for one key difference. Risk can be measured; uncertainty, by its nature, cannot. But … what happens when you can’t tell the two of them apart?

[THEME]

ANNOUNCER: From WNYC and APM, American Public Media: This is FREAKONOMICS RADIO.  Today: The Truth Is Out There … Isn’t It?  Here’s your host, Stephen Dubner.

DUBNER: So Stephen Greenspan, the gullibility expert, loses a third of his life savings in what turns out to be a Ponzi scheme. Now, even if you feel sympathetic toward him, you might say, Hey, you know, he’s just one person. Bad things happen to people every day. At least the world didn’t end. But what if we were worried about something that might end the world? No, I’m not talking about an attack by alien nations – not yet, at least. That’ll come later in the program. I’m talking about…climate change. How are people like you and me supposed to calculate the threats from something like climate change? There’s so much complexity, so much uncertainty. So most of us do what Stephen Greenspan did when he was looking to invest. We take our cues from other people…

Al GORE: It’s not a question of debate. It’s like gravity. It exists.

 

Rush LIMBAUGH: The reason that you know you’re right is that you know things they don’t know. And because they don’t even have that baseline of knowledge to chat with you, they can’t understand where you’re coming from.  And that’s exactly how I feel talking to people who believe this global warming crap.

 

ABC WORLD NEWS: The science is solid, according to a vast majority of researchers, with hotter temperatures, melting glaciers, and rising sea level providing the proof.

 

Glenn BECK: When the University of Madison Wisconsin comes out with their definitive study, do I believe that? No! Do I believe scientists? No! They’ve lied to us about global warming. Who do you believe?


DUBNER: Who do you believe? That was Glenn Beck, by the way. Before him, from the top, you heard Al Gore and then Rush Limbaugh and an ABC World News report. When it comes to something like climate change, as fraught as it is with risk and uncertainty – and emotion! – who do you believe? And, more important, why?

Ellen PETERS: You know, my personal perception is that I don’t know enough about it, believe it or not. This is an issue that I think…

 

DUBNER: Wait, could you just say that again so that everyone in the world can hear an honest response? It’s so rare for some version of I’m not quite sure or I don’t know. So, sorry, say it again and then proceed.

 

PETERS: What I was saying, I’m not sure exactly what I believe on it in terms of the risk perceptions of climate change. It’s something that I don’t think I am personally educated on enough to have a really firm opinion about that.


DUBNER: That was Ellen Peters. She teaches in the psychology department at Ohio State University. She is part of a research group called the Cultural Cognition Project. They look at how the public feels about certain hot-button issues – like nuclear power and gun control – and then they try to figure how much those views are shaped by cultural values. That is, not empirical evidence. But people’s what they call “cultural cognition.” So, they recently did a study on climate change. How was it, they wanted to know, that the vast majority of scientists think the Earth is getting warmer because of human activity, but only about half the general public thinks the same? Could it be, perhaps … that people just don’t trust scientists? Here’s Dan Kahan. He’s another Cultural Cognition researcher and a professor at Yale Law School.

Dan KAHAN: Well, in fact, the scientists are the most trusted people in our society. The Pew Foundation does research on this, and this has been a consistent finding over time.

 

DUBNER: OK, so there goes that theory. That explanation won’t work for us then.

 

KAHAN:  Correct.



DUBNER: All right, so maybe people just doesn’t understand the science. Surveys have found that fewer than 30% of Americans are scientifically literate. Ellen Peters again:

PETERS: People have the belief that the reason that people don’t believe the risks of climate change are high enough is because they’re not smart enough, they’re not educated enough, they don’t understand the facts like the scientists do. And we were really interested in that idea and whether that’s really what was going on, or whether something else might matter.


DUBNER: So Peters and Kahan started out their climate-change study by testing people on their scientific literacy and numeracy, how well they knew math.

PETERS: And the items are things like: it is the father’s gene that decides whether the baby is a boy or a girl, true or false?

DUBNER: True.

 

PETERS: So fairly simple.

 

DUBNER: Is it true?

 

PETERS: You know, I’m actually not even positive on that one. I think it’s the comb…Oh, no it has to be the father’s gene.

 

DUBNER: I’m putting my money on father, true.

 

PETERS: Father is true there, absolutely. Second question, antibiotics kill viruses as well as bacteria, true or false?

 

DUBNER: Negative.

 

PETERS: That one is absolutely false.


DUBNER: You can see why they wanted to know how people did on these questions before asking them about climate change.

PETERS: Numeracy in general, what it should do is it should help you to better understand information first of all. And that kind of comprehension is sort of a basic building block for good decisions across a variety of domains.

 

DUBNER: Right.

 

PETERS: But numeracy should also do other things. It should also help you just simply process the information more systematically. It should, in general, help you to get to better decisions that are more in line with the facts.

 

DUBNER: All right, so that makes perfect sense, but you have found something that kind of flies in the face of that haven’t you?

 

PETERS: We have. It’s the idea that people who are highly numerate and highly scientifically literate, they seem to actually rely on preexisting beliefs, on these sort of underlying cultural cognitions they have about how the world should be structured more than people who are less scientifically literate, or less numerate.

 

DUBNER: So, if I wanted to be wildly reductive, I might say the more education a culture gets, the more likely we are to have intense polarization at least among the educated classes, is that right?

 

PETERS: Based on our data, that’s what it looks like. It’s so interesting and so disturbing at the same time.


DUBNER: It is interesting, isn’t it? I mean, Peters and Kahan found that high scientific literacy and numeracy were not correlated with a greater fear of climate change. Instead, the more you knew, the more likely you were to hold an extreme view in one direction or the other -- that is, to be either very, very worried about the risks of climate change or to be almost not worried at all. In this case, more knowledge led to … more extremism! Why on earth would that be? Dan Kahan has a theory. He thinks that our individual beliefs on hot-button issues like this have less to do with what we know than with who we know.

KAHAN: My activities as a consumer, my activities as a voter, they’re just not consequential enough to count. But my views on climate change will have an impact on me in my life. If I go out of the studio here over to campus at Yale, and I start telling people that climate change is a hoax – these are colleagues of mine, the people in my community—that’s going to have an impact on me; they’re going to form a certain kind of view of me because of the significance of climate change in our society, probably a negative one. Now, if I live, I don’t know, in Sarah Palin’s Alaska, or something, and I take the position that climate change is real, and I start saying that, I could have the same problem. My life won’t go as well. People who are science literate are even better at figuring that out, even better at finding information that’s going to help them form, maintain a view that’s consistent with the one that’s dominant within their cultural group.  

 

DUBNER: So you’re saying that if I believe that climate change is a very serious issue and I want to align my life with that belief, that it’s actually more important that I align my life with that belief not because of anything I can do, but because it helps me fit in better in my circle, there’s more currency to my belief there. What about you? You’re in New Haven, Connecticut, at Yale. I gather you haven’t walked into a classroom and publicly declared that you believe climate change or global warming is a hoax, have you?

 

KAHAN: No, I haven’t done that.


DUBNER: This makes sense, doesn’t it? But it’s also humbling. We like to think that we make up our minds about important issues based on our rational, unbiased assessment of the available facts. But the evidence assembled by Kahan and Peters shows that our beliefs, even about something as scientifically oriented as climate change, are driven by a psychological need to fit in. And so we create strategies for doing this. Here’s my Freakonomics friend and co-author Steve Levitt.

Steve LEVITT: I think one of the issues with information gathering is that when people go to the trouble to learn about a topic, they tend not to learn about a topic in an open-minded way. They tend to seek out exactly those sources which will confirm what they’d like to believe in the first place. And so the more you learn about a topic, you tend to learn in a very particular way that tends to reinforce what you believe before you ever started.


DUBNER: Aha. So if you’re already scared of something, you tend to read more about how scary it is. And if you’re not worried -- then you don’t worry … right?

LEVITT: So if there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.


DUBNER: Coming up: since we’re so bad at figuring out what’s really dangerous, let’s bring in the professionals, shall we?

Michael SHERMER:  I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do.


DUBNER: And: a cautionary tale about siding with the conspiracy theorists.

Nick POPE: I think somebody actually thought I was an alien myself.


[CASH REGISTER]

ANNOUNCER: From WNYC and APM, American Public Media: This is FREAKONOMICS RADIO. Here’s your host, Stephen Dubner.

DUBNER: So as Steve Levitt sees it, we seek out information that confirms our preexisting biases, and we are congenitally bad at assessing risk. So how are people supposed to figure out what to be afraid of? Here’s Levitt again.

LEVITT: To know what to be afraid of, you need to go through an in-depth data collection process, you need to be properly informed. And people are too busy, rightfully too busy, leading their lives instead of dwelling on what the exact, almost infinitesimal probability is that any particular thing will kill them. So it’s sensible for people to be uninformed and it’s sensible to rely on the media. It just turns out that the media is not a very good source of information.


DUBNER: If you really wanted to make sure that every one of your beliefs was worth holding, you’d have to spend so much time gathering primary data that you’d have no time for anything else in life. You’d practically have to become a professional skeptic. And that’s not a job … is it?

SHERMER:  Uh, yeah, I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do. So, anything from UFOs and alien abductions to Bigfoot and conspiracy theories all the way up to things like, global warming and climate change and autism and vaccinations. We cover it all.

 
DUBNER: Michael Shermer, a professor at Claremont University, has a masters degree in experimental psychology and a Ph.D. in the history of science. He’s also the publisher of Skeptic magazine and he writes books. His latest is called The Believing Brain.  

DUBNER:  Now, as a professional skeptic, I’m guessing a lot of people look at you, or hear about a guy like you or read a book by you and think, oh man, that’s, like, the dream job. You know, people think, well, I’m a skeptic, I don’t believe anything. So, what do you have to do to be you, Michael?

 

SHERMER:  Haha.  Well, we actually do believe all sorts of things. You have to have all sorts of beliefs just to, just to get out of bed in the morning, and so, the question then becomes, well, which of your host of beliefs are the ones that are really supported by evidence, or are questionable, or are probably not true, and which are those that we base on instinct and intuition, and which are we basing on, you know, solid evidence, and so, that’s where the rubber meets the road, is, is not, do you believe something or not—of course, we all believe all sorts of things.  The question is, are they true?  And what’s the evidence? What’s the quality of the evidence?

 

DUBNER: Talk to me about how we end up believing what we believe in. I was going say, how we choose to believe what we believe in, but it sounds like it’s not really a choice, right?

 

SHERMER:  It isn’t really a choice, no. Our brains are designed by evolution to constantly be forming connections, patterns, learning things about the environment. And all animals do it. You think A is connected to B and sometimes it is, sometimes it isn’t, but we just assume it is. So my thought experiment is, imagine you’re a hominid on the plains of Africa, three and a half million years ago. Your name is Lucy. And you hear a rustle in the grass. Is it a dangerous predator, or is it just the wind? Well, if you think that the rustle in the grass is a dangerous predator and it turns out it’s just the wind, you’ve made a Type 1 error in cognition – a false positive. You thought A was connected to B, but it wasn’t. But no big deal. That’s a low-cost error to make. You just become a little more cautious and vigilant, but that’s it. On the other hand, if you think the rustle in the grass is just the wind, and it turns out it’s a dangerous predator, you’re lunch. Congratulations, you’ve just been given a Darwin award for taking yourself out of the gene pool before reproducing. So we are the descendants of those who were most likely to find patterns that are real. We tend to just believe all rustles in the grass are dangerous predators, just in case they are. And so, that’s the basis of superstition and magical thinking.

 

DUBNER: But then we get to something like climate change, which is, theoretically, an arena bounded entirely by science, right?

 

SHERMER:  You would think so.

 

DUBNER:  Yeah, you would think so.  So what do we find, actually?

 

SHERMER:  Either the earth is getting warmer or it’s not, right?

 

DUBNER:  Yeah.

 

SHERMER:  I mean, it’s just a data question. Well, because it also has ideological baggage connected to it, you know, left wing versus right wing politics, and so the data goes out the window.  It’s like, I don’t know—whatever the data is, I don’t know, but I’m going to be against it.  Now, I can’t just say, oh, I’m against it because my party is, or I just do what other people tell me. Nobody says that. What you do is, you make the decision, “I’m skeptical of that” or “I don’t believe it,” and then you have to have arguments.  So then you go in search of the arguments.

 

DUBNER: It doesn’t sound like it surprises you at all, then, that education—level of education -- doesn’t necessarily have a big impact on whether you’re pro- or con-something. Correct?

 

SHERMER:  That’s right, it doesn’t. And giving smart people more information doesn’t help. It actually just confuses things. It just gives them more opportunity to pick out the ones that support what they already believe. So, being educated and intelligent, you’re even better at picking out the confirming data to support your beliefs after the fact.

 

DUBNER:  Let’s talk now for a bit about conspiracy theories, which we’re nibbling around the edges of. How would you describe, if you can generalize, the type of person who’s most likely to engage in a conspiracy theory that’s not true?

 

SHERMER:  Well, their pattern-seeking, their pattern-seeking module is just wide open.  The net has, you know, is indiscriminatory.  They think everything’s a pattern. If you think everything’s a pattern, then you’re, you’re kind of a nut.

 

[SOUND EFFECTS]

 

POPE: I suppose I’m best known for having had a job at the government where my duties were investigating UFOs.


DUBNER: That’s Nick Pope. Until 2006, he worked for the British Ministry of Defence. And in the early 90’s, he headed up the office that handled reports of UFO sightings.

POPE: Flying saucer sightings, as they were called then.

DUBNER: His job was to figure out if any of these sightings had merit, and if perhaps there were extraterrestrial visitors.

POPE: To satisfy ourselves that there was no threat to the defense of the UK.


DUBNER: Pope came into the job as a skeptic. But some UFO reports, especially from pilots and police officers, got him wondering if perhaps we were being visited by aliens. Now, mind you, there was no hardcore confirmatory evidence. But Pope started talking -- in the media -- about the possibilities.

KQRE: You say you believe with 99% certainty that we’re not alone. So tell us what you’ve discovered.

POPE: Well I think it’s inconceivable in this infinite universe that we’re alone. And then that begs the question, if we’re not alone, are we being visited? It’s a related question.

POPE: When I started speaking out on this issue, I think some people in the UFO community thought that I might be some sort of standard-bearer for them.

 

DUBNER: Meaning one of them?

 

POPE: Yes, absolutely, that I could be a spokesperson for the movement. Of course I had the huge advantage that whilst everyone else had done this as a hobby, I’d done it as a job.

 

DUBNER: Did that make you a bit of a hero in the UFO community?

 

POPE: It did, and a lot of people still hold that view. They want me to come out and say, yes it’s all real and yes, I was part of a cover up. Their fantasy is what they call Disclosure with a capital “D”, as if there’s going to be some magical parting of the curtains and a moment where a crashed spaceship is revealed for all the world to see. Because I say, you know what, I don’t think that spaceship exists. So, in a sense I manage to upset everyone. I go too far for a lot of the skeptics by being open to the possibility, but I don’t go far enough for the believers, particularly the conspiracy theorists. And I get called things like “shill” and that’s one of the more polite things that I’ve been called.

 

DUBNER: Yeah, I’ve looked at some of the comments on YouTube from a speech you gave. I’ll read you a bit of it. We’ll have to employ our bleeping technician later. “Nick Pope, what a f****** spastic. He works, he quote, ‘works’ for the government, why else is he constantly on every bloody UFO program on every f****** channel. He talks enough bull**** to keep the UFO nutters happy while never actually saying anything of importance.” Let’s unpack that one a little bit, shall we Mr. Pope?

 

POPE: Yes.

 

DUBNER: It says you quote, “work for the government.” Do you still work for the government?

 

POPE: No, I don’t. This is in itself one of the great conspiracy theories that in 2006 I didn’t really leave. I just went under deep cover, and that they’re passing me wads of bank notes in a brown paper bag or something.

 

DUBNER: But here’s my favorite. There’s one claim on a UFO blog that you, Nick Pope, have been abducted by aliens yourself and now lie about it.

 

POPE: Well, yes I’ve heard that one. I’ve even seen one, which I think you might have missed. I think somebody actually thought I was an alien myself.

 

DUBNER: That would explain a lot wouldn’t it?

 

DUBNER: Nick Pope discovered a sad truth. The more transparent he tried to be -- the more information he released about himself and his work -- the more worked-up his attackers became. They took facts that would plainly seem to work in his favor and they somehow made these facts fit their conspiracies instead. But before we judge, consider how good we all are at deciding first what we want to believe, and then finding evidence for it. So what’s the solution? What can we do to keep ourselves headed down the road, albeit slowly and clumsily, toward a more rational, reasoned civilization? Here’s Ellen Peters again, from the Cultural Cognition Project.

DUBNER: So, I guess, the depressing conclusion one might reach from hearing you speak is that ideology trumps rationalism?

 

PETERS: I think that we are seeing some evidence for that in this study, but I don’t think that that has to be the final answer. I think that policy makers, communicators need to start paying attention to some of these cues that deepen cultural polarization. So for example, telling the other side that they’re scientifically inept? Probably a bad idea. Probably not the best way to continue people coming together on what the basic science really does say. Or, coming up only with solutions that are antagonistic to one side. And you know it if you’re listening to them that those are just antagonistic solutions -- again, probably not the best idea. It’s a sign or a signal that we’re not listening maybe as well to beliefs on the other side.


DUBNER: Dan Kahan agrees that, whatever the solution, none of us are able to go it alone.

KAHAN: What’s clear is that our ability to acquire knowledge is linked up with our ability to figure out whom to trust about what. And ordinary people have to do that in making sense of the kinds of challenges that they face. But, the amount that we know far exceeds the amount that any one of us is able to establish through our own efforts. Maybe you know that the motto for the Royal Society is Nullius in Verba, which means “Don’t take anybody’s word for it.” And it’s kind of admirable and charming, but obviously false.

 

DUBNER: Not very practical, is it?

 

KAHAN: Can’t be right. I mean, what would I do? I’d say you know, don’t tell me what Newton said in the Principia, I’m going to try to figure out how gravity works on my own.

 

DUBNER: And speaking of Isaac Newton -- remember what Stephen Greenspan told us earlier -- how Newton was suckered into this terrible investment? It’s heartening to learn that even Newton, the scientific sage, was able to acknowledge the flaws in his own thinking, the shortcomings in his own thinking. And he left behind some advice that might be helpful for us all. He wrote, “to explain all nature is too difficult a task for any one man or even for any one age. 'Tis much better to do a little with certainty, and leave the rest for others that come after you, than to explain all things by conjecture without making sure of any thing.” In other words, don’t get too cocky.

ANNOUNCER: FREAKONOMICS RADIO is produced by WNYC, APM: American Public Media and Dubner Productions. This episode was produced by Katherine Wells. Our staff includes Suzie Lechtenberg, Diana Huynh, Bourree Lam, Collin Campbell and Chris Bannon. Our interns are Ian Chant and Jacob Bastian. David Herman is our engineer. Special thanks to John DeLore. If you want more Freakonomics Radio, you can subscribe to our podcast on iTunes or go to Freakonomics.com where you’ll find lots of radio, a blog, the books and more.

Leave A Comment

Comments are moderated and generally will be posted if they are on-topic and not abusive.

 

COMMENTS: 43

View All Comments »
  1. James Briggs says:

    I think most non political people would like to minimize green house emissions and do it in a cos effect way as possible.

    Thumb up 0 Thumb down 0
  2. Gigi says:

    Instant theory of knowledge lesson! Thanks from IB teachers everywhere (for this episode, and many others).

    Thumb up 0 Thumb down 0
  3. Matthew says:

    Katherine Wells:

    Regarding:
    Podcast of “The Truth is Out There… Isn’t It?”

    I wanted to express my thanks to this podcast. The subject matter of Truth, and how it is more elusive as data is collected was very well done. I found myself hanging on every word. Although expressing any opinion seem asinine considering the subject matter, I will attempt a comment.

    There was concepts that was not directly talked about that could be important. Motivation of belief: Most Climatologists have to come up with new findings supporting global worming for there to be a market for their services. However, Energy firms have an economic interest to keep alternative energy out of the market. Does this not weigh into the validity of the Experts on either side of the Ile.

    Again, That was one of the best 17 min I have spent, and I very much enjoy this piece you produced.

    Thumb up 0 Thumb down 0
  4. nancy says:

    I am a card carrying Republican,but I don’t think that matters. My 12 year old child thinks that Martin Luther King Jr. is one of the greatest men that ever lived , because thats what they teach her. Not Jesus Christ , George Washington, AbrahamLincoln, or even Simon Wiesenthaul. Does anyone bother to say to that he was a womanizer, or that he had illegitimate children. Or that he was was about as racist as a human could possibly be. He had one agenda,and one agenda only, and the was the black agenda. Not Democratic or Republican just black. He did nothing unify anyone. And those who think he did need to read the history books before they thought it was a good idea to change them. There are still alot of us around who remember the real history.

    Thumb up 0 Thumb down 1
    • James Briggs says:

      I am a sixty eight year old white American male with an Irish background. I was part of organizing the first march on Washington. I was a Conservative during the 80s who voted for Ronald Reagan and fought against Affirmative Action and for crime victim’s rights.

      1. I am a card carrying Republican, but I don’t think that matters.

      Of course it matters. You are a member of party that hates the American government and wishes to destroy it. A member of the party had brought new scams to defraud innocents with every Republican administration. Anyone who is a member of a party that destroyed the Middle Class and caused two depressions in less then one hundred years is a stranger to the truth.

      2. My 12 year old child thinks that Martin Luther King Jr. is one of the greatest men that ever lived , because thats what they teach her.

      The greatest of anything is subjective. Teaching opinion is always wrong. He was a great man he gave up everything including his life for a better nation and equality.

      3. Not Jesus Christ , George Washington, Abraham Lincoln, or even Simon Wiesenthaul.

      I think they were all greater them MLK. He wasn’t in the same league with the men you mention.

      4. Does anyone bother to say to that he was a womanizer, or that he had illegitimate children.

      You are right and he also abused woman.

      5. Or that he was was about as racist as a human could possibly be. ‘’

      He was not a racist. In fact he fought the racism of the Black Muslims. When he was murdered the racists hijacked the civil rights movement and his death was the only reason we went through the terrible time of Affirmative Action and an explosion in the crime rate.

      6. He had one agenda, and one agenda only, and the was the black agenda. Not Democratic or Republican just black. He did nothing unify anyone.

      He unified me. My best friend is black and MLK helped make it so.

      7. And those who think he did need to read the history books before they thought it was a good idea to change them. There are still a lot of us around who remember the real history.

      Yes there are a lot of us around who remember real history too bad you aren’t one of them.

      Thumb up 1 Thumb down 1
  5. Stewart Hart says:

    It’s easy to to change a conspiracy theorist’s mind. All it takes is transparency. Not answering questions, and withholding information creates conspiracy theories. Ask the families of the victums of 911 or the JFK researchers. Many theories of conspiracy have proven true. Church committe, watergate, pentagon papers, the many wiki leaks, Snowdens NSA info etc.

    Thumb up 0 Thumb down 0