Season 3, Episode 3
Until not so long ago, chicken feet were essentially waste material. Now they provide enough money to keep U.S. chicken producers in the black — by exporting 300,000 metric tons of chicken “paws” to China and Hong Kong each year. In the first part of this hour-long episode of Freakonomics Radio, host Stephen Dubner explores this and other examples of weird recycling. We hear the story of a Cleveland non-profit called MedWish, which ships unused or outdated hospital equipment to hospitals in poor countries around the world. We also hear Intellectual Ventures founder Nathan Myhrvold describe a new nuclear-power reactor that runs on radioactive waste.
Also in this hour: we look at the strange moments when knowledge is not power. Issues like gun control, nuclear power, vaccinations, and climate change consistently divide the public along ideological lines. Maybe someone just needs to sit down and explain the science better? Or maybe not. Dubner looks into the puzzle of why learning more only makes people more stubborn. And we look into conspiracy theories to see how people form their own version of the truth, even when the data contradict it.
Stephen J. DUBNER: Hi there. Are you Carlos? Very nice to meet you.
Carlos AYALA: Great to meet you as well.
DUBNER: I recently had lunch in Chinatown, the Chinatown in New York City, where I live, with a very pleasant fellow named Carlos Ayala. We met up at a place called the Golden Unicorn, to eat something I’d never had before: chicken feet, also known as chicken paws. Ayala is the guy you want to eat chicken paws with. He works at Perdue Farms, the fourth largest chicken producer in the U.S. He’s the Vice President of International...
Carlos AYALA: ...which means I’m in charge of everything that Perdue does on the food side of the business, so the chicken and turkeys, that’s outside of the United States. So my main focus is on exporting products that are not so desired in the U.S. and sending them overseas.
DUBNER: And you are a fan of the paw. You like to eat the chicken paw. It’s like Hair Club for Men. You are not just a spokesman, right, you love the paw?
AYALA: That’s very true. It’s actually the best part of the chicken as far as I’m concerned. It’s my favorite thing to eat.
ANNOUNCER: From WNYC and APM, American Public Media: This is Freakonomics Radio, the show that explores the hidden side of everything. Here’s your host, Stephen Dubner.
DUBNER: Today on Freakonomics Radio, we’re talking about clever ways to not waste waste—our waste. I went down to Chinatown to have lunch with Carlos Ayala from Perdue Farms. Now, I’m going to go out on a limb here and say that most Americans would vigorously disagree with Ayala that chicken paws are the best part of the bird.
DUBNER: Now, we haven’t seen them yet. We will soon. Just describe what they are going to look like. Cause when I use Mr. Google to help me out and I say chicken feet or chicken paws, it looks to me like a kind of sweaty human hand missing a finger.
AYALA: Yeah, they are kind of brown and wrinkly. And that’s actually one of the reasons I prefer the black bean sauce because they look less like a human hand.
DUBNER: We ordered. Some dumplings, a noodle dish, some Chinese broccoli – and a basket of chicken feet for everyone at our table. And then we tucked into those little suckers. They were about the size of my kids’ hands. My kids are nine and eleven. The paws were fleshy and bony at the same time.
AYALA: You have to spit out...
DUBNER: Yes you do.
AYALA: You have to spit out the little bones.
DUBNER: Yeah. I discovered that.
AYALA: So what do you think so far?
DUBNER: So, here’s what I think. I think it’s perfectly fine. It doesn’t, you know, disgust me or repel me in the least.
AYALA: Mmm hmmm.
DUBNER: I can’t say it’s the best thing I’ve eaten in the last even five years. But I love that you love them. So I’m just going to be with the paw for a little while.
DUBNER: Our order caused a stir with the Golden Unicorn’s waitstaff. Here’s Bourree Lam, who works on Freakonomics and came with us to lunch.
Bourree LAM: So they just asked me in Cantonese where Carlos is from because they said that Westerners never ever order chicken paws and they...we have about five baskets on the table right now so all the waitresses have just come and accosted me and asked me who he is, what he does, and where he is from.
DUBNER: Once they find out that Carlos Ayala works for Perdue, which makes him chicken-paw royalty, we get first-class treatment, including a special, off-the-menu, ginger-infused chicken-paw dessert, which I’d rather not relive right now. But you got the feeling that even without the Perdue connection, they would have smiled on our party because, as they said, it’s the rare American who orders the paw. Now in China, meanwhile, watch out! It isn’t the chicken breast they crave, like we do here. The drumstick? Okay, but what really floats their boat is the chicken paw. Ayala tells me in a typical year, the U.S. exports about 300,000 metric tons of chicken paws to China and Hong Kong. That’s roughly the mass of the Empire State Building!
DUBNER: It’s amazing that you are taking something that for your company in this country is pretty much valueless. In fact it may be negative cost. You normally would just dispose of it, yes?
AYALA: That’s right. So, nothing goes to waste on a chicken, but the chicken paws absent the export market just have no value. I mean it is very minimal.
DUBNER: Where did all of those hundreds of thousands, millions, no millions of chicken paws. Right? You produce about twelve million chickens per week, so that’s about half a billion chickens per year that just Perdue is producing.
AYALA: That’s right. And they each have two feet.
DUBNER: So over a billion chicken paws.
AYALA: So over a billion chicken paws.
DUBNER: Ok. So take me back a little bit, however many years we need to go back to the time before there was a robust export market. Where would those billion plus, if there were that many then, feet be going?
AYALA: So, they’d go to rendering. And it might end up in dog food or something like that. But certainly not for human food. And then in 1991 we started harvesting paws in our first plant and it’s actually a huge upgrade, so, using opportunity costing, and the alternative is almost zero, so the upgrade is tremendous. In fact it’s one of the more profitable items for a chicken company right now, are the paws.
DUBNER: And what was the general feeling within Perdue about this idea of creating an export market for... I can tell by the look on your face that it wasn’t greeted with open paws.
AYALA: It’s not going to work!
AYALA: Yeah, the idea that we are going to spend a lot of money in infrastructure for the feet? I mean, you know, what are you thinking? But it has turned out to be one of the most profitable items that we have. In fact there’s a lot of chicken companies that would be out of business if it wasn’t for the chicken paws.
DUBNER: Is that right. Perdue included?
AYALA: Um, it would be very difficult for us to survive without chicken paws.
DUBNER: No kidding.
AYALA: Yeah. It’s a critical part of the business. Demand in China is bottomless for chicken paws. If we produced literally twice as many paws, they’d be sold by 9 AM tomorrow.
DUBNER: And why don’t you? Because you don’t have enough demand for the rest of the chicken?
AYALA: That’s exactly right.
DUBNER: So you just need to learn to grow chickens with four feet?
AYALA: I’ve asked our geneticists about it. But no luck yet.
DUBNER: The idea of a four-footed chicken aside – and assuming you have no problem with people eating chicken in the first place, don’t you love this idea? I mean, what’s not to love? One man’s trash is another man’s dinner. I am very attracted to this kind of thing. I grew up on a little farm in a big family without a lot of resources. Anything that could be reused or repurposed was. Big glass mayonnaise jars got turned into milk jugs for our cow; junk mail became scratch paper. The cardboard tube from wire coat hangers? We used them to make fire starters. Recycling wasn’t a political thing; it was a way of life. So I’m always on the lookout for recycling stories, the weirder the better. Which brings us now to Cleveland, Ohio.
Lee PONSKY: My name is Dr. Lee Ponsky and I’m the president and founder of MedWish International, a nonprofit organization that collects unused medical supplies and sends them to Third World countries.
DUBNER: Of the more than two million tons of medical waste generated each year by U.S. hospitals, a lot of it is perfectly good equipment and supplies: stuff like microscopes, ultrasound machines, unused surgical masks. Now, there are plenty of underfunded hospitals and clinics in the U.S., but MedWish cannot ship to them because of liability issues and regulations on disposing of medical waste, even unused medical supplies. So MedWish collects supplies and ships them to more than 90 countries around the world. Lee Ponsky is an oncologist at University Hospitals Case Medical Center in Cleveland. He started MedWish during college, after he volunteered as a surgical assistant in Nigeria in 1991.
PONSKY: When I went to Nigeria and I saw what having nothing really meant, it blew me away. We started our day literally sewing up rubber gloves from the day before. So, we had a little lady who sat in the corner, filled them with water, the surgical glove, and if there was water dripping from one of the fingers, there was a hole, she would take a needle and thread and sew it up until there was no water dripping. We would make our own saline water and add salt and sterilize it. We would literally, we could buy very inexpensive fishing line, nylon fishing line, to use, and we would cut it up and use it in the sutures to sew people up for the days surgery. We made our own gauze. And you know what? At the end of the day it worked. But it was amazing to me there were sometimes surgeries that we couldn’t do because we just didn’t have the instruments or the tools, simple stuff often. The doctors there had the training, and they had the capability, but we just didn’t have the instruments and the tools to do certain things that we needed to do. And we literally saw a few people die. And that’s when it blew me away. I said this should just not happen. It doesn’t make sense that there are literally people dying to see one person right in front of your face dying because you didn’t have a certain tool or instrument that you know you’re throwing away in the U.S. That just doesn’t make sense. And that’s where I came and said we just need to do something about it.
DUBNER: Talk to me for a minute…Talk to me for a minute about why some of these supplies get tossed. Why does, why do a bunch of boxes of gauze, let’s say, get tossed, or tongue depressors, or surgical gloves? Why on earth aren’t they just kept in the closet and being used?
PONSKY: A great story is recently one of the medical supply companies had, there was an error in the way the instructions were printed. And it said, you know, the instructions in Spanish were messed up. But the product was still good, but they couldn’t sell it in that format. They couldn’t sell it marked inappropriately. So they called us and said, hey we’ve got five palettes of, I forget what the product was, let’s say they were rubber gloves. But the instructions are misprinted, would you guys put those to good use? And we said absolutely, those are still very usable pieces of equipment for us.
DUBNER: I understand that wooden tongue depressors have an expiration date, is that true?
PONSKY: Isn’t that amazing? I mean, that is, as far as…Again, if our organization points things out like that, that can help improve things here in the U.S., I would love it. Yes, that kind of stuff drives me crazy. And the manufacturers, the worry of the manufacturers and the concern of liability is pervasive everywhere. So, all these products have expiration dates, and some of them just don’t make any sense, and so, yeah, so little product even like a tongue depressor, a wooden tongue depressor will have an expiration date. Now, maybe there’s reasons, maybe it splinters after it sits in the wrapper for thirty-five years, but I can’t imagine it wouldn’t be useful if it expires six months after its expiration. I’ve always said that if what we’re doing make the hospitals in the U.S., in our country, more efficient, so efficient that there’s no more supplies being thrown away, great we’ll find something else to do, we’ll go on to the next thing. But until that time, let’s make use of the stuff that’s getting thrown away.
DUBNER: Now, just to be clear, MedWish itself, for legal reasons, does not accept or distribute expired medical supplies. Still, last year, it kept more than 200 tons of so-called medical waste out of landfills. So far we’ve filled our bellies with chicken paws, saved some landfill space, maybe even saved some lives, so what if we could light up the world with waste?
DUBNER: So first say who you are and what you do. The what you do might take three hours, but go ahead.
Nathan MYHRVOLD: Okay, I’m Nathan Myhrvold. I’m CEO of Intellectual Ventures, a company that invents new technology. And I’m also a cookbook author.
DUBNER: All right, there you go. So, this episode is about weird recycling, people who are reusing, recycling, repurposing something that is thought to have little or no value, or even negative value maybe, and turning it into a big positive. And we’re talking chicken paws, we’re talking medical supplies, but I understand that you have your own entry into the weird recycling sweepstakes, perhaps the weirdest entry in the weird recycling sweepstakes.
MYHRVOLD: Nuclear waste, we love it.
DUBNER: Nathan Myhrvold is a physicist by training. He’s also the former Chief Technology Officer at Microsoft. Now, he and Bill Gates and a few others have formed a company called TerraPower, which hopes to generate electricity—lots and lots of electricity—via nuclear power.
DUBNER: And how much money have you raised so far for Terra Power?
MYHRVOLD: You know, I’m not sure we even say. But...
DUBNER: You can tell me.
MYHRVOLD: Yeah, just here on this isolated soundstage no one will hear.
DUBNER: That’s not a microphone, that’s an amaryllis plant.
MYHRVOLD: Tens of millions tending towards ultimately hundreds of millions and billions.
DUBNER: TerraPower would create energy using new technologies, a new reactor called a traveling-wave reactor, and old fuel. Old as in used. As in used by traditional nuclear-power plants. Meaning nuclear waste. What’s known as depleted uranium. You know, the stuff nobody wants in their backyard.
MYHRVOLD: When you concentrate the U-235, you’re left with a mountain of U-238 called depleted uranium. It’s slightly radioactive; it’s classified as nuclear waste. It’s not as radioactive as the spent fuel rods, but there’s a lot more of it, huge amounts of it.
DUBNER: And what’s done with that typically?
MYHRVOLD: It’s sent to Paducah, Kentucky.
DUBNER: And why Paducah?
MYHRVOLD: And not just Paducah.
DUBNER: It’s one of three places…
MYHRVOLD: It’s one of the primary places where there is a government-run storage facility where armed guards patrol this vast…We have these great aerial photos of it that show these thousands and thousands of canisters of U-238 that’s sitting there.
DUBNER: Canisters, what size? Like a natural gas tank canister? Or gigantic, the size of a tractor trailer?
MYHRVOLD: They’re big. Each one is, I think, ten tons of the stuff.
MYHRVOLD: So they’re relatively large.
DUBNER: And how big, how much is there, how many tons of this?
MYHRVOLD: So that takes us to the recycling thing. We have a reactor that can burn that stuff as fuel.
DUBNER: That can burn the leftovers from…
MYHRVOLD: The leftovers. So, if we just take the stuff at Paducah, the depleted uranium, we could take America from its current mix of being about fifteen percent nuclear up to say eighty percent nuclear as France does, and run it for more than a hundred years. Paducah, Kentucky is the Saudi Arabia of this new world.
DUBNER: So you’re talking about a new kind of power plant, a new kind of nuclear power plant that uses a different technology that’s fundamentally different in some ways and similar in other ways to existing nuclear plants. You’re talking about however recycling a kind of waste product that nobody wants anyway and that most people consider worth less than zero. It should be made clear however that you don’t really know that this would work do you?
MYHRVOLD: Well, it’s a fascinating question. It will work. I can tell you with complete confidence that it will work. Now, if you asked me to prove that I would then take you to a set of theoretical calculations. And I would then take you to a set of computer simulations. You know, Fukushima was designed in the slide-rule era. Today, using modern computing technology, we can understand the properties of all parts of the reactor vastly better than anything w e did before. And the results of those computer codes has been very closely calibrated against experiments for a very long time. Now, that said, no one is actually going to build a power plant just on my say so and my sunny confidence. So, of course we will build a pilot plant that will be the first true proof of principle.
DUBNER: The plants TerraPower wants to build would be much smaller than traditional nuclear plants, and buried deep underground, with little need for tending. If all that sounds too good to be true, keep in mind that, for the moment at least, it’s not yet true. The world’s appetite for nuclear power, and its fear of nuclear power, waxes and wanes.
MYHRVOLD: Thirty years ago, the United States decided that we were freaked out about nuclear. The accident at Three Mile Island, which by the way killed zero people, and an amusing thing is there’s three reactors at Three Mile Island, not only are the other two still going, they’ve just been re-licensed for another twenty or thirty years. So everything has continued to work just great there. But the combination of that and a Jane Fonda movie called The China Syndrome…
DUBNER: Which came out twelve days before the accident at Three Mile Island.
MYHRVOLD: Yes. Coincidence, I wonder?
DUBNER: I think not.
MYHRVOLD: So nuclear R and D, the idea of doing exciting, new things in nuclear, the air went out of the balloon. There was no energy around it.
DUBNER: But the thing that I wonder, Nathan, is you’ve had this new plan for a new type of nuclear plant on the board for several years, right, you’ve been working on this, right?
DUBNER: So you’ve got all of this momentum. You’ve got worldwide electricity demand rising, rising, rising, rising, rising. And then last year when the tsunami happened in Japan and this nuclear power plant started to fall apart, what are you thinking. We’ve got this wonderful nuclear power project here, but is this another Three Mile Island, and Chernobyl, and Jane Fonda rolled into one? Is that what you’re thinking?
MYHRVOLD: Yes, that’s what we were worried about. There was a period of time when I was getting sort of commentary every hour. My BlackBerry was going off with a zillion different messages. And we had people in Japan, and we had all these tremendous amount of focus trying to understand what the hell is going on, and it was not the easiest few weeks to be out there promoting a new nuclear project.
DUBNER: So talk about what happened, talk about how bad it was, and talk about what it means for nuclear power.
MYHRVOLD: Well it turns out that the stretch of coast where the built the Fukushima Daichi Plant in the last hundred years there’s been a couple of twenty meter high tsunamis. So, you build a nuclear plant on that same coast that can only survive maybe a three and a half meter tsunami. This was not a good decision. Then the whole reaction to the plant, once it occurred, things were okay, or largely okay. Yes, there was a little damage, but all of the really seriously radiological leaks that occurred at Fukushima occurred because of human error, because the guys involved really had not done any level of safety drills or planning. You know, if you live on the seashore in Japan you’ve got to do tsunami drills. They really didn’t see to have done that. They were all left thumbs when it came to their response. You know, it’s easy to sit back and say that in retrospect, but they could have done a much better job.
DUBNER: When I think of using this waste that nobody wants, that’s considered dangerous, it’s costly, and so on, and you have this plan to turn it into fuel that could power the world, electricity of the world, I have to say it makes me think a little bit of the chicken foot, the humble chicken foot that nobody wanted until American chicken producers realized that China wanted it. So I wonder, you know, where you mind goes when I ask you to compare the depleted fuel stockpiles with the chicken paw?
MYHRVOLD: Well, I love chicken feet, as it turns out. In my new cookbook we have a fantastic recipe for puffed chicken feet.
DUBNER: Puffed chicken feet? Like they need more puffing. You mean, do you blow air in like a Peking duck?
MYHRVOLD: What you do is you cook them sous vide first, which makes them sort of soft. You pull the bones out. Then you dehydrate them a little bit. Then you deep fry them. And if you’ve ever had chicharrones, or fried pork rinds, it puffs up amazingly. Well, you do that with chicken feet, but it’s got this great chicken flavor. And they also look like these weird, twisted, puffed little gloves. Obviously they’re three-fingered gloves, (laughs) but it’s great.
DUBNER: Coming up on Freakonomics Radio, we’ll move from weird recycling to...weird thinking. We’ll talk about how even smart people make very dumb decisions.
Stephen GREENSPAN: He said well Bernard Madoff just admitted that he was running a Ponzi scheme. And I responded, who is Bernard Madoff, and what’s it have to do with me?
DUBNER: Plus, there’s a nasty secret about hot-button topics like global warming: knowledge is not always power.
Ellen PETERS: People have the belief that the reason that people don’t believe the risks of climate change are high enough is because they’re not smart enough, they’re not educated enough, they don’t understand the facts like the scientists do. And we were really interested in that idea and whether that’s really what was going on, or whether something else might matter.
DUBNER: And if you want more Freakonomics Radio, you can subscribe to our free weekly podcast on iTunes. Or visit Freakonomics.com, where all our shows are archived. Thank you!
ANNOUNCER: From WNYC and APM, American Public Media, this is Freakonomics Radio. Here’s your host, Stephen Dubner.
DUBNER: Katherine Wells is one of the producers on our show. And Katherine, you’re here with a story for us, yes?
Katherine WELLS: Yes, I’m with a story about a man named Stephen Greenspan.
GREENSPAN: Hi Katherine.
WELLS: That’s Greenspan. He’s an emeritus professor of psychology at the University of Connecticut. And he has an interesting specialty: he’s an expert in what he calls “social incompetence.”
DUBNER: I have some of that.
WELLS: Which is how we all feel. What he means is he studies why people do dumb things.
DUBNER: Presumably that means why smart people do dumb things?
WELLS: Sure, sure that included. So, I called Greenspan because I wanted to hear about this thing that happened to him a few years ago. It was December 2008, and things were going pretty well for him. He had written a book called Annals of Gullibility that was about to be released. In fact, he’d just gotten the first pre-release copy of the book in the mail, so he was pretty excited about that. He was also in a good spot financially. About a year earlier, he had invested in a hedge fund that was doing really well. So between that investment and the gullibility book, he was getting nicely set up for retirement.
GREENSPAN: Yes, life was pretty good until I got a phone call from my broker. I said, how are you? He said, terrible, it’s the worst day of my life. Now this is a man who had lost a son, so when he said it’s the worst day of my life that got my attention. And I said why? He said well Bernard Madoff just admitted that he was running a Ponzi scheme. And I responded, who is Bernard Madoff, and what’s it have to do with me?
DUBNER: Uh-oh. So, Katherine, I think we can kind of smell where this is headed.
WELLS: Right. This fantastic hedge fund that Greenspan had invested in turned out to be a feeder for Madoff’s Ponzi scheme. And Greenspan had no idea, he didn’t remember ever even having heard Madoff’s name.
DUBNER: Oh, man. So the gullibility expert has been gulled.
WELLS: Right, gulled in a big way. He lost four hundred thousand dollars. Now, this was just about a third of his savings, so it wasn’t the total end of the world. And he should get some money back eventually from settlements. But he’s seventy-one now, he has two kids, one of them is in college, and he’d really hoped to be retired by now. And, of course, there was the gullibility book.
GREENSPAN: There was a columnist, a financial columnist in Canada who in his blog wrote: the first Greenspan, Alan, will be remembered as the economist who didn’t see it coming, while the other Greenspan, Stephen, will be remembered as the psychologist who forgot to read his own book on gullibility.
WELLS: I mean, it’s ironic, because Greenspan’s own research shows how even the smartest people can be duped.
GREENSPAN: I mean, a good example of that would be Sir Isaac Newton, the greatest scientist of all time, who lost over a million dollars— in modern dollars — in the South Sea Bubble. And so he wrote, “I can calculate the orbit of heavenly bodies, but I cannot fathom the madness of men.”
WELLS: In reference to losing the money?
GREENSPAN: In reference to his own foolishness in putting all of his fortune at risk in something that he wasn’t really, in spite of his incredible brilliance, able to really understand or adequately calculate the risk of. Like Sir Isaac Newton and the South Sea Bubble, I knew nothing about Madoff and just basically went along with the crowd. And that’s powerful. We tend to take our cues from other people, especially in situations where we don’t quite know what to do.
WELLS: So in a way, you joined an elite club of brilliant, informed, educated people who can be fooled.
GREENSPAN: I joined the human race basically.
DUBNER: So that’s what we’re talking about for the rest of the hour, how we make complex decisions, and how even the smartest people among us carry around a big sack of biases. How do we decide whom to trust? How do we decide what’s risky, or dangerous, and what’s not? Now, even if you feel sympathetic toward Stephen Greenspan, you might say, hey, you know, he’s just one person. Bad things happen to people every day. At least the world didn’t end. But what if we were worried about something that might end the world? No, I’m not talking about an attack by alien nations. Not yet at least. That’ll come later in the program. I’m talking about climate change. How are people like you and me supposed to calculate the threats from something like climate change? There’s so much complexity, so much uncertainty. So most of us do what Stephen Greenspan did when he was looking to invest. We take our cues from other people.
Al GORE: It’s not a question of debate. It’s like gravity. It exists.
Rush LIMBAUGH: The reason that you know you’re right is that you know things they don’t know. And because they don’t even have that baseline of knowledge to chat with you, they can’t understand where you’re coming from. And that’s exactly how I feel talking to people who believe this global warming crap.
ABC WORLD NEWS: The science is solid, according to a vast majority of researchers, with hotter temperatures, melting glaciers, and rising sea level providing the proof.
Glenn BECK: When the University of Madison Wisconsin comes out with their definitive study, do I believe that? No! Do I believe scientists? No! They’ve lied to us about global warming. Who do you believe?
DUBNER: Who do you believe? That was Glenn Beck, by the way. Before him, from the top, you heard Al Gore and then Rush Limbaugh and an ABC World News report. When it comes to something like climate change, as fraught as it is with risk and uncertainty—and emotion!—who do you believe? And, more important, why?
Ellen PETERS: You know, my personal perception is that I don’t know enough about it, believe it or not. This is an issue that I think…
DUBNER: Wait, could you just say that again so that everyone in the world can hear an honest response? It’s so rare for some version of I’m not quite sure or I don’t know. So, sorry, say it again and then proceed.
PETERS: What I was saying, I’m not sure exactly what I believe on it in terms of the risk perceptions of climate change. It’s something that I don’t think I am personally educated on enough to have a really firm opinion about that.
DUBNER: That was Ellen Peters. She teaches in the psychology department at Ohio State University. She is part of a research group called the Cultural Cognition Project. They look at how the public feels about certain hot-button issues, like nuclear power and gun control, and then they try to figure how much those views are shaped by people’s cultural values. That is, not empirical evidence, but people’s what they call “cultural cognition.” So, they recently did a study on climate change. How was it, they wanted to know, that the vast majority of scientists think the Earth is getting warmer because of human activity, but only about half the general public thinks the same? Could it be, perhaps that people just don’t trust scientists? Here’s Dan Kahan. He’s another Cultural Cognition researcher and a professor at Yale Law School.
Dan KAHAN: Well, in fact, the scientists are the most trusted people in our society. The Pew Foundation does research on this, and this has been a consistent finding over time.
DUBNER: All right, so maybe people just doesn’t understand the science. Surveys have found that fewer than thirty percent of Americans are scientifically literate. Ellen Peters again:
PETERS: People have the belief that the reason that people don’t believe the risks of climate change are high enough is because they’re not smart enough, they’re not educated enough, they don’t understand the facts like the scientists do. And we were really interested in that idea and whether that’s really what was going on, or whether something else might matter.
DUBNER: So Peters and Kahan started out their climate-change study by testing people on their scientific literacy and numeracy, how well they knew math.
PETERS: And the items are things like: it is the father’s gene that decides whether the baby is a boy or a girl, true or false?
PETERS: So fairly simple.
DUBNER: Is it true?
PETERS: You know, I’m actually not even positive on that one. I think it’s the comb…Oh, no it has to be the father’s gene.
DUBNER: I’m putting my money on father, true.
PETERS: Father is true there, absolutely. Second question, antibiotics kill viruses as well as bacteria, true or false?
PETERS: That one is absolutely false.
DUBNER: You can see why they wanted to know how people did on these questions before asking them about climate change.
PETERS: Numeracy in general, what it should do is it should help you to better understand information first of all. And that kind of comprehension is sort of a basic building block for good decisions across a variety of domains.
PETERS: But numeracy should also do other things. It should also help you just simply process the information more systematically. It should, in general, help you to get to better decisions that are more in line with the facts.
DUBNER: All right, that makes perfect sense, but you have found something that kind of flies in the face of that haven’t you?
PETERS: We have. It’s the idea that people who are highly numerate and highly scientifically literate, they seem to actually rely on preexisting beliefs, on these sort of underlying cultural cognitions they have about how the world should be structured more than people who are less scientifically literate, or less numerate.
DUBNER: So, if I wanted to be wildly reductive, I might say the more education a culture gets, the more likely we are to have intense polarization at least among the educated classes, is that right?
PETERS: Based on our data, that’s what it looks like. It’s so interesting and so disturbing at the same time.
DUBNER: It is interesting, isn’t it? I mean, Peters and Kahan found that high scientific literacy and numeracy were not correlated with a greater fear of climate change. Instead, the more you knew, the more likely you were to hold an extreme view in one direction or the other, that is, to be either very, very worried about the risks of climate change or to be almost not worried at all. In this case, more knowledge led to more extremism! Why on earth would that be? Dan Kahan has a theory. He thinks that our individual beliefs on hot-button issues like this have less to do with what we know than with who we know.
KAHAN: My activities as a consumer, my activities as a voter, they’re just not consequential enough to count. But my views on climate change will have an impact on me in my life. If I go out of the studio here over to campus at Yale, and I start telling people that climate change is a hoax— these are colleagues of mine, the people in my community—that’s going to have an impact on me; they’re going to form a certain kind of view of me because of the significance of climate change in our society, probably a negative one. Now, if I live, I don’t know, in Sarah Palin’s Alaska, or something, and I take the position that climate change is real, and I start saying that, I could have the same problem. My life won’t go as well. People who are science literate are even better at figuring that out, even better at finding information that’s going to help them form, maintain a view that’s consistent with the one that’s dominant within their cultural group.
DUBNER: So you’re saying that if I believe that climate change is a very serious issue and I want to align my life with that belief, that it’s actually more important that I align my life with that belief not because of anything I can do, but because it helps me fit in better in my circle, there’s more currency to my belief there. What about you? You’re in New Haven, Connecticut, at Yale. I gather you haven’t walked into a classroom and publicly declared that you believe climate change or global warming is a hoax, have you?
KAHAN: No, I haven’t done that.
DUBNER: This makes sense, doesn’t it? But it’s also humbling. We like to think that we make up our minds about important issues based on our rational, unbiased assessment of the available facts. But the evidence assembled by Kahan and Peters shows that our beliefs, even about something as scientifically oriented as climate change, are driven by a psychological need to fit in. And so we create strategies for doing that. Here’s Steve Levitt, the University of Chicago economist who’s also my Freakonomics co-author.
Steve LEVITT: I think one of the issues with information gathering is that when people go to the trouble to learn about a topic, they tend not to learn about a topic in an open-minded way. They tend to seek out exactly those sources which will confirm what they’d like to believe in the first place. And so the more you learn about a topic, you tend to learn in a very particular way that tends to reinforce what you believe before you ever started.
DUBNER: Aha. So if you’re already scared of something, you tend to read more about how scary it is. And if you’re not worried then you don’t worry, right?
LEVITT: So if there’s one thing that human beings are terrible at, it’s assessing risk and knowing what to really fear versus the things we actually do fear. And the kind of things that tend to scare us are things that evolution has bred into us. So, my wife is terrified of snakes, mice, flies, you know, butterflies, everything small that flies or that runs she’s terrified of. What are the chances that any of those are going to do her any harm in the modern world? Virtually nothing. I mean the things that you should be afraid of are French fries, and double cheeseburgers, and getting too much sun for skin cancer. Those are the kinds of things that really end up killing us in the modern world.
DUBNER: Coming up on Freakonomics Radio: since we’re so bad at figuring out what’s really dangerous, why don’t we bring in the professionals:
Michael SHERMER: I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do.
DUBNER: And: a cautionary tale about siding with the conspiracy theorists.
Nick POPE: I think somebody actually thought I was an alien myself.
DUBNER: That’s next on Freakonomics Radio. We love hearing from you, our listeners, so: if you’ve got something to say, or a question to ask, or a strange fact to share, please visit us at Freakonomics.com or drop us a line at firstname.lastname@example.org. And thank you.
ANNOUNCER: From WNYC and APM, American Public Media, this is Freakonomics Radio. Here’s your host, Stephen Dubner.
DUBNER: On today’s show, we’re talking about how we make complex decisions about issues like climate change. As my Freakonomics co-author Steve Levitt sees it, we seek out information that confirms our pre-existing biases, and we are congenitally bad at assessing risk. So how are people supposed to figure out what to be afraid of? Here’s Levitt again:
LEVITT: To know what to be afraid of, you need to go through an in-depth data collection process, you need to be properly informed. And people are too busy, rightfully too busy, leading their lives instead of dwelling on what the exact, almost infinitesimal probability is that any particular thing will kill them. So it’s sensible for people to be uninformed and it’s sensible to rely on the media. It just turns out that the media is not a very good source of information.
DUBNER: If you really wanted to make sure that every one of your beliefs was worth holding, you’d have to spend so much time gathering primary data that you’d have no time for anything else in life. You’d practically have to become a professional skeptic. And that’s not a job is it?
SHERMER: Uh, yeah, I’m Mr. Skeptic. Anything that can be analyzed critically and skeptically, that’s what we do. So, anything from UFOs and alien abductions to Bigfoot and conspiracy theories all the way up to things like, global warming and climate change and autism and vaccinations. We cover it all.
DUBNER: Michael Shermer, a professor at Claremont University, has a masters degree in experimental psychology and a Ph.D. in the history of science. He’s also the publisher of Skeptic magazine and he writes books. His latest is called The Believing Brain.
DUBNER: Now, as a professional skeptic, I’m guessing a lot of people look at you, or hear about a guy like you or read a book by you and think, oh man, that’s, like, the dream job. You know, people think, well, I’m a skeptic; I don’t believe anything. So, what do you have to do to be you, Michael?
SHERMER: Well, we actually do believe all sorts of things. You have to have all sorts of beliefs just to, just to get out of bed in the morning, and so, the question then becomes, well, which of your host of beliefs are the ones that are really supported by evidence, or are questionable, or are probably not true, and which are those that we base on instinct and intuition, and which are we basing on, you know, solid evidence, and so, that’s where the rubber meets the road, is, is not, do you believe something or not—of course, we all believe all sorts of things. The question is, are they true? And what’s the evidence? What’s the quality of the evidence?
DUBNER: Talk to me about how we end up believing what we believe in. I was going say, how we choose to believe what we believe in, but it sounds like it’s not really a choice, right?
SHERMER: It isn’t really a choice, no. Our brains are designed by evolution to constantly be forming connections, patterns, learning things about the environment. And all animals do it. You think A is connected to B and sometimes it is, sometimes it isn’t, but we just assume it is. So my thought experiment is, imagine you’re a hominid on the plains of Africa, three and a half million years ago. Your name is Lucy. And you hear a rustle in the grass. Is it a dangerous predator, or is it just the wind? Well, if you think that the rustle in the grass is a dangerous predator and it turns out it’s just the wind, you’ve made a Type 1 error in cognition— a false positive. You thought A was connected to B, but it wasn’t. But no big deal. That’s a low-cost error to make. You just become a little more cautious and vigilant, but that’s it. On the other hand, if you think the rustle in the grass is just the wind, and it turns out it’s a dangerous predator, you’re lunch. Congratulations, you’ve just been given a Darwin award for taking yourself out of the gene pool before reproducing. So we are the descendants of those who were most likely to find patterns that are real. We tend to just believe all rustles in the grass are dangerous predators, just in case they are. And so, that’s the basis of superstition and magical thinking.
DUBNER: But then we get to something like climate change, which is, theoretically, an arena bounded entirely by science, right?
SHERMER: You would think so.
DUBNER: Yeah, you would think so. So what do we find, actually?
SHERMER: Either the earth is getting warmer or it’s not, right?
SHERMER: I mean, it’s just a data question. Well, because it also has ideological baggage connected to it, you know, left wing versus right wing politics, and so the data goes out the window. It’s like, I don’t know—whatever the data is, I don’t know, but I’m going to be against it. Now, I can’t just say, oh, I’m against it because my party is, or I just do what other people tell me. Nobody says that. What you do is, you make the decision, “I’m skeptical of that” or “I don’t believe it,” and then you have to have arguments. So then you go in search of the arguments.
DUBNER: It doesn’t sound like it surprises you at all, then, that education—level of education— doesn’t necessarily have a big impact on whether you’re pro- or con-something. Correct?
SHERMER: That’s right, it doesn’t. And giving smart people more information doesn’t help. It actually just confuses things. It just gives them more opportunity to pick out the ones that support what they already believe. So, being educated and intelligent, you’re even better at picking out the confirming data to support your beliefs after the fact.
DUBNER: Let’s talk now for a bit about conspiracy theories, which we’re nibbling around the edges of. How would you describe, if you can generalize, the type of person who’s most likely to engage in a conspiracy theory that’s not true?
SHERMER: Well, their pattern-seeking, their pattern-seeking module is just wide open. The net has, you know, is indiscriminatory. They think everything’s a pattern. If you think everything’s a pattern, then you’re, you’re kind of a nut.
POPE: I suppose I’m best known for having had a job at the government where my duties were investigating UFOs.
DUBNER: That’s Nick Pope. Until 2006, he worked for the British Ministry of Defense. And in the early 90’s, he headed up the office that handled reports of UFO sightings.
POPE: Flying saucer sightings, as they were called then.
DUBNER: His job was to figure out if any of these sightings had merit, and if perhaps there were extraterrestrial visitors.
POPE: To satisfy ourselves that there was no threat to the defense of the UK.
DUBNER: Pope came into the job as a skeptic. But some UFO reports, especially from pilots and police officers, got him wondering if perhaps we were being visited by aliens. Now, mind you, there was no hardcore confirmatory evidence. But Pope started talking in the media about the possibilities.
KQRE: You say you believe with 99% certainty that we’re not alone. So tell us what you’ve discovered.
POPE: Well I think it’s inconceivable in this infinite universe that we’re alone. And then that begs the question, if we’re not alone, are we being visited? It’s a related question.
POPE: When I started speaking out on this issue, I think some people in the UFO community thought that I might be some sort of standard-bearer for them.
DUBNER: Meaning one of them?
POPE: Yes, absolutely, that I could be a spokesperson for the movement. Of course I had the huge advantage that whilst everyone else had done this as a hobby, I’d done it as a job.
DUBNER: Did that make you a bit of a hero in the UFO community?
POPE: It did, and a lot of people still hold that view. They want me to come out and say, yes it’s all real and yes, I was part of a cover up. Their fantasy is what they call Disclosure with a capital “D”, as if there’s going to be some magical parting of the curtains and a moment where a crashed spaceship is revealed for all the world to see. Because I say, you know what, I don’t think that spaceship exists. So, in a sense I manage to upset everyone. I go too far for a lot of the skeptics by being open to the possibility, but I don’t go far enough for the believers, particularly the conspiracy theorists. And I get called things like “shill” and that’s one of the more polite things that I’ve been called.
DUBNER: Yeah, I’ve looked at some of the comments on YouTube from a speech you gave. I’ll read you a bit of it. We’ll have to employ our bleeping technician later. “Nick Pope, what a f****** spastic. He works, he quote, ‘works’ for the government, why else is he constantly on every bloody UFO program on every f****** channel. He talks enough bull**** to keep the UFO nutters happy while never actually saying anything of importance.” Let’s unpack that one a little bit, shall we Mr. Pope?
DUBNER: It says you quote, “work for the government.” Do you still work for the government?
POPE: No, I don’t. This is in itself one of the great conspiracy theories that in 2006 I didn’t really leave. I just went under deep cover, and that they’re passing me wads of banknotes in a brown paper bag or something.
DUBNER: But here’s my favorite. There’s one claim on a UFO blog that you, Nick Pope, have been abducted by aliens yourself and now lie about it.
POPE: Well, yes I’ve heard that one. I’ve even seen one, which I think you might have missed. I think somebody actually thought I was an alien myself.
DUBNER: That would explain a lot wouldn’t it?
DUBNER: Nick Pope discovered a sad truth. The more transparent he tried to be—the more information he released about himself and his work—the more worked-up his attackers became. They took facts that would plainly seem to work in his favor and they somehow made these facts fit their conspiracies instead. But before we judge, consider how good we all are at deciding first what we want to believe, and then finding evidence for it. So what’s the solution? What can we do to keep ourselves headed down the road, albeit slowly and clumsily, toward a more rational, reasoned civilization? Here’s Ellen Peters again, from the Cultural Cognition Project.
DUBNER: So, I guess, the depressing conclusion one might reach from hearing you speak is that ideology trumps rationalism?
PETERS: I think that we are seeing some evidence for that in this study, but I don’t think that that has to be the final answer. I think that policy makers, communicators need to start paying attention to some of these cues that deepen cultural polarization. So for example, telling the other side that they’re scientifically inept? Probably a bad idea. Probably not the best way to continue people coming together on what the basic science really does say. Or, coming up only with solutions that are antagonistic to one side. And you know it if you’re listening to them that those are just antagonistic solutions -- again, probably not the best idea. It’s a sign or a signal that we’re not listening maybe as well to beliefs on the other side.
DUBNER: Dan Kahan agrees that, whatever the solution, none of us are able to go it alone.
KAHAN: What’s clear is that our ability to acquire knowledge is linked up with our ability to figure out whom to trust about what. And ordinary people have to do that in making sense of the kinds of challenges that they face. But, the amount that we know far exceeds the amount that any one of us is able to establish through our own efforts. Maybe you know that the motto for the Royal Society is Nullius in Verba, which means “Don’t take anybody’s word for it.” And it’s kind of admirable and charming, but obviously false.
DUBNER: Not very practical, is it?
KAHAN: Can’t be right. I mean, what would I do? I’d say you know, don’t tell me what Newton said in the Principia, I’m going to try to figure out how gravity works on my own.
DUBNER: And speaking of Isaac Newton, remember what Stephen Greenspan told us earlier, how Newton was suckered into that terrible investment in the South Sea Bubble? It’s heartening to learn that a sage like Newton could acknowledge the flaws in his own thinking. Here’s the advice he left for all of us. He wrote: “[T]o explain all nature is too difficult a task for any one man or even for any one age. 'Tis much better to do a little with certainty, and leave the rest for others that come after you, than to explain all things by conjecture without making sure of anything.”
The fact is that most of us just don’t like to admit we don’t have all the answers. I asked Steve Levitt why we do this:
LEVITT: Well, even I always answer questions I don’t know the answer to. Everyone answers questions. But, as I’ve worked more with businesses, I’ve just come to what I think is a very interesting observation. I’ve been an academic all my life, and in academics we always start from the position that we just don’t know the answer to a question. That’s why we invest a year or two years doing a research project. We don’t know the answer; we want to find out what the answer is. What I’ve found in business is that almost no one will ever admit to not knowing the answer to a question. So even if they absolutely have no idea what the answer is, if it’s within their realm of expertise, faking is just an important part. I really have come to believe teaching MBAs that one of the most important things you learn as an MBA is how to pretend you know the answer to any question even though you have absolutely no idea what you’re talking about. And I’ll give you a very specific example. Whenever I propose that a company run a randomized experiment, almost always there’s tremendous resistance. And the reason is because in order to make a randomized experiment be sensible, it means that you have to start from the premise that we don’t actually know the answer. And the randomized experiment is a way both to test whether what we’ve been doing is correct and also whether there’s another way of doing it better. And people always say, “well why would I run a randomized experiment when I already know the answer?” And consequently the firms never learn anything.
DUBNER: On the Freakonomics blog a while back, one of our readers wrote in with this very question, which I passed on to Levitt.
DUBNER: Here’s a question from Ty Spalding. He writes, “Why do people feel compelled to answer questions that they do not know the answer to?”
LEVITT: I did not feel compelled, Ty, to answer that question.
DUBNER: That’s our show for today. Until next time, maybe you want to try practicing the three hardest words in the English language: I. Don’t. Know.