Our latest Freakonomics Radio episode is called “Why Do We Really Follow the News?” (You can subscribe to the podcast at iTunes or elsewhere, get the RSS feed, or listen via the media player above. You can also read the transcript, which includes credits for the music you’ll hear in the episode.) The gist: there are all kinds of civics-class answers for why we pay attention to the news — but how true are those answers? Could it be that we read about war, politics, and miscellaneous heartbreak simply because it’s (gasp) entertaining? Read More »
Season 3, Episode 3
Until not so long ago, chicken feet were essentially waste material. Now they provide enough money to keep U.S. chicken producers in the black — by exporting 300,000 metric tons of chicken “paws” to China and Hong Kong each year. In the first part of this hour-long episode of Freakonomics Radio, host Stephen Dubner explores this and other examples of weird recycling. We hear the story of a Cleveland non-profit called MedWish, which ships unused or outdated hospital equipment to hospitals in poor countries around the world. We also hear Intellectual Ventures founder Nathan Myhrvold describe a new nuclear-power reactor that runs on radioactive waste. Read More »
A new study looks at how ideological and political beliefs affect people’s perceptions of the weather. The authors surveyed 8,000 people across the U.S. between 2008 and 2011 and found that while floods and droughts were remembered correctly, temperature changes were a different story. From Ars Technica:
Read More »
In fact, the actual trends in temperatures had nothing to do with how people perceived them. If you graphed the predictive power of people’s perceptions against the actual temperatures, the resulting line was flat—it showed no trend at all. In the statistical model, the actual weather had little impact on people’s perception of recent temperatures. Education continued to have a positive impact on whether they got it right, but its magnitude was dwarfed by the influences of political affiliation and cultural beliefs.
And those cultural affiliations had about the effect you’d expect. Individualists, who often object to environmental regulations as an infringement on their freedoms, tended to think the temperatures hadn’t gone up in their area, regardless of whether they had. Strong egalitarians, in contrast, tended to believe the temperatures had gone up.
We recently put out a podcast called “The Truth Is Out There … Isn’t It?” about how people decide what to believe about everything from global warming and nuclear risk to UFO’s. It was inspired by the research of Dan Kahan and his colleagues at the Cultural Cognition Project; they have found that we systematically filter our beliefs through our personal and political filters. In other words, we allow our biases to influence what we think about theoretically non-ideological issues, but we aren’t aware of that influence. Read More »
Our latest Freakonomics Radio podcast is called “The Truth Is Out There…Isn’t It?” (You can download/subscribe at iTunes, get the RSS feed, listen live via the media player above, or read the transcript below.) In it, we try to answer a few fundamental questions: how do we know that what we believe is true? How do we decide which information to trust? And how do we quantify risk — from climate change to personal investments?
The program begins with Stephen Greenspan, a psychologist and an expert on “social incompetence” and gullibility. He knows from personal experience that even the smartest people can be duped into bad risk assessments, especially on the advice of people they trust. Read More »
A new study by the Cultural Cognition Project, a team headed up by Yale law professor Dan Kahan, shows that people who are more science- and math-literate tend to be more skeptical about the consequences of climate change. Increased scientific literacy also leads to higher polarization on climate-change issues:
The conventional explanation for controversy over climate change emphasizes impediments to public understanding: Limited popular knowledge of science, the inability of ordinary citizens to assess technical information, and the resulting widespread use of unreliable cognitive heuristics to assess risk. A large survey of U.S. adults (N = 1540) found little support for this account. On the whole, the most scientifically literate and numerate subjects were slightly less likely, not more, to see climate change as a serious threat than the least scientifically literate and numerate ones. More importantly, greater scientific literacy and numeracy were associated with greater cultural polarization: Respondents predisposed by their values to dismiss climate change evidence became more dismissive, and those predisposed by their values to credit such evidence more concerned, as science literacy and numeracy increased. We suggest that this evidence reflects a conflict between two levels of rationality: The individual level, which is characterized by citizens’ effective use of their knowledge and reasoning capacities to form risk perceptions that express their cultural commitments; and the collective level, which is characterized by citizens’ failure to converge on the best available scientific evidence on how to promote their common welfare. Dispelling this, “tragedy of the risk-perception commons,” we argue, should be understood as the central aim of the science of science communication.