Risk = Hazard + Outrage: A Conversation with Risk Consultant Peter Sandman

In our recent podcast “The Truth is Out There… Isn’t It?,” we hear from professional skeptics, former UFO investigators, and  “social incompetence” experts. One fascinating interview that didn’t make the final cut was with Peter Sandman, a “risk-communication consultant” whose work was also cited in Freakonomics. (Here is how he came to be what he is.)

Sandman breaks his work into three areas: scaring people who are ignoring something that is legitimately dangerous and risky; calming down people who are freaking out over something that’s not risky; and guiding people who are freaking out over something that is legitimately risky. To accomplish all this, Sandman came up with a useful equation: Risk = Hazard + Outrage. Here are some excerpts from Stephen Dubner’s interview with Sandman, which ranges from the perceived risk of WMD’s in Iraq to the debate over climate change. 

Dubner: All right, so there are the false dangers, the over-hyped dangers, the under-appreciated dangers, and the properly equilibrated dangers… which case provides you the most employment?

Sandman: Well, I intentionally try to divide my time about equally. But scaring people doesn’t pay nearly as well as reassuring people. So even though I divide my time equally, my work with clients who are worried that the public is unduly freaking out subsidizes my work with clients that are trying to alert people to real risks.

Dubner: I see. And so talk to me for a minute about your cornerstone formula… I’m interested in at least two things about it: one, how you came to it; and two, what it means.

Sandman: Okay, the most important truth in risk communication is the exceedingly low correlation between whether a risk is dangerous, and whether it’s upsetting. That correlation hovers around 0.2. You can square a correlation to get the percentage of variance accounted for. You square 0.2 you get 0.04, a glorious four percent of the variance; that is, the risks that kill people and the risks that upset people are completely unrelated. If you know a risk is dangerous, that tells you almost nothing about whether it’s upsetting. If you know a risk is upsetting, that tells you almost nothing about whether it’s dangerous. That low correlation is not my work, but back in the 1980s looking at that low correlation and trying to make sense out of it, I coined the formula that sent my children to college and that has become my sort of signature formula. I said alright look, let’s call whether a risk is dangerous or not, or how dangerous it is, let’s call that hazard. Let’s call how upsetting it is outrage. And I coined the formula risk equals hazard plus outrage, which was my effort to claim that the risk really has two different components. … The technical component is the hazard. The social component, the cultural component, is the outrage. And I argued, and now several decades later still have spent a lot of time arguing, that outrage isn’t just a misperception of risk; it is a part of risk. You know, it’s a misperception of hazard, or at least it leads to a misperception of hazard. But outrage is really part of what people mean by risk, and if you’re going to manage the risk, obviously you have to manage the hazard. You know, you have a moral and legal obligation to manage the hazard. But managing the risk means managing the outrage also, getting people more upset if the hazard is high, getting people less upset if the hazard is low.

Dubner: Now, when I look at the formula, which I love, risk equals hazard plus outrage, it makes perfect sense. It’s almost a tautology after a while. That said, when I see the word risk there, I want to stick another word in front of it, which is perceived risk. That’s really what you’re talking about right? Because the hazard in your formula is the actual risk, correct? And the risk that you’re trying to determine, hazard plus outrage is really what people seem to think is the risk, that you can have a high hazard situation but low outrage and the risk is lower where the actually risk may be high. Am I right on that?

Sandman: Well, yes and no. Definitions are stipulative; you can define things any way you like. And if you want to define the risk as purely technical, then what we’re talking about is perceived risk. But that’s not in fact what the public does… Let’s say we’re at a public meeting and people are terribly worried about the emissions from a factory that’s in their neighborhood. They know that factory is emitting a chemical called dimethyl-meatloaf, and they think the dimethyl-meatloaf is giving their children cancer. And they’re very, very upset.

Dubner: I hate to interrupt, you are saying dimethyl-meatloaf aren’t you?

Sandman: I am.

Dubner: Okay, very good.

Sandman: That’s my hypothetical chemical.

Dubner: I don’t remember that from chemistry class, but I’m happy to…Okay.

Sandman: So people are very upset about the dimethyl-meatloaf emissions. And lets assume that you have terrific data that show that dimethyl-meatloaf emissions are not in fact capable of causing the type of cancer that they have in their community, so that they are wrong about the hazard. Well, it’s going to be hard to convince them that they’re wrong about the hazard. But, you know, assume optimal conditions. Assume that you have good data, that you’re a good speaker, they’re a rational audience.

Dubner: A lot of assumptions, yep, yep.

Sandman: Yeah, but assume optimum conditions, and over a period of several hours you succeed in convincing them that dimethyl-meatloaf can’t cause leukemia, if leukemia is the kind of cancer they’re worried about. So you’ve actually changed their hazard perception. Now, here’s the point to this example. During the several hours during which you were laboriously changing their hazard perception, it’s very unlikely that their outrage has gone down. You know, you’re still the company that is emitting dimethyl-meatloaf without getting their permission. You’re still the company that has been arrogant, unresponsive, and less than honest about your dimethyl-meatloaf emissions. You’re still the factory that they wish wasn’t in their neighborhood. And all the sources of outrage are still there, and when you correct the hazard perception, the outrage doesn’t go down. … It may in fact go up. It’s a little bit like Charlie Brown having the football snatched away from him by Lucy. Their ammunitions, their substance of ammunition has gone away, but their outrage hasn’t. So that, you know, they have an additional source of outrage. The point is then that under those circumstances if you ask people, ‘Now that you know that the dimethyl-meatloaf isn’t causing leukemia in your neighborhood are you no longer upset about it?’, they’ll tell you no. They’ll start saying, ‘Well maybe it causes some other kind of cancer. Or maybe it causes birth defects. Or maybe it causes asthma.’ They’ll come up with new hypotheses that attach their outrage to a new hazard. So you don’t accomplish anything in correcting people’s hazard perception without managing their outrage. … I think you want to talk more about climate change later in this conversation. But you know, with climate change, early on, if we go back a decade, climate change risk communication was mostly precaution advocacy, it was people who weren’t aware of the problem, who weren’t taking the problem seriously, who needed to be awakened. Their apathy needed to be pierced. I don’t think that’s true anymore. I think, I mean, obviously there are still plenty of people who are unaware of climate change or who are apathetic about climate change. But I think there’s a much more important growing audience of people who are outraged at the remedies that are proposed for climate change. They don’t want to lose their SUV. They don’t want to take the economic hit that they think the society will take if we take climate change mitigation to heart. And if you’re talking to a room full of people who hate the idea of the set of remedies you have proposed for climate change and instead of trying to reduce their outrage about the remedies, you’re busy trying to increase their outrage about climate change, you’re fighting the wrong fight.

Dubner: So let’s start by just applying your formula to this problem. So, if risk equals hazard plus outrage, talk to me for a moment about how you overlay that formula onto the issue of climate change, and especially how, as you suggest, the measurement of each of those elements in your formula has shifted over the years.

Sandman: Yes, well you have two fundamental problems, and you have to decide which is the bigger of them, or when one is the bigger one, and when the other is the bigger one. One problem is apathy, people who are insufficiently upset about climate change, and the other is denial. People who are motivated not to be upset about climate change because they dislike something in the climate change message. … They dislike the source of the message, they dislike, as I suggested earlier, the remedies that you’re proposing; they don’t like being blamed, they don’t like how depressing and fatalistic your message is. There’s something about what you’re saying that is provoking denial. And apathy and denial look the same. And I have to stress because denial is used in the climate change literature to mean something else entirely, I’m not talking about denial in the political sense. I’m talking about denial in the psychological sense: people who can’t bear to take climate change seriously because it causes something that they can’t tolerate in the message. If that’s what you’re up against, then you need a set of strategies that are aimed at that.

Dubner: If I were to come to you as the, let’s say the head of the United States government, and I say, you know, I care about climate change a lot, but for all kinds of reasons, social, political, economic, it’s a very hard subject to gain the proper kind of traction on, and Peter, I think the first step is to change the way we communicate, change the way we have a climate change conversation. What do you do, what are your first few, what are your first few steps of action there?

Sandman: All right, here’s a quick list. Number one, express regret for some of the solutions that are going to be unpleasant and costly. Number two, express uncertainty that you’re right, but conviction that the risk is too great to ignore. Number three, give people absolution for having dug this ditch without realizing that they were digging it. Number four, be interested in possible solutions that are less unpleasant. Take the possibility of a technical fix seriously. If we can’t find one, then we can’t find one, but environmentalists don’t want to find one. They’re very hostile to the idea that we might solve this problem without changing everything about our way of life. You know, be open to that.  Number five, be more optimistic. Don’t try to say the world is going to hell in a hand basket where it’s already too late, we’re doomed, therefore use compact fluorescents. Nobody’s going to change their lives if we’re doomed. We have to have a good shot at solving the problem seriously. Last point…well, no next to last point, take adaptation seriously. You know, instead of just trying to reduce our vulnerability, or reduce the amount of climate change, try to reduce our vulnerability to climate change. Take seriously learning how to live comfortably in a hotter world with wilder weather. And finally, stop trashing the enemy. Be much more respectful of people who think you’re wrong, or people who think the cost is too high, or people who are in any of a variety of ways opposed to the climate change movement. They’re allowed to treat you with contempt. It’s a big mistake to treat them with contempt.

Dubner: So if these are the seven steps or platitudes towards communicating effectively the risk and actions needed to address climate change: express regret, uncertainly, offer absolution, consider other solutions, express optimism, consider adaptation, and don’t trash the enemy– if those are the seven factors that, if they were executed properly would, let’s say, total a score of 100 on the Sandman scale of climate change communication, how would you assess the actual score on that scale of the broad, mainstream climate communication?

Sandman: Well, this is where my optimism comes into play. I would put them at about a thirty, but I think they’ve come up from about a ten. I think there is real movement in the directions that I am prescribing. And I don’t mean to imply that it’s because of me. But I think lots of people are beginning to realize that accusing your audience, and depressing your audience, and guilt tripping your audience, and trashing your opponents is not a winning formula. And I, you know, although they have a long way to go, I think there is the beginning of a much more empirically sound communication approach on behalf of climate change.

TAGS:

Leave A Comment

Comments are moderated and generally will be posted if they are on-topic and not abusive.

 

COMMENTS: 6

View All Comments »
  1. Eric M. Jones says:

    I gather that most Risk Determinations assume you will live forever. I wonder how this changes things.

    Thumb up 3 Thumb down 1
  2. Becky says:

    “They’ll come up with new hypotheses that attach their outrage to a new hazard. ”

    Like when the warmists keep changing what we’re supposed to be worried about . . . first it was “global warming” . . . then it was “climate change” . . . now it’s “climate disruption.”

    The warmists keep making prophecies that fail (50 million climate refugees by 2010, children won’t know what snow is, malaria will spread, etc.). Not only are they not true, but the OPPOSITE is happening!

    If they can’t tell us exactly what the risk is, why should we freak out about it? If their predictions keep failing, why should we trust them?

    Hot debate. What do you think? Thumb up 13 Thumb down 17
    • DaveyNC says:

      I just don’t understand what is so bad about a warmer world. A colder world would be much worse for mankind. Maybe some of you anti warming advocates can enlighten me?

      Thumb up 4 Thumb down 4
      • Enter your name... says:

        They tried the “world is freezing” in the 70′s and it didn’t work. This is just a new tactic in a not so new strategy in an old, old game called “political control”. The payoff is in fame and fortune a la Al Gore or any one of a zillion publicly funded “green” start up companies.

        Thumb up 3 Thumb down 7
  3. David Finkelstein says:

    Your radio segment is on the topic of “why do people hold their particular beliefs about climate change.” Specifically, you cite that 95% of science shows that climate change is a real threat (which is true) and then ask, if there is some much evidence that the threat is real, why do half of Americans think that it is a hoax? This is a legitimate and important question.

    But the answer to the question is very straightforward. The 5% of scientists who claim that climate change is not real are doing junk science, and they are funded by the oil industry. If it were not for the industry funding, there would be 100% agreement among scientists. The industry has spent millions of dollars in a propaganda campaign to create the belief that climate change is a hoax, and the media is very powerful at creating irrational beliefs. If it were not for this propaganda campaign, no one would be talking about a “hoax.”

    I found it astonishing that you could leave out any mention of the propaganda campaign which created the belief in the first place. After discussing the issue solely in terms of people’s personal psychology, you mention that it has to do with a person’s “ideology,” but how can you have an economics program without mentioning the enormous amount of money the industry will make by manufacturing a belief that climate change is a hoax?

    Thumb up 1 Thumb down 0
  4. John says:

    I find it ironic that all of these comments, for or against climate change do not use any of the seven steps highlighted by Sandman to change the “outrage”, either increasing it or decreasing it. The article is not about climate change, it is about communicating risk and risk perception.

    Thumb up 1 Thumb down 0