Center for BiosecurityUPMC
Bulls, Bears, and Birds
Conference Site Map | Home 
horizontal rule
Blue globe
horizontal rule
horizontal rule
Horizontal rule
Additional Reading
horizontal rule
horizontal rule
horizontal rule

Conference sponsored by:

Center for Biosecurity

Deutsche Bank

The Contingency Planning Exchange

vertical rule
Home > Events > Bulls, Bears, and Birds > Speakers > Peter Sandman


Risk Communication Before and During Epidemics
Peter Sandman, Risk Communication Specialist

Speaker biography  |  Video

Let me tell you the basics of risk communication, and then I want to apply them, a little bit, to bird flu. The fundamental principle of risk communication can be summarized in a number, [which] is the correlation between how much harm a risk does and how upset people get about it. If you look at a long list of risks, and you rank them in order of how upset people get [about them], then you rank them again in order of how much harm they do, then you correlate the two, you get a glorious 0.2.

Those of you who remember your statistics know you can square a correlation coefficient to get the percentage of variance accounted for: If you square 0.2, you get 0.04, or 4% of the variance.

That is, the risks that kill people and the risks that upset people are completely different. If you know that a risk kills people, you have no idea whether it upsets them or not. If you know it upsets them, you have no idea whether it kills them or not.

If you replace mortality with morbidity in the calculation -- you're not killing people, you're just making them sick -- our correlation remains 0.2. If you use ecosystem damage, the correlation is once again 0.2, and if, as this group likes to do, you correlate economic damage with public concern, the correlation is 0.2.

It doesn't seem to matter what your measure of harm is. Whatever your measure of harm, across a wide range of risks, the correlation between how much harm [a risk is] going to do and how upset people are going to get is this absurdly low 0.2 correlation.

In the mid-80's, I took these two concepts -- how dangerous [a risk] is and how upset people get [about it] -- and I called the first one "hazard" and the second one "outrage." The word "outrage" applies more readily to the environmental controversies that I was working on at the time than it does to the kind of public health issues we're focused on today, but the terminology stuck, so what we've got is this glorious 0.2 correlation between hazard and outrage.

It's worth noticing, by the way, that the correlation between hazard and perceived hazard is also very low, but the correlation between outrage and perceived hazard is very high. Now, as soon as you have a high correlation, of course, what you want to know is: What's the direction of the causality. That is the question we're asking when we look at the high correlation between whether people get upset and whether they think [a risk is] dangerous. Are they upset because they think it's dangerous, or do they think it's dangerous because they're upset?

That's an important question, because if you want to manage the system, you have to know which one is the cause and which one is the effect. You don't want to be in the awkward position of trying to manage a cause by manipulating the effect. That's not likely to work. So you need to know the direction of the causality. This is much studied, and as usual in social science, it turns out to be a cycle, but one of the arrows is very robust and the other arrow is very weak. The strong arrow is from outrage to hazard perception. That is, for the most part, it is not true that people are upset because they think [a risk is] dangerous; it's much more true that people think [a risk is] dangerous because they're upset.

The same is true in the negative: It's not true that people are calm because they think [a risk is] safe; it's much more true that people think [a risk is] safe because they're calm. It follows, [therefore], that if you want people to think [a risk is] dangerous, then you'd better get them upset, and if you want them not to think [a risk is] dangerous--if you want them to think it's safe--then you need to calm them down.

The outrage is the engine in this relationship, and the hazard perception is very much a result of what's happening to the outrage.

Given this reality, imagine a 2 x 2 table of hazard against outrage. In one corner, we have the high hazard/low outrage risk, where people are very endangered but not very upset. The task [in this setting] is precaution advocacy: You try to persuade them to take the risk more seriously so they'll take precautions. In the opposite corner, you have high outrage/low hazard, [where people are] very upset but not very endangered. The risk communication task [in this setting] is outrage management--to try to reduce the outrage so they'll stop wasting their time on this trivial hazard. Those are the two mismatches.

In the third corner, high hazard/high outrage, you have crisis communication. People are upset, and they're right to be upset because they're endangered. Crisis communication is entirely unlike the other two [settings]. In precaution advocacy, you want the outrage [to be greater]. In outrage management, you want the outrage lower. In crisis communication, the outrage is just fine.

But there's still a lot of communication to be done. If you were communicating in the Superdome a few weeks ago, or if you were communicating in Lower Manhattan a few years ago, there was a great deal of communication to be done, but it wasn't telling people to calm down, and it wasn't telling people to get excited. It was helping people bear their outrage and take wise rather than unwise precautions in the face of their outrage.

And finally, just to complete this 2 x 2 matrix, in the low hazard/low outrage corner, I have not found a way to earn a living. I do the other 3 for a living, and I think it's worth noticing that they are 3 different skill sets. You can have all of them, just as you can be a good carpenter and a good electrician at the same time, but you'd better bring the right tool kit to the task. If you think it's a carpentry task when it's really an electrical task, you're going to screw it up.

Figuring out where you are on this map of hazard against outrage is the beginning of risk communication.

So where is bird flu on this map? Where does avian influenza live? And of course all that means is, how serious do you think the risk is and how concerned do you think your public is? Those are the two questions.

Clearly, when it happens, when we are in the middle of a pandemic, it's going to be crisis communication. Everybody sees why that's true: People are going to be endangered, and they are going to feel endangered, and we are going to be talking to them about what they can do. What can they do to protect themselves? What can they do with their feelings? How can they get through this incredibly tough time?

One of the more obvious risk communication issues here is that we all need -- you folks need, the government needs, everybody needs -- a standby pandemic crisis communication plan. I will tell you that the crisis communication plans for avian flu I have seen so far are universally horrible. I've only seen a few, and they're all awful, and that includes the one that HHS produced a year or so ago.

I was delighted to hear this morning that HHS has a revised pandemic management plan. Hopefully it will include a revised pandemic communication plan. We'll see. But I have hope, a slight hope, that maybe they have good standby pandemic crisis communication plans, and they're secret. I would rather have a government that failed in transparency than a government that was flat-out completely unprepared to talk to people in a pandemic influenza crisis.

I think it's clear to everybody in this room that one of the things that went very badly awry in New Orleans a few weeks ago was precisely that we were unprepared to talk to people in a crisis. So that's a task, [and] it's an important task. I think we're not in good shape on that dimension at all.

But communication in the middle of a pandemic isn't what I want to talk about now.

The question I want to focus on is, where are we now with respect to pandemic flu? Where are we on this map? And of course [the answer] depends on whether you think the risk is serious [or not]. If you think the risk is serious, then we're in the precaution advocacy corner, where we are trying to persuade people to take it seriously.

But what if you think the risk isn't serious? What if you're worried that people are coming to take it too seriously: "My God, PBS and ABC talked about avian flu in the same week!" That's a trend!" If you're worried about that, you might say, as Marc Siegel, the much-feted author of False Alarm has said, [that] the problem is that people are excessively worried about avian flu, and we need to do some outrage management and get them focusing on something serious.

It's always amusing to see that happen, because as many of you know, flu is usually the example given by people who don't want the public to be worried about something, as the thing they should have been worried about instead.

So, when people are worried about SARS, there's always going to be some commentator to say, "My God! Look how many people flu killed, and we're not worried about flu, so why should we worry about SARS?" Same thing with West Nile Virus. Now, at last, people are worried about flu, and it turns out the same commentators don't want them worried about that either.

There are people out there who genuinely believe that avian flu isn't a serious problem, and for them, [the task is] outrage management. Presumably, most of the people in this room would agree that the task is precaution advocacy.

However, and this is important, there's another group of people who think that precaution advocacy isn't the task. [There are] people who do think avian flu is serious but don't think the public should take it seriously. That's a position held by a number of people in the government and a number of people in a number of governments who argue that, yes, we the government are going to prepare, but for God's sake don't tell the public, because . . . they might get excessively frightened, and that might be bad for their psychology and bad for the economy. God forbid people should be afraid just because they're going to be dead. As the economists earlier on pointed out, it doesn't hurt the economy all that much for a lot of people to die, but if a lot of people get frightened, that's bad for business! So, there's a sense that we dare not frighten people. The other base in this argument says, "It's serious but let's not say so," [because] there's nothing for people to do anyhow.

So those are the two arguments, it seems to me, that are given for not talking to the public now about avian flu -- the arguments for having lots of plans about where we're going to put all the dead bodies, for example, but [deciding not to tell the people who are going to be the dead bodies. We don't want to scare them, and there's nothing for them to do anyhow. What I want to do with the rest of my time is rebut those two arguments. [Or] I'll rebut one of those two arguments, and I'll assign the other one for homework.

So the question is, why do I think it's all right to scare people about avian flu? And let me describe for you, very quickly, some of the errors that underlie the view that we dare not frighten people -- errors that underlie official fear of fear.

First is the false expectation that fear will inevitably escalate into panic.

I wish I had ample time to demonstrate to you that panic is rare. Even in New Orleans, where the circumstances were as conducive to panic as any I've seen in a long time, there isn't that much evidence of panic. There's lots of evidence of panicky feelings. There's lots of evidence of misery. There's lots of evidence of lots of things going wrong. But if you ask yourself which was a bigger problem in New Orleans, people so frightened they couldn't think straight, or people insufficiently frightened who didn't get out of town, I think you can make a very strong argument that the latter was a bigger problem than the former.

Now, you do always have, in a crisis, lots of people feeling panicky, but people behaving in panicky ways is relatively rare. New Yorkers know this better than anybody because we have the lesson of 9/11. In the stairwells of the World Trade Center, people were more courteous than New Yorkers usually are, and more organized than New Yorkers usually are, and there were very few signs of panic among those who evacuated the Twin Towers. . . . When you interview the survivors, the vast majority tell you they panicked, but they didn't. They're wrong. They felt like panicking, and they did just fine.

Panic, in short, is rare. But official "panic panic" is common. That is, officials often imagine that the public is panicking or about to panic. And in order to allay panic, officials sometimes do exactly the wrong thing from a crisis communication perspective: They withhold information, they over-reassure, they express contempt for public fears, etc.

Panic is quite rare. What's quite common is denial; denial is why panic is rare. We are organized such that, when we're about to panic, we trip a circuit breaker instead and go into denial. There are good reasons for being worried about bird flu denial. There's very little reason, in my judgment, for being worried about bird flu panic -- with the possible exception of in the middle of the crisis, where that's going to be an issue. But it's not an issue now. So, I think that's the first misperception.

I obviously don't have as much time for the others, but let me list some of them for you quickly:

  • The mistaken belief that people cannot tolerate their fear. There are some people who can't; the vast majority of people can.
  • The underestimation of the frequency with which fearful people rise to resilient, pro-social, and even heroic behavior. We had ample evidence of that in 9/11, and I won't belabor the point.
  • The failure to recognize the positive value of fear in encouraging preparedness, vigilance, tolerance of inconvenience and expense, and so forth.

The relationship between fear and precaution-taking is a U-curve, an inverted U-curve, obviously. If people are insufficiently afraid, they don't take precautions. If people are excessively afraid, they don't take precautions. They don't panic either. They go into denial and sit around saying, "It'll happen to somebody else." But I think that between apathy and excessive fear leading to denial is a period in which [people are] getting more concerned and are, therefore, doing more about [the risk]. So it's completely inconsistent to say we want the public to prepare, [but] we don't want the public to be frightened. The main incentive for people to prepare is becoming frightened.

There's also the failure to understand that the initial burst of fear on first encountering a new piece of alarming information is temporary. Psychiatrists call this an "adjustment reaction." The normal reaction when you first discover that something bad is going to happen is to overreact temporarily. You become more vigilant. You stop doing things that look like they may be dangerous. If it's 9/11, you stop flying in airplanes. If it's SARS, you stop going to Chinese restaurants. Then there's a hepatitis outbreak in a Mexican restaurant, and you say, "All right, I'll go back to Chinese restaurants." [In the] short-term, [though] you got through an adjustment reaction.

The adjustment reaction is a rehearsal. It's a logistical rehearsal, and it's an emotional rehearsal, and the evidence is that people who have an adjustment reaction have two big advantages over people who don't. The first advantage is they are likelier to do the right thing in the crisis because they have rehearsed, and the second [advantage] is that they are more likely to notice if the crisis doesn't happen because they have rehearsed. They gear up better, and they stand down better. Ideally we want as many people as possible to have this adjustment reaction beforehand, so they'll be used to the idea if and when the crisis comes, and they'll be ready to take appropriate action.

A final error that underlies official fear of fear is the notion that when you make people more afraid of bird flu, you are making them more fearful people. That is flat-out not true. You are as fearful as you are. You get slowly more fearful as you get older. Remember your teenage years if you doubt this. But your fearfulness changes over the decades, not over the days and weeks and months. For the most part, we are who we are. In a crisis we become very temporarily more fearful -- that's the adjustment reaction -- but we quickly settle into the New Normal.

But even though our fearfulness can't be increased for long, it can be reallocated. It is fungible, to use a word you folks like to use. Greenpeace wants us afraid of genetically modified food, and the Christian Right wants us afraid of gay marriage, and I want us afraid of H5N1. You should not think of any of those three as trying to make people more afraid. What we are doing is competing with each other for our slice of the fearfulness pie.

So when you try to frighten people about bird flu, you're not changing who they are. You're trying to get more of their fearfulness for your issue. As evidence of that, after 9/11, telephone calls to pollution hotlines plummeted. People who were worried about terrorism didn't have the emotional energy left to worry about pollution. Little by little, as we settled into the New Normal, pollution hotline calls rose again, but not back to their pre- 9/11 levels, because terrorism now has a smaller chunk of our concern than in the months immediately after 9/11, but a larger chunk of our concern than before 9/11.

That's what we're trying to get for bird flu. Let's do it now, because we're going to need it later. Thank you.

return to top