One of the side effects of reading and writing about risk management for the past decade or so is that I have been conditioned to see the risk in everything. So far, this hasn’t turned me into a risk-averse shut-in ready to run from my own shadow, it’s simply made me more aware of all the things that are out to get me in the world and given me a healthy sense of what you might call “realistic pessimism” (or maybe “pessimistic realism”). As Joseph Heller wrote, “Just because you’re paranoid, doesn’t mean they aren’t after you.” At least risk managers are on the job, doing their best to make sure the bad things that could happen don’t. (And yes, that is shameless flattery on my part.)
What is troubling, though, is how so many people seem unable or unwilling to recognize risks in their own lives and take common-sense measures to avoid their ill effects. The internet is full of stories and images of would-be Darwin award-winners doing things that seem designed to remove them from the gene pool in the dumbest ways possible. It’s easy to chalk this up to, at best, a lack of awareness, or, at worst, plain stupidity, but there may also be a third culprit behind poor decision-making and illogical responses to risk: our traitorous brain and the cognitive biases it cooks up for us.
In a recent paper in the journal Health Promotion International, economists Frederick Chen of Wake Forest University and Ryan Stevens of New York University looked at these errors in judgment in the context of flu vaccination. The CDC reports that, between 1976 and 2007, anywhere from 3,000 to 49,000 people in the United States died each year from the flu, and the vaccine could cut the risk of illness by 60%. Yet the majority of Americans do not get vaccinated.
While people cite a variety of reasons for not getting the flu vaccine, including that they don’t think they need it, they think it is ineffective, or they believe the vaccine will actually give them the flu, the researchers suggest that these are often the result of certain cognitive biases that create misconceptions and interfere with public health messages.
For instance, most people tend to notice adverse events—in this case, when the vaccine doesn’t work. If the vaccine is effective and no one gets sick, that is simply considered a normal state and doesn’t spark interest. This is called the “availability heuristic,” where people overestimate the importance of the most recent or noteworthy examples when making decisions or developing opinions. This would be like not believing in climate change because it was cold one morning or dismissing the risk of smoking because you read about a healthy grandmother who smokes a pack a day. In the case of flu vaccines, the availability heuristic would cause a person to overestimate the likelihood that the vaccine wouldn’t work, then conclude they shouldn’t bother.
Risk management itself faces a similar problem in that, in an ideal state, when risk management is effective, nothing bad happens and the status quo is maintained. It is extremely difficult to prove the value of “nothing,” so no one takes notice. But if things go south because the company was blindsided by some problem, risk management takes the blame. Like with a vaccine, they think, if it doesn’t work, why bother?
Thus, risk managers (and vaccine proponents) need to take cognitive biases into account when trying to convince their constituents of a course of action they know to be beneficial. That might mean creating new, more vivid examples of successful programs to educate people and reset their availability heuristic in your favor. After all, if you haven’t been immersed in the field to the point that you see risk in everything, you could just as likely see risk in nothing.