I was taking my 10-year-old niece to see a movie the other night, and after she got into my car, we had an interesting conversation. “Zoe, please buckle your seat belt,” I said.
“We’re only going downtown, do I have to?” she asked.
“Of course you do,” I said. “Safety first.”
“But I’ve never been in an accident. It won’t happen. Let’s get going!”
Suffice it to say, Zoe buckled up, and we arrived without incident. We even enjoyed the movie. But she got me thinking: is the idea that nothing bad has happened before — and the conclusion that nothing bad will happen now — exclusive to children?
The short and unfortunate answer: No.
In its January 2011 report, the President’s Commission on the Deepwater Horizon Oil Spill said that a root cause of the fatal explosion and subsequent colossal spill in the Gulf of Mexico was complacency. “The Deepwater Horizon disaster,” states the report, “exhibits the costs of a culture of complacency…There are recurring themes of missed warning signals, failure to share information, and a general lack of appreciation for the risks involved.”
Moreover, the report went on to detail how this can infect any culture, not just that of an energy giant. “These findings highlight the importance of organizational culture and a consistent commitment to safety by industry, from the highest management levels on down.”
The culture of complacency is not limited to oil rigs far out at sea. The recent recession — which was caused, in part, by an erroneous belief that the status quo would continue and the housing market would never decline — is one case in point. Nor is there anything new about the disastrous consequences of complacency; consider the overconfidence of the engineers that built the Hindenburg or Spain’s King Phillip II’s misplaced confidence in the ability of his Armada to defeat Queen Elizabeth’s British navy. Complacency, after all, is a human characteristic.
The Merriam-Webster dictionary defines complacency as “self-satisfaction especially when accompanied by unawareness of actual dangers or deficiencies.” Complacency might be rooted in what psychologists refer to as “confirmation bias” — the tendency to look for or interpret information in a way that confirms currently held beliefs. It is likely that King Phillip II only listened to ideas that supported his confidence in his Armada, and rejected any contrary views.
Complacency can also arise from too much familiarity with a circumstance. A 2003 article by Steve E. Hrudey and William Leiss in the journal Environmental Health Perspectives reveals some interesting insight. “There is a risk of complacency developing among personnel in other sectors, from airport security to emergency response,” states the article. “They will predictably experience a preponderance of false positives in performing their routine screening responsibilities dealing with rare hazards.”
This manifestation of complacency is very familiar — even to children. While we discussed seat belts, Zoe happily reminded me of The Boy Who Cried Wolf.
It almost goes without saying that risk managers shouldn’t be complacent. Instead, they must help protect their organizations against this basic human behavior, which we can refer to as a new type of risk: complacency risk.
Let’s consider where this new risk fits into the greater scheme of threats facing today’s organizations. Many people may recall former Secretary of Defense Donald Rumsfeld’s famous classification of risk into four categories: “known knowns,” “known unknowns,” “unknown knowns” and “unknown unknowns.”
Complacency about “known known” risks led to the collapse of Lehman Brothers, whose employees were aware of the risks associated with exotic derivatives, yet ignored them. Complacency about “known unknown” risks yielded the defeat of the Spanish Armada; King Philip II knew that he had no accurate assessment of the British fleet, but sailed for England anyway. Complacency about “unknown known” risks is what led to the questionable decision to inflate the Hindenburg with hydrogen, which obviously burns, rather than helium, which does not.
And complacency about “unknown unknown” risks — the most dangerous of all — led to the spectacular failure of Galloping Gertie. Gertie was a bridge built over the Tacoma Narrows waterway in Washington state that famously twisted wildly in high winds before collapsing. Engineers later learned that the winds had the same resonant frequency as the bridge, which vibrated like a tuning fork and was torn apart by the gust.
It is interesting that for all of these examples, the real culprit seems to be a decision to forego what you could call a moment of insight. And these decisions seem to happen in just two ways: either by saying “I don’t care” about known risks or by saying “I have done enough thinking” about unknown risks. This makes sense because complacency comes from a place of self-satisfaction — where “I don’t know and I don’t care” run rampant.
So what is the takeaway for risk managers? To me, it is the idea that complacency risk requires the consideration of residual risk. According to the Federal Aviation Administration’s safety handbook, residual risk is the “portion of total risk that remains after management efforts have been employed,” and it “comprises acceptable risk and unidentified risk.”
As a risk manager, it is not enough that I have used enterprise risk management to identify, prioritize, treat and monitor risks. I am also required to consider the residual risks after this process and analyze whether more needs to be done. The natural tendency to become complacent — “I have done enough thinking” — is countered by asking, “Have I done enough thinking?” and “Am I ignoring residual risk?”
It is also important to consider whether my carefully crafted risk treatments have been implemented. If decision makers in my organization respond to my risk treatments by saying, “I don’t care,” this is tantamount to a decision not to treat the risk at all. This also means that regardless of our carefully crafted risk management strategies, risks are untreated and the residual risk greatly increases.
So what is the solution when decision makers say that they don’t care about a risk? Well, if complacency is itself a risk, there are really only a few alternatives for addressing it: avoid, accept and monitor, transfer, reduce the impact, or reduce the likelihood.
Most alternatives are not feasible. Risk avoidance for complacency is impossible unless you replace every human in your organization with a robot. To my knowledge, robots do not become complacent.
Accepting and monitoring the risk is also not much of a solution. This treatment requires a practitioner to set a baseline monitoring condition, which, if met, means that the risk will become reviewed again for treatment. At best, it means putting off a decision on how to deal with complacency until a later day. At worst, it means that the deadly risk event will happen before a decision of what to do about it is made. This was probably one of Lehman’s errors.
Risk transfer is unlikely to be cost effective; a company that would contract to operate my organization would still need to address complacency risk.
Reducing the impact is unfortunately not always an option since complacency risk often encompasses substantial risks that are game-changers because the risk events can be outside the control of the organization (such as Lehman and the global economy or BP and highly pressurized oil a mile underwater).
To me, reducing likelihood is the only real solution. The likelihood of complacency will decrease if my organization creates a culture that embraces enterprise risk management. At a minimum, such an organization is much more responsive to risks and their treatments. It is very unlikely to hear an ERM-savvy CEO say, “I don’t care about these risks.”
So there you have it. Treat complacency risk by creating a high- — functioning enterprise risk management culture. And hopefully, we will all remember to buckle up on the way to that destination.