Preventable errors rob profit and resources at every level of an organization. They undercut quality, safety, IT security and customer service. They create waste in otherwise useful systems and negate the return on investment from training, processes and procedures. To add insult to injury, the media may quickly discover an error and broadcast your embarrassing failure around the world, permanently damaging your brand and reputation in a matter of minutes.
In the April 2011 issue of Strategic Finance, Mark Frigo and Richard Anderson define strategic risks as uncertainties “that could inhibit an organization’s ability to achieve its...strategic objectives with the ultimate goal of creating and protecting shareholder and stakeholder value.”
Using this definition, there is little doubt that human error fits the bill. The question then becomes, how can companies reduce the frequency of human error?
The Loss of Error
“Every day in America, 13 people go to work and never come home,” said Secretary of Labor Hilda Solis in a speech on Memorial Day in 2012. “Every year in America, nearly four million people suffer a workplace injury from which some never recover.”
There are few business resources as dear, both socially and financially, as the people we work with. There are even fewer that have as great an impact on productivity when unavailable. Workplace injury not only takes an enormous human toll, it also leads to staggering costs. A director of research for a leading writer of workers compensation insurance said that “though risk management strategies have reduced the frequency of workplace injuries and fatalities, they continue to represent a huge burden for U.S. businesses at over $50 billion in annual direct costs.”
Other errors also cost money—lots of money. The global analyst firm IDC, in a 2008 white paper, examined human error in the form of “employee misunderstanding” and its financial impact on 400 U.K. and U.S. businesses.
“Large enterprises are each potentially losing tens of millions of dollars to what is termed employee misunderstanding,” stated the paper. It defined employee misunderstanding as actions by employees who have misunderstood or misinterpreted company policies, business processes, job functions—or a combination of the three.
The average cost of this misunderstanding, at a company with 100,000 employees, is $62.4 million per year. Combined, U.K. and U.S. enterprises are losing an estimated $37 billion every year. The cost of intangibles—like reputation or customer trust—could have even greater consequences.
The health-care industry knows these costs all too well. More than a decade ago, the Institute of Medicine issued a now-famous study on medical error (“To Err is Human: Building a Safer Health System”) that revealed dramatic statistics. At least 44,000, and up to 98,000, preventable deaths occur annually as a result of medical errors in U.S. hospitals.
These numbers, if accurate, would make hospitals the eighth-leading cause of death in America—and this figure does not even include medical errors in the outpatient setting. This would rank the lethality of U.S. hospitals ahead of motor-vehicle accidents, breast cancer and AIDS. In the last 10 years, these estimates have escalated to over 200,000 per year, according to various studies by Health Grades, a U.S. company that develops and markets quality and safety ratings of health-care providers.
Another error-related threat comes from cybersecurity. After massive attacks on corporate giants like Google, Amazon, Citibank, JPMorgan, Sony and Lockheed Martin, strategic risk managers have realized that IT security is as much their concern as financial performance is.
While most of the buzz in the IT industry remains centered on sophisticated hackers, multiple studies show that the vast majority of data breaches are caused by human error. A 2011 study from the Ponemon Institute, a tech research nonprofit, revealed that organizations lose an average of $332 million in brand value in the year following a data breach.
This fact is compounded by Cisco findings that younger workers do not consider themselves accountable for IT security. The company’s 2011 “Connected World Technology Report” found that 61% of the college students and young professionals surveyed do not think they are responsible for protecting corporate information. Worse still, 70% admitted to violating company security policy, and 80% think restrictions on use of social media in the workplace are outdated (or that they don’t know that they exist at their job). “This is our future workforce,” said Scott Olechowski, Cisco’s security and threat research manager.
Of course, not all brand setbacks are tech-based. Recent years have given us a host of spectacular brand breakdowns related to human failings (Penn State football, BP, Apple maps). And reputation disasters do not even have to be spectacular. In December, 2011, for example, a funny-but-sad clip of a FedEx delivery man pitching a computer monitor over a fence attracted nearly 200,000 viewers on YouTube in a single day. It became fodder for late-night television comics and forced FedEx to launch a costly public relations campaign.
White Flags of Surrender
Surprisingly, even with huge related losses, human error remains below the radar. To understand why, look at the culture of malaise. What have companies been taught to expect from human capital? Not much.
The most basic problem is that the battlefield needed to wage the struggle against human error is strewn with an attitude of defeatism and decades of techno-hubris; companies have a belief that they can technically engineer away the effects human error. To some degree, many have already capitulated to the idea that “to err is human” and current losses are simply “the cost of doing business.”
To challenge the status quo, organizations must challenge the premise that the human is the weakest link in the workplace. On the contrary, when properly prepared, people are not something to be protected against; they are the strongest part of the performance equation. While technology, technical training and culture are important, the individual mind is more critical.
Not everyone believes this. There are reams of evidence—in recent writings and public statements given by highly respected professionals (who will remain nameless here)—that this view has become pervasive. “People will always make mistakes, that’s a given.” “Trying to stop human error is a fool’s errand.” “It’s easier to change situations than people.” “It is easier to manage error than to prevent it.” “Unfortunately, we are forced to work with the crooked timber of human fallibility.”
While there is a ray of truth in each of these opinions, the general premise is a flag of surrender. The alternate view, however, will help companies improve the vigilance and attention to detail of their workforce. Many are doing so already.
From Tactical to Strategic
Until recently, human error has been approached indirectly, if at all. Throughout the industrialized world, there have traditionally been a few different responses.
They can be broadly lumped into the following five categories:
- Punish the individual who made an error and create remedial training
- Emphasize accountability by blaming the leader for failures that occur on their watch
- Create teamwork strategies to capture or contain errors through better communication
- Establish systemic approaches that put multiple layers of protection in place to avoid or respond to errors (the so-called “Swiss cheese” model)
- Introduce cultural approaches that focus on creating and sustaining social factors
In spite of these efforts, human error remains responsible for 60%-80% of failures, accidents and incidents in most high-risk industries. Human performance experts continue to struggle to find a broad-spectrum antibiotic to cure the human error “disease.”
If we extend this metaphor, we can shed some light on why it is not working. Human error is not a common infection that can be fought with a broad-spectrum antibiotic; it is more like a virus that the immune system must handle from within. And when you’re fighting a mutating virus, every battle is an inside job, won or lost at the individual level. As we continue to search for a more effective remedy against human error, individuals must be taught to see and defend against common error-producing conditions.
In recent years, organizations as varied as the U.S. Marine Corps, Federal Express and the FBI have begun to embrace “empowered accountability,” a concept where individuals are trained to recognize common error types and mistake-producing conditions. They are then urged to study their own performance—on and off the job—to recognize the types and frequencies of errors that they make. By making error control “a life skill first and a job skill second,” the new information and skills cross the work/home threshold and consistently reinforce themselves. The early results of this approach have been promising.
Another productive attempt to curb error is quality, safety, security and customer-service analytics, which are mined for patterns of errors. In a broad sense, this approach is reminiscent of W. Edwards Deming’s “the data will set you free” mind-set that revolutionized the quality movement in the 1980s.
Most critically, where organizations are having success, the issue of human error is no longer treated as an embarrassment. It is viewed as another ever-present risk that must be managed strategically.
A Short History Lesson: 216 BC
On August 2, 216 BC, the two largest armies in the civilized world stood face to face on an open plain near the mouth of what is now the Ofanto River on Italy’s east coast. The fate of the civilized world hung in the balance.
The Romans held the better ground and had almost twice as many troops as their adversary. Nearly 80,000 armed men stood in three bristling lines of attack. Opposing the Roman juggernaut was a far weaker adversary in an inferior tactical position. With the river on one side and the ocean to the rear, an estimated 49,000 Carthaginian forces—mostly mercenaries who did not even speak a common language—prepared for what appeared to be a crushing defeat from the Roman sledgehammer.
But the Carthaginian general Hannibal Barca knew his adversary, a hot-headed Roman general named Varro, and through a series of maneuvers designed to embarrass his rival, he drew the entire Roman center into an unwise advance.
Less than four hours later, nearly 60,000 Roman soldiers lay dead or dying on the ground near the village of Cannae. They were the victims of poor decisions born of the common human errors: ego and anger. Perhaps even more important than the errors committed is the fact that the Carthaginian commander predicted and induced these errors to defeat a far superior force on unfavorable terrain.
Therein lies the lesson for today.
In 216 BC, Hannibal was one of the few leaders in the world who understood the intricacies of human error and how to leverage them to his advantage. Today, that information is becoming available to all. Over the past two decades, human error research has expanded exponentially. The causes and effects of error have been studied and codified. The next step is behavioral change for all—individuals, team players and leaders. Human error is no longer the shadowy, ill-defined foe it once was, yet few have utilized the new discoveries to strategically attack error as a part of an enterprise risk management system.
In the near future, companies that continue to let human error rot their strategy from the inside-out might face a similar fate as that of the Romans. Conversely, those that realize that this area is ripe for improvement may be able to outmaneuver their rivals—no matter how large the adversaries in the industry might seem.