Are Risk Assessments Increasing Your Risks?

Chris Cronin


March 2, 2015

Risk Assessments

The “observer effect” is a term used in the scientific community that refers to how “the act of observing influences the phenomenon being observed.” This concept is equally applicable in the information security industry. Sometimes the instruments used by information security professionals  to analyze and measure create an unintended effect on what is being studied. And so it goes with information security risk assessments: If not prepared carefully, they may increase your risks and have profound legal ramifications about which few professionals may even be aware.

Some of these unintended risks are obvious. For example, if a risk assessor’s notes and reports get into the wrong hands, they could create serious hazards to the assessed organization. Also, bad risk analysis may cause organizations to ignore foreseeable breaches. They can delay the repair of poorly understood but critical weaknesses, or prioritize scarce resources for hyped vulnerabilities that protect low risks. In short, misreading the data can lead to misdirecting resources, which in turn can exacerbate existing problems or even create new ones.

Another poorly understood risk of information security risk assessments is ignorance of the law. More specifically, conducting such assessments while being unaware of the legal implications of risk assessments can increase an organization’s liabilities.

In a 2001 Journal of Legal Studies article, W. Kip Viscusi detailed a study he conducted while a professor at Harvard Law School. Viscusi was trying to determine whether juries are more or less reasonable than judges when considering the liability of organizations that make risk-based safety investments. Viscusi’s research used the “Hand rule” as the standard for reasonableness. Used in many liability and negligence cases, the Hand rule states that the burden of a safeguard should not be greater than the probability multiplied by the loss that a risk could create. Note the rule’s similarity to the definition of risk that is commonly used among risk assessors: Risk equals likelihood multiplied by impact.

Viscusi’s study presented a few scenarios to nearly 500 jury-eligible people and a set of judges that depicted an organization’s risk-based decisions on investing in repairing a known product vulnerability.

In one scenario, the participants were asked to assume the role of the organization. Their assessment identified a risk to loss of property that, according to the Hand rule, would appear acceptable: The estimated likelihood and impact were lower than the cost to reduce the risk. When asked whether they would invest in reducing that risk, the majority of the jury-eligible participants said they would, while the majority of judges said they would not.

In another scenario, the participants were asked to play the role of the jury. In this scenario, a number of people died as a result of a risk that was appropriately protected according to the Hand rule. In this case, the majority of both jury-eligible participants and judges ruled against the organization, but the damages that jury-eligible participants levied were many times higher than those that the judges considered.

Most concerning is that the jury-eligible participants selected higher penalties against the hypothetical companies that conducted risk assessments than those companies that did not. Consider the importance of this: The very existence of a risk assessment increased an organization’s liability under jury review. Viscusi’s research showed that potential jurors were soured by the idea that an organization would weigh a safety investment against the value of a human life. Judges, however, only showed that bias in extreme cases.

Risk assessments are required for all information security and privacy regulations and guidance frameworks, but they may increase liability in jury-based trials. Viscusi found that lay persons simply are not familiar with the principles of risk management. So while our risk assessments help companies comply with regulations, they could be setting them up for outsized damages if security breaches go to court. Short of tort reform that permits only judges to award damages in negligence cases, risk assessors must prepare organizations for this liability risk.

When conducting a risk assessment, think carefully about how a jury will look at the risk register. If impacts are defined only in terms of potential harm to the organization, such as lower profits or market value, change that. Make sure your risk assessments also calculate impacts to interested parties, including customers, employees, business partners and the public. Also make sure investments in security safeguards are made in balance with those impacts.

Are acceptable risk levels defined using terms that are odious to lay people? If the organization is accepting risks that create actual harm to individuals or organizations, change that too. Ensure that acceptable risks are no greater than a measurable, known variance that cannot be reduced regardless of the size of the investment. Confirm that the managers, personnel, attorneys, auditors and public understand that the organization knows that no amount of investment will prevent all harm, so an acceptable risk is one that can be reduced with investment. Also, make certain that this reasoning is explicit and part of the risk assessment reports and documents.

Are your respondents estimating risks casually? Ensure that risk analyses are based on available evidence, and that they are repeatedly run using the best and latest information.

Finally, see that legal counsel is involved in risk assessments, and that they are aware of Viscusi’s research. In the unlikely event that the organization undergoes a negligence case, they may work diligently to have a judge hear the case, or make sure that the jury is fully prepared to understand the role of the risk assessment.

Risk assessments are important not only because they are required, but because they help to secure a balance between our concerns with what may go wrong and our ability to invest against those threats. But if we are unaware of the potential harm that can come when strangers read our risk register, or if we do not write risks so that they reflect the obligations we owe others, then, ironically, we are allowing our risk assessments to increase risks.
Chris Cronin is a principal consultant at HALOCK Security Labs.