Risk and Impact
As much as I believe that a risk based approach for cybersecurity is the way to go, I still feel a chill down the spine when I see the results of some risk assessments. I believe we are getting increasingly better with the overall estimation of the likelihood of an event. The impact side of the equation, however, quite often looks too way off and the results of the exercise end up being a nice piece of wishful thinking.
Risk assessments are usually performed on limited scopes, such as specific applications, projects or technology environments. The impact assessment for those usually limits the impact to losses related to that scope and the data flowing through or stored in that environment. The most conscious assessors will also consider indirect losses like reputation impact (Secondary Loss Factors in FAIR). Still, I have a strong feeling (in fact, I'm basing this whole point on anecdotal evidence) that those assessments grossly underestimate the interconnectivity and cross exposure that currently exists between technology environments.
If we look at recent high impact breaches, such as what happened with HBGary, Target and Sony Pictures, initial compromise is usually related to areas or systems considered of low business value or risk. From a low importance Content Management System to HVAC systems, the list of good examples to illustrate the point keeps growing. Nevertheless, I keep wondering what would have happened (or had happened) if those systems were subject to risk assessments by the average risk assessor, using the most common methodologies? I wouldn't be surprised to see a lot of green or 'Low' labels used in the final reports.
My point is that risk assessments are vastly underestimating the interconnectivity aspects of today's networks and technology environments. From obvious interconnection aspects to more subtle cases of administrative passwords reuse, the fact is that seeing low business value assets being compromised as a way to reach more interesting targets shouldn't be an unexpected story or a 'black swan' to the victims. However, it seems that due to the way that risk assessments are being conducted, we are deemed to see it happening over and over again. We need to fix how those risk assessments are being done.
The solution involves many aspects. First, some organizations are still using risk assessment methodologies that don't support or can not incorporate more refined information about impact. Some of those just use a simple number for the impact, without consideration of ranges or even for the fact that the distribution of potential impact value won't necessarily be an uniform distribution. When the full impact of a breach is considered and it includes the worst case scenario, it's still important to understand that potential impact values have a likelihood themselves. Certain values or ranges of values are more likely than others. When methodologies consider only an average or a worst case scenario they ignore very important information that should be used to properly reflect the resulting risk. Picture it as seeing the potential impact as a single dot on a chart versus a curve line (a bell curve, for example).
The second important aspect is about the people behind the assessment. Risk assessors are usually blind to the worst case scenarios and the technology components that make them possible. To make things more complicated, some assessors can see those scenarios but are not capable of understanding the subtle components that affect the likelihood of each case. It is impressive how risk assessors are often unaware of how a breach or intrusion actually happens. It would be very important to those professionals to learn those aspects by performing or watching penetration tests and red team exercises. The difference in the understanding of how things can escalate between those with pentesting experience and those without it is impressive.
Risk based security is the way to do things; I’m not trying to suggest something different here. However, the most important part of the process, the risk assessment, has to be fixed so we won’t keep seeing foreseeable events and as ‘black swans’.