Friday, October 31, 2014

Availability bias

So you are reading about the Russians, APT28, and thinking that that's where you should be putting your protection efforts, right? Not necessarily. You are probably being another victim of the Availability Bias:

The availability heuristic is a mental shortcut that relies on immediate examples that come to a given person's mind when evaluating a specific topic, concept, method or decision. The availability heuristic operates on the notion that if something can be recalled, it must be important, or at least more important than alternative solutions which are not as readily recalled. Subsequently, under the availability heuristic people tend to heavily weigh their judgments toward more recent information, making new opinions biased toward that latest news.
(Wikipedia) 

This is one of the examples of cognitive biases studied on Behavior Economics. It happens frequently on our risk assessments, specially those in our day to day routine. We exaggerate the risks of anything related to the last tragedy in the news. It happens to terrorism acts, crazy shooters, airplane disasters, diseases (Ebola?) and many other things. Things that are present (available) in our minds tend to look as more likely to happen, skewing our risk perception.

What you should do about it?

First, keep cool under new information. Look at it under the perspective of everything else you already know, and try to use data resources such as the Verizon DBIR to make more rational decisions. The base advice is to ask yourself, "what are the chances of this happening again? how frequently do these things happen? How are the chances if this happening comparing to other threats?". All those questions and thinking in terms of probabilities will help you move into a more rational way of thinking, avoiding the impact of the availability bias.

Friday, October 24, 2014

(One more) long winter of this blog is about to end

I noticed I haven't written anything here since February...that's probably the longest period I stayed without blogging since I started more than 10 years ago. That's unacceptable for someone who wants to stay at the top of his game and also to go beyond the parroting the sounds coming from the echo chamber. I'll definitely work to avoid this to become the new normal.

Now, what is making me put my head out of the sand again is (at least for me) an amazingly interesting topic: behavioral economics. The implications to information security are many, from the most obvious ("user" behavior) to some less evident situations (attacker point of view, risk management, SOC operations, secure coding and development, among many others).

I recently read two books that are a great introduction to the topic:

The Art of Thinking Clearly - Rolf Dobelli

Thinking, Fast and Slow - Daniel Kahneman (Nobel prize winner, seen by many as the 'father' of the field)

There are many others that I hope to add here over time, when I expand on what I've been thinking about this in our Infosec environment. More to come.



P.S. There's also an ongoing online (and free) course about BE at eDX...