Wiser is interesting for us because a lot of decisions and processes in security involve groups. There are groups working around risk assessments, deciding about security controls and measures and also doing incident response. The way that groups fail to behave in an optimal manner and how to correct that is thus important to infosec. A good example on this just came up in a recent Twitter exchange.
Fascinating @a_greenberg article explains how FBI / DHS worked #attribution for Ross Ulbricht as Dread Pirate Roberts http://t.co/WJo87kn0Px
— Richard Bejtlich (@taosecurity) January 15, 2015
"Step that helped build consensus was the creation of a team dedicated to pursuing rival [#attribution] theories, none of which panned out."
— Richard Bejtlich (@taosecurity) January 15, 2015
@taosecurity "we cannot prove it's been someone else, so it must be them"? :)
— Stefano Zanero (@raistolo) January 15, 2015
@raistolo @taosecurity no, that's actually one of the best ways to reduce groupthink (check 'Wiser' by @CassSunstein )
— Augusto Barros (@apbarros) January 15, 2015
Richard Bejtlich was talking the use of a "red team" to mitigate the risk of groupthink during an attribution exercise. This is a perfect example of techniques to improve group work being used on security related processes. He followed up on the twitter exchange with a nice post on his blog.
(I understand Zanero's point from a logical point of view; the fact that you can't prove A doesn't necessarily means that B is truth is the universe of possibilities is bigger than A+B. However, I don't think that's the objective of the red team in that context. The red team is there to reduce the trend of the group to rapidly converge to a decision without properly considering the alternatives. This is a decision making aid tool, not a logical argument)
No comments:
Post a Comment