Friday, April 29, 2011
Post Mortem lessons from Amazon
Thursday, April 28, 2011
Must read for those working with vuln. management
McAfee VirusScan Enterprise: False Positive Detection Generic.dx!yxk in DAT 6329
McAfee Labs have issued an alert that McAfee VirusScan DAT file 6329 is returning a false positive for spsgui.exe. This is impacting SAP telephone connectivity functionality.
McAfee have a work around for the issue documented in KB71739 https://kc.mcafee.com/corporate/index?page=content&id=KB71739
Chris Mohan --- Internet Storm Center Handler on Duty
They seem to be improving...at least it's not a core component of the OS this time :-)
Wednesday, April 27, 2011
Wednesday, April 20, 2011
Will we see the return of low level vulnerabilities?
With the efforts towards the migration to IPv6 (and all the protocols related to it, such as ICMPv6) and DNSSEC, a lot of vendors are running to add the support for those protocols to their products. Vulnerabilities in protocols at the lower ISO stack levels haven't been common, but there were plenty of those when the Internet became popular (remember the Ping of Death?). The times when you could bring down a system with a simple "ping" seemed to be over, but now, with a lot of new code handling the basic stuff being deployed, we'll probably see again a surge in vulnerabilities like those being exploited.
However, the scenario is quite different now. Some factors that may make things different:
- The Internet now is slightly different from that one in the 90's...I wonder what could happen if someone finds a new PoD Today.
- Developers know that their code will be attacked, that things like "buffer overflows" can be exploited. Big vendors have SDLCs in place.
- The research community is bigger and better prepared. A lot of very good people trying to find bugs.
- The tools to find bugs have also evolved. A lot of researches are pointing their new shinny fuzzers to everything that runs code.
- More powerful and well funded organizations searching for "cyberweapons".
During the last years we've seen the attackers targets going up the ISO layers. With all the new code being deployed there's no reason to believe they won't revisit the lower levels to find "lower hanging fruits" (pardon the pun).
Tuesday, April 19, 2011
Quick comments on the Verizon DBIR 2011 report
Monday, April 4, 2011
Beware of "low impact" in risk assessments
Friday, April 1, 2011
World Economic Forum 2011 Risk Report
By the way, "cyber risks" are in the top of lists of "Risks to Watch", in other words, risks with a lot of uncertainty and hard to predict trends. It makes sense.
1 Raindrop: "I know" and "I don't know" schools of security architecture
Excerpt from Howard Marks’ July 2003 Memo “The Most Important Thing”:
"One thing each market participant has to decide is whether he (or she) does or does not believe in the ability to see into the future: the “I know” school versus the “I don’t know” school. The ramifications of this decision are enormous.If you know what lies ahead, you’ll feel free to invest aggressively, to concentrate positions in the assets you think will do best, and to actively time the market, moving in and out of asset classes as your opinion of their prospects waxes and wanes. If you feel the future isn’t knowable, on the other hand, you’ll invest defensively, acting to avoid losses rather than maximize gains, diversifying more thoroughly, and eschewing efforts at adroit timing.
Of course, I feel strongly that the latter course is the right one. I don’t think many people know more than the consensus about the future of economies and markets. I don’t think markets will ever cease to surprise, or thus that they can be timed. And I think avoiding losses is much more important than pursuing major gains if one is to achieve the absolute prerequisite for investment success: survival."
In security architecture terms, I differentiate Identity & Access Services which are designed to help enterprise achieve some business goals (and despite what a lot of people say about ROSI, these services have ROI attached to them from day one), but these are implicitly "I know" kind of service or least "I guess."
otoh, there are Defensive services like monitoring and logging, which implicitly say - "I don't know" how I am going to be attacked, how and where things will fail, but I need to build a margin of safety into the system to be able to react if and when they do.
The Security Triangle shows that depending on whether you have "I know" assumption or "I don't know" assumptions, you'll end up with a different looking architecture. Of course, its not a binary choice, you will have some of both, but there are always priorities and choices. The goal for security architects is to be clear about the choices, because if you are trying to know or accepting that you don't know, your security services delivery, measurements and processes will vary.
When you are building out monitoring services, your goal is to identify assets and event types with the goal of increasing visibility. This typically results in a people, process and technology like IRT that respond to catalysts and vectors that are often not known at the time the system is being built.
When you are building out Identity & Access services, you are assuming much more knowledge of subjects, objects, attributes, data and applications. This mapping typically manifests in architecture like Identity & Access Management systems, publishing and enforcing known known relationships.
In each of these cases, the toolsets are different, how you staff for them is different, the design and operations is totally different, but they get lumped under the title "security."
Very good post from Gunnar Peterson. I always thought there's a huge diference between security technologies. Mike Rothman like to define them as "let the good guys in" and "keep the bad guys out" technologies. I even wonder if anyone has ever tried to model their security teams like that, or something like "external threat team" and "internal control team". That would be interesting.