"80 Percent of Cyber Attacks Preventable"
Threatlevel (aka 27B/6) reported yesterday that Richard Schaeffer, the NSA’s information assurance director testified to the Senate Senate Judiciary Subcommittee on Terrorism, Technology and Homeland Security on the issue of computer based attacks.
If network administrators simply instituted proper configuration policies and conducted good network monitoring, about 80 percent of commonly known cyber attacks could be prevented, a Senate committee heard Tuesday.
The remark was made by Richard Schaeffer, the NSA’s information assurance director, who added that simply adhering to already known best practices would sufficiently raise the security bar so that attackers would have to take more risks to breach a network, “thereby raising [their] risk of detection.”
I’m really curious however on what data Director Schaeffer is basing his testimony on. Is it the DBIR? Another open set of breach data or is it based on data gathered by the NSA? Regardless, it’s great to see more folks talking about what the Verizon DBIR report told us and what we’ve known anecdotally for a long time; which is, we still aren’t even close to doing the basics well.
The article then goes on to tell us:
A 2009 Price Waterhouse Cooper study on global information security found that 47 percent of companies are reducing or deferring their information security budgets, despite the growing dangers of cyber incursions.
The thing is, as we’ve learned from the Verizon study, most of the found issues were due to failing at doing the basics, like not removing default passwords, not revoking accounts when employees leave and misconfigurations. Even in the case of patching, the vast majority of holes exploited had patches available for over a year and 100% had patches available for over 6 months. This is not the stuff of big budgets and sexy technology, but rather about having solid, repeatable and auditable processes, in other words, serious operational discipline. Budget cuts might actually be a good thing because it will force organizations to focus on the people and process portions of security rather then the technology. It’d be really cool to if PWC were to track correlation of budgets to breaches within their survey groups, then we’d have some actual data on potential optimal spend levels.
I think there’s a flaw in the reasoning, even if you buy the “80%” number.
In terms of quantity of attacks, most attacks are opportunistic that exploit the “low hanging fruit” (practically lying on the ground). Even if you accept the proposition that basic/standard configuration practices would eliminate most or all of these vulnerabilities, there is still plenty of fruit in the tree that are accessible with only a modest increase in attacker effort or sophistication.
Of course, in the short term, a given company can reduce the success rate of opportunistic attacks if their configuration and security practices are better than their peers. Said one wildebeest to another, “I don’t have to run faster than the lion… I just have to run faster than you.”
But in the long run, if all organizations shift upward to basic/common/standard configuration and security practices, the attackers will simply shift to the next level of fruit.
This is like preditor-prey and host-parasite coevolution in nature.
I would phrase it this way: “80% of cyber attacks can be shifted from low sophistication to modest sophistication attacks by following basic/common/standard configuration practices”. Not very attractive as a headline or in congressional testimony, but this is closer to the reality, IMHO.
Another thing to consider is that the fact that the basics are not necessarily easy when we consider big and complex organizations. For instance, things like privileged accounts control and patch management are very hard to do well when IT operations and infrastructure are not in a “best of breed/best practices” mode. And, again, attackers only need a single mistake. For “basic stuff”, doing 80/20 is not good enough, but big organizations may not be in a position to do more than that.
Hey Dave, I thought I would cross-post something that Wade Baker posted this last December. I think it fits in quite well with what you’ve written here.
http://securityblog.verizonbusiness.com/2008/12/03/crisis-could-improve-security-in-2009/
I would add that the bigger issue here is at the application level. What are companies doing to protect webapps where the network perimeter is useless?
I read recently an estimate that it takes an average of 67 days to fix something as common as XSS(!). Unacceptable. Everyone’s throwing stats around – they’re a dime a dozen, but we all know the issues are real and damaging.
http://artofdefence.wordpress.com/