Security experts take it as a truism that you can’t defend everything. So you have to make choices about what attacks to worry about, and which ones to ignore. A study released today claims that unprotected hosts are attacked once per second. (USA Today reports on the study, and avantgarde.com is utterly swamped. So I have not read the study as I write this. From their news page:
Working with Kevin Mitnick and USAToday, Avantgarde released a study on November 29th that showed that automated ‘bots,” worms and other threats pummeled six computer platforms over a two-week period with 305,955 total attacks. Results also revealed, and that an inadequately protected computer fell victim to an actual compromise within four minutes of first plugging into the Internet.
The results are not particularly surprising. Attack frequency has been on the rise for quite some time, and attacks encoded in worms don’t need to sleep. One thing that this means is that a prime goal of security management is to prioritize response by predicting and respond to those issues which are likely to become worms. Given that worms are now actively spreading to PCs when people visit web sites, a good firewall and a quarantine system are not enough. (Telling people to only visit ‘trusted’ websites isn’t enough. Even the most trusted web sites are vulnerable to compromised. If you’re a home user, Install Firefox. Firefox is not vulnerable to the same set of issues as IE is.)
From the security research perspective, I think there are more interesting questions. There are more vulnerabilities discovered than are turned into worms. Some issues are harder than others to exploit. Some issues exist in a broader set of targets than others. Some issues have exploit code published sooner than others. Which of these factors is predictive as to how worms spread?