Comments on the Verizon DBIR Supplemental Report

On December 9th, Verizon released a supplement to their 2009 Data Breach Investigations Report. One might optimistically think of this as volume 2, #2 in the series.

A good deal of praise has already been forthcoming, and I’m generally impressed with the report, and very glad it’s available and free. But in this post, I’m going to offer up a stack of criticism. This isn’t intended to detract from the report, but to encourage and improve it.

My biggest request is to focus more on the “requests for additional recommendations for deterring, preventing and detecting breaches,” which is the fourth common theme. In particular, each of the elements of the threat action catalog has what appears to be a grab-bag of mitigators. The first threat action, “keyloggers and spyware” has a dozen. (I’d make it a baker’s dozen by splitting host IDS and integrity monitoring.) How these are selected is unclear. How each was tested for validity is unclear. The relative costs of each is not assessed. I would urge the authors to include a table of all mitigators, showing how often each is present, which types of threat actions they expect it would impact, and an estimate of cost or relative cost. (A preventative mitigator present at over 50% of incidents should likely be removed from the list.)

In fact, on page 22, a list of lessons learned is presented. That breakdown would be an provocative top row of a table of mitigators.

My next large request is to break out threat actions better. Figure 2 is hard to read, as it combines threat categories which Verizon calls Malware, Hacking and misuse, and I’d characterize as gaining access and using that access. Perhaps that should include expanding access, but two small tables would be easier to read, as would the same sort applied to the data in the break-out.

Some other places where I raised my eyebrows:

  • Page 2 discloses that non-essential details have been altered. That’s an interesting choice of words, especially noting how carefully other words are chosen. (For example, page 26, “We find this interesting” versus “We find this fascinating.”)
  • Page 7, “Ubiquitous but particularly common …” would imply either not really ubiquitous in other industries.
  • Page 8, “non-sanctioned commercial remote desktop program.” Was a sanctioning process available? Was work at home a supported scenario? Was there software to support that scenario that was sanctioned? I ask because of the implicit judgement of the researcher who installed the software, and a guess that the blame may reasonably be placed on the IT department.
  • Page 15 “We very often see [unauthorized access]” but it’s listed as 8% of breaches and 1% of records compromised. Which is a … provocative … definition of very often. (The same idiom is used with similar numbers on page 19.)

The comparisons between the DLDB and the Verizon database are, to borrow a word, fascinating.

None of this is intended to detract from the report, which is worth reading and comparing to your control set. How’s your coverage on what they believe would be effective?