Doing Science With Near Misses

[Update: The final article is available at “That Was Close! Reward Reporting of Cybersecurity ‘Near Misses’,” at the Colorado Technology Law Journal.]  ]

Last week at Art into Science, I presented “That was Close! Doing Science with Near Misses” (Slides as web page, or download the pptx.)

The core idea is that we should borrow from aviation to learn from near misses, and learn to protect ourselves and our systems better. The longer form is in the draft “That Was Close! Reward Reporting of Cybersecurity ‘Near Misses’Voluntary Reporting of Cybersecurity “Near Misses”

The talk was super-well received and I’m grateful to Sounil Yu and the participants in the philosphy track, who juggled the schedule so we could collaborate and brainstorm. If you’d like to help, by far the most helpful way would be to tell us about a near miss you’ve experienced using our form, and give us feedback on the form. Since Thursday, I’ve added a space for that feedback, and made a few other suggested adjustments which were easy to implement.

If you’ve had a chance to think about definitions for either near misses or accidents, I’d love to hear about those, in comments, in your blog (trackbacks should work), or whatever works for you. If you were at Art Into Science, there’s a #near-miss channel on the conference Slack, and I’ll be cleaning up the notes.

Image from the EHS Database, who have a set of near miss safety posters.

3 Comments on "Doing Science With Near Misses"

  1. How about this definition? “A near miss occurs when a fortunate break in the chain of events, or an after-the-fact detection and resolution (i.e. a close call), prevents an impact to the organization.”

    1. I like it, and what is ‘an impact to the organization?’ I ask these because we’d like to give lawyers guidance that allows incentives, and that’s easier the more crisp we are.

      If a SOC analyst sees it, does that count? After all, they’re not going to see something else. What if they act on it, clicking a button? Talk to someone about it? These are at the trivial end of impacts. At the far end of what might not be an impact, what if the security team holds a meeting to do some forensics or root cause work about this chain of events, and sends out a request for change because of it? I bet that, in their reviews, they’d say that that had an impact.

Comments are closed.