Rob is apparently confused about what risk management means. I tried to leave this as a comment, but apparently there are limitations in commenting. So here go:
Nowhere did I imply you were a bad pen tester. I just said that you should have a salient view of failure in complex systems (which I’m sure you do).
“I’ve never thought of incident analysis aka. casual analysis aka. failure analysis as part of risk management.”
First, risk management, done properly is an implementation of scientific method. If treated differently, its stupid numerology. What I mean by this is, you start with a hypothesis (model), it is tested, then refined. Pretty basic stuff, that. If you do NOT refine the model, then you’re just making up numbers to make them up. So incident analysis is a step that must be done before model refinement.
Second, can you explain more about how “risk analysis” isn’t part of risk management? To me, the management of risk (be that engineering, financial or “natural systems” – three different concepts, each with very different approaches/models) is the establishment of a state of wisdom. Wisdom is predicated on establishing a state of knowledge (yes, I’m being very Bayesian here, it’s a bias) that requires analysis of the state of nature.
“Most people assume that risk management is about preventing bad things from happening. That’s not true. A “risk” could mean good or bad”
As far as a “risk” meaning “good”, that’s limited primarily to financial risk modeling where you can have positive as well as negative returns. Engineering risk is different, as that which resists is by definition incapable of resisting greater than its designed for. “Natural Systems” risk as a different animal, is also focused primarily on identifying determinants which cause failure. The medical community, ecological community and others who operate in this realm don’t necessarily tie in positive outcomes (Side note relevant to why we do the DBIR at Verizon – “Natural Systems risk is done differently than most approaches we’re familiar with in IT – engineering, financial – because it’s dealing with complex systems). Of course, you could consider enterprise networks to have properties that indicate strong emergence, but that’s another argument. So a financial risk “positive” really isn’t applicable in this situation, because there isn’t really the potential for a positive return from a meltdown.
“That “maximizing opportunities” never comes up in cybersecurity risks management, which is why cybersecurity is so out of step with the rest of the company.”
No, we’re out of step with business because we have no clue how to simply relate our expense to revenue.
“Nobody does incident analysis after the website was delayed because of cybersecurity concerns.”
I’ll disagree primarily because I do it every day. If you’re interested in learning more http://herdingcats.typepad.com/
is one of the more salient blogs that discusses project risk.
“Another way of defining “risk management” is “uncertainty management”
Ugh, only if you’re Knightian/Frequentist from the 1920’s. We actually cover this in the SIRA podcast I think, “uncertainty” is no longer considered the nature of risk by most probabilists. Uncertainty is a factor relevant to your prior and posterior distributions (usually expressed in the kurtosis of the distribution itself).
“For example, as Alex points out, nobody trusts that TEPCO (the company operating the Fukushima power plan) is telling the truth. Alex says that means we can’t do risk management.”
This is not at all what I said. I said that this means it’s too early to do post-incident analysis due to:
- the fact that it’s still future-predictive rather than past-predictive
- even if we tried past-predictive analysis we’d have a high degree of uncertainty in factors (which would necessitate us moving to future-predictive).
You framed the discussion as “hindsight analysis”. I explained that it’s too early. I think we can do more predictive analysis around future states, sure, but not hindsight analysis.