The One Where David Lacey's Article On Risk Makes Us All Stupider
In possibly the worst article on risk assessment I’ve seen in a while, David Lacey of Computerworld gives us the “Six Myth’s Of Risk Assessment.” This article is so patently bad, so heinously wrong, that it stuck in my caw enough to write this blog post. So let’s discuss why Mr. Lacey has no clue what he’s writing about, shall we?
First Mr. Lacey writes:
1. Risk assessment is objective and repeatable
It is neither. Assessments are made by human beings on incomplete information with varying degrees of knowledge, bias and opinion. Groupthink will distort attempts to even this out through group sessions. Attitudes, priorities and awareness of risks also change over time, as do the threats themselves. So be suspicious of any assessments that appear to be consistent, as this might mask a lack of effort, challenge or review.
Sounds reasonable, no? Except it’s not alltogether true. Yes, if you’re doing idiotic RCSA of Inherent – Control = Residual, it’s probably as such, but those assessments aren’t the totality of current state.
“Objective” is such a loaded word. And if you use it with me, I’m going to wonder if you know what you’re talking about. Objectivity / Subjectivity is a spectrum, not a binary, and so for him to say that risk assessment isn’t “objective” is an “of course!” Just like there is no “secure” there is no “objective.”
But Lacey’s misunderstanding of the term aside, let’s address the real question: “Can we deal with the subjectivity in assessment?” The answer is a resounding “yes” if your model formalizes the factors that create risk and logically represents how they combine to create something with units of measure. And not only will the right model and methods handle the subjectivity to a degree that is acceptable, you can know that you’ve arrived at something usable when assessment results become “blindly repeatable.” And yes, Virginia, there are risk analysis methods that create consistently repeatable results for information security.
2. Security controls should be determined by a risk assessment
Not quite. A consideration of risks helps, but all decisions should be based on the richest set of information available, not just on the output of a risk assessment, which is essentially a highly crude reduction of a complex situation to a handful of sentences and a few numbers plucked out of the air. Risk assessment is a decision support aid, not a decision making tool. It helps you to justify your recommendations.
So the key here is “richest set of information available” – if your risk analysis leaves out key or “rich” information, it’s pretty much crap. Your model doesn’t fit, your hypothesis is false, start over. If you think that this is a trivial matter for him to not understand, I’ll offer that this is kind of the foundation of modern science. And mind you, this guy was supposedly a big deal with BS7799. Really.
4. Risk assessment prevents you spending too much money on security
Not in practice. Aside from one or two areas in the military field where ridiculous amounts of money were spent on unnecessary high end solutions (and they always followed a risk assessment), I’ve never encountered an information system that had too much security. In fact the only area I’ve seen excessive spending on security is on the risk assessment itself. Good security professionals have a natural instinct on where to spend the money. Non-professionals lack the knowledge to conduct an effective risk assessment.
This “myth” basically made me physically ill. This statement “I’ve never encountered an information system that had too much security” made me laugh so hard I keeled over and hurt my knee in the process by slamming it on the little wheel thing on my chair.
Obviously Mr. Lacey never worked for one of my previous employers that forced 7 or so (known) endpoint security applications on every Windows laptop. Of course you can have too much !@#%ing security! It happens all the !@#%ing time. We overspend where frequency and impact ( <- hey, risk!) don’t justify the spend. If I had a nickel for every time I saw this in practice, I’d be a 1%er.
But more to the point, this phrase (never too much security) makes several assumptions about security that are patently false. But let me focus on this one: This statement implies that threats are randomly motivated. You see, if a threat has targeted motivation (like IP or $) then they don’t care about systems that offer no value in data or in privilege escalation. Thus, you can spend too much on protecting assets that offer no or limited value to a threat agent.
5. Risk assessment encourages enterprises to implement security
No, it generally operates the other way around. Risk assessment means not having to do security. You just decide that the risk is low and acceptable. This enables organisations to ignore security risks and still pass a compliance audit. Smart companies (like investment banks) can exploit this phenomenon to operate outside prudent limits.
I honestly have no idea what he’s saying here. Seriously, this makes no sense. Let me explain. Risk assessment outcomes are neutral states of knowledge. They may feed a state of wisdom decision around budget, compliance, and acceptance (addressing or transferring, too) but this is a logically separate task.
If it’s a totally separate decision process to deal with the risk, and he cannot recognize this is a separate modeling construct, these statements should be highly alarming to the reader. It screams “THIS MAN IS AUTHORIZED BY A MAJOR MEDIA OUTLET TO SPEAK AS AN SME ON RISK AND HE IS VERY, VERY CONFUSED!!!!”
Then there is that whole thing at the end where he calls companies that address this process illogically as “smart.” Deviously clever, I’ll give you, but not smart.
6. We should aspire to build a “risk culture” across our enterprises
Whatever that means it sounds sinister to me. Any culture built on fear is an unhealthy one. Risks are part of the territory of everyday business. Managers should be encouraged to take risks within safe limits set by their management.
So by the time I got to this “myth” my mind was literally buzzing with anger. But then Mr. Lacey tops us off with this beauty. This statement is so contradictory to his past “myth” assertions, is so bizarrely out of line with his last statement in any sort of deductive sense, that one has to wonder if David Lacey isn’t actually an information security surrealist or post-modernist who rejects ration, logic, and formality outright for the sake of random, disconnected and downright silly approaches to risk and security management. Because that’s the only way this statement could possibly make sense. And I’m not talking “pro” or “con” for risk culture here, I’m just talking about how his mind could possibly conceptually balance the concept that an “enterprise risk culture” sounds sinister vs. “Managers should be encouraged to take risks within safe limits set by their management” and even “I’ve never encountered an information system that had too much security.”
(Mind blown – throws up hands in the air, screams AAAAAAAAAAAAAAAAAHHhHHHHHHHHHHHHH at the top of his lungs and runs down the hall of work as if on fire)
See? Surrealism is the only possible explanation.
Of course, if he was an information security surrealist, this might explain BS7799.
I’m occasionally a heretic in the eyes of the Church of Risk myself, but David Lacey comes off as a raving lunatic here. I have nothing against a bit of hyperbole to make a point, but Mr. Lacey’s “points” only seem to be used to stab himself.
What he means by 5) is this:
Asset/data owners are not permitted to field a system unless it has been through a risk assessment. Now, we wouldn’t want infosec to tell “the business” how to do things, so there’s an escape hatch – once risks have been identified “the business” can choose to accept any or all of them. Think of it as informed consent (or acting against medical advice). So, the stereotypical clueless LoB MBA will tell the Risk Analyst, CISSP “Great, tell me what form to sign so I can get this over with”, and we’re done. Time wasted, outcome not at all affected, but compliancy achieved and everybody’s happy.
I don’t know this Lacey, but it seems he’s the type who has been told what not to do by one too many “security fascists”, and has decided they represent everyone in the field. Rather than excoriate him, I’d like you to train your sights on *them*. If I am right, Lacey is a victim here. Judging by how full of himself he comes off as, he may well be a victim who should have known better, but a victim nonetheless.
” I honestly have no idea what he’s saying here.”- Then I wonder which world you have been working in? Maybe you shouldn’t go off on one like that if you don’t understand what you’re reading?
” 7 or so endpoint security applications on every Windows laptop” so are you saying that this was good security? It sounds like you had one of the people david’s talking about do the risk assessment that justified that.
“I’ve never encountered an information system that had too much security” made me laugh so hard I keeled over …
Aw, poor you. Perhaps the straps on your jacket need tightening. Those of us who work in the real world are painfully aware that too little is spent on security by the vast majority of companies not subject to burdensome government, legislative or regulatory requirements.
“7 or so endpoint security applications” does not equate to “too much” security. In all probability it equates to inefficient and ineffective security. Point solutions tend to do that. The little Dutch boy plugs each hole when he sees it, but ignores the water slopping over the top of the dyke and flowing out of the open sluice gate.
David Lacey might be a tad (or two) pompous and inclined to overstate his importance in some processes [but don’t we all?], but he hits the mark with many of his comments.
There’s none so blind …
“Aw, poor you. Perhaps the straps on your jacket need tightening. Those of us who work in the real world are painfully aware that too little is spent on security by the vast majority of companies not subject to burdensome government, legislative or regulatory requirements.”
@shannon – I work for a bank. Doesn’t get much more “real world” – in fact, based on your comments vs. the data at hand from DLDB and the DBIR, I have to ask – compared to what and whose risk tolerance? Yours? Fuuuuu….
“David Lacey might be a tad (or two) pompous and inclined to overstate his importance in some processes [but don’t we all?], but he hits the mark with many of his comments.”
My point is that his comments aren’t indicative of the state of risk management (see the SIRA mailing list) in total, just his small sample size, and, worse yet, they’re illogical and contradictory.