Cleaning Up
If you haven’t read Steven Johnson’s The Ghost Map, you should. It’s perhaps the most important book in print today about the next decade of computer security.
John Snow was a physician who was a pioneer in anaesthesia who turned his attention to cholera when the worst epidemic hit the London where he lived in 1854. It’s not just about Snow, however, it’s about theories, information, and how to select the right model.
The prevailing model at the time (this was pre-germ-theory) was that cholera was airborne, carried by “miasma,” namely stink. If it smelled bad, it was probably disease-ridden. It’s not a bad theory, actually, it’s just wrong. Snow came to the belief that cholera was waterborne, despite the fact that the suspect wells in London were known to be largely sweet-tasting.
Despite the fact that I’m giving away the plot (spoiler — we beat cholera and major cities in Europe no longer have epidemics), Snow got there by examining data and coming up with the proper visualization of the data (the Ghost Map) to show that cholera spread along water flow not along air flow.
Before Adam used Snow and Johnson’s book in his recent “Why Security Breaches Are Good For You,” I read the book and was thinking about it and security.
I believe that our security problems need to be looked at both from the viewpoint of public health issues, but also from the viewpoint of quality. Snow beat cholera because he was fortunate enough to have the right insight, but insight isn’t enough. You need data. Fortunately, there was lots of data available, and the data was available to him and the people who disagreed with him. Data was also part of the problem, as Johnson points out, because the larger problem was sorting through the data. However, when it comes to computer security, we don’t yet have the luxury of too much data.
Everyone’s data center has its own little cesspool. Mine does, yours does. We have to figure out how to clean them up. We need to have more data. We therefore need to remove the stigma of disclosing data as well as insisting on it. This is why The Ghost Map is an important book for computer security, it will take you back a sesquicentury to the problems of creating cities with millions of people in them, and in that history you can think about the problems of making networks with billions of people in them.
Johnson himself has a chapter on the future of cities and urbanization, which I wasn’t as impressed with. The book shifts from being a page-turner to a page-flipper when he gets away from the past and considers the future. Nonetheless, read it and think.
I was fortunate enough to be in London recently and made a pilgrimage to Broad Street (now Broadwick Street) and the pub in his honor. I also made a point to use the modern public convenience on Broadwick Street and was amused by the washing gizmo that soaps, waters, rinses, and dries one’s hands without one having to touch anything.
Photo of the pub sign for the John Snow pub courtesy of Mordaxus. I apologize for leaving the decent camera at home, and thus having to make do with the camera in my mobile.
Data was also part of the problem, as Johnson points out, because the larger problem was sorting through the data. However, when it comes to computer security, we don’t yet have the luxury of too much data.
Everyone’s data center has its own little cesspool. Mine does, yours does. We have to figure out how to clean them up. We need to have more data. We therefore need to remove the stigma of disclosing data as well as insisting on it.
I think you’ve got a dubious link here. John Snow (not to belittle his contributions) had a specific issue (cholera), occuring in a specific area (Soho), causing specific problems (illness/death). He could gather specific data about location of illness/death, and plot it on a map.
Just addressing information leaks (and not security in general), we’ve got a general issue, occuring in many areas, causing a variety of problems.
While I’m certainly not going to claim that more information might not be helpful, I remain decidedly unconvinced that more information would turn out to be anything more than – well – more information.
How would you structure the analysis of breach data, to provide meaningful information?
As a trivial example, if we take information leaked from laptops being stolen as an example, we might conclude that laptops are most often stolen from vehicles parked near stadiums – so information breaches can be prevented by not parking near stadiums…
Overview
Do you cringe at the subjectivity applied to security in every manner? If so, MetriCon 2.0 may be your antidote to change security from an artistic “matter of opinion” into an objective, quantifiable science. The time for adjectives and adverbs has gone; the time for hard facts and data has come.
MetriCon 2.0 is intended as a forum for lively, practical discussion in the area of security metrics. It is a forum for quantifiable approaches and results to problems afflicting information security today, with a bias towards practical, specific implementations. Topics and presentations will be selected for their potential to stimulate discussion in the Workshop.
MetriCon 2.0 will be a one-day event, Tuesday, August 7, 2007, …
Cat, I think you are basically right. Snow was working in a time where data analysis to support conclusions was a new idea, now it is an old idea.
What is left is critical thought. I do not believe we have mastered that, as yet. Two examples:
Last year the Nobel for Medicine went to a pair of Australian looney doctors who thought that stomach ulcers were caused by bacteria not stress. Well, everyone knows they are stress. The two doctors saw different data, but still nobody was convinced. It was only when one of the doctors infected himself with the bacteria, created an ulcer, and then cured it with penicillin, that people started to take notice.
The issue is not the data, that was easy to look at and repeat. It was the thought process; It took a crazy circus trick to get people to re-assess their beliefs.
2nd example: we’ve known all about phishing since 2003-2004. And before, because it was a straightforward, well-known, historical and predicted weakness. If you asked around, you could find people who would tell you what it was. The data was pouring in … and yet … here we are in 2007, and you still can’t find 2 security experts (?!) to agree on how to solve it.
Why not? I surmise that it is because to do that, we as a security industry have to look at that flagship product — secure browsing to protect ecommerce — and we basically have to say “we failed.” We basically put in place the wrong product at the wrong time, and it failed when attacked. Then we did nothing.
Until we cross that hurdle, we’ll not get anywhere … I surmise … because none of us want to go where it is darkest. But you won’t get anyone to agree with me 🙂