TSA Breaks Planes (and a link to infosec)
Aero News Network has a fascinating story, “ANN Special Report: TSA Memo Suggests That Agency ‘Encourages’ Damaging Behavior.” It covers how a TSA goon climbed up a plane using equipment marked “not a handhold,” damaging it and putting the flying public at risk. It continues:
While this may be terrifying on a number of levels, the situation becomes far more questionable with the release of a recent memo from the TSA in which such damaging and destructive actions are apparently ENCOURAGED. The memo clearly states that, “Aircraft operators are required to secure each unattended aircraft to make sure that people with bad intent cannot gain access to the planes. But during the inspection, TSA’s inspector was able to pull himself inside of an unattended aircraft by using a tube that was protruding from the side of the plane. TSA encourages its inspectors to look for and exploit vulnerabilities of this type.”
There’s a couple of things I want to say about this. The first is that TSA seems to be orienting their “inspectors” towards the idea that no indignity or stupidity is too large. This is a natural result of there being no accountability.
While it’s fun to rage at the TSA like this, I don’t want to be throwing stones from a glass house. In information security, we sometimes tend this way. Security risks are seen as accruing to the career of the CSO. Smart CSOs shift jobs often to avoid having the risk (I forget who pointed this out, or I’d give credit.)
Implementing controls for a set of rare, high impact risks is hard. TSA, DHS and the President ought to be telling Americans not to be scared, and to realize that these things may happen again, despite our best efforts. This was the lesson of societies including the UK, France, Germany and Japan, not to mention Israel.
Fortunately, in information security, we have lots of common risks to go after, if only we’d pay attention.
So long as we believe that the definition of “safe” is “100% safe” and that anything else is “unsafe” and thus “in danger” and therefore a problem, we as a culture will do this sort of thing. This is important to us in many roles throughout our lives, as citizens, security professionals, engineers and designers, parents or teachers (professional and otherwise).
We need to remind the world that, as democrats and thinkers from Pericles to Jefferson have said, a little loss, a little blood, is the price we pay for liberty and that liberty is what actually makes us safer. Think of it as an investment in the future.
I suspect one issue here might be defining the roles of the different people. TSA employees ought to be working out ways to get into planes in unauthorized ways, since that lets an attacker put a bomb on a plane without blowing himself up. But there’s a place and time for penetration testing, and it’s not on a safety-critical production system that’s in use. Similarly, while it would be nice for someone from the FDA to work out what frequency and power of microwaves will mess up a pacemaker, the cardiac ward of the local hospital isn’t the right place to go around testing the pacemaker-jamming gear.