Shostack + Friends Blog Archive

 

Evolve or Die

Or at least become more vulnerable. I’ve recently been helping a client with their secure coding initiative and as a result I’ve been reading Mike Howard and Dave LeBlanc’s Writing Secure Code which reminded me of an important aspect of maintaining a secure code base which often gets overlooked: That is that as code ages it becomes insecure.
This is most readily apparent with web applications, but is true for any code base. I’ve worked with several clients who brought in organizations such as @stake, ISS, isec partners, etc several years ago for an assessment and then addressed all the found problems. Time goes by and customers using applications like NT Objectives and Watchfire start sending in bug reports. So the client calls me up and says something like: “What happened, those security guys we hired years ago must have been crappy, suddenly customers are calling up claiming we are insecure! How can that be? We haven’t changed the code in those modules in ages!”
The explanation is pretty straight forward. The state of the art of finding vulnerabilities has moved forward and the clients’ controls for dealing with vulnerabilities has stayed the same. As a result, the source code has naturally regressed and become more vulnerable over time, much like a piece of machinery wears out over time. We like to say that old, well understood, well tested code is far better than new code and while in general I’m inclined to agree, one needs to remember, that well tested means adjusting the tests to keep up with the advancement of vulnerabilities.
While this regression is inevitable there are some things that can be done to slow it down. Most notably, practices that reduce the attack surface as much as possible by implementing applications with least necessary privilege and the other security principles of Saltzer and Schroeder. Similarly designing filters to permit acceptable data as opposed to attempting to enumerate bad behavior will also get you a long way in the right direction, but in the end, you have to just keep on testing.

6 comments on "Evolve or Die"

  • I think in the future we are going to change track. Bringing in the rafts of consultants and know-it-alls to fix what we did wrong didn’t work.
    For a start, there just aren’t enough consultants to go round, and that’s even before we filter out those who are narrowly focussed on their little patch, and are therefore out of sync with needs.
    It seems fairly logical, inescapable even, that if the consultants can’t do it (for whatever reason) then it’s the client’s job. He has to do it. The client has to build the security system, to suit needs. Outsourcing security is negligence.
    In the future, we may be bringing in teachers instead. The goal is to learn how to do it right, ourselves. (Yup, Microsoft probably got that principle right.)

  • Ryan Russell says:

    I wish to argue semantics.
    The vulnerabilities were there the whole time. Only more of them are found over time (yes, due to new techniques.)

  • Dan Weber says:

    Software can definitely become more vulnerable over time.
    I can write a piece of software which includes libfoo.so, but uses one of the library calls in an undocumented way. However, with the version of libfoo.so that I’m using, my software is absolutely unexploitable.
    A year down the road, the libfoo maintainers release a new version, fixing some other security problems, and changing the way that my library call works. Now I am exploitable, even though my code never changed.
    I suppose that one could stretch things to say that my use of the library in an undocumented is the vulnerability. But then I could change my scenario to one in which the library maintainers change their documentation so my use at the time was within the library’s spec at the time, even if it isn’t now.
    This example really isn’t that far-fetched, because in a Web 2.0 world pieces of the running software are everywhere. Instead of changing libfoo.so, the web browsers and protocols and operating systems change all around you, like Marty McFly finding himself in a stange new world because Biff changed the world.
    For example, you rely on JavaScript to enforce the same-source policy in the ECMAS spec, but there are common browsers out there which ignore it, which lets your users have all their cookies stolen. Oh, sure, it was the browser that had “the bug,” but do you think your customers care?
    Do you think CSRF attacks are a flaw in the web browser or in the web application? Should a web application written in 1997 have cared about someone browsing the web with multiple tabs? Or with a browser that has a version of Javascript that lets the refer(r)er location be altered?
    And, really, does it matter? Unless you’ve written some weird contract that says you get to force the people who wrote your “secure application” 10 years ago to fix the vulnerabilities in it that were always there, does endless semantic arguments gain anyone anything?
    (Well, maybe it does, if Bruce Schneier manages to get any traction with his “make developers responsible for the vulnerabilities they make” proposals.)

  • I’m with Ryan’s comments above. The code doesn’t necessarily change or wear out, but subsequent patchings (if they occur) can degrade old code, plus new techniques like fuzzing can reveal inherent flaws and bugs that were always there, just never found before.
    Those consultants may have done a fabulous job and, in that day, the code may have been as secure as they knew how to secure it. But…time reveals all…

  • Just to clarify…code security can change:
    – due to changes in assumptions (such as Dan’s underlying libraries in the above comment)
    – due to changes in the actual code (patches, tweaks, rewrites)
    – due to changes in the attackers or environment (new discovered methods of compromise
    – due to changing standards/specifications (we didn’t care before that sales people had access to the data this code protects, but we do now because of outside compliance, which changes the original spec of the code)

Comments are closed.