…he hoped that all the casualties and accidents, which had occurred during their progress, would be noticed in revising the Paper; for nothing was so instructive to the younger Members of the Profession, as records of accidents in large works, and of the means employed in repairing the damage. A faithful account of those accidents, and of the means by which the consequences were met, was really more valuable than a description of the most successful works. The older Engineers derived their most useful store of experience from the observations of those casualties which had occurred to their own and to other works, and it was most important that they should be faithfully recorded in the archives of the Institution.
Today Robert Stephenson would likely express the same hope, mutatis mutandis, about the failure of computer programs and the measures that have been taken to protect them.
Now, Petroski is talking about failures of computers in engineering, rather than the engineering of computers. But I think there’s little doubt that he’d say that the same applies to the engineering of computers. Both the chapter “The limits of design” and the new afterword are worth reading with an eye to what they can teach us about information security and disclosure. Actually, the entire book is worth reading, but the analogies are strongest there.