James Reason’s entire career was full of mistakes. Most of them were other people’s. And while we all feel that way, in his case, it was really true. As a professor of psychology, he made a career of studying human errors and how to prevent them. He has a list of awards that’s a full paragraph long, but perhaps most interesting is that he’s an honorary fellow of the Royal College of General Practitioners for his work in reducing medical errors. “The Human Contribution” is a broad, accessible and fun book on the human contribution to errors, failures and crises. At times, it rises from ‘merely’ thought provoking to awe-inspiring.
Part I includes a “mind users guide” including easily triggered failures of thinking, like tip of the tongue state. Part II covers unsafe acts, including (chapter 3) a set of ways of classifying errors, and the differences between rule based mistakes (applying a rule that doesn’t apply) and knowledge based ones, where people don’t know the right solution, and errors are extremely common. Chapter 4 covers violations, including a long discussion of why people violate rules:
For many acts of non-compliance, experience shows that violating is often an easier way of working and brings no obvious bad effects. The benefits are immediate and the costs are seemingly remote and, in the case of accidents, unlikely.
Chapter 5 covers different ways people think about unsafe acts. The “plague model” is that these things just happen. They’re unpredictable and hard to control. The “person model” is focused on individual unsafe acts and their origins. The “legal model” adds a moralistic aspect to the person model that “someone must be punished.” The chapter closes with a system model, showing how individual choices, the organization and its policies and procedures can come together in a variety of ways that influence accidents.
Part III is short, covering accident traps, recurrent accident patterns and culture in chapter 6, and the influence of a few significant accident investigations. (That is, where the investigations were significant for advancing the state of our understanding of accidents.)
Part IV is a rare touch in books on errors. It covers heroic recoveries in four ways: “Training, discipline and leadership,” focusing on two military retreats across long distance. Chapter 9, “Sheer Unadulterated Professionalism” covers in depth the rescue of Titanic survivors and Apollo 13, as well as British Airways flight 09 and BAC1-11 and surgical errors. Chapter 10 “Skill and luck” covers the intersection of skill and luck with the Gimli glider and United 232. When a Boeing 767 flying as Air Canada flight 103 lost power, the pilot was an experienced glider pilot, and the co-pilot had flown out of Gimli. The degree of luck that AC 103 was in range of Gimli, and that the co-pilot knew where the base was, is nearly incalculable. (The base, being closed, was not on the flight charts.) But without the pilots skill at unpowered flight, and his willingness to risk flying a 767 like a glider, the odds of a landing people walked away from were very low. Chapter 11, “Inspired Improvisations” covers the ways in which people’s unique skills and experiences can lead to unusual but effective solutions. The section closes with a chapter on “The ingredients of heroic recovery.” The actions covered here are heroic in many senses, and a fascinating collection of stories. But it’s more than that; it’s a set of lessons which can be extracted and applied elsewhere.
Part V, which closes the book, covers achieving resilience in a chapter on “Individual and Collective Mindfulness” and “In search of safety.”
The book has actually substantially influenced my thinking on product management and the tradeoffs between security, design beauty and time to market. That, perhaps, is another blog post. More importantly, this is an important book, and worth the time of readers of The New School.
It’s not that safety management and risk management are identical, but rather that they can and should inform each other. But the real New School angle to “The Human Contribution” is the underlying premise that we must study the real errors and even near-misses that systems produce, and how people react to them. It is only through that study that we can build systems which will be safe enough to satisfy us.
PS: Someone I spoke with at BlackHat recommended this book. Thank you!