Although not especially “current,” Normal Accidents: Living with High-Risk Technologies is a 1984 book by Yale sociologist Charles Perrow, which analyses complex systems from a sociological perspective. Perrow argues that multiple and unexpected failures are built into society's complex and tightly coupled systems, and that accidents are unavoidable and cannot be designed around. Several historical disasters are analysed. I read a newer edition published in 1999, and the author had added a chapter on Chernobyl, which turned out to be a textbook example of some Perrow’s theory (in particular, that adding fail-safes also adds complexity, thus not necessarily making for any more safety. The Chernobyl disaster was precipitated at least in part, because they were on a tight schedule to test a fail-safe system.) The book is fascinating and a good page turner, hard to put down.
Perrow’s book is best combined with a reading of The Doomsday Machine: Confessions of a Nuclear War Planner, by Daniel Ellsberg.
I'm a retired neurosurgical anesthesiologist (38 years in practice). I read Perrow's book several years after it was published. I was struck by how relevant his points of failure were to the practice of anesthesiology, the concept of the danger of tight coupling. I referred to this book over subsequent decades in my presentations on Grand Rounds, but to my knowledge none of the residents or other attendings ever read it.
Other books I’ve much enjoyed, when your interest is in structural or other failures:
Why Buildings Fall Down: How Structures Fail by Matthys Levy and Mario Salvadori, a wide ranging history of structural failures of various kinds, and their causes.
Ignition!: An Informal History of Liquid Rocket Propellants by John Drury Clark, which is a personal memoir from a senior researcher with many decades experience developing rocket fuels - he is the proverbial Rocket Scientist. Most interesting, and amusing (in a morbid way), is the quite different culture of safety “back in the day” of this somewhat esoteric engineering/chemistry field.
Sure. Even a history of safety success contributes to this. We haven't had an accident in 3000 days, what was dangerous about this job again? Also what's this stupid policy for anyway, I've never seen anybody even come close to (non-dangerous-sounding fate) while working here.
But probably the policy is in place because it used to happen before the policy was in place. It's just not obvious to people who have never seen the consequences before.
Complacency kills! It's why it's usually the old farmers that die in stupid ways.
I'm also reminded of the Yale machine shop safety supervisor who died by getting herself wound around a lathe spindle. Working alone, late at night on powerful rotating machinery wearing loose clothing.
Inviting Disaster: Lessons From the Edge of Technology was one of the texts for an aerospace class I didn't take but friends did, but honestly you can just read the book.
There are lots of frameworks for teaching safety and programs for compliance and such but they are far too easy to cargo cult if you don't appreciate safety and the need for safety culture and UNDERSTAND what failures look like.
And when you really understand the need and how significant failures happened... "state of the art" tools and practices take a back seat, they can be useful but they're just tools. What you need is people developing the appropriate vision, and with that the right things tend to follow.
The STPA and CAST handbooks are available for free from the MIT Partnership for Systems Approaches to Safety and Security website. They are phenomenal.
Although not especially “current,” Normal Accidents: Living with High-Risk Technologies is a 1984 book by Yale sociologist Charles Perrow, which analyses complex systems from a sociological perspective. Perrow argues that multiple and unexpected failures are built into society's complex and tightly coupled systems, and that accidents are unavoidable and cannot be designed around. Several historical disasters are analysed. I read a newer edition published in 1999, and the author had added a chapter on Chernobyl, which turned out to be a textbook example of some Perrow’s theory (in particular, that adding fail-safes also adds complexity, thus not necessarily making for any more safety. The Chernobyl disaster was precipitated at least in part, because they were on a tight schedule to test a fail-safe system.) The book is fascinating and a good page turner, hard to put down. Perrow’s book is best combined with a reading of The Doomsday Machine: Confessions of a Nuclear War Planner, by Daniel Ellsberg.
I'm a retired neurosurgical anesthesiologist (38 years in practice). I read Perrow's book several years after it was published. I was struck by how relevant his points of failure were to the practice of anesthesiology, the concept of the danger of tight coupling. I referred to this book over subsequent decades in my presentations on Grand Rounds, but to my knowledge none of the residents or other attendings ever read it.
Read a sample here: https://www.amazon.com/Normal-Accidents-Living-High-Risk-Tec...
Other books I’ve much enjoyed, when your interest is in structural or other failures:
Why Buildings Fall Down: How Structures Fail by Matthys Levy and Mario Salvadori, a wide ranging history of structural failures of various kinds, and their causes.
Ignition!: An Informal History of Liquid Rocket Propellants by John Drury Clark, which is a personal memoir from a senior researcher with many decades experience developing rocket fuels - he is the proverbial Rocket Scientist. Most interesting, and amusing (in a morbid way), is the quite different culture of safety “back in the day” of this somewhat esoteric engineering/chemistry field.
(okay, I'll stop now!)
I just had a conversation about engineers not understanding the need for grounding.
I'm wondering if every generation has to relearn the basics for themselves through experience.
Each generation has to make the same mistakes. Because book learning doesn't seem to do it for some things.
Sure. Even a history of safety success contributes to this. We haven't had an accident in 3000 days, what was dangerous about this job again? Also what's this stupid policy for anyway, I've never seen anybody even come close to (non-dangerous-sounding fate) while working here.
But probably the policy is in place because it used to happen before the policy was in place. It's just not obvious to people who have never seen the consequences before.
Complacency kills! It's why it's usually the old farmers that die in stupid ways.
I'm also reminded of the Yale machine shop safety supervisor who died by getting herself wound around a lathe spindle. Working alone, late at night on powerful rotating machinery wearing loose clothing.
4 replies →
Most people are just resistant to learning (without pain).
I'm not saying it is right, or correct.
Just that it happens.
[dead]
Learn about failures.
Inviting Disaster: Lessons From the Edge of Technology was one of the texts for an aerospace class I didn't take but friends did, but honestly you can just read the book.
There are lots of frameworks for teaching safety and programs for compliance and such but they are far too easy to cargo cult if you don't appreciate safety and the need for safety culture and UNDERSTAND what failures look like.
And when you really understand the need and how significant failures happened... "state of the art" tools and practices take a back seat, they can be useful but they're just tools. What you need is people developing the appropriate vision, and with that the right things tend to follow.
The STPA and CAST handbooks are available for free from the MIT Partnership for Systems Approaches to Safety and Security website. They are phenomenal.
http://psas.scripts.mit.edu/home/books-and-handbooks/