Comment by eternityforest

3 years ago

The real URL is how.complexsystems.fail

Point 2 is "Complex systems are heavily and successfully defended against failure"

Complex systems do fail. But airplanes are still extremely safe. Because people stacked on even more complex systems, often involving worldwide change in response to an accident that happened once.

You constantly hear about how safe it is to fly. And yet hardly anyone seems to learn fron their successes. When you stop accepting failure and are willing to disrupt everything if it saves even one life, you can do a lot.

Complex systems may be unreliable, but with enough work, it seems we can sometimes make the overall picture safer than not having them.

I can't firmware update all of mankind to never leave a baby in a hot car. But they can put sensors on seats and continually do studies to be sure it's working. Complex systems are sometimes more controllable than people or simple systems.

The choice sometimes seems to be "Add complexity, do nothing, or do something that nobody will accept"

I really see your point here; but I have to caution: Airplanes are "exactly as simple" as they need to be.

There is a lot that goes into their design to simplify things greatly; you're probably thinking of complicated computer systems that are used in planes.

But those computer systems are incredibly simple compared to what we use or build atop of: as simple as they have to be in order to be fully understood.

  • They're still far more complicated than a layman might guess after years of hearing that simple is always better.

    I'm guessing things like fly by wire would be automatically assumed to be unsafe by a lot of people.

    When cars get features that are anything like what goes into planes, people tend to get upset and say "I'm not an idiot, you should make cars expensive and complicated just because some drivers can't stop crashing without a computer".

> are willing to disrupt everything if it saves even one life

I feel like you're walking away with the wrong lesson. Disrupting everything is a great way to blow up complex systems. You want to change things gradually, ensuring that the human side can keep up.

  • A lot of the time what happens is the human side doesn't need to keep up.

    They'll say "This actuator fails if driven past it's limit in hot weather after a rainstorm, and we have data showing that people can overdrive it accidentally in this condition".

    Then they'll replace all affected actuators even if it costs millions .

    Or they'll add a software patch to keep you from overdriving it.

    What they don't do is say "It's probably fine, people just need to be more careful". If someone made a mistake once, someone else can make it again. Systems have to be built for the people who will actually use them, not theoretical elite users.

    On occasion the technical fix has it's own dangers that need to be evaluated and you can't find any substitute for operators doing the right thing(See Gare de Lyon for the perfect example of multiple human errors by different people interacting with complex safety systems).

    But only some careful analysis will tell you what's more dangerous.

"I can't firmware update all of mankind to never leave a baby in a hot car. But they can put sensors on seats and continually do studies to be sure it's working."

We can also put mandatory sensors in peoples bodies, to make sure they act and live allright.

But I think this would be overcomplicating things.

  • Complexity wouldn't be the problem, the issue would be violation of people's bodies.

    Something like a car seat sensor is just a consumer product safety regulation that does not imply any extreme expense, danger, or violation, except to very extreme anti-tech or anti-regulation people. It's a further development of the same trend as headlight or seatbelt related laws.

    Plus, it protects people who have no way of protecting themselves, from mistakes that are made by people who have actively been prevented all their life (via confidence culture) from having the tools to prevent making them.

    • "Plus, it protects people who have no way of protecting themselves, from mistakes that are made by people who have actively been prevented all their life"

      Let me put it that way, idiots who let their babies in a hot car actually would also need a million of other mandated sensors and I would not trust them with small lives, or a dangerous fast and heavy bullet like a car in the first place.

      But the default should be trust and not micromanaging people assuming they are all idiots. Because this keeps people as idiots.

      1 reply →