← Back to context

Comment by inimino

8 years ago

> The system did exactly what it was intended to do, it was the humans who screwed up.

It was the humans who designed and who chose to deploy a system without a human in the loop, and without an override (even after a director was involved) that screwed up.

Maybe, but the humans who didn't renew his employment status knew how the system worked. They screwed up more.

  • They screwed up, but that happens. The point where the system takes over and even the higher-ups can't override it is where the story becomes Kafka-esque.

    If you build an automation system that goes out of human control after a human error, that is a failed design.

    • >If you build an automation system that goes out of human control after a human error, that is a failed design.

      Unless it was designed with the intent that human intervention should be impossible once a process was started. That would make it a very poor design, but not a failed one.

      I've seen lots of internal software that doesn't have failsafes or rollbacks - the operator is simply trained to follow procedure and then is expected to follow it. Software that considers operator error is more complex, and therefore more expensive, to produce. Cost often supersedes quality or flexibility when these systems are developed.

      1 reply →