Comment by dan_quixote

5 years ago

I spent many years working very close to Boeing, though never for Boeing so I wasn't privy to meetings that may have included the alleged malfeasance you describe. What I did see was communication breakdown between major teams. Some interface features were mutually assumed to be owned by another team and fell through the cracks. MCAS in response to unstable thrust vector is commonly cited as an example of this.

But I think it's worth noting to the (mostly software oriented) HN crowd - aerospace projects have massive manufacturing cycle times. Some things literally take years to manufacture for the proof-of-concept stage, let alone production. You can't NOT be incredibly schedule-oriented in this environment. This can cause some perverse incentives for management (which must be mitigated), but there are always going to be somewhat risky last-minute changes that could seem ill-advised in hindsight if you want to deliver something.

Schedule matters, especially in long lead items. The problem becomes when schedule overrides good judgement regarding safety. My personal opinion is this is largely rooted in human cognitive biases that make us bad at objectively assessing risk. We just aren’t wired to think statistically and then we have multitudes of biases that undermine our ability to make judgement, especially about low probability events.

  • > human cognitive biases that make us bad at objectively assessing risk

    I generally agree strongly (e.g. car vs air safety). Though this group is highly cognizant of KNOWN risks. It's the UNKNOWN risks (usually found at those interfaces I mentioned above) that get overlooked. Unknown risks definitely will be more problematic in a compressed-schedule environment.