Comment by bumby
5 years ago
I think this runs deeper than an “MBA vs Engineer” perspective. Having worked in the industry but no direct knowledge of how this project was run (outside of what’s been reported in the news) here my take:
If I were a betting person, my guess is they have created a culture of “schedule first”. Other orgs managed by engineers have been chastised for this in the past (looking at you, NASA). The problem is the risk/reward is asymmetrical managing these projects. Even though the severity can be extremely high, the probabilities of an error of this magnitude tend to be very small. This creates an incentive for ambitious souls to be aggressive and continually role the dice to meet schedule. As a former manager once told me when I was in a safety role, “you don’t want to get a reputation for slowing things down.”
To oversimplify things, say there’s a 1% chance any project can end in disaster. You can literally create an entire career ignoring that risk and still look like a star. Probability is on the individuals side. Meanwhile, the manager pushing to do things the right way to minimize that risk almost certainly will bear a higher cost/schedule burden. But in aggregate, that probability will catch up to the organization.
What makes me think this? There’s ample reporting Boeing didn’t follow their own procedures because they didn’t want to delay. Hazard analysis documentation wasn’t reflective of the design because the paperwork wasn’t getting updated. Their didn’t follow their own procedures regarding redundancy in design of items identified in the HA. I suspect they knew this was wrong which is why they obfuscated details in the investigation.
Culture matters, irrespective of academic degrees.
Surprisingly enough, it doesn't run deeper. Boeing was once an engineering center of excellence. Building bulletproof planes were what they did, and it showed. Then they decided to acquire McDonnell Douglas - who was run by accountants. For reasons I'm not sure anyone can explain, the MD executives somehow not only survived the merger, but took control of the company. The end result was a move to pinch pennies and maximize profit vs. a focus on quality and engineering.
https://www.theatlantic.com/ideas/archive/2019/11/how-boeing...
>With ethics now front and center, Condit was forced out and replaced with Stonecipher, who promptly affirmed: “When people say I changed the culture of Boeing, that was the intent, so that it’s run like a business rather than a great engineering firm.”
I wonder if the implication is that being run like an “engineering firm” is no longer able to produce a competitive company culture. What the MBA vs Engineer framing doesn’t do is explain why other organizations run by engineers have had the same type of mistakes. I think there’s a more general through line
I spent many years working very close to Boeing, though never for Boeing so I wasn't privy to meetings that may have included the alleged malfeasance you describe. What I did see was communication breakdown between major teams. Some interface features were mutually assumed to be owned by another team and fell through the cracks. MCAS in response to unstable thrust vector is commonly cited as an example of this.
But I think it's worth noting to the (mostly software oriented) HN crowd - aerospace projects have massive manufacturing cycle times. Some things literally take years to manufacture for the proof-of-concept stage, let alone production. You can't NOT be incredibly schedule-oriented in this environment. This can cause some perverse incentives for management (which must be mitigated), but there are always going to be somewhat risky last-minute changes that could seem ill-advised in hindsight if you want to deliver something.
Schedule matters, especially in long lead items. The problem becomes when schedule overrides good judgement regarding safety. My personal opinion is this is largely rooted in human cognitive biases that make us bad at objectively assessing risk. We just aren’t wired to think statistically and then we have multitudes of biases that undermine our ability to make judgement, especially about low probability events.
> human cognitive biases that make us bad at objectively assessing risk
I generally agree strongly (e.g. car vs air safety). Though this group is highly cognizant of KNOWN risks. It's the UNKNOWN risks (usually found at those interfaces I mentioned above) that get overlooked. Unknown risks definitely will be more problematic in a compressed-schedule environment.
1 reply →
they dlc'd the secondary/redundant AoA sensor, a critical safety feature and literally a direct cause of the crashes
True. But the hazard analysis reported in the Seattle Times showed Boeing listed it a hazardous failure item. Ignoring that it should have been classified as “catastrophic” because of its ability to cause a loss of aircraft, even at the “hazardous” level their procedures made a redundant element required.
Required, not optional if the customer pays enough.
Actually I wonder how having two sensors is a good idea, anyways. There is no quorum of two, if one of the sensors is off, what's the actual angle of attack?
I dont understand much of redundancy of sensors, yet '3' would allow to ignore the 'bogus' reading.
It seems such a weird thing to cut corners on.
The main new feature in a plane solving a problem that they knew was potentially dangerous which is why they had to install the auto-nose-down system in the first place.
How expensive could a couple of extra sensors even be compared to a whole plane?
1 reply →
> Actually I wonder how having two sensors is a good idea, anyways. There is no quorum of two, if one of the sensors is off, what's the actual angle of attack?
Not knowing the angle of attack is miles better than not knowing that you don't know the angle of attack.
There’s usually guidelines based on the criticality of the equipment and the required reliability.
"dlc"? What does that mean
Downloadable content, a reference to large game studios' reputations for nickel-and-diming players on what should be core gameplay features of a video game.
They made it an optional extra that you had to pay more for.
This would be fine if we're talking about carpeting on the flight deck, but the AOC sensor is a critical feature.
They made it an optional feature you have to pay more for.
Just want to say I think your response is well articulated, and has been sitting with me for days. Reflecting on how "schedule first" is driving things in my org...