Re: Business, Economics, and Tax Policy 8: Bezos Takes Over the World
I'd liked to know what our resident aerospace engineer thinks of this.
Definitely quite an article.
Human factors are the hardest things to design around, because they are so uncertain. Flight Critical (i.e. they have to work or the airplane crashes) functions are required to be designed to have a failure rate of less than once per billion flight hours (1e-9). Commercial airplanes may have a useful life of up to 50,000 flight hours, so to reach a billion hours of flying would require a fleet of 20,000 airplanes. So unless there are errors by the engineers (errors can be either technical or judgmental), the systems themselves are supposed to be incredibly reliable.
So the weakest link in any aircraft should be the humans involved - the pilots and maintainers. I don't care how well designed the cockpit is or how much training you provide, you'll never convince me that during those same billion hours of flying, the human pilots and maintainers would only make a single error that would crash the plane. It's absolutely no surprise to me that pilot and maintainer error was a major, perhaps even dominant, factor in these cases. I've also never assumed it was coincidence that the two airlines that crashed were both from countries with less than robust political and regulatory systems.
However, the aircraft designer has to give the pilots a fighting chance. Maybe pilots properly trained in both airmanship and safety attitude should be able to correctly diagnose and correct a runaway trim condition 99.99% of the time. That still doesn't mean it's okay to design an aircraft that can throw that condition at the pilots due to a single failure of an angle of attack (AOA) sensor. Boeing has built ~10,000 737s, so if each of them accumulates 50,000 flight hours, that's 500 million flight hours. Individual aircraft components typically have failure rates of around once per 20,000 flight hours, so that would be 25,000 failures of that AOA sensor over the life of the fleet. If pilots can "only" be counted on to correct the problem 99.99% of the time, that means that this design flaw would be expected to initiate .0001 x 25,000 = 2.5 crashes. That's the problem with huge numbers - even a tiny fraction of a huge number is still an unacceptably large number.
In contrast, there are ~400 737Maxes, and they've only been around for 5 years. If we generously say that they each have 5,000 hours, that's a total fleet life of 2M hours. If the AOA failure rate is close to my 20,000 hour estimate, that means the fleet should have experienced 100 AOA sensor failures by now. 2 of those have led to fatal crashes, so that implies that pilots have only reacted correctly 98/100 times = 98%. Extrapolated to the full fleet (500M hours) that would mean that there would be .02 x 25,000 = 500 crashes. There's just no way that an airplane should be designed such that a pilot should have to react correctly more than 99% of the time to keep the number of crashes at an acceptably low level.
So yes, pilot error exists and was definitely a factor in these cases. But Boeing KNOWS that pilot error exists and therefore should not design in situations that require ridiculously high pilot reliability in order to prevent crashes. In my mind, the responsibility still lies with Boeing.