PPRuNe Forums - View Single Post - Ethiopian airliner down in Africa
View Single Post
Old 3rd May 2019, 15:21
  #4792 (permalink)  
GordonR_Cape
 
Join Date: Dec 2015
Location: Cape Town, ZA
Age: 62
Posts: 424
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by 737 Driver
Since we have seemingly now moved onto a focus on the human factor element (i.e. the why behind the flight crew errors), I think it might be worthy to expand this topic a bit.

Clearly, these accidents have exposed a case where some presumably highly-trained and experienced professionals were put in a position where that training and experience did not rise to the task. Obvious errors were made that had fatal consequences. Since most of us would like to assume that this wasn't a case of malicious or negligent behavior, then presumably there were some significant human factor element behind these lapses.

I am referring, of course, to the various engineers, technical and supervisory staff that designed and approved MCAS for service.

Imagine, if you will, a parallel online forum in which aircraft-related engineer specialties debate over the various elements of these accidents from their perspective. One could imagine certain individuals saying, "Why didn't they just design the friggin' software/hardware correctly?!" Others might defend their tribe by saying the design was sufficient, it was just that the operators weren't sufficiently skilled/trained to handle a malfunction. Still others might concede that, while yes, errors were made, the individuals who had their hands on the design/approval process were working under various constraints and pressures and that their errors were perfectly understandable from a human factors perspective. They would plead that, please, everyone take a breath and quit trying to blame the engineers when it is obvious they were doing the best they could under the circumstances.

What would we make of such a conversation?

What I am trying to point out is that while some of us like to say "Boeing" messed up or "the FAA" messed up, the reality is that these organizations are simply made up of human beings who respond to their training, experience, and environment. Being human, they are just as much subject to the fallibilities of the human mind as were the pilots. There is even one study that lists precisely 188 types of cognitive errors that the human mind is subject to (click here to read). These errors may be different than the ones the pilots were exposed to, but they were ultimately human errors.

At some point, we will have two final accident reports detailing a list of primary and secondary causes to these accidents. Behind a fair number of these causes will be a human being who was not acting out of malicious intent or neglect. They were simply performing according to their training, experience, and environment. In the discussions on this and related threads, there quite often the refrain, "Stop blaming the pilots!" I don't have any problem with that sentiment since the act of "blaming" is largely an emotional response that tends to avoid getting to the root of the problem. That being said, identifying the root causes and proposing remedies isn't the same as blaming (unless someone chooses to interpret it that way).

So yes, how about we all stop blaming everybody who had a hand in these accidents, understand that behind every error there was likely a human factor element, and support those efforts to address and/or remediate those issues?
The article that I just posted got lost in a flurry of comments, but is worth reading. It carefully describes the chain of human decisions involved in the design of MCAS, without apportioning blame: https://www.theverge.com/2019/5/2/18...error-mcas-faa

So MCAS was designed to compensate. It would use an angle of attack (AoA) sensor to detect when the airplane entered a steep climb. It would activate the airplane’s pitch trim system, which is routinely used to help stabilize the airplane and make it easier to control, especially during climb and descent. And it would trim the airplane in modest increments for up to nine seconds at a time until it detected that the airplane had returned to a normal AoA and ended its steep climb. It seems simple enough — on paper, that is.
MCAS received a “hazardous failure” designation. This meant that, in the FAA’s judgment, any kind of MCAS malfunction would result in, at worst, “a large reduction in safety margins” or “serious or fatal injury to a relatively small number of the occupants.” Such systems, therefore, need at least two levels of redundancy, with a chance of failure less than 1 in 10 million.
Worse still: the FAA did not catch the fact that the version of MCAS actually installed on the 737 Max was much more powerful than the version described in the design specifications. On paper, MCAS was only supposed to move the horizontal stabilizer 0.6 degrees at a time. In reality, it could move the stabilizer as much as 2.5 degrees at a time, making it significantly more powerful when forcing the nose of the airplane down.
But why had nobody caught it in the first place? The answer might be infuriatingly simple: nobody read the paperwork.

Although the FAA is responsible for the safety of any airplane manufactured in the United States, it delegates much of the certification to the manufacturers themselves.
So had anyone checked, they might have flagged MCAS for one of several reasons, including its lack of redundancy, its unacceptably high risk of failure, or its significant increase in power to the point that it was no longer just a “hazardous failure” kind of system.
In a strange way, the 737 Max’s story is less about what did happen and more about what didn’t. Nobody did anything criminal. Nobody did anything malicious. Nobody did anything wrong, in a strictly technical sense.
It’s a perfect example of the cross purposes at which business, technology, and safety often find themselves. With its bottom line threatened, Boeing focused on speed instead of rigor, cost-control instead of innovation, and efficiency instead of transparency. The FAA got caught up in Boeing’s rush to get the Max into production, arguably failing to enforce its own safety regulations and missing a clear opportunity to prevent these two crashes.
GordonR_Cape is offline