PPRuNe Forums - View Single Post - Boeing Resisted Pilots Calls for Steps On MAX
Old 19th May 2019, 16:00
  #71 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
Originally Posted by megan
PJ2, you absolutely sure? Three AOA inputs and it still managed to cause issues that ended with the Captain retiring because of PTSD.
https://www.atsb.gov.au/media/3532398/ao2008070.pdf
Originally Posted by Zeffy
The author of the thesis is not unknown in the world of engineering test pilots.
http://hoh-aero.com/home.htm
http://hoh-aero.com/FAA%20Rudder/AIA...er%20Paper.pdf
Megan, Zeffy, thanks for your responses.

A crossroads of a sort is emerging: "How much automation, how much pilot?". Are systems really "too complex"? Are pilots in over their head when confronted with normal failures, (vice outliers such as QF72 and the two recent B737M accidents). Is training falling down on the job? Is the standards and checking process not rigourous enough or afraid to confront weak candidates?...or both?

I think that the points raised, (about complexity and pilot capacity) hint at taking autonomous passenger flights seriously. They raise questions concerning the possibility of software engineering's capacity to address human factors/human behaviour issues.

It has been raised before in PPRuNe: Could software have dealt with QF32 with equal success? With UA232? With Colgan 3407? With USAirways 1549?, etc. If not, where is the line between pilot and software? What are the saves vs. the failures? The questions need to be addressed by both pilots and the engineers together.

Clearly, this requires far greater work and thought than the space available here, but these are the questions raised when one thinks about "training vs. software" solutions to technical, and human factors failures.

To begin, no matter how strange or confusing any of the transports I flew were when the question, "What's it doing now?" arose, every type was always flyable when everything was disconnected. I think that is really worth something. I think that is a huge testimony to the aeronautical engineers and system designers in terms of flyability and redundancy and in terms of graceful failures.

Further, I believe that that is the experience of 99% of the world's airline crews, not because I have the data obviously, but because I view my experience as a kind of median - a down-the-middle experience with a few interesting outliers and a couple of serious challenges which turned out okay but had the potential for alternative endings.

So to your points if I may, neither Mr Hoh's considerable qualifications as an engineer and test pilot which are certainly impressive, nor the QF72 accident example and it's unhappy result, are counter-examples for what I am trying to convey.

From the WSJ OpEd Mr. Hoh has written:
Many current airline pilots are simply not up to the standard necessary to operate current systems. With that in mind, the airline industry—not just Boeing—needs to lower expectations related to pilot competency in designing systems and dealing with failures.
Mr Hoh is calling for software engineers to fill in where crews fail. I think that this is an attempt to address human factors problems vice aircraft technical failures and that is a very difficult thing to do.

I include experience and a minimum knowledge base as qualifying conditions for an airline pilot to meet, and if, given this, things go wrong, we may look to human factors for cause and potential solutions. Clearly, automation (& FBW) software works with astonishing reliability and with resilience; it is a demonstrable enhancement to flight safety - I know this having flown the types, (Airbus in my case), for a decade and a half. There were no encounters when the airplane's software prevented appropriate pilot action and to Mr Hoh's point, I also know there were "software saves".

In fact I think we might find more in common regarding our views on system complexity because I don't think he would disagree with the fundamental requirement for thorough, validated training and checking (auditing), and the old principle of knowing one's airplane as thoroughly as available manufacturer information permits. This is all I am emphasizing.

I know very well that it is not possible to know one's airplane at the nuts-and-bolts level, nor is it possible for anyone, let alone pilots, to know that software, like an algorithm, is 100% predictable and reliable under all conditions. As pilots, we operate far above that level, using software-engineered solutions as tools that are expected to work all the time. Pilots should never be expected to be the troubleshooters of bad sofware design.

Mr. Hoh appears to claiming that there is an increasing gap between the engineers that create aircraft systems and today's pilots' overall, (worldwide) capacity to understand the engineers' complex designs/systems. I submit two initial observations regarding this view: 1) that most pilots not comprehending aircraft complex systems is trivially obvious and, 2) that "more software cannot" fix a problem which may be software-originated and which instead has become a human factors matter, not a software matter.

It is not possible to verify or even comprehend software in the way that we normally can see and comprehend mechanical systems. Software is pure design without necessary & verifiable principles of operation*. It's outcomes are not 100% predictable nor can its resilience / brittleness tested for certainty. If there is a glaring example of this potential for capricious behaviour in what is normally an extremely stable system, it is precisely the QF72 PRIM failure being cited above, and the airplane survived the fault.

* Software reliability and dependability, Littlewood, Strigini

I certainly would not be calling for a lowering of expectations of pilots by the manufacturers.

In billions of hours of flight since the 80's, aircraft system design has presented a genuinely insoluble problem for flight crews only a few times, far, far lower than similiar occurrences in other complex systems such as healthcare, automobile engineering and of course space flight. So designers & software engineers can not have a full and complete comprehension of the software systems they design. It has been understood for a very long time that it is not possible to know all variations in software system design. NTK training arose out of this factor and aircraft manuals began refocussing or thinning out system descriptions.

To use an old expression, I think that Mr Hoh's article is "throwing the baby out with the bathwater".

One an hardly argue against the success of automation and the high level of operational safety it has yielded since its introduction in the late 80's, but one is hard-pressed to argue that AI, robotics, or engineering solutions can encompass human creativity for doing the wrong thing for the "right" reasons.

PJ2

Last edited by PJ2; 19th May 2019 at 17:18.
PJ2 is offline