PPRuNe Forums - View Single Post - Boeing pilot involved in Max testing is indicted in Texas
Old 28th Mar 2023, 16:47
  #244 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
The biggest assumption about communication is to assume its taken place

Many discussions use acronyms, but not everyone has the same understanding in their use.
MCAS - refers to a system (S), but the extent of that system may not be given explicitly.

Normally a system is bounded by input, activity (computation) and output, together with monitoring / verification of input(s) and feedback.
Thus a failure of an input sensor could cause the 'system' to malfunction, without reason, if the the design of the activities, output, and feedback are weak. A more robust system might identify internal errors and provide external alerting (MCAS FAIL), reduced the need for human reasoning.

A core issue relating to this thread is the human involvement;

- The assumptions about the operator's ability to detect a system malfunction and act; which relates to the hazard level in certification - which drives the design.

- The defined the piloting task when assessing the assumptions in certification is critical.
Does the design simulator represent the final design.
Is the task constrained to the specific system (MCAS), or the wider context involving other aircraft systems and alerts associated with an input failure; realistic operational context.
Who assesses the task, are the findings widely communicated.

Endless prior discussion suggests weaknesses in all of these areas, and those of other human checks and oversight.

Considering the Big System - specification, technical and commercial, the design, engineering, testing, certification oversight, documentation, training, all involved weaknesses, which independently could have been identified, but when together, the system complexity defied the process which existed.

Neither an individual nor team, group, organisation should be cited. This was a systemic failure, a failure of the aviation process.
Responsibilities can be attributed, but more often these involve a legal viewpoint.

Dr. David Woods discussing automation surprise, learning from incidents and complexity surrounding the Boeing 737 Max.

“...The industry has to stop rationalising away the lessons from accidents - the difficulties of learning after accidents have occurred are profound. It is easier to look at each event in isolation seeing only the specific details that define a narrow set of changes needed to facilitate a return to normal. But this narrow focus on one accident at a time makes it easy to discount the fundamental common vulnerability that needs to be addressed as an industry-wide priority. Neither the regulators nor the safety organisations have been able to recognise this critical vulnerability nor have they been able to energise a coherent, fundamental approach to overcome this vulnerability.”

i.e. we, the industry, are still at risk of an accident from design and certification processes; the continuing task is to minimise this risk.

"Attempts to prevent pilot error by additional regulation, both federal and corporate, have increased the complexity of the environment and increased pilot (design, engineering, test, certification) workload."
alf5071h is offline