PPRuNe Forums - View Single Post - Boeing 737 Max Recertification Testing - Finally.
Old 27th Mar 2023, 16:31
  #1020 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
"Systems are designed and constructed from components that are expected to fail.
As the complexity of a system increases, the accuracy of any single agent's (person's) own model of that system decreases rapidly.
"

A quote from a report on coping with complexity in IT malfunctions. Many similarities with operator and design issues as the Max, except for the timescales and number of people involved.

Other 'cherry picked' quotes; read the full report for context.
  • Each anomaly arose from unanticipated, unappreciated interactions between system components.
  • There was no 'root' cause. Instead, the anomalies arose from multiple latent factors that combined to generate a vulnerability.
  • The vulnerabilities themselves were present for weeks or months before they played a part in the evolution of an anomaly.
  • The events involved both external software/hardware
  • The vulnerabilities were activated by specific events, conditions, or situations.
  • The activators were minor events, near-nominal operating conditions, or only slightly off-normal situations.

Surprise
In all cases, the participants experienced surprise. … mainly discoveries of previously unappreciated dependencies that generated the anomaly or obstructed its resolution or both. The fact that experts can be surprised in this way is evidence of systemic complexity and also of operational variety.
A common experience was "I didn't know that it worked this way." People are surprised when they find out that their own mental model of The System doesn't match the behavior of the system.

More rarely a surprise produces astonishment, a sense that the world has changed or is unrecognizable in an important way. This is sometimes called fundamental surprise … four characteristics of fundamental surprise that make it different from situational surprise:


1. situational surprise is compatible with previous beliefs about ‘how things work’; fundamental surprise refutes basic beliefs;
2. it is possible to anticipate situational surprise; fundamental surprise cannot be anticipated;
3. situational surprise can be averted by tuning warning systems; fundamental surprise challenges models that produced success in the past;
4. learning from situational surprise closes quickly; learning from fundamental surprise requires model revision and changes that reverberate.

This adjustment of the understanding of what the system was and how it worked was important to both immediate anomaly management and how post-anomaly system repairs add to the ongoing processes of change.
Uncertainty and escalating consequences combine to turn the operational setting into a pressure cooker and workshop participants agreed that such situations are stressful in ways that can promote significant risk taking
.

Reread the surprise section with alternative viewpoints; operators were surprised, manufacturer, regulator, self; which types of surprise.
Pprune - surprise; a forum for ill considered post-mortems.

Experts are typically much better at solving problems than at describing accurately how problems are solved. Eliciting expertise usually depends on tracing how experts solve problems. … experts demonstrated their ability to use their incomplete, fragmented models of the system as starting points for exploration and to quickly revise and expand their models during the anomaly response in order to understand the anomaly and develop and assess possible solutions.

… focused on hypothesis generation.
[ not seeking to follow SOPs existent or not ] These efforts were sweeping looks across the environment looking for cues. This behavior is consistent with recognition primed decision making.

organizations which design systems... are constrained to produce designs which are copies of the communication structures of these organizations.

The alerts draw attention but they are usually not in themselves, diagnostic. Instead, alerts trigger a complex process of exploration and investigation that allows the responders to build a provisional understanding of the source(s) of the anomalous behavior that generated the alert.


It is unanticipated problems that tend to be the most vexing and difficult to manage.… unappreciated, subtle interactions between tenuously connected, distant parts of the system.

Don't overlook the end sections; how much dark debt is the industry carrying. An ever increasing amount due to automation and operational complexity, yet constant limited human performance.

"dark debt"; vulnerability was not recognized or recognizable until the anomaly revealed it. … found in complex systems and the anomalies it generates are complex system failures

Dark debt is not recognizable at the time of creation. … it is a product of complexity, adding complexity is unavoidable as systems change
.

Ref https://snafucatchers.github.io
,

Last edited by alf5071h; 27th Mar 2023 at 16:42.
alf5071h is offline