PPRuNe Forums - View Single Post - CRJ down in Sweden
View Single Post
Old 14th Dec 2016, 15:48
  #279 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
The report gives a comprehensive overview of the limitations of human performance, but then by inference, cites the human as a cause by requiring more procedures and training.
More callouts and procedures are unlikely to be effective, particularly in situations where human mental resources are minimal and also because hearing degrades before the other senses.

Many (most) LoC accidents involve misunderstanding a complex situation, and based on this, inappropriate action results in upset conditions; the aircraft were not 'upset' to begin with. Also, each crew member's understanding appeared to differ such that it was impossible to communicate helpful information, at least until the event progressed, often to the point on no recovery.

Because of the human limits we should not expect to see any cross monitoring or CRM in this type of accident. This is an important safety issue particularly as the industry increasingly relies on these safety aspects, in extreme situations like in this accident they may be unavailable.
We cannot expect the humans to cover for system failures in remote or confusing situations. The reliability of modern technology is now so good that when it cannot function (extremely remote), neither can the human. Furthermore, because it is difficult to identify the circumstances and contributions associated with 'extremely remote' events beforehand, it is unlikely that we can provide targeted training; and why train for the last accident which may have no relationship with the next one (excepting the need for improved awareness and surprise management).

The industry should also consider other less obvious factors which could influence pilot behaviour. E.g. how is upset recovery training taught, has the industry over promoted quick reactions to make a nose-down control input, what scenarios are trained in the simulator.
These 'minor', apparently inconsequential contributions could be disregarded, but in extreme circumstance they may be a significant, although not an immediate apparent aspect. The industry has to be aware of 'Meldrew moments' * "I don't believe it'.

This is similar to citing human error as a cause, which should trigger the need to look deeper in the accident; thus any potential contribution judged "I don't believe it ...' should be reconsidered, look deeper for underlying assumptions associated with a less obvious contribution.

* 'Meldrew moments' (MM) from a British tv comedy involving the catchphrase "I don't believe it'.
alf5071h is offline