PPRuNe Forums - View Single Post - IV-6936 MD83 overrun at Mahshahr, Iran
View Single Post
Old 3rd Sep 2020, 17:46
  #6 (permalink)  
safetypee
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
parkfell,
"… there is something going on in the minds of these accident Captains which appeared to prevent them from …"

A key question; however we must not differentiate 'these' individuals from everyone else - including ourselves, who make similar decisions each day, generally with successful outcomes.

- "… what is it that inhibits them from following what they know to be the SOP for an obvious unstable approach ?
- What are the common threads?
- Why this overwhelming desire to land when all the indications to the readers of these accident reports would say things are really not at all good."


These points hinge on hindsight - our judgement after the fact:
- We cannot judge what the crew knew, or with knowledge, were able to recall it at that time.
- Nor might we identify commonalities, and particularly the human tendency to 'see' patterns where there are none. At best we might identify associations from a few accidents, but unlikely to have sufficient proof to justify widespread action. An alternative is to evaluate all operations, considering the range behaviours which result in success, but again after the fact; and who decides what is good or not, and how.
- The desire to land, often described as plan continuation bias, and again only identified with hindsight, has generated a range of academic views. One issue is that soft science (HF) depends on judgement, whereas people generally require 'fact'; if not from hard science, then facts as individually 'created'; our view, subject to our biases, culture, education, and experiences.

Orasanu and Martin provide a useful, simplified view of decision 'error'. *

There are more recent academic views, but with little practical advice to help improve aviation safety.
As I recall, one view suggests that the cognitive incentive required to change an inappropriate course of action requires an opposing incentive nine times greater. e.g. the mental gain in choosing to land straight in, saving time and fuel, tailwind, unstable, fast, - because these have been successful on many previous occasions; has to be opposed by a perception of risk nine times stronger in those situations which can result in an accident.
Thus the need for safety focus on procedural deviation, situation assessment, and knowledge of risk in decision making (education, training). Also checks on organisational influences, rota change, disruption, time and fuel pressure.
The industry must clearly restate the risks during landing, false perceptions of tradeoffs, safety margins, etc; landing briefing, accuracy and application of landing performance data.

Also, review implementation of safety strategies; e.g. current focus on LOC avoidance and practical recovery procedures vs landing accidents have no recovery procedure, only avoidance.
And reconsider risk; LOC has resulted to more fatalities (historical), but future risk often overlooks the remedial actions - AF 447, 737 Max, technical changes; thus projecting future fatalities on history would not be appropriate risk assessment. Whereas landing overruns have a similarly high historical risk (number of events), but without any practical mitigation the accidents could be repeated in similar situations, higher risk as the number events and fatalities cannot be prejudged. Safety action relies on frail human judgement.

** http://www.pacdeff.com/pdfs/Errors%2...n%20Making.pdf
safetypee is offline