PPRuNe Forums - View Single Post - Avoiding an overrun: what should be trained?
Old 7th Jan 2008, 19:46
  #10 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
IGh – plan continuation bias – (PCB).
Good points, but PCB is just one of many biases or the result of several biases, which can effect a decision.
Ref 1 (page 9…) describes a range of issues including bias:- humans use reasoning shortcuts (patterns of thinking) which are generally reliable, but their use as quick estimators can result in error. Many of these shortcuts have ‘built in’ bias, e.g:-
  • Humans divide situations (and decisions) into ‘good enough’ or ‘not good enough’, but rarely optimal. This actually might be an advantage in aviation, but ‘good enough’ or ‘not good enough’ require definition within expected situations (context), i.e. when is the weather / runway condition / wind / airspeed, etc ‘good enough’; this requires knowledge which has to be acquired / taught.
  • Humans are more likely to pass up an opportunity to make a gain rather than risk a loss; risky landings are often seen as a ‘gain’ and a go-around as a loss. This requires a disciplined evaluation of the situation and knowledge of the risks associated with each option. Disciplined thinking must balance the interests of others against the safest option – the passengers on-time arrival, or what peers might think about a GA, are irrelevant when you are stuck in the overrun area.
  • Humans are more likely to select the option (as perceived/visualized) resulting in the most desirable outcome, irrespective of the actual probability of that result actually occurring. Visualization is thinking ‘what-if’ in pictures, but the content depends on knowledge of the probability (risks) and adherence to the key safety objectives, these require knowledge and discipline, e.g. 70% of landing overrun accidents occurred on wet runways.
  • Once humans have made a decision, they are reluctant to change. Pilots need to understand the boundaries of a safe operation and be provided with strong physical and mental trigger points to force reconsideration of a plan e.g. stabilized approach criteria and boundary ‘gates’ – if unstable at 1000ft and correcting, check at 500ft. If unstable at 500ft, or not correcting at any time below 1000ft, then go-around.
  • Humans usually overestimate the connection between their actions and attaining the desired outcome - hindsight. This occurs when successful events are wrongly reconstructed in memory or when recalling past failures – we think we did better than was the fact, we fool ourselves. Accurate memories are aided by debriefing, self analysis (admitting that you were wrong – at least to your self). Briefing helps memory recall and strengthens memories particularly when visualizing the course of a flight or operational segment. No briefing should be standard, every flight is different; identify and brief the differences or unusual items however small or insignificant, e.g. variability in wt, speed, wind, weather, runway condition, runway surface, distance available, braking technique, overrun area, etc, etc; these can be ‘triggers’ for decision making.
The reference also reviews some defenses against bias to supplement those above:
  • Knowledge is required to appreciate the opportunities or range of outcomes during an evolving situation (good or bad). Pilots have to know how to use the knowledge, first by understanding the hazards, then associating them with related, but often future situations, e.g Cbs are a hazard; in-flight we should consider lightning, hail, windshear, etc, but projecting this to a landing (perhaps mentally a ground operation) we should consider the rainfall rate and visibility – thence a contaminated runway, windshear/tailwind/crosswind – landing limitations.
  • Reframing the problem – looking at the situation from a different perspective – considering 'what if'. Don’t ask can we make this approach; instead ask “should” we be making this approach, i.e. what are the risks.
  • Being vigilant; monitoring and using checklists. This also involves personal checks including self reflection - how am I doing – flying and thinking? Or ‘how goes it’, a check of approach progress with the ideal, or how far from the edge of a safe boundary is the operation and in which direction is it heading – towards or away from the safe zone. Pilots can use deviations from the ‘ideal’ and ‘boundaries’ as trigger points for reconsideration or change of plan respectively.

Pilots have to be taught to be reflective and self correcting; some of this is personality (hazardous attitudes), but a significant proportion comes from the manner in which we think. Basic education teaches thinking skills; flying training should teach skills associated with thinking for flying – often termed in airmanship; so how do we teach airmanship.

1. Analyzing Explanations for Seemingly Irrational Choices.
2. Thinking about thinking.
3. The limits of expertise. Key Dismukes
alf5071h is offline