The recent report into the overrun accident at Toronto Pearson Airport recommends more training “to better enable pilots to make landing decisions in deteriorating weather”. The recommendation originates from a well considered analysis of the events and the human interactions with the situation; unfortunately the report does not provide the industry with guidance as to what should be trained nor how this might be achieved or checked. Presumably many national authorities will endorse this recommendation, thus we might anticipate additions to CRM, TEM, and HF training programmes. Pre-empting such change we might consider how overrun events can be avoided, what should we train to better enable pilots to make landing decisions in deteriorating weather, and how is this training to be achieved?
First, a minor correction. The report was written by the TSB, not TC.
As to what to train? The first one that comes to mind for me is the decision as to whether or not to continue an approach in conditions where the missed approach path is not a safe option due to severe weather, such as on the accident day in YYZ. Even when an approach is perfectly stabilized, a go around must always be considered as an equal possibility to a successful landing. You never know when someone will miss the hold line, followed by the tower ordering a go-around. That way, when a missed approach is required, it doesn't come as a surprise and one is better prepared to carry it out.
Stabilized approach criteria need to be considered all the way to the landing flare. Just because we're stable at 1,000 ft or 500 ft (or whatever our SOP says), does not gaurantee that we'll be stable at 100 ft.
Touchdown point awareness is also an important element of effective landing decision making.
We all think about "Contaminated Criteria" when there's lots of snow and ice around. We need to start thinking about it when the runway is wet as well. When it starts to rain heavy in thundershowers, a runway can quickly become as slippery as if there were more than an inch of snow on it.
But again, the decision to continue the approach in the first place was, for me, the key link in the accident chain that so easily could have prevented it.
You make a very good point. I didn't mean for my list to be a definitive one. I am sure there are a few more ways that training of landing decisions can be improved. I hope others will post their ideas here. A critical element of effective simulator training is being willing to point out areas for improvement to the candidate. If you're a simulator instructor or check pilot, ask yourself this question. Would you rather debrief them after a less than perfect sim session, or after an accident on the line?
The decision to continue the approach and landing was not the cause of this accident.It was the loss of stabilised flight in the flare and the subsequent lack of correct retardation procedures, which led to the deep landing and overrun. Simulator training can assist in reinforcing the need to touchdown at the correct point and use all retardation devices promptly, however, for the most part, runway conditions are conducive to 'soft' touchdowns. How often have you been briefed by the PF as to what retardation will be used and which turn-off point will be taken, only to witness a deeper than planned touchdown and the same braking/reverse used, resulting in turning-off much further down the runway than briefed. Many inexperienced pilots seem to concentrate on gentle touchdowns rather than correct ones. As for whether or not to continue the approach, this comes with good command training. I would suggest that even with an experienced co-pilot, this should have been a Captain's landing, in those conditions. However, the previous meeting between these two pilots had ended unpleasantly (with a Captain-under-check failing the detail on a different type). This may have led to an authority gradient in the co-pilot's mind and an unwillingness to object to the Captain's decisions. Emphasising correct landing techniques in training, is a relatively simple task, but often overlooked, especially when courses are reduced for costs reasons. Training out mind-sets, such as continuing approaches because preceding aircraft have already safely done so, although prudence would suggest an immediate discontinuation, is quite a difficult thing to do. There is also the individual's 'macho' view, that it is nothing he/she has not coped with successfully before and will do so again.
Question posed in initial thread: "... what should be trained ...[?]
Several studies suggest that pilots suffer diminished cognitive abilities during such stresses (TRW near arrival corridor); then late in the approach the overloaded "Pilot Monitoring" isn't really (missing checklist items then unable to effectively "monitor" the "approach").
www .flightsafety.org | AviationSafetyWorld | December 2006, pgs 28-33
Some interesting concepts here, subtly implicated in “unstable” approaches, eg:
-- Plan Continuation Bias,
-- late cognitive demands may overwhelm the human’s capabilities, and then inhibit his decision for go-around;
-- mixed messages from the airline (merely suggesting guidelines rather than imposing standards).
This study was done prior to the release of TSB’s analysis of the A340 landing rwy-excursion at YYZ.
The cognitive-limitations described in this FSF paper (from Bermin and Dismukes) suggest the earliest activation of Honeywell’s hosted RAAS [Rwy Awareness Advisory System] and SAM [Stabilized Approach Monitor].
[A special thanks to FSF’s K. Ehrlich, Production Coordinator, Flight Safety Foundation, for sending the un-locked pdf file. That protection-free file made these excerpts easily available for you to read below.]
= = = \/ = = = EXCERPTS = = = \/ = = = =
“... two of the most common themes in the 19 accidents studied:
*** plan continuation bias — a deep-rooted tendency of individuals to continue their original plan of action even when changing circumstances require a new plan — and
*** snowballing workload — workload that builds on itself and increases at an accelerating rate....
“... the problems encountered by the crews seem to have centered on these two themes....
“Too often, pressing an approach ... is attributed to complacency or an intentional deviation from standards .... To understand why experienced pilots sometimes continue ill-advised approaches, we must examine the insidious nature of plan continuation bias. Plan continuation bias appears to underlie what pilots call “press-on-itis,” which a Flight Safety Foundation task force found to be involved in 42 percent of accidents and incidents they reviewed. Similarly, this bias was apparent in at least nine of the 19 accidents in our study. Our analysis suggests that this bias results from the interaction of three major components:
-- social/organizational influences,
-- the inherent characteristics and limitations of human cognition, and
-- incomplete or ambiguous information....
“... Our study suggests that ...
-- when standard operating procedures are phrased not as requirements ... that may appear to tacitly approve of bending the rules,
-- pilots may ... place too much importance on schedule and cost when making safety/ schedule/ cost tradeoffs.
“Also, pilots may not fully understand ... that the cognitive demands ... from an unstabilized approach severely impair their ability to assess ... the approach ...”
“...Although plan continuation bias is powerful, it can be countered once acknowledged. One countermeasure is to analyze situations more explicitly than is common among crews. This would include explicitly stating the nature of the threat, the observable indications of the threat and the initial plan for dealing with the threat. Crews then should explicitly ask, “What if our assumptions are wrong? How will we know? Will we know in time?” These questions are the basis for forming realistic backup plans and implementing them in time, but they must be asked before snowballing workload limits the pilots’ ability to think ahead.
“ Airlines should periodically review normal and non-normal procedures and checklists for design features that invite errors....
“... Operators should carefully examine whether they are unintentionally giving pilots mixed messages about competing goals such as stabilized approaches versus on-time performance and fuel costs. For example, if a company is serious about compliance with stabilized approach criteria, it should publish, train and check those criteria as hard-and-fast rules rather than as guidelines....”
= = = = /\ = = = END excerpts = = = /\ = = =
The last paragraph above is specifically cited as a needed task for the Board, the regulator, and the operator, regarding the variety of Approach and Landing mishaps (hard landings, tailstrikes, rwy excursions, &ct).
“... Simply labeling crew errors as ‘failure to follow procedures’ misses the essence of the problem.” [From above paper, pg 32, right column, mid.]
= = // = =
Excerpts, from an earlier study:
= = = \/ = = Excerpts = = = \/ = = =
the USA Board’s AAR-01/02, RUNWAY OVERRUN DURING LANDING AMERICAN AIRLINES FLIGHT 1420
... MD-82, N215AA LITTLE ROCK, ARKANSAS JUNE 1, 1999
[from AAR Analysis Section, pages 141-143.]
220.127.116.11 The Role of Situational Stress
The presence of weather as a potential threat to the safety of flight and efforts to expedite the landing were stresses to the flight crew. Research has demonstrated that decision-making can be degraded when individuals are under stress because they selectively focus on only a subset of cues in the environment. As a result, any situation assessment may be incomplete, and the resulting decision, even when made by an expert, may be degraded. Stress can also impede an individual’s ability to evaluate an alternative course of action, resulting in __a tendency to proceed with an original plan__ even though it may no longer be optimal. Research on decision-making has demonstrated a natural tendency for individuals to maintain their originally selected course of action until there is clear and overwhelming evidence that the course of action should be changed ... [ Rhoda, D.A. and Pawlak, M.L. 1999. __An Assessment of Thunderstorm Penetrations and Deviations by Commercial Aircraft in the Terminal Area__. Massachusetts Institute of Technology, Lincoln Laboratory, Project Report NASA/A-2.]
. . . A June 1999 report sponsored by NASA and conducted by research staff at the Massachusetts Institute of Technology’s Lincoln Laboratory . . . used weather radar and ATC radar data sources to document flight crew behavior during . . . terminal area . . . convective activity. This research documented that pilots routinely penetrated thunderstorms with NWS precipitation intensity levels of 3 (strong), 4 (very strong), and 5 (extreme) rather than deviated around them, especially when approaching an airport to land. ... pilots penetrated the thunderstorms ... 67 percent ... The study concluded that pilots were more likely to penetrate a thunderstorm when they were flying after dark, flying within 10 to 16 miles of the airport, following another aircraft, or running behind schedule by more than 15 minutes....
. . . Because the NASA study showed no discernible differences among operators and airplane types regarding the propensity to penetrate thunderstorms, Safety Board concludes that aircraft penetration of thunderstorms occurs industry-wide. . . .
= = = = end excerpts, from NTSB’s AAR-01/02 = = = = =
How can one expect long haul pilots to be proficient in hand flying or have enough experience to base their decisions on, when most of them barely have one or two take-offs and landings per month? Also; one is perhaps at the end of a long and boring flight, maybe jet lagged as well, and suddenly has to jump right in to the full alert mode. "It was the loss of stabilised flight in the flare and the subsequent lack of correct retardation procedures, which led to the deep landing and overrun" Too many empty, zero-experience-building flight hours in a way too automated aircraft. Long haul ops takes you from pilot to manager in no-time, that is the problem. To answer the initial question: pilots need to pilot every so often. This is not the pilots´"fault". It´s just what can be expected when they never get to do any real flying...
skiesfull, crossunder, I had hoped that this thread would avoid focus on a specific accident, the quest is for a generic solution. However, re handling; although improved handling skills will position a flight towards a safer (optimum) operating zone they do not guarantee avoiding an overrun. If the runway conditions have deteriorated from those which were expected then there still may be insufficient runway to stop. The point of a pre landing decision is not to ensure that the aircraft can land accurately and stop in the distance available, it is to ensure that it can stop safely within the distance. The additional margin allows in part, for unforeseen changes in the situation and some errors, – note a similar absence of an adequate safety margin in many overrun accidents e.g. Congonhas, Midway.
The danger in singling out handling skills is that ‘good’ landings may be completed in circumstances which have a higher level of risk than desired by the safety standards in our industry; this builds a false level of security (complacency), such that an encounter with the unexpected may suddenly result in no safety margin at all, i.e. handling skills do not prevent or mitigate a decision error. DM must be built on discipline, the maintenance of safety standards, and avoidance of complacency or inappropriate industry norms.
Decision making (DM) should include planning for the unexpected. In terms of landing distance some of the ‘unexpected’ is included in the scheduled performance, but not for example a change from wet to contaminated conditions. Crew training should include knowledge of what is and is not included in the landing distance margins – what are the assumptions (Refs; AIC and Managing T&E slide 6). Also, DM in situations that include ill-determined runway conditions should consider the difference between wet and contaminated distances. It may not be necessarily to plan for a contaminated landing in most circumstances, but at least a crew should arm themselves with information so if they perceived that the runway condition had changed it enables a decision to change an earlier plan, i.e. knowing that you should be able to stop safely on a wet runway vs knowing that you will not stop on a contaminated runway. A similar point can be made about braking levels; in what circumstances do you change from medium to max (know before you go), i.e. SOPs are only ‘standard’ if the assumptions are met - what are the assumptions in SOPs (knowledge required - training).
The Captains landing (or not) might be debateable, but whichever view is taken it involves a decision; thus we have to determine what goes into that decision and how is it taught. In the ref below (crew procedures) the proposals might provide guidance for a Captain if he has the authority (flexibility) in the operation, if not this information could be in SOPs, which would simplify the decision, e.g. Cbs with 5nm of runway – Capts landing (or SFO landing with monitored approach procedures). Providing the advice is justifiable to an operator, it matters little what the advice is provided it triggers consideration and subsequent DM, thus it essential to provide ‘trigger’ advice to aid DM (SOPs or memory). However, we should be aware that guidance or SOPs, etc, cannot cover all eventualities or situations. For those rare occasions or unusual circumstances, operators require the flexibility enabled by human judgement, but with that comes a tendency for error – so DM guidance also has to provide error avoidance/mitigation; … what is that and how is it taught.
The decision to continue the approach and landing was not the cause of this accident.It was the loss of stabilised flight in the flare and the subsequent lack of correct retardation procedures, which led to the deep landing and overrun.
Accident prevention needs to look at all areas of potential improvement in pilot performance and decision making if it's ever going to be successful. I was not attempting to identify a cause. I was simply trying to point out what I see as a significant weakness in crew decision making when it comes to conducting an approach when the missed approach path may involve flight through hazardous wind shear conditions. Let's face it, a different decision regarding continuing the approach would have created a much different outcome, althought it also would have presented a new challenge to the crew in having to divert to another airport on pretty close to minimum fuel.
You make an interesting point about "captain's landing". Many operators (mine included) provide decision criteria as to when the captain is to carry out a takeoff or landing. We do not mention operations in the vicinity of severe weather, but it's certainly worth consideration.
IGh – plan continuation bias – (PCB). Good points, but PCB is just one of many biases or the result of several biases, which can effect a decision. Ref 1 (page 9…) describes a range of issues including bias:- humans use reasoning shortcuts (patterns of thinking) which are generally reliable, but their use as quick estimators can result in error. Many of these shortcuts have ‘built in’ bias, e.g:-
Humans divide situations (and decisions) into ‘good enough’ or ‘not good enough’, but rarely optimal. This actually might be an advantage in aviation, but ‘good enough’ or ‘not good enough’ require definition within expected situations (context), i.e. when is the weather / runway condition / wind / airspeed, etc ‘good enough’; this requires knowledge which has to be acquired / taught.
Humans are more likely to pass up an opportunity to make a gain rather than risk a loss; risky landings are often seen as a ‘gain’ and a go-around as a loss. This requires a disciplined evaluation of the situation and knowledge of the risks associated with each option. Disciplined thinking must balance the interests of others against the safest option – the passengers on-time arrival, or what peers might think about a GA, are irrelevant when you are stuck in the overrun area.
Humans are more likely to select the option (as perceived/visualized) resulting in the most desirable outcome, irrespective of the actual probability of that result actually occurring. Visualization is thinking ‘what-if’ in pictures, but the content depends on knowledge of the probability (risks) and adherence to the key safety objectives, these require knowledge and discipline, e.g. 70% of landing overrun accidents occurred on wet runways.
Once humans have made a decision, they are reluctant to change. Pilots need to understand the boundaries of a safe operation and be provided with strong physical and mental trigger points to force reconsideration of a plan e.g. stabilized approach criteria and boundary ‘gates’ – if unstable at 1000ft and correcting, check at 500ft. If unstable at 500ft, or not correcting at any time below 1000ft, then go-around.
Humans usually overestimate the connection between their actions and attaining the desired outcome - hindsight. This occurs when successful events are wrongly reconstructed in memory or when recalling past failures – we think we did better than was the fact, we fool ourselves. Accurate memories are aided by debriefing, self analysis (admitting that you were wrong – at least to your self). Briefing helps memory recall and strengthens memories particularly when visualizing the course of a flight or operational segment. No briefing should be standard, every flight is different; identify and brief the differences or unusual items however small or insignificant, e.g. variability in wt, speed, wind, weather, runway condition, runway surface, distance available, braking technique, overrun area, etc, etc; these can be ‘triggers’ for decision making.
The reference also reviews some defenses against bias to supplement those above:
Knowledge is required to appreciate the opportunities or range of outcomes during an evolving situation (good or bad). Pilots have to know how to use the knowledge, first by understanding the hazards, then associating them with related, but often future situations, e.g Cbs are a hazard; in-flight we should consider lightning, hail, windshear, etc, but projecting this to a landing (perhaps mentally a ground operation) we should consider the rainfall rate and visibility – thence a contaminated runway, windshear/tailwind/crosswind – landing limitations.
Reframing the problem – looking at the situation from a different perspective – considering 'what if'. Don’t ask can we make this approach; instead ask “should” we be making this approach, i.e. what are the risks.
Being vigilant; monitoring and using checklists. This also involves personal checks including self reflection - how am I doing – flying and thinking? Or ‘how goes it’, a check of approach progress with the ideal, or how far from the edge of a safe boundary is the operation and in which direction is it heading – towards or away from the safe zone. Pilots can use deviations from the ‘ideal’ and ‘boundaries’ as trigger points for reconsideration or change of plan respectively.
Pilots have to be taught to be reflective and self correcting; some of this is personality (hazardous attitudes), but a significant proportion comes from the manner in which we think. Basic education teaches thinking skills; flying training should teach skills associated with thinking for flying – often termed in airmanship; so how do we teach airmanship.
AC 91-79 provides background knowledge and a reminder of requirements to help avoid an overrun. The presentation of some of the issues appears muddled and at variance with world standards. Why does the FAA choose to introduce new terminology for runway braking conditions when there is an adequate ICAO standard, or is this a move towards standardization?
The recommendations are typical of many regulators in telling operators what to do and what their responsibilities are, but there is scant advice as to how these are to be achieved, e.g. “Adhere to SOPs and best practices”, but what are best practices and exactly how do pilots adhere to SOPs – how do they avoid error?
Technical knowledge and an understanding of the regulatory system are components of decision making, but a more critical element is how and when that knowledge is used. These details are absent.
I have located a more positive reference (link at ref 1). This provides a model of the human decision making and concludes with a section on training for decision making, starting with decision aiding systems:
“Humans are inaccurate at estimating probabilities, and are particularly poor at revising probabilities based on new information. Decision aids can be used to help improve this fault in decision making.” Perhaps we require some technology to help with the probabilities.
The training recommendations start with ‘de-biasing’, the removal of bad habits or false belief; this has to overcome “first learned, best remembered’ items from basic training which might not apply to commercial operations.
The key training issues are:
Five hazardous attitudes, but there is no explanation of how to control them.
Self evaluation, self monitoring – ‘how goes it’.
Time management; staying in control of the situation, time, thoughts, and actions.
Memory recall; also related to how information is learnt and ‘memorized' – a reminder or trigger cues for recall.
Metagognitive skills; see “thinking about thinking” (link at #11).
Training in context; chose a realistic situation.
Situation awareness and mental simulation – visualization, but no details of how to improve situation assessment, except ...
… pattern recognition, development of mental models.
… pay attention to ambiguous, unexpected, or abnormal cues.
Risk assessment and resource management; mental resources and workload.
“For maximized effectiveness these are to be taught to pilots in naturalistic environments - time pressure, high workload, uncertainty, dynamic situations and other people involved”.
funnily enough i just read the aaib report on the british airtours tristar overun at LBA in june 1985,
weird issues there as to what to train, what wasnt trained, what manual (FM)to use and the SOP etc etc
the peculiar one there for me was that british airtours sent this a/c in to LBA for the first time ever and classified LBA as not a special classification airport (despite having a landing distance available well<2000m) and the flight deck crew had never been there before this flight (NOR had a tristar)
I suggest the following as an effective measure to combat landing technique complacency:
Simulator training should be conducted on runway lengths just within the landing length performance limit, be it wet, dry, or slippery. A crosswind should be present. More often than not simulator runways at the destination are well in excess of the minimum length required.
It is important to demonstrate the dangers associated with excess speed and/or excess threshold crossing height. Briefings are fine, but seeing the result of a poorly managed approach and landing is better. A picture is worth a thousand words.
Have the pilot deliberately approach too fast, too high and with a tailwind. All things being equal the aircraft will over-run and presumably an intelligent pilot will see the result of his "folly." Do this on a slippery runway to make the point the over-run potential is greater. From then on train pilots in the simulator to land consistently safely on a runway limiting weight. This does not mean autoland either but raw data hand flying.
The manufacturers speed additives need to be looked at as well. An example of this is where the additive called for is half the steady headwind component and all the gust.
For example the 737 FCOM requires the steady headwind to be bled off approaching touch down. Few pilots accept that and many hang on to all addditives until the flare. On a wet runway this results in excess speed and invariably a long float. Boeing does not explain why the half the steady headwind component is required as an additive and does not give accurate advice as to when the pilot deliberately starts to bleed off the additive.
"Approaching touch-down" is imprecise and open to liberal interpretation. The FCTM does however state that the flare manoeuvre washes off 3-4 knots. But if you have a 15 knots additive then at what altitude or distance from the threshold should speed bleed -off commence? It's anyone's guess but presumably the pilot should start reducing speed at the point where ground friction starts to take place. Free stream air is considered to start around 2000 ft so the half the HW component should start to be bled back once below 2000 ft. The support pilot needs to be briefed not to squawk Mayday when he sees speed reduction occurring.
Overrun accidents are often due to excessive threshold speed with subsequent float. A common denominator is the manufacturer's recommended additives to Vref. You will not bleed off 15 knots in the 3-4 second flare manoeuvre - no way. The gust factor additive is usually held until the flare and if the gust or lull fails to materialise a long float is likely.
Too many simulator sessions are on runway lengths far in excess of landing length limits. In turn this causes complacency. In the simulator we see pilots conducting the non-normal all flaps-up landings (B737) on runway lengths of at least 9000 ft and there is no sweat in pulling up even after a long float. When the same pilots are required to use shortened runways just within the length limits of a all flaps up landing, most over-run because of excess speed and float.
There is probably no harm with psycho-babble to explain why people land long and fast - but I see proactive training in the simulator as the answer to preventing over-runs on wet and slippery runways.
Are all pilots ware of kinetic energy = 1/2 m V SQUARED? If you double the landing speed you increase the energy by a factor of 4. So a few extra knots makes a BIG difference to the amount of energy which has to be disipated during the landing!
Another issue related to this topic that I feel is "under trained" in most simulator scenarios is the Go Around itself. Particularly in newer automated aircraft, a go-around from a position other than DH/MDA (which is prepared for/briefed) can be a sporty event, particularly with a low missed approach altitude.