PDA

View Full Version : The "Threat" of Automatic Complacency


A37575
12th Jan 2008, 12:06
The latest issue of the United States flight safety magazine called (appropriately) "Flight Safety" Volume XXVIII, Number 1 for January 2008, has a short paragraph advertising future articles. One of them is called "Automatic Complacency: Why it's so easy to get lazy and what you can do about it."

It got me thinking that despite documented research that over-reliance on full automation can lead to the habit known as "Automatic Complacency", both Boeing and Airbus push for full use of automatics as being the safest way to operate their aircraft. So how does one define over-reliance? Certainly simulator syllabus for Boeing and Airbus operators whether type rating or recurrent training, accent the full use of automatics.

Where does this leave a pilot who wishes to avoid "automatic complacency?"
Even those operators who permit occasional hand flying invariably require the flight directors and auto-throttles to be engaged - in other words retaining partial automatics.

So is this subject just a case of do what I say (avoid automatics complacency) not do as I do (full automatics for as long as possible take off to touch-down?).

And given that full use of automatics is strongly recommended - for example the introduction to Normal Procedures in the 737 states: "Normal procedures are used by a trained flight crew and assume all systems operate normally and the full use of all automated features LNAV, VNAV, autoland, autopilot and auto-throttle." - then what measures remain to combat "The Threat" of automatic complacency?

enicalyth
12th Jan 2008, 14:07
Which airline are you with? Can I see your SOPs?

Crossunder
12th Jan 2008, 16:52
The idea of "complacency" is just another desperate try from the human performance buffs to say something meaningful about "human factors".

It is an almost completely meaningless term when it comes to trying to unserstand how humans and machines interact. It is just another way of saying "human error". It makes a "neat" and "holistic" case for studying folk models of how accidents occur. "Complacency" is equally non-specific as the Yerkes-Dodson "U-curve".

Automation does not just solve problems. Like any other innovation/solution it creates new problems as well. If you cannot trust in automation - why use it? Using the word "complacency" in an accident investigation is just judgemental and over simplistic. One often hears that "the crews trust in automation was UNJUSTIFIED". So far, I have never seen anyone try to explain why the assumption of satisfactory system state was regarded as JUSTIFIED by the crew. If their trust in autiomation were unjustified, and the crew knew that, they would not make the assumption and would consequently not become "complacent".

I´ll write more later, after the staunch folk model worshippers have had their say! ;)

alf5071h
12th Jan 2008, 21:09
Another view of complacency, taken from my boy’s book of HF, is that it is the opposite of courage. Normally courage is opposed by fear. So perhaps the problem involves being fearful of automation, or just not understanding it?
“Without courage, wisdom bears no fruit.” Baltasar Gracian.

Automation covers a wide range of technologies and equipment.
Automatic systems have evolved from a simple workload relief for pilots (attitude hold), to complex systems that undertake tasks which are beyond human capability (Cat IIIB autoland, precision navigation).
Modern systems involve varying degrees of automatic activity and differing levels of integrity; - do I trust it to land me in all conditions (fail safe within extreme probabilities), or does the system require vigilant monitoring or crosschecking.

There is nothing wrong with relying on automatics for the tasks for which they were designed, provided the operation is conducted within the limitations of the systems certificated reliability (normally determined by procedures) and using the approved operating techniques - procedures.

Automation can be a threat if it is used in situations that it is not designed for – when we rely on it for a range of tasks beyond the envisaged capability – over reliance.
Similarly, it is a threat if the manufacturers operating procedures are not followed.
These are not necessarily operator’s SOPs. Additional problems can result from operator (or individual) derived SOPs that assume how and when automation can be used, i.e. operators (or individuals) decide independently on operational techniques and situations (I know better syndrome – complacency?).

A good example is using the autopilot to fly the aircraft while the crew deals with a system failure. In principle reducing the workload aids safety in the situation, but this assumes that the immediate operating rules have been followed;_
1. Aviate: fly the aircraft; the crew has established stable flight and a trimmed condition.
2. Navigate: determine the future flight path, track, altitude, check speed (time / fuel functions).
3. Communicate: amongst the crew and externally as required, exchange thoughts (mental models) requests (more information) and plans (intentions and possibilities).
4. Manage the resultant situation, which includes engaging the autopilot and checking that ‘it’ is following rules 1 & 2 (your monitoring task), then check rule 3 particularly because automation has difficulty in communicating – it cannot always tell you what or how it is doing or more important what it is going to do. A human can detect and state that it is working hard or it is near the limit of performance – it might fail, automation rarely has these capabilities.

In the situation above, automatic complacency occurs if you expect the autopilot to sort out the initial flight conditions and future path (rules 1 & 2) by selecting an immediate or ill-conceived engagement. Perhaps your autopilot will normally manage, but unless it was designed for the specific task there may be situations where it will fail, often without warning (limitations in rule 3).
Consider a similar situation in handing control to the other pilot; it is good practice (airmanship) to establish a stable flight path with the aircraft in trim; the intended flight path / navigation plan is briefed, and the hand over completed with a feedback check – ‘I have’. Why then not the same for an autopilot – “Automation is like a dutiful first officer, except it never learns”.
Automation may be a poor communicator, but it still deserves the respect afforded by good airmanship.

“ … then what measures remain to combat "The Threat" of automatic complacency?”

A. Fully understand what tasks the automation has been designed for and in what situations it can be used; do not exceed these tasks. (You don’t know better).

B. Maintain a good knowledge base of what the automation can achieve and how this is accomplished – basic operating procedures. If the system is complex and the depth of knowledge low, then only use a limited set of options or procedures.

C. Monitor (communicate with) automation in proportion to its level of integrity / reliability.

D. Respect automation, it too requires CRM, TEM, and Airmanship, but it rarely reciprocates.

“ It takes a lot of courage to release the familiar and seemingly secure, to embrace the new. But there is no real security in what is no longer meaningful. There is more security in the adventurous and exciting, for in a moment there is life, and in change there is power.” Alan Cohen.

Embrace automation as if flying with a new crewmember on every flight. When employed correctly it can be most rewarding, but if provoked or unmonitored, it is a threat.

AirRabbit
12th Jan 2008, 23:53
The idea of "complacency" is just another desperate try from the human performance buffs to say something meaningful about "human factors".

It is an almost completely meaningless term when it comes to trying to unserstand how humans and machines interact. It is just another way of saying "human error". It makes a "neat" and "holistic" case for studying folk models of how accidents occur. "Complacency" is equally non-specific as the Yerkes-Dodson "U-curve".

Automation does not just solve problems. Like any other innovation/solution it creates new problems as well. If you cannot trust in automation - why use it? Using the word "complacency" in an accident investigation is just judgemental and over simplistic. One often hears that "the crews trust in automation was UNJUSTIFIED". So far, I have never seen anyone try to explain why the assumption of satisfactory system state was regarded as JUSTIFIED by the crew. If their trust in autiomation were unjustified, and the crew knew that, they would not make the assumption and would consequently not become "complacent".

I´ll write more later, after the staunch folk model worshippers have had their say!
Well, at least as much as the psychologists of the world are still in disagreement as to the accuracy, let alone the value, of the Yerkes-Dodson “curve,” there are differing opinions among aviators (and those who comment on aviation) regarding when and where to use the term “complacency,” at least as much as there are differing opinions as to what is actually meant when that term is used. There are many who would argue with the premise that being or becoming complacent is not necessarily an error on anyone’s part – and if these persons are operating under the accepted definition of the term, they would be correct. Complacent means self-satisfied … nothing more … nothing less. To be “satisfied” is not necessarily making an error. One can be exposed to a set of deteriorating circumstances, take some actions to reverse that trend, and be satisfied that the actions taken are, or will become, sufficient to change those circumstances. The fact that the person is satisfied the actions taken are, or will be, sufficient is not necessarily an error – particularly if those actions are those that were taught to be the correct ones. If the circumstances are not reversed or they are not mitigated to the extent that the person is satisfied with the actions taken, there are three possible conclusions that could be reached regarding that collective situation: 1) the person erred in assessing the circumstances; 2) the person erred in applying the actions taken; or 3) the actions that were taught to be taken were, in fact, not the correct actions – or a combination of these realities.

Certainly, aviation is a dynamic set of human and natural interactions, and, as such, any one circumstance cannot logically be taken in isolation, as in the situation I just described. The key, obviously, is to continually (as opposed to constantly) assess the situation and continually make decisions about the existing, and evolving, circumstances – and taking action, or more accurately, a series of actions, accordingly, to achieve and maintain satisfactory flight. But, I believe that does not change the premise I described above. That is, being satisfied the actions taken are, or will become, sufficient is not necessarily an error – particularly if those actions are those that were taught to be the correct ones.

I agree completely with your premise that “Automation does not just solve problems. Like any other innovation/solution it creates new problems as well.” I am hoping that the objection you registered comes from the fact that “stopping” the search for the reason for the occurrence under investigation at the point the term “complacency” is used is at least as much in error as may be the use of the term in the first place. I say this because I don’t believe it is a matter of “trusting” or “not trusting” the automation.” The automation, unless it fails, when engaged properly at the correct time, is very likely going to do just exactly what it was designed and installed to do. The problem is, that, alone, may be insufficient under the circumstances. In fact, it may contribute to a further unwanted circumstance – and will most likely do so in those cases where the precise functioning of the automation (including sequencing) is either not known or not considered by either the organization having conducted the training or the person currently activating the automation. And there are several examples available that highlight this process.

But there is a further problem with automation that has nothing to do with the timing of its use, its functional programming, or its limitations. It has to do with psycho-motor skills; the skills of thinking and doing, equally, with respect to a particular goal. Piloting an aircraft is very heavily dependant on the psycho-motor skills a pilot develops and practices. And the key word here is “practices.” Even the most mundane and most basic of skills tend to deteriorate over time without sufficient, periodic practice – and unfortunately, I can provide an overabundance of examples, anecdotal as they may be, of just such deterioration because of what I describe as an “over reliance” on automation in day-to-day operations. I am concerned that, with today’s almost completely “automatic,” computer-controlled aircraft, there is a question as to whether or not some pilots ever truly have the opportunity to develop, let alone practice, some of the skills that are necessary in a non-computer-controlled environment – which means in an airplane not designed to be computer-controlled, or a computer-controlled airplane in which the computers have decided to “take a hike.” No pilot who has ever hand-flown an ILS in a dark and stormy night more than once would ever denigrate proper and properly used automation. But, as much as automation was developed to assist the pilot, the fallacy has been that many believe that having such automation available warrants a lessening of the time necessary to devote to training. Those who believe this, do so because it is easier, and quicker, to teach the use of automation. But what is often overlooked is the very real possibility of not having the automation at some point – for one of an ever-expanding list of reasons. It is at this point when fundamental “stick-n-rudder” training becomes a lamented loss. Arriving at this juncture is, in my book, what deserves the description COMPLACENCY.

Regarding automation, what is necessary, in my (not so humble) opinion, is two things: 1) the cessation of “over reliance” on automation in day-to-day operations – which I would expand to a “lessening of reliance” on the computer in day-to-day operations of computer-controlled airplanes; and 2) training the crew to be more “in the loop” when automation is being used.

I, too, will await comments.

airsupport
13th Jan 2008, 00:37
I do KNOW it is not exactly the same thing :rolleyes: but it reminds me of AN Airline I was at around 20 years ago, everything was running so smoothly with the new computerised check ins etc, until one day all the computers went down Nationwide. :(

The flight delays were massive because hardly anyone remembered how to do it the old manual way or weren't even there when it was all done manually. :uhoh:

GMDS
13th Jan 2008, 00:57
1. Automation is great and safe, if given as a tool or aid a pilot MAY use if deemed useful.

2. Automation leeds into a cul de sac if IMPOSED on a pilot through FCOM and SOP.

Imposing something on someone when he actually wouldn't want it, leads to a conflict. There is a reason why he doesn't want it, rightfully or not, but it creates insecurity and defiance in his subconscience. He suspiciously controls the automatics, unconsciously hoping it will prove doing worse than himself would have, but then anxiously controlling it not beeing completely sure if he's right or the system will prove him wrong by working better or even in a way he has not anticipated.
Another effect of this is with repetitive situations like that, and with most of these having the automation working satisfactory, most pilots lose their defiance to some sort of resignation and boredom ("so what...") which leads to distancing himself from the system, not keeping up to date in training, which finally leads to what is discussed here, complacency.
I would however call it rampant incompetence. THAT is the real danger.

How can we counteract? By even more strict SOPs? No, it proves to be counterproductive. By more training? Maybe, but training the ways you don't like never really proved persistant.

Personally I am fully sustaining the entry argument:
Automation is great and safe, if given as a tool or aid a pilot MAY use if deemed useful.
Manufacturors and Chief Pilots would be well advised to go this path. In my 30 years of aviation I have run through many systems, aircraft, manufacturors, sops, doctrines etc. etc. One thing has been around pilots ever since and has never changed: If you give them a option, it will be used and used cleverly more often than not. If you impose something on them, it will be mostly followed but more often than anticipated it will be defied and circumnavigated a little less cleverly.

Now what seems to be more intelligent and therefore safer?

GMDS

frontlefthamster
13th Jan 2008, 08:17
Even those operators who permit occasional hand flying invariably require the flight directors and auto-throttles to be engaged
:(

Not at our house (big UK carrier, mostly large Boeings).

I routinely hand fly, no FD, no AT, any phase of flight. :ok:

rogerg
13th Jan 2008, 08:22
I have never been in a company that does not allow, or even encourage, raw data hand flying when approprate. This includes BCAL, BA, plus many smaller operators.

A37575
13th Jan 2008, 10:15
No pilot who has ever hand-flown an ILS in a dark and stormy night more than once would ever denigrate proper and properly used automation. But, as much as automation was developed to assist the pilot, the fallacy has been that many believe that having such automation available warrants a lessening of the time necessary to devote to training. Those who believe this, do so because it is easier, and quicker, to teach the use of automation. But what is often overlooked is the very real possibility of not having the automation at some point – for one of an ever-expanding list of reasons. It is at this point when fundamental “stick-n-rudder” training becomes a lamented loss. Arriving at this juncture is, in my book, what deserves the description COMPLACENCY.


I must say how much I have appreciated all of your erudite and well thought out replies. Believe me there are operators out there - and some of these are major Asian operators in particular, that actively discourage en-route "practice" of hand flying the aircraft. I well remember being astounded on reading the applicable section of the company operations manual of a German LLC with 737's back in the Seventies, which stated in large block type akin to "shouting" "MANUAL FLIGHT IS ONLY TO BE UNDERTAKEN UNDER THE MOST EXCEPTIONAL CIRCUMSTANCES".

Whether or not this could lead to automatic complacency or however you wish to personally define the term, I don't know. But those who originate SOP's would do well to read some of the replies above.

Empty Cruise
13th Jan 2008, 11:21
Some airlines - for reasons that they themselves find valid and justified - are taking it one step further.

We're now supposed to teach the cadets to use the FMC "option" over the MCP "option" every time. I.e. if you're on a SID with various altitude restrictions and are instructed to climb to a higher FL - then, God forbid, you mustn't just dial in the new FL and hit LVLCHG! No-no, what you have to do is look down on the CDU, check what level-restrictions are in there, and then delete them with ALT INTV, one at a time.

And it goes on - if you don't use the fix page to draw a TOD-ring and a 10 NM-ring from your arrival runway threshold, it's considered "sub-optimal use of the FMC". We're already prohibited from doing timed approaches at some airports etc. etc.

This means that we go one step further into the automatisation jungle - we're now moving away from flying the aircraft on automatics and towards managed automatics - i.e. removing the pilot one further step from actual control of the aircraft.

Tell me I'm wrong, but I know that the day will come when the FOs will look straight into the FMC when the aircraft does not do what they want it to - and they'll try to program themselves out of it. Gone are the days when a controller could just throw a last-minute holding at the pilots, since we'll now apparently need our 1 min. to program the hold :mad:, or instruct us to intercept a QDR, since that will have to be put in the fix page first.

We're talking loss of situational awareness - which will do nothing but compound the problem of over-reliance in automatics. Soon we won't even know when we're being led down the gaden path :ugh:

sarah737
13th Jan 2008, 15:32
As a contractor for many years I have seen a lot of different ways of using the automatics. My conclusions:
-always using automatics is as wrong as always flying manual
-people who do a lot of handflying are still proficient in using the automatics
-people always using automatics are loosing their skills and are a danger without realising it

Use automatics when you have to or when you want to, but don't get to the point where you have to use automatics because you can't fly manual anymore.

frontlefthamster
13th Jan 2008, 16:35
Too late, Sarah, too late by far...

(Remember there are some zero-to-hero 'cadets' out there who have never, and will never, be able to hand-fly a transport aircraft competently - for the moment, they're probably all sitting next to people who can, but...) :(

CEJM
13th Jan 2008, 17:14
Frontlefthamster,

I don't agree with your post. While i have to agree that some 'zero to hero' cadets are not competent to hand-fly a transport aircraft. However i have also seen a lot of these guys/gals trying to get as much manual flying in as they can get. And everybody knows that practice makes perfect.

However the trend which has started a few years back is that a fair amount of captains deny their F/O's to handfly the aircraft. Even when the conditions are right. (not talking about inbound LHR with weather on CAT1 minima) They defend their decision by stating that the SOP's prefer an automated approach. However, the SOP's allow for manual flying (A/P, A/T, F/D off) when both pilots agree is safe to do.

I know that the skipper has the ultimate responsibility but if you start denying your F/O to hand-fly the aircraft you can't come on this website (or company's website) and claim that the low hours F/O's have no manual flying skills.:ugh:

This raises the question, whats the reason that these skippers don't let their F/O's hand-fly the aircraft? Is it because they are themselves reluctant to hand-fly because their skill are not up to scratch?

Personally i do fly the aircraft manual whenever possible and do encourage the other side to do this also. In the end you have become a pilot to fly the aircraft and not to just sit and watch what the a/c is doing. (of course this depends on the situation)

CEJM

Cardinal
13th Jan 2008, 22:48
He suspiciously controls the automatics, unconsciously hoping it will prove doing worse than himself would have, but then anxiously controlling it not beeing completely sure if he's right or the system will prove him wrong by working better or even in a way he has not anticipated.
Another effect of this is with repetitive situations like that, and with most of these having the automation working satisfactory, most pilots lose their defiance to some sort of resignation and boredom ("so what...") which leads to distancing himself from the system, not keeping up to date in training, which finally leads to what is discussed here, complacency.

Very insightful. Relatively new to the world of massive automation, I've been trying to quantify and figure out the obtuse interactions I've seen some crewmembers have with the aircraft. (Company grew up on 1st generation Boeings, transitioned to A320 just a few years ago, the heartburn continues) That makes perfect sense.

PPRuNe Towers
13th Jan 2008, 23:21
I sometimes wonder if the autoflight sections of MEL's within certain company cultures reflect their automatics only policy?

I've found over the years that despatch without autothrottle seems far more disconcerting for many rather than autopilot. If despatch is allowed without either or both I consider currency and practice as a basic professional requisite.

Rob