Go Back  PPRuNe Forums > Flight Deck Forums > Tech Log
Reload this Page >

Automation Bogie raises it's head yet again

Wikiposts
Search
Tech Log The very best in practical technical discussion on the web

Automation Bogie raises it's head yet again

Thread Tools
 
Search this Thread
 
Old 19th Jan 2011, 11:41
  #161 (permalink)  
 
Join Date: Jun 2002
Location: home and abroad
Posts: 582
Likes: 0
Received 0 Likes on 0 Posts
@ PBL:

there is however, a chain of events leading up to an accident.

Removal of any of those links prevents the accident.

So reminding ourselves that airmanship is our last resort even when the rest of the design and circumstances conspire against us, is not a bad thing in my opinion.

Fact is, that without the decision to change a good plan, none of the other factors would have been able to cause the accident.
They probably would have been part of another chain of events, leading to another accident or incident, but not this one.

So for us at the pointy end of whatever we fly, design of aircraft, equipment and airspace are a given that we cannot influence, only our behaviour and how we work with what we are handed on the day.

In this case, as in may other cases, the crew made poor choices allowing the rest of the holes in the swiss cheese to line up. And as human beings, we need awareness of why and how we are prone to making poor choices. Because we will always have to deal with stuff designed by people without the full picture of what it means to work with it. We are always the last line of defence.
S76Heavy is offline  
Old 19th Jan 2011, 11:59
  #162 (permalink)  

Mach 3
 
Join Date: Aug 1998
Location: Stratosphere
Posts: 622
Likes: 0
Received 0 Likes on 0 Posts
Pilots and Academics

It is surely necessary to understand PBL's and fdr's papers/commentary/reasoning if one is to advance the safety of the "universe" we, as commercial airline pilots, operate in.

Nothing other than a rigorous analyses of causal factors will result in a coherent strategy that is able to reduce incidents of a similar nature occuring in the future. It doesn't surprise me (rather it deperesses me) that PBL's team have found numerous mistakes in the logic contained within a significant proportion of aircraft accident reports.

However, inspite of my pretensions towards academia - PhD in Aerodynamics - I'm not sure it is necessary for me, to, for instance, retain an understanding of all 59 causal factors in the case of AA965 to help me do my job better in future when faced with similar circumstances.

But then my aim in reading the report is not quite the same.

My guess is most pilots will (excepting, perhaps, those who take an interest in contributing to discussions such as this) employ the same "economy of effort" they apply to executing complex tasks whilst airborne to their understanding of complex accident reports. To this end, they will endeavour to distil only one or two significant and salient ideas from the text to inform their modus operandi in future.

Whilst intending no disrespect, I see RAT's post along these lines. There is nothing rigorous about his commentary. The post serves as an admonition, to others and, perhaps, himself, that deviating from published procedures in the vicinity of high ground, in an unfamiliar location at 400mph comes associated with a degree of risk.

I used to know a pilot who started everyday in the flightdeck by asking the question:

"What don't we know about today that is going to kill us?"

SR71 is offline  
Old 19th Jan 2011, 14:56
  #163 (permalink)  
PBL
 
Join Date: Sep 2000
Location: Bielefeld, Germany
Posts: 955
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by S76Heavy
there is however, a chain of events leading up to an accident.
Actually not. There is a network (not a chain) of states and events (not of events alone) "leading up to" an accident. Rephrasing, the chain of causal factors leads back from the accident (when one considers causal factors of causal factors and so on iteratively), and the structure so derived forms a directed graph.

Originally Posted by S76Heavy
Removal of any of those links prevents the accident.
Indeed so. Removal of any of the necessary causal factors of an accident avoids the accident. That follows directly from the definition of necessary causal factor.

Originally Posted by S76Heavy
So reminding ourselves that airmanship is our last resort even when the rest of the design and circumstances conspire against us, is not a bad thing in my opinion.
That is an argument for the prioritisation of crew's competent handling of the situation over those factors which remain constant throughout the playing out of the accident event - states of the runway, states of the navaids, state of the weather, *but not* design of the automation. The "but not" comes from the observation that it is the *behavior of* automated devices during the playing-out that is most immediately causally involved: the design is the cause (causal factors of) the behavior, and the behavior in turn (some of) the causal factors of the accident. That is true for pilots also, BTW. The behavior of the pilots in the Cali case was causal; the training, experience and expectations were (essentially) causal of their behavior (modulo questions of psychological causality and free will). That is why terpster's putting the finger on the US piloting environment (as he did in his quote from his 1996-7 work) is essentially as accurate a partition of the particular set of causal factors he is interested in as the behavior during the playing-out.

It does not by itself allow one to prioritise the behavior of the automation over the behavior of the crew as a set of causal factors.

Originally Posted by S76Heavy
So for us at the pointy end of whatever we fly, design of aircraft, equipment and airspace are a given that we cannot influence, only our behaviour and how we work with what we are handed on the day.
Of course a professional pilot focuses on what heshe, as a professional pilot, can do to avoid.... Just as an avionics system safety specialist will focus on what heshe, as a professional system safety specialist, can do to avoid .... and an ATCO will focus on what heshe, as a professional, can do to avoid...

But an accident analysis takes all factors into account. An objective prioritisation of some takes application of an explicit criterion (such as that above) and in my experience such criteria are always open to question by others. But making the criteria explicit allows the discussion to proceed as to whether the criteria are appropriate (serve such-and-such purposes, or not).

People here might be surprised at how often this crew-behavior versus kit-behavior argument gets played out, often behind closed doors and for very high stakes indeed. To my mind, the arguments are evolving from "the crew screwed up; end of story" to "the crew played their role in an environment set by the behavior of the avionics, ATC, weather, and so on. How did each of these classes of factors contribute to the outcome? How did, for example, the avionics provide a benign or a hostile environment in which the crew could execute (what they understood to be) their tasks?"

It wouldn't surprise me if in fifty years we didn't think along the mirror image. That is, the designers of the kit can in peace, quiet and with all the time in the world think how their kit can best operate with help from the pilots, and when the pilots attempt to achieve what they think is task A but which has different outcome B, or to monitor what they think is task A but which turns out to be task B, the design is faulted because the pilots are operating under heavy resource constraints with the usual human weaknesses, and the designers should have accounted for it. (That won't happen until the execution of complex functions is as well understood and standardised as the T-display of essential flight data, but this understanding and standardisation is very likely to come about within fifty years, I think.)

PBL
PBL is offline  
Old 19th Jan 2011, 16:58
  #164 (permalink)  

Do a Hover - it avoids G
 
Join Date: Oct 1999
Location: Chichester West Sussex UK
Age: 91
Posts: 2,206
Likes: 0
Received 0 Likes on 0 Posts
Chaps

Let us call those who say "it was pilot error - pure and simple" group A

Let us call those who say "it was a complex accident that happened because of all sorts of factors" Group B

May I suggest the best thing we can do in the SHORT term to reduce accidents is to brief Group A's position to as may pilots as possible.

However in the LONG term the only way pilot's performance will improve is through applying lessons learned from Group B.

I think A and B both have a lot to contribute to safety.

JF
John Farley is offline  
Old 19th Jan 2011, 17:57
  #165 (permalink)  
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
Here, here, JF.

And as James Reason put it:-
“We need to find some workable middle ground that acknowledges both the psychological and the contextual influences on human performance, as well as the interactions between active failures and the latent conditions that serve, on rare occasions, to breach the system’s defences.”

Who moved my Swiss Cheese?
safetypee is offline  
Old 19th Jan 2011, 19:10
  #166 (permalink)  
 
Join Date: Jun 2002
Location: home and abroad
Posts: 582
Likes: 0
Received 0 Likes on 0 Posts
Agree, JF.

@ PBL: I'm by no means saying that all we should look at is how pilots can prevent poor choices which can lead to accidents. I am fully aware of what bad ergonomics for example can do to someone's performance, or other more complex "chains" find their origin in how the human tries to cope with a design, but as a pilot all I can do is what is in my limited scope of control.
As rightly identified by John Farley.

However, I'd be more than delighted to be able to pass my experience on to those designing the stuff that pilots will use and in fact, have already done so on a few occasions.

So to me it is not black and white, just long term and short term.
And learning how to survive short term so I can benefit from the long term
developments made by others.
S76Heavy is offline  
Old 19th Jan 2011, 22:43
  #167 (permalink)  
 
Join Date: Aug 2005
Location: fl
Posts: 2,525
Likes: 0
Received 0 Likes on 0 Posts
The Cali and other incidents discussed here could have been prevented by pilot intervention and not allowing automation to do what the pilot didn't intend to do. When I declined the Cali straight in approach after the accident it wasn't because it was unsafe. It just didn't seem right to show how easy it was. A lot of people died doing that approach because the pilots let automation take them out of the loop. Once you are out of the loop you are along for the ride. I think the new 250 hr wonders are out of the loop. I think if they can push the right buttons they will succeed. If not, watch the news.
bubbers44 is offline  
Old 19th Jan 2011, 23:01
  #168 (permalink)  
 
Join Date: Dec 2010
Location: New York & California
Posts: 414
Likes: 0
Received 0 Likes on 0 Posts
I'm of the belief that almost anything can be taken to excess, and automation is just another example.
Jane-DoH is offline  
Old 20th Jan 2011, 11:04
  #169 (permalink)  

Mach 3
 
Join Date: Aug 1998
Location: Stratosphere
Posts: 622
Likes: 0
Received 0 Likes on 0 Posts
Flight International Jan 18-24 has an article on the levelling off of aircraft accident statistics and cites work done/being done by FAA Human Factors expert, Dr Kathy Abbott.

She suggests inadequate crew knowledge of automated systems was a factor in 40% of accidents.

EXCLUSIVE - Qantas QF32 flight from the cockpit | Aerospace Insight | The Royal Aeronautical Society includes an interview with David Evans, QF32 Check Captain.

I'm not sure we're learning the lessons with respect to automation as it took a crew of 5 over 2 hrs to get this jet back on the ground as a result of having to deal with 43, some contradictory, ECAM messages.

In the interview he is asked whether he believes the result would have been the same had there only been 2 crew onboard?

I think he is generous in his response, suggesting that it "may" have taken longer but the end result would have been the same....

How much longer?

(5/2)x2 hrs?

I'm curious as to whether or not there is an optimum amount of information that needs to be presented to a pilot to do his job and whether advanced flight decks are merely saturating pilots with information on the assumption that "more is better"?

Is there an argument for increasing minimum crew complements again because of this complexity?
SR71 is offline  
Old 20th Jan 2011, 11:19
  #170 (permalink)  
 
Join Date: Jun 2000
Location: last time I looked I was still here.
Posts: 4,507
Likes: 0
Received 0 Likes on 0 Posts
Greetings, to my mind this was not an automation issue. Bubbers44 says, perhaps rather sarcastically, he didn't want to show how easy the NPA was. Indeed any NPA can be easy if planned, prepared and briefed properly. However, they are a disaster waiting to happen when attempted without proper preparation. The worldwide CFIT stats bear that out. Situational awareness goes out the window as you slide ever more behind the a/c. You are a passenger on a sled ride. Sometimes you are saved by busting out of cloud, in an unexpected place, breath a huge sigh of relief, wipe the seat and spray some airfreshner, then land. No-one except you and the FDM monitor is any the wiser. Another statistic avoided. On the other hand...
An a/c at altitude has an amount of energy; This has to be reduced drastically to land safely. Our god given computer used to do this; now we trust VNAV/LNAV, but hopefully monitor it to make sure it is doing what we would have done ourselves. It is a great reducer of workload, when used properly. In this case the a/c was too high on energy; it was too high, too fast and too short of time to reduce the energy sufficienly to make a safe approach and landing. That would have been true for a needles & dial a/c or an EFIS a/c. It's physics and aerodynamics. Situational awarness told them how far they were from the rwy, the altimeter told them their height and the ASI told them their speed. Added all together they had too much energy. Without proper planning they did not have enough time to set up safely. Why they made a decision to defy energy I have no idea. Were there causal factors at play, e.g. behind schedule, tight on FTL's, commercial pressure, poor training culture etc. etc. I do not know. Some guys can push the boundaries and so learn, by mistakes or by successes; some guys stay in the middle of the box and are considered stodgy, but safe. The hairs on the back of my neck are a good barometer. I've pushed boundaries in many things, but usually with an escape route. When I've got close to the edge, unitentionally (= mistake) it's scary. Knowingly going into a cul-de-sac usually alerts the hairy sensors and I back off. I just cannot see this having, as a root cause, a mis-managed automation conclusion. In daylight, airfield in sight, perhaps it was worth a go. At night, PF on 1st visit, IMC, high on energy, NPA, it just seems a very odd human decision to even atempt such a manouevre. A couple of holes had lined up when they started to consider a straight in. By saying "NO Thanks" the 3rd hole stayed closed and the cheese was back in the fridge, at least for this scenario. By saying OK the holes started to line up 1 by 1.
From the spectrum of posts I suspect this thread will rotate and continue without reaching a concensus. I'll read with interest, hopefully learn, but my tuppenyworth is spent.
RAT 5 is offline  
Old 20th Jan 2011, 12:45
  #171 (permalink)  
 
Join Date: Nov 2007
Location: Northampton
Posts: 5
Likes: 0
Received 0 Likes on 0 Posts
think the new 250 hr wonders are out of the loop. I think if they can push the right buttons they will succeed.
Maybe the 250hr wonder would realise that what he was being asked to do is beyond his experience level and would have declined the offer, as this crew should have done.
rogerg is offline  
Old 23rd Jan 2011, 12:46
  #172 (permalink)  
PBL
 
Join Date: Sep 2000
Location: Bielefeld, Germany
Posts: 955
Likes: 0
Received 0 Likes on 0 Posts
Well, the same question that arises on a half-dozen other threads seems to have chosen different venues for its emergence. A bit like thunderstorm cells, isn't it? One matures and dies within a fraction of an hour but the squall line persists. This thread seems to have died over people's positions on the Cali accident.

Let's see if we can resurrect the discussion.

Terpster wants to prioritise the crew's violation of the good arrival/approach practice, as laid down in the design assumptions for the arrival/approach (say, similar to TERPS). RAT and bubbers want more or less to join that view.

I argue for consideration of all the factors, and consider that view proposed also by fdr, SR71, and Dozy.

Even though we were arguing, these views are not necessarily in contradiction.

Alf pointed out that, besides the comprehensive analysis (which he termed "academic" - surely he knows better than that!), there is a need to provide guidance to the front-seaters. My view on that: yes.

I pointed out that there is a trope:
Originally Posted by PBL
(a) there was a failure of airmanship; (b) other stuff is secondary; (c) failure of airmanship is thus "root cause".
"Airmanship", or "adherence to TERPS-like approach procedures", or what-have-you. There was a failure of two organs sitting behind the eyes in the front seat; we have agreed. That failure was, as I have argued, facilitated in a way that those who have not experienced suchlike will not necessarily readily understand.

Let me offer an extreme example. Suppose it is well known that, at one specific crossing in the middle of town, the traffic lights show red for "go" and green for "stop". Everybody knows it. They know to take it into account. Nevertheless, would we be surprised if more accidents take place at that intersection than elsewhere in town? I would suggest: we would not be surprised.

Now, suppose we look at causes. We have a specific accident in mind. Mr. Jones went through a green light and collide with Mrs. Smith. One might argue: (i) Jones knows about the light and the convention; (ii) he willfully went through a green light without sufficient care and attention; (iii) he hit Smith's car; (iv) his insurance pays. One might also argue (I) as (i) above; (II) the city council knows about the light, that this is a counterintuitive affordance (in Don Norman's use of the word), and the result of counterintuitive affordances, namely that people are induced to do the "wrong" thing, even though they might theoretically "know better"; (III) the council's insurance pays.

There are two issues here. One is causality, the other is responsibility (phrased here as responsibility for compensation, as it often is in real life). I have tried previously to separate the two, with limited success. Let me point out here that in terms of causality we cannot ignore any of (i), (ii), (iii) or (II). Ideally, AngloSaxon common law (for example) suggests in general that responsibility follows causality, but in practice (that is, in the actual legal environments of countries following this paradigm) other factors mitigate.

Automation provides affordances in Norman's sense, lots and lots of them, for example behavior of nav data bases in unanalysed situations (two navaids within reception range with identical ID and FREQ). And so do variant design of approach procedures (naming an approach after an end point, for example). Causal analysts cannot ignore those, as I have tried to argue.

There is a criterion that will prioritise crew behavior over other causal factors, as terpster, RAT and bubbers wish to do, namely that other factors, such as DB design, FMS design, approach design, are temporally prior to the playing-out of the accident event, and the behavior of the participants during that playing-out should be prioritised over the temporally-prior factors. The reason for this causal priority can be seen to lie in the phenomenonen that the participants are expected to avail themselves of free choice, and choices are available to them whose consequences are not the accident scenario. Furthermore, those choices (which they unfortunately did not make) are worth making for more than the reason that they would have avoided *this specific* accident.

Examples of these choices are: (A) follow the full approach procedure diligently, including the arrival; (B) don't descend below MSA unless following the arrival/approach diligently; (c) in the sim, practice cleaning up the airplane upon GPWS warning.

Difficulties with the application of this criterion are that, although the design of the DB/FMS/airplane automation in general temporally precedes the playing-out of the accident event, that design is a design for certain specific future behavior, and that future (at design time) behavior is part of the behavior during the playing-out of the accident event. Forethought is explicitly required of the designers: possible future behavior must be analysed and hazards avoided or mitigated.

Similarly, crew behavior, with its MttB characteristics, is heavily mediated by the flying environment in which they have operated (which is the basis for terpster's 1996 critique of the US regulators, which he reprinted here).

So in fact it is not so easy to discriminate on the basis of the temporal situation of factors. Because we look with different-colored glasses. For the crew, we look at behavior-at-the-time and fail to look so closely at what in the past has afforded their current behavior (rose-colored glasses). For the kit designers, we look at the decisions they made in advance, and not so much at the behavior-at-the-time of the kit (lilac-colored glasses). There are other participants, past and present: controller, approach designers, navaid-placement designers, etc - let me stop for the moment with rose- and lilac-colored.

I propose we should look with clear-colored glasses. This is where I think there is lots of material to continue the discussion. For I don't think this is all about a few variously-experienced people blabbering in their free time on an internet forum. I think it is a major research topic for those interested in reducing complex-system accidents in general and commercial-aviation accidents in particular. SR71's citation of Kathy Abbott's view is pertinent.

PBL
PBL is offline  

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.