PDA

View Full Version : "To err is human": differing attitudes to mistakes in EK and Turkish accidents


Gibon2
4th May 2009, 14:26
SLF here. As one who regularly puts his safety in your hands, I come humbly to your forum with observations and questions from the two current threads on the Turkish Airlines accident in Amsterdam, and the Emirates tailstrike and near-accident in Melbourne. (Moderators: I'm posting this in R&N as it relates directly to two threads here, and concerns what I think is an "item that may be of interest to professional pilots".)

Although the respective investigations continue, the preliminary reports and the lively armchair analysis on PPRuNe suggest that both the EK and TK cases involved serious errors by the flight crew. In the Turkish case, the crew failed to notice abnormally decreasing airspeed, resulting in a fatal crash. In the Emirates case the crew apparently used an incorrect aircraft weight in their take-off calculations, very nearly resulting in a fatal crash.

In both cases the mistake was serious enough, in itself, to cause a crash. And in both cases, there was no layer of redundancy to catch the mistake.

But the responses from pilots on the respective threads here have been curiously different. The emerging consensus on the TK accident thread is that the crew made an inexplicable and inexcusable error, all on their own. No systems/management failure, no fatigue issue, nothing. Some put it more bluntly than others, but the prevailing view is perhaps best summed up in this thoughtful comment from bjornhall:

Of course we all know that humans are susceptible to errors, the best pilots can make the worst mistakes, the system is only made safe by trapping and mitigating such errors, not by relying on them never occuring, etc etc etc. I doubt anyone contributing here lack that understanding.

But not monitoring airspeed on final is too basic to fall under the "to err is human" heading. The system shall be pilot proof, not idiot proof. Sweeping every display of pilot error under the same "human error" blanket risks making one blind to an industry wide training problem, if such a problem exists.

In contrast, the response to the Emirates accident has been more along the lines of "there but for the grace of God go I", with lots of discussion of the possible role of fatigue, management problems, etc. One gets the impression that, in the view of pilots, this is a very different and less serious sort of "mistake" than in the TK case.

In general, as far as I can see, there are three basic types of "mistake":

1. Deliberately ignoring or contravening procedures, cutting corners, etc (e.g. as in the Garuda 737 accident, for which the captain was recently jailed).

2. Mistakes that "anyone" can make (where "anyone" presumably means "any properly qualified and reasonably competent and experienced airline pilot").

3. Mistakes that, although inadvertent, are so egregious or just plain stupid that no qualified pilot should make them ever, in an entire career.

Type 1 is not my concern here, although I know there is another whole debate over the application of criminal sanctions in safety-critical fields.

Type 2 mistakes are, or should be, covered by a system of checks and redundancies, both human and mechanical, so that when (not if) they happen, they are caught and rectified without major consequence.

Type 3 mistakes have no such coverage, since they are to be "trained out" of the human, and are presumably thought to be so unlikely as to require no further precaution or remedy.

It is fairly clear that the "PPRuNe consensus" (if such exists) views the Turkish accident as a type 3 mistake. The consensus is less clear on the EK accident, but many of the comments seem to view it as type 2, and there is a lot of sympathy for the crew.

Why the difference? Is there a difference? And if the EK case really is a type 2 mistake, shouldn't there be redundant systems to catch it?

RoyHudd
4th May 2009, 15:04
Both Type 3 mistakes. IMHO. TK have a very poor safety record, (in the public domain) so bias may well creep in. Mind you, this is not the first T/O screw-up on EK's record. JNB A340 incident springs to mind.

I think your thread is valid, and thought-provoking.

captplaystation
4th May 2009, 15:20
I think your summing up of the "consensus" is a fair representation of how most people see it. Personally, I can imagine a situation arising ,particularly in a training environment, that could lead to one "approaching" the Turkish scenario. What I find harder to justify is taking my eye off the ball long enough for it to develop as catastrophically as it did. Regretably, whichever way you dress it up, it really was one god- all -mighty loss of situational awareness. :confused:
As I understand it, the EK incident was caused fairly simply by getting a number wrong.
I agree with you entirely that there should be a system of checks and balances in place to stop it happening, if there is, and it was circumvented/ignored then this becomes a Type 3 accident as the error becomes a bit less "forgiveable", but it seems the system allowed this error to slip through too easily by far. :=
Another accident that is difficult to categorise is the ditching off Sardinia of the TunisAir ATR42. In this case a simple system of crosschecks would have shown up the ridiculous discrepancy in the fuel uplift, but either the system didn't call for the check, or it was simply omitted/misunderstood. . . again teetering between Type2 & 3.
I hate to say it here also, for fear of being flamed, but had it been a Lufthansa/Air France/BA rather than Turkish aircraft involved, perhaps slightly more understanding might have been shown.
Unfortunately, whether we like it or even care to admit it, we are all guilty in life to a greater or lesser extent, of just a teensy-weensy bit of "casual racism" from time to time. Alas that is truly imprinted in the nature of human beings ,as any thorough research can demonstrate. It is a primeval instinct, which we we don't consciously seek, & can only at best partially ignore.
Interesting question you pose though :ok:

Dysag
4th May 2009, 15:27
To anyone trying to be objective, it's saddening to hear Turkish (could be Egyptian, Brazilian.....) pilots unions, national politicians and local press instantly claiming "it wasn't our fault", like kids in the playground.

That attitude is so irritating that, rightly or wrongly, it surely helps turn opinion against the parties they are trying to defend.

captplaystation
4th May 2009, 15:35
Indeed it does, and "national" characteristics (as that undoubtedly is what inability to accept loss of face may be characterised as is in many cases) is part of the reason that we DO differentiate between accidents depending on the nationality of the parties involved. To do so may be "politically incorrect", but to fail to do so is to be blind to the fact that we are indeed all "different" in ways that are both good and bad. Unfortunately, these days it is seen as a crime to say so, but we don't all react the same, and we shouldn't pretend otherwise.

erichartmann
4th May 2009, 15:48
A 2 person crew aircraft has one pilot flying (PF) or monitoring and responsible for what the autopilot (to include autothrottle) is doing, and the other pilot monitoring (PM) the actions the PF and aircraft. An approach would typically have have the PM calling out airspeed (and sink rate) at set points based on a Vref, typically at 500' AGl and again at 100' AGL. It is, however, incumbent on the PM to call out any serious deviation from the planned or "targeted" approach speed or a higher or lower than normal sink rate at any time (at my company Vref + 5 is the minimum target airspeed allowed on the approach) along with deviations from glideslope(path) and localizer or course. Going below Vref with a decreasing trend would be very serious and would certainly generate a call. This is really very basic airmanship and speaks to a lack of basic situational awarness on the part of the crew.

The Takeoff Weight entry is more of a problem. Suffice it to say there are a number of cross checks made in an effort to trap a mistake, but if the payload weight is wrong everything else will be wrong.

ALK A343
4th May 2009, 15:52
According to your definition both are type 2 mistakes in my opinion, as both should have been caught by the system or the other crew-member. The EK crew were just lucky they got away with it, the TK guys were not. Having said this, we all make type 2 mistakes once in a while, there is nobody who has never made one. There are of course a lot of contributing factors leading to a type 2 mistake, but it should never result in a serious incident or accident. When this happens the system or the other crew-member has failed. Safe flying to all of you!

411A
4th May 2009, 16:36
According to your definition both are type 2 mistakes in my opinion, as both should have been caught by the system or the other crew-member. The EK crew were just lucky they got away with it, the TK guys were not. Having said this, we all make type 2 mistakes once in a while, there is nobody who has never made one. There are of course a lot of contributing factors leading to a type 2 mistake, but it should never result in a serious incident or accident. When this happens the system or the other crew-member has failed.

Agree.
There is normally quite a lot of finger pointing going on at PPRuNe...mostly toward non-European operators which adds little to the specific safety discussion.
Ain't likely to change, either.:rolleyes:

fotoguzzi
4th May 2009, 16:47
(Self-loathing freight here) TK's airspeed is something that pilots have always monitored, so a lapse there would be Type 3 while perhaps EK's data entry would at one time have been performed by an engineer in a three-crew cockpit (so, Type 2?).

I do not know if a navigation error should be considered Type 2 because that is something that the navigator used to do in a four-crew cockpit. . . .

Edit: transposed 3 and 2--now fixed. I guess that's why my chair doesn't have rudder pedals.

mensaboy
4th May 2009, 17:23
''Type 3 mistakes have no such coverage, since they are to be "trained out" of the human, and are presumably thought to be so unlikely as to require no further precaution or remedy.''

Can't speak for TK but the EK accident was definitely Type 3

1- This is the third instance of such an input error in the past 7 years at EK alone (there have probably been many more input errors that were never captured because they did not cause a problem).

2- Input errors into a computer are common, although less so for takeoff data.

3- There may have been contributing factors, such as cockpit interruptions, fatigue or MFF flying, which counter the ''trained out'' possibility. 'Cockpit interruptions' has already been pointed out by the LOSA evaluations, yet no further precautions or remedies were instituted and even now, very little has been done to address this issue. Fatigue is an ongoing problem at EK, one that is hard to quantify especially if there is no intention on the part of management to even acknowledge the possibility. MFF is debatable for sure but it is also a possibility. Add the three together, and who knows the result!

SPA83
4th May 2009, 18:00
In the TK accident, the pilots made mistakes, that's right. But we cannot ignore that the aircraft was not really fit to fly :
"The data show instances of left radio altimeter malfunctions on some of the nine previous flights. In the recorded cases, the autothrottle also entered the retard mode above the intended flare altitude, and the thrust levers moved to idle, because of a malfunction of the left radio altimeter on two of the nine flights. The data of these flights are being investigated."

captplaystation
4th May 2009, 19:11
Don't really think the A/T selecting idle thrust when you don't particularly want it means "the aircraft was not really fit to fly"
A bit of a pain in the ass perhaps, but finally, the A/P & A/T are merely there to do your job for you. If they don't do it as you wish (& it is incumbent upon you to check they do) you switch them off & DIY.
Really truly lets drop this cr@p.
Chances are the reason previous crews didn't write up the radalt prob in the tech-log wasn't to avoid making work for the ginger-beers, but more likely because they simply didn't notice.:rolleyes:
On a Cat1 approach to a manual landing it is a total non event, so let's quit trying to blame the A/T or RADAlt for the fact the crew (for whatever reason still to be established ) were not flying, nor monitoring the performance of the A/P, A/T and more pertinently , the A/C. :=

END OF STORY. :ugh:

Scimitar
4th May 2009, 19:15
Some of the non-pilots who have been reading about this accident seem to be having difficulty in working out just why the pilots amongst us are so horrified by what appears to have happened.

It is the very basic need to "watch the shop", for at least one of the pilots to be keeping an eye on the instrument panel, that very quickly becomes second nature.

I can only describe what went on, in non-aviation terms, as having increased ones speed when driving to say 70 mph - and then closing ones eyes! And waiting for the bang.

If someone had told me that an accident like this would one day take place, well, I just wouldn't have believed them!

The Real Slim Shady
4th May 2009, 20:25
Automation, in modern glass cockpit aircraft, is dutifully dumb: tell the airplane to fly to Point X with a mountain between the airplane and Point X and it will, very accurately, impact with said mountain (Cali).

Sit back and watch the thrust levers come back to idle thrust and, with the autopilot engaged ( and no Alpha Floor protection), it will fly a textbook full stall manoeuvre.

We are in danger of losing sight of the big picture: the "Playstation" generation are now sitting in the right seat, and the left seat, of modern commercial aircraft: "what's it doing now" is becoming more common.

Aviate, navigate, communicate...stick to the basics.

A 737 / A320 is a glorified C150 and you can always go back to basics.

4Greens
4th May 2009, 22:51
To all concerned, get hold of a copy of James Reasons book "Human Error". Its all in there.

PJ2
4th May 2009, 23:57
Gibon2;

For an SLF contribution to this discussion, you get an 'A'. Nice work.

Typing is useful to begin with, but very quickly the process becomes a human one and is therefore subject to enormous subtleties which defy categorzation. But I suspect you already know and appreciate that.

4Greens;

Along with Jim's book there are others equally if not more important in terms of human error, which I highly recommend anyone wishing to comprehend why this accident is so horrifying to professional aircrews and so misunderstood by non-pilots including engineers. Hopefully this thread will run it's course soon.

Nothing excuses what this crew did not do - fly the airplane. All the rest are unimportant details raised by those who won't/can't admit this fundamental fact.

Further, there is no understanding in this accident that is useful in the addressing of organizational, systemic or human factors accidents.

For an understanding of those kinds of accidents, (which aren't really accidents at all but happenings-with-precursors), the following references will be of interest:

Some titles are:

Normal Accidents: Living with High Risk Technologies,
Charles Perrow

The Field Guide to Understanding Human Factors
and
Just Culture,
- both by Sidney Dekker

Why-Because Analysis, (web-based)
Ladkin

Accident Precursor Analysis and Management: Reducing Technological Risk Through Diligence,
National Academy of Engineering

The Challenger Launch Decision:Risky Technology, Culture and Deviance at NASA,
Diane Vaughan

Organization at the Limit: Lessons From the Columbia Disaster
Starbuck & Farjoun

Beyond Aviation Human Factors,
James Reason




The Real Slim Shady;
"what's it doing now" is becoming more common.
And, perhaps in increasing frequency given the accidents we have seen thus far this year, "I think I'll just let it do what it wants...".

The training issues alone are substantial but checking issues as well as flight data programs are crucial. Without those tools, an airline is flying blind and every incident or accident will be a shock and a surprise.

infrequentflyer789
5th May 2009, 00:43
In the TK accident, the pilots made mistakes, that's right. But we cannot ignore that the aircraft was not really fit to fly :

The aircraft wasn't fit to fly itself... which is why (I thought) we still have real pilots at the pointy end. The aircraft's automatics might have developed a slight suicidal tendency, but nothing reported so far appears to have suggested that it wasn't fit to fly had someone actually been flying it.

More concerning to me is that:
this aircraft is built with nice big real moving controls to make it (allegedly) a lot more obvious to the pilot what the automatics are doing, compared to a certain other type of aircraft
and (as you noted) the aircraft warned its handlers, using its big moving controls, of this suicidal tendency tendency on multiple previous flights
and...And then we have no published info. What happened to those warnings ? no one noticed ? no one wrote it up ? no one acted on it ?

If this aircraft actually was, "not really fit to fly", then why did nine previous crews let it go back up ? Whichever way you look at it it looks like a huge chunk of human error turning what should have been a minor technical problem into a fatal crash.

BOAC
5th May 2009, 06:53
If there any mods around, can I make a plea that we do not immerse ourselves yet again in a relentless repetition of '"it was u/s", "why was it not snagged?" (soon we'll be asked about autolands here:ugh:). PLEASE can you transfer any such posts to the already 'expired' Schipol crash thread?

PJ as usual says it for me -

"And, perhaps in increasing frequency given the accidents we have seen thus far this year, "I think I'll just let it do what it wants...".

The training issues alone are substantial but checking issues as well as flight data programs are crucial. Without those tools, an airline is flying blind and every incident or accident will be a shock and a surprise."

There are substantial lessons we all need to learn from recent accidents where 'automatics' have played a deadly part. Those who dismiss the Schipol crash as 'irrelevant' preach a highly dangerous, blaze and over-confident doctrine which serves no-one's interests but their own. In all these accidents, supposedly competent and experienced crews have allowed the unthinkable to happen. These events appear to be becoming less isolated than we had assumed they were. This is the major issue we need to address without delay.

Whether we have a type 1, 2 or 3 really is only of passing interest. This is 'Human Factors' in the raw.

757_Driver
5th May 2009, 07:28
good thread. I agree with BOAC - lets not pollute this interesting discussion with pointlss comments about Rad alts and autolands.

There are substantial lessons we all need to learn from recent accidents where 'automatics' have played a deadly part. Those who dismiss the Schipol crash as 'irrelevant' preach a highly dangerous, blaze and over-confident doctrine which serves no-one's interests but their own. In all these accidents, supposedly competent and experienced crews have allowed the unthinkable to happen. These events appear to be becoming less isolated than we had assumed they were. This is the major issue we need to address without delay.

well said - however thats why i think that the type 1, 2,3 issue on the opening post IS relevent. I know the EK error got through the system but in general, as pointed out, there are checks and balances to trap the type 1 and 2 errors. As demonstrate by the horror at the stunning level of 'incompetance' that cause the turkish crash, the type 3 error has no error traps on an operational basis. It is assumed that these errors are prevented by the training and checking regime. Quite why that appears to be no longer the case I don't know. Although to be fair, and risk a politically incorrect flaming, it does still appear that decent operators from enlightened societies are still preventing and trapping these sort of errors. And its important that we understand why, so that we don't go down the same road.


The reason why we tend to point fingers more at places like Turkey (and as pointed out - Egyption, Brazilian are another couple that spring immediately to mind) is not because we are all superior western sky gods, it is because a society that instantly mobilises and has government and press output such as we've seen from turkey, almost by definition cannot be 'safe' from a human factor point of view.
After all how can Turkish possibly learn, or train out whatever behaviour caused this accident if they refuse to acknowledge that it exists.
Couple this attitude with the authoritarian / subservient issue in many cultures whereby an FO dare not question the mighty captain then you have a real safety issue.

Why does this appear to happening more and more? who knows, perhaps its that the recent aviation expansion has thrust some operators and regulatory regimes into top divisions - where they don't really deserve to belong. it doesn't matter how new the aircraft are, how shiney the paint and what club (star alliance in this case) you belong to. If your regulator, and accident investigator thinks that its job is about face saving and shifting blame then you will never, ever get the human domain correct to trap those type 3 errors.

Gibon2
5th May 2009, 07:35
Thanks all for the interesting comments so far. If I could just second BOAC's request to avoid rehashing here arguments that have been thoroughly played out in the respective accident threads. My purpose in starting this thread was to look at the different attitudes to the actions of the respective crews in the two accidents, and to explore what this might show about underlying beliefs and assumptions about human factors and safety.

Anyway, so far we have:


Both Type 3 mistakes. IMHO.



According to your definition both are type 2 mistakes in my opinion


So much for the PPRuNe consensus! But still, it seems that many view the two errors as being of the same type. As PJ2 says, my types are at best approximations and at worst completely arbitrary. But this kind of rough categorisation can perhaps shine some light on the difficult cases at the margins - which are typically the ones that result in accidents. As captplaystation puts it:

teetering between Type 2 & 3

For example, type 2 errors that are insufficiently covered by checks and balances, or type 3 errors that are easier to make and more common than is generally recognised.

As SLF, I am happy to accept the tiny risks of catastrophic mechanical or structural failure, extraordinarily unlucky combinations of multiple factors (e.g. crew forgets flaps AND flap warning fails), freak weather, terrorist attacks, midair collisions, etc. What worries me more is that I may be a typo away from a fiery death, or that the crew may just "forget" to fly the plane.

Worse than this, if I have understood correctly, is that there's no agreement on what to do about the problem. PJ2 says of the TK crash:

Further, there is no understanding in this accident that is useful in the addressing of organizational, systemic or human factors accidents.

So what to do? Are you saying these kinds of risks cannot be reduced?

SPA83
5th May 2009, 07:36
You accept that an aircraft can be allowed to fly not being repaired just to save time or money or other crazy reasons, so you must accept too that pilots sometimes may not be in the loop, just along for the ride.

If you think pilots should always be professional, you must also consider that an aircraft must always be in good condition with no defect

757_Driver
5th May 2009, 08:05
Worse than this, if I have understood correctly, is that there's no agreement on what to do about the problem. PJ2 says of the TK crash:


Further, there is no understanding in this accident that is useful in the addressing of organizational, systemic or human factors accidents. So what to do? Are you saying these kinds of risks cannot be reduced? No. They can and are reduced, but generally only where the operator, regulator and Accident investigation bereau understand what their role is in a true safety system. As soon as "national pride", "Face Saving", "Cultural influences".... call it what you will, comes into the picture then this system fails, and the ultimate manifestation of that failure is often a human factor accident. Many of these accidents are the ones that, after the event, we all say "what the hell....."
We are in real danger in our society (under the current regime) of going down the same route - where face saving, the good of the 'party' and political correctness become more important than telling the truth and having a sensible, safe regulatory regime. After all just look how the Banks regulator bent to political will and refused to question anything - a situation that is hugely relevent to our industry and should be a real wake up call to the importance of the political independence of national aviation regulators.
Fortunately it appears the CAA and AAIB are generally impervious to the insidious influence of westminster, however it is easy to see how these things can happen, and we can all see where it ends up.
To be honest i'm not sure what the answer is, but I see the errors made by the turkish crew, not just as a human error (granted - one that is almost impossible to understand) , but as the symptom of an entire systemic failure.
The fact that this is not the first in this particular airline and authorites case, and indeed is another statistic in quite a shocking record leads one even further to that conclusion.

PanPanYourself
5th May 2009, 08:17
I was embarrassed when a year ago today Turkey banned access to youtube.
I was dismayed when they banned richarddawkins.net based on the ravings of an indicted criminal lunatic.
I was in disbelief when they banned several blogging sites owned by google, and free website hosting domains such as geocities.
I was in shock when they made themselves out to be Islamic fundamentalists at the recent NATO submit over a complaint about free speech in Denmark.

But in recent memory I haven't been truly ashamed of my country as much as I was in the days following the THY crash at Schipol. The entire Turkish media made the pilots out to be heroes based on the most ridiculous "evidence" just hours after the accident. The government and THY lied shamelessly and the overwhelmingly ignorant Turkish public lapped it all up.

So yes, when I hear of an accident involving Turks my immediate reaction is to wonder which idiot caused it and how, whereas I give civilized Westerners the benefit of the doubt.

BOAC
5th May 2009, 08:32
turkish crash, the type 3 error has no error traps on an operational basis - while I would prefer not to 'categorise' here, I would pick you up there briefly (hopefully without diverting the thread too much from the original question!) - there IS indeed an error trap, namely the stable approach criteria. They were patently WRONG at 1000' and again at 500'.. One would HOPE that a TC would be watching this like a hawk, if not actually monitoring airspeed per se, as this should be a major training point for all pilots. What we need to try and understand is why these pointers were missed. Indeed, does this airline have these criteria?

icarus sun
5th May 2009, 08:39
I suggest that all pilots in highly automated aircraft should hand fly at least one approach per month. With the AP/AT system disengaged. This should be legal requirement. I know that on long haul operations this may not be easy, then do it in the sim. For the MEL EK near miss i suggest the companies publish on the flight plans the estimated power settings and speeds for the planned weight. These to be cross checked/updated by the actual weights from the computer.

4Greens
5th May 2009, 08:43
It may be relevant to suggest that FOQA or Flight Data Analysis becomes a standard tool in all airlines. It is such a good device for Fleet Monitoring and hence improved safety that it should be on your ticket if the airline has it. Those that do are the ones to fly with.

757_Driver
5th May 2009, 08:48
I see your point BOAC. I was thinking of the wider issue - off the top of my head, all the 'type 3' traps (i prefer not to categorise either - but it makes it more readable on this thread if we do!) rely on a human factor and if the entire system is broken, then what value those checks and criteria?

Anyway before we get too much into the self flaggelation lets not forget that for all its faults the aviation safety model is still amongst the best in the world. I don't know off the top of my head the number of fatalities annualy in the world but its in the medium / high hundreds if that.
I could have an accident tomorrow (3000 deaths on the road in the UK every year) go to a hospital and get sliced open by a doctor who went to a med school decades ago, possibly anywhere in the world, and has minimal requirements to do any form of ongoing training, education or checking, has no restriction on his duty hours and is largely unnacountable for his errors. over 10,000 'avoidable' deaths occur in uk hospitals every year under this safety model.
So lets no throw the baby out with the bath water and recognise that, largely, we have a great and durable safety regime. Yes we can tinker and improve it, but lets not ruin it in the process.

FrequentSLF
5th May 2009, 09:20
Have a question that is long time I want to ask.

Training (or lack of) is coming up as contributory cause almost every time an accident happens.
I wonder who is the ultimate responsible for the training, the pilot or the airline?
I mean if a pilot feels that is not trained sufficiently can ask for more training or his/her position might be in jeopardy for admitting the lack of training?

Thanks

jetopa
5th May 2009, 10:21
I mean if a pilot feels that is not trained sufficiently can ask for more training or his/her position might be in jeopardy for admitting the lack of training?

Well, in an ideal world he/she should do so. But: there might be somebody pointing at you and your 'apparent' lack of self consciousness - or whatever you like to call it. Been there, done that. Unfortunately.

Best thing is to publicly discuss such matters so that we all can profit from that. No need to say: 'look, those stupid idiots punched in the wrong waypoint into their FMS and did not realize that it was in a different country with a mountain in their way...'.

Nobody (only the type 1 guys do) does deliberately fly into desaster. The way the Schiphol accident was handled in Turkey was shameful. But the fact that an incident like the one in Melbourne was allowed happen again is, in my opinion, no good PR for Emirates either.

It all boils down to acknowledging that we all are human and that even LH, BA, AF or THY 'heroes' can f... up pretty badly. No need to cover up.

The Real Slim Shady
5th May 2009, 10:37
In the final analysis we self regulate: that is why we have 2 pilots, not a man and a boy, but PF and PM.

When the PF screws up and PM doesn't correct the error the self regulation fails and problems occur.

Sound training programmes, adherence to SOPs and an open blame free reporting culture all contribute to the safety management system but ultimately the atmosphere in the office remains the weak link.

ZuluKilo66
5th May 2009, 11:09
Congratulations on one of the most thoughtful opening posts I've seen on Pprune. Clear analysis and questions provoking clear, thoughtful responses. I look forward to reading more.

Tailspin Turtle
5th May 2009, 12:48
According to The Right Stuff by Tom Wolfe (an excellent read, okay movie), the TK pilots were presented with a problem, didn't cope with it, and crashed and got killed - they had the wrong stuff. The EK pilots were presented with a problem, handled it, and got back on the ground with nobody hurt - they had the right stuff. According to Wolfe's understanding and description of the test pilot psyche (and that of the race car driver, mountain climber, etc.), he (or she) copes with the risk of fatal failure by "knowing" that they have the right stuff and will not crash, whereas someone who crashes and is killed did not.

Centaurus
5th May 2009, 13:50
So what to do? Are you saying these kinds of risks cannot be reduced?

Some operators have a long history of dangerous practices which can largely be blamed on ethnic culture centred around an unshakeable belief of "my Deity will always protect me". A numb acceptance of what will be, will be.

You can debate the subject ad nauseum but what is certain is that cultural mores should have have no place on the flight deck. Where it exists then sooner or later there will be dangerous and undisclosed close shaves where to these pilots their personal Deity does his job and saves the day - or he doesn't step in and people die. Forget James Reason and all the other books of that genre. Interesting stuff to the interested, but gobbly-dook to a culture driven pilot.

lomapaseo
5th May 2009, 14:33
You can debate the subject ad nauseum but what is certain is that cultural mores should have have no place on the flight deck. Where it exists then sooner or later there will be dangerous and undisclosed close shaves where to these pilots their personal Deity does his job and saves the day - or he doesn't step in and people die. Forget James Reason and all the other books of that genre. Interesting stuff to the interested, but gobbly-dook to a culture driven pilot.

One may live in a culture, but in the cockpit you live by the rules, not by a culture. That's the whole idea of training before you fly as a two man (or more) cockpit.

This is a good thread :ok:

I'm hoping that we can get beyond the culture of a country and their tabloid news pandering to an ingorant public and concentrate more on what goes on or should go on in the living environment of the cockpit.

I like to think of the error chain as:

Skill based

Knowledge based

Rule based


My narrow view at this time is that maybe we should evaluate the rules-based part of human error and ask why were they broke?

If you think that it was culture based then why was it not trained out?

Who here has the facts of what their training consisted of vs all the surviving pilots?

Blacksheep
5th May 2009, 14:37
I don't believe that cultural influences are as significant as some may make out. There are plenty of examples of aircraft being allowed to fly themselves into the ground while the crew busied themselves about a set of baffling indications.

What were the cultural aspects of the crew (a 3 crew aircraft too) who allowed their aircraft to fly itself into the Everglades while they messed about trying to fix a landing gear green light? Then there was the B757 that crashed into the sea because the crew were baffled by a pitot-static problem? Another crew, another place and the captain flew the aircraft to a successful landing using the right power settings for aircraft configuration and each flap setting.

A captain barrel-rolled his 747 into the sea while trying to follow a failed ADI indication at night.

I could go on, there are dozens of similar examples, but it is a fact that when distracted by indications that they have never encountered before, human beings become fixated on the anomaly and abandon attempts to monitor other events. It is a Human Factors or basic psychology issue based upon the many millenia of evolution of our species and it cannot simply be written off as a training shortfall, a character flaw or as a 'cultural thing'.

screwballburling
5th May 2009, 15:50
We will never eliminate human error, Never, in fact man is fast becoming the "weak link" in the chain.

I am a great believer in keeping things simple. Things now are getting complicated as we're getting away from the basics. Complicate things and more people screw up.

Here is an E.G., At flying school we were taught to use full power for t/o. If you didn't get full power you aborted! Now we are asked to use reduced power for t/o!! Can you imagine taking off in say a trainer at 2200 rpm instead of say 2500 -2600 rpm? of course not, yet we are doing it in airline operations FFS!! Are the savings worth it?

There is a recipy for disaster right there, as it is one more weak link in the chain.

This my 2 cents worth from a someone who started in this business 40 years ago this year. Maybe I have it all wrong and should give it up. Look at the airbus accident in the South of France late last year. All so unnecessary, in my view.

biscuit74
5th May 2009, 15:57
I think Blacksheep's description is excellent.

The TK issue was apparently very a much a Human Factors, classic distraction, case. Described at length by James Reason and others. The cultural aspect may be that if others in the crew noticed something wrong, perhaps they felt inhibited about mentioning it in time. In which case the holes in th Swiss Cheese start to line up. Training activity + distraction + cultural inhibition. Any one of those could happen anywhere. We'd like to believe they should not all line up at once.

I think that is why this for many pilots seems to hover between Gibon2's Type 2 & Type3. For myself I view it as a Type 3, because loss of situational awareness in what was not an apparently high stress environment should not happen to a well organised & competent pilot. This is part of what we teach people as basic airmanship. But I am an (now ex-) instructor not an airline pilot.

The EK event sounds to have been an oversight and a failure, presumably, to follow procedures. Since I understand it is quite common for departure routines to be interrupted - not unusual, in most professions the 'ideal' and the actual worlds are quite different - an error like this is easier to visualise and I suspect that is why rather more folk have the 'there but for the grace.....' feeling about that one, hence leaning towards Type 2.

Interesting discussion. I don't see any way to totally avoid these things, because of the way human beings operate.
Maintaining awareness of the pitfalls and snags in any activity does help. However we all get complacent, in all activities. Every so often, at the very least someone gets a nasty surprise. If we are lucky, we learn from that 'incident' and the general awareness threshold are back up higher for a while. That seems to be a consistent pattern in most pursuits, not just aviation. In theory, if we could graph the awareness curve, intervention at the right moment can help improve safety. The services used to do this with extra supervision at the various high risk flying hours points.

White Knight
5th May 2009, 17:03
Screwballfellah - a Cessna 152 is grossly underpowered even at full bore.. A 345 (which I fly) is fantastically overpowered at full thrust (108,000 lbs). The two also fall under totally different performance and certification categories so the comparison is ridiculous:ugh::ugh::{

I suggest that after 40 years you've still got some learning to do with regards to Performance A aeroplanes............................

On a related matter - if you go and use TOGA on every take-off you'll soon find that your engine failure rates shoot up!!!

Herc708
5th May 2009, 17:05
Icarus Sun has a good idea but I would go further and remove the FD as well (FD is too easy) - pure raw data at least once a month

Unfortunatley the hysteria being created about level busts etc is having the side effect of reducing the overall skill level of the pilot population. In a complex SID or STAR only the very brave fly it raw data on a regular basis

Ideally, each pilot should fly a raw data SID or STAR once a month and allowance this should be included in SOPs. Extra briefing etc but the advantages outweigh the disadvantages. This avoids the embarrassment at the Sim when re-takes are the norm because everyone has forgotten how to fly raw data

bjornhall
5th May 2009, 17:32
My simplistic understanding of rule based behavior goes something like this: You are faced with a situation, you analyze it (more or less subconsciously) to identify it with some category for which there is a rule, you recall or choose that rule, and you apply that rule. The process is not rational in the way it appears in writing, but it nevertheless follows a chain or a cyle of "identify - choose - apply".

IMHO, if that is the way one manages speed control on final approach, then there is something seriously wrong. My view is that it should be a skill based, not a rule based behavior. Admittedly an amateur pilot's view, but am I really that far off the mark?

Skill based behavior on the other hand is an automatic, ongoing process. Importantly, it is far harder to get distracted from a skill based activity than it is from a rule based activity. Distraction means the "identify" step never occurs, and then the "apply" action is never taken; furthermore, one does not realize that something important was missed, until the indications become overwhelming (e.g., stick shaker).

Transforming an ability from a rule based to a skill based behavior is the role of training; maintaining it in the skill based category to prevent a relapse into rule based or even knowledge based is the role of recurrent training.

I like to think there are few professional pilots who make Type 3 errors because they fit in the "wrong stuff" group; i.e., that they could not perform to an acceptable standard no matter how well they are trained. I think they make such errors because they are insufficiently trained, and that with sufficient training they could perform as required.

My tentative conclusion then is that the mere existence of Type 3 errors indicate training problems.

Type 2 errors, on the other hand, are precisely the type of problems where what can now be referred to as "classical human factors issues" and the traditional solutions to such problems can be applied (read: J. Reason, Swiss Cheese, System Failures and so on).

EDITED to add: On the other hand, "someone should always mind the shop" seems to have its correct place in the rule based category. It would be interesting to understand if that step failed? Did nobody realize that nobody was minding the shop? Or did the person who was doing so get distracted and failed to perform it properly?

PJ2
5th May 2009, 18:22
Gibon2;
Worse than this, if I have understood correctly, is that there's no agreement on what to do about the problem. PJ2 says of the TK crash:

Quote:
Further, there is no understanding in this accident that is useful in the addressing of organizational, systemic or human factors accidents.
So what to do? Are you saying these kinds of risks cannot be reduced
757_Driver makes the statement,
Anyway before we get too much into the self flaggelation lets not forget that for all its faults the aviation safety model is still amongst the best in the world.
with good reason. We are focussing on an extremely tiny but equally extremely visible aspect of the industry - the success rate of this industry (fatality rate) out-performs by magnitudes other industries which must accomodate and mitigate levels of risk such as the health-care system, the automobile transportation system, even one's private home where fatality rates are far higher a fact which is almost completely ignored by users, victims and regulators alike, (but not insurance companies).

To your question about "So what to do?", there are indeed good answers.

First, the notion of human performance is new as are the notions of the causal pathways to accidents, (which is one of a number of reasons why most should not be grouped as "accidents" but of "preventable occurrences" - that language however, would not suit the politically correct among the critics especially those in the direct line of responsibility - reminds me of the positions CEOs and now our bankers adopt - "Your Honor, the bed was on fire when I got in it", but...I digress).

Preventative action takes quite different forms today than it did in the 50's and 60's. Today the industry has largely solved the technical design, weather, navigational, and mechanical causes of accidents. We needn't delve into how, just to keep this short.

Almost all fatal accidents are caused by or have significant contributions from, "human factors". Machines and, to some extent, systems, may be robust (although how "brittle"* they are is an important question), but humans are not; we know why - we forget, get distracted, tired and have emotions all of which intervene in technically-perfect performances.

The defences do not emerge in terms of "a solution", but rather a layering of defences. This may seem obvious but it is these very layers which, when other priorities such as costs take over, are removed or defeated, usually employing justifications known as the "normalization of deviance". When a process (SOP) designed to prevent an untoward human respond or behaviour is assessed, usually by those who don't know what they're talking about when it comes to safety processes (such as those in the more senior ranks of Flight Operations and any executive level management), "suggest" that such processes are "expensive" and, because "nothing" comes of them, can be set aside, changed or removed.

Almost always, the removal of a layer of defence is met with short-term success, reifying such flawed decision-making as "good cost control".

Somebody here asked about FOQA programs. This is a good example to illustrate this point. These programs are expensive and require significant resources yet are not a "profit center", (a very common corporate approach which does not differentiate "safety" from "marketing" priorities, benefits or costs).

Since such programs don't produce "anything" in terms of adding to the bottom line and instead produce potentially "inconvenient" data, they do not enjoy wide acceptance. Most people simply do not wish to know things either because such things are beyond their control or because they are "too expensive" to fix and besides, "nothing happened when we stopped ___________, (fill in the program or SOP).

So a layer is removed.

You need to understand that in such a complex system, (apropos my comment on getting down to the details), many layers can be "successfully" removed before trends in incidents and close calls begin showing up. It can take years in some cases where high competence, good hiring practises and reasonable training regimes exist. (These notions relate to the MPL, both positively and negatively, but I will keep this short).

A manager who is intent on pleasing his supervisor must, in an SMS environment, be very careful about which safety issues he or she is going to draw attention to. In a self-regulating safety system where the first priority is profit and cost-control, (no corporation places safety first), what will be on any manager's mind will be "how will it look if I draw attention to an item and I am mistaken?". In a bureacracy, one only has "so many chances" for advancement before one becomes "suspect". The key in a healthy safety culture is to promote those who "disagree with reasons" and to eschew those who "go along to get along". That rarely occurs.

It should be quite clear, that such corporate dynamics are directly related to the notion of layering safety processes.

However such corporate dynamics unfold, the crucial aspect of "connections" is missing, simply because no one is ever assigned to see patterns. A manager's job is to put out fires, not to see causes - "that's for higher-ups". But those higher-ups, (described above as senior and executive level management), are doing the same thing - putting out corporate fires, watching costs and share prices as well as the competition. I know for a fact that at least some CEOs do not understand and cannot see safety processes. If the CEO doesn't want it, it almost always doesn't happen.

In summary - what has this got to do with the Turkish accident? As I said, precious little, so egregiously negligent were this crew's actions. However, as BOAC correctly observes, the stable approach criteria, if they had been adhered to, would have rendered this another "experience" for the crew instead of killing them and some of their passengers.

What this has to do with the issues described herein and the question, "what can we do?", is first of all "awareness" of what is already very successfully being done which includes the layered defence approach. Another important approach, which a non-use or non-engagement is today inexcusable perhaps even negligent, is the implementation of a robust data gathering program so that the airline knows what its airplanes are doing. It must be entirely confidential, trusted by the pilots and non-punitive. This is asking to live in an ideal world I know - Asians and even some European lo-cost carriers use FOQA data to "seek and destroy" individual crews rather than the wiser, smarter, less expensive learning route but the more enlightened approach is the rule. Airlines such as BA, BM, Lufthansa, Air France, QANTAS, United, Continental, US Airways, South African come to mind right away as having such robust, long-established programs which are significant layers of safety awareness and which have paid huge, though un-credited dividends.

As you already know Gibon2, none of this (not just in aviation but in human affairs) lends itself to polar-opposite measurement where the continuum is "good > bad". A far more subtle appreciation of what it is to "be safe" and to "do safety" while still making a profit for the shareholders is required. Otherwise, outcomes as we have seen will obtain. The tragic part is, very few can see connections and therefore only blame crews, ensuring that a repeat accident will occur.

These processes (or variations, depending upon the industry) are, (or should be) at work in all endeavours which are accompanied by risk. Medicine and the health-care industry is just beginning to see that killing patients at the numbers they do (iatrogenesis, incompetence, negligence), not the unavoidable deaths) can be addressed as can the number of deaths on highways, (45,000/year in the US alone, or a B747 fatal accident every 3 days).

I hope this is helpful in coming to terms with your important questions. Clearly bandwidth and a long and boring (for some) post is not the answer but reading in the titles I suggested is. There are some very positive posts here by many contributors/professional aircrews which should be gratifying that a) they have taken the time and b) understand all the above intuitively.

The extreme outliers, (good book, by the way) such as the TK accident, perhaps there is no rational explanation which can help us prevent such a vast and inexplicable human failing. While it isn't acceptable (to me and many) to say "to err is human", until organizations themselves begin to comprehend that the notion of "accident" must change, that will remain the only excuse.

* The notions of "brittleness" and "resilience" are useful in understanding how a system or set of SOPs designed to enhance safety, may respond. A brittle system is one in which small changes increase the potential for wide failure, while resilient systems are error-tolerant.

mercurydancer
5th May 2009, 20:05
This is a most interesting thread.

I work in risk management for the NHS (mainly- sometimes for H&S Executive) and frankly aviation safety is about 10 years ahead of medical safety which is why I find such fascination in threads such as this. I will gladly put my anorak on and explain why...................

This is not meant as a criticism in any way, I just want to know what is going on in the next level of thought...

Someone commented that possibly culture affected the crew, in that the person who can get away with something is not seen as wrong but clever. Almost everything established has some form of "creep" be it threads, bombing patterns or financial targets. Without ranting on about management, could I ask to what point is safety obstructed in companies and how does that sit with the crew? I know of several examples where I have been involved in safety issues with medical staff and the experience of taking the matter to the boardroom was made difficult and harrowing, and much of it didnt actually question my findings. Any interruption to service recieved a particularly hostile response.

Rainboe did describe flying an aircraft with automatic systems and there were some comments about cat 3 landing with autoland, and as is his way, he described it with clarity. I can understand if the machine does something different to what is intended, then the pilot should take over, but if the precision of the machine exeeds your own in certain circumstances and is not performing in a questionable way is the machine to be trusted? In many ways I think this is the crux of the matter - autoland. alpha floor and the other engineering (soft and hardware) is just another tool to aid the professional, and should not be considered to be anything more, but when functioning well it is reliable and can be trusted to aid the pilot to a safe landing in conditions where hand flying is not concievable.

Could I also ask a question about the comments about the aircraft is trying to kill you? I understand the general meaning that flight means involving physical forces and conditions which can easily exceed what humans are built to withstand, but is the comment an expression of the attitude of the flight crew? It seems to imply that there is some conflict between the aircraft and the pilots. Does it indicate a mistrust in the aircraft and its systems? If so to what extent?

GlueBall
5th May 2009, 20:16
I hate to say it here also, for fear of being flamed, but had it been a Lufthansa/Air France/BA rather than Turkish aircraft involved, perhaps slightly more understanding might have been shown.

Not so when LH had the dubious distinction of seriously crashing the first B747and killing a bunch of German tourists at NBO in 1974 due to crew error: Unverified leading edge flap setting. The european press had a field day, and LH's shiny image was dragged through the mud.

The Real Slim Shady
5th May 2009, 20:23
And Captain Van Zantem (KLM) is talked about to this day!

Gegenbeispiel
5th May 2009, 21:12
Just one thing to add to this excellent thread: it seems to me that partial automation, as in the TK case, is more hazardous than the maximal (in that case, FMS-VNAV-LNAV to GS capture then autoland) or minimal (AT off, hand-fly or MCP control). In max. auto, the crew knows that the job is to watch the system very closely. In min. auto, the crew uses its classical training and is fully engaged with the aircraft. Anywhere between, things can get confused more easily. And TK at AMS was in such a mode.

herkman
5th May 2009, 22:02
My exposure to flight safety with the RAAF, reveals to me that one factor is often not the single cause of and incident or worse.

The Hercules fleet must have by now at least 800,000 hours of operation, with no lose of life and airframes over a period of 50 years.

This is because of the through training, and constant checks and non flying sim periods, which well exceed normal civilian ones, but it also the attitude that is continually re enforced in the operating climate that all crews must display constant diligence. Yet in spite of the constant operations, which are more demanding than civilian operations, the flying is a safe as it can be made. The RAAF has a safety record which is second to none and is the eveny of many Air Forces of the world. Many reasons including the constant training and requalification of crews and the fact that all know that we cannot afford the loss of an airframe and life.

My personal and observed experience show that seldom, even in a motor car, is there one factor which causes an accident. It is almost like when the events start, and little factors creep in, that unless someone breaks the cycle, then the cycle continues on its way with sometimes a fatal result.

One of the problems of commercial aviation is that it is often driven by cost factors, where the Bean Counters who have in my opinion too much say, often place operational restrictions on matters that should not be introduced.

If we look at the Melbourne instance where penny pinching, stopped providing two laptops, which should picked up that error, was the norm.

How much for flight safety, sometimes it cannot be fully calculated, until one has both a hull lose and human life. It is only then that the full costs come out. Who wants to be the one to tell the loved ones, that in this case a lousy $2000 lap top, means your men will not be coming home tonight.

Regards

Col

PJ2
5th May 2009, 22:19
mercurydancer;
could I ask to what point is safety obstructed in companies and how does that sit with the crew? I know of several examples where I have been involved in safety issues with medical staff and the experience of taking the matter to the boardroom was made difficult and harrowing, and much of it didnt actually question my findings. Any interruption to service recieved a particularly hostile response.
The venue of the "corporate safety board meeting" carries similar characteristics. "Who" rapidly or chronically becomes more important than "What", and the result if a posturing defensiveness on the part of the manager of the department "under the lights", instead of a discussion of 'what', and how to fix it. Territoriality, siloing, denial of data and saving face are all very active pressures and forces if a safety culture is not healthy.

Now, nobody likes to have their performance reviewed harshly so to keep the process going and to prevent people from being driven towards such responses (or worse, being driven underground where nothing is admitted/mentioned), a process of support for dissidence is crucial but an atmosphere of professionalism and deep respect for the integrity and personal qualities of the participants is an absolute must.

The other important aspect of this is, the organization may not know that anything is wrong. NASA "normalized deviance" in both the Challenger and Columbia accidents, (see the book list, previous page). Nobody thought they were doing anything wrong or even unsafe.

In fact, this is how most accidents happen: good, highly-trained, well-intentioned people acting within a flawed system in which risk is masked or dismissed and therefore responses are not seen as needed. In other words, it is extremely rare, especially in aviation, (I suspect in medicine too, but cannot speak to it), for intentional acts or even negligence or imcompetence to be the cause of an accident. The job of safety people is changing awareness and highlighting the unseen - the job isn't to admonish somebody for not wearing a safety vest on the ramp, etc. That is the key point most managements miss entirely and by which they dismiss their safety people's observations and entreaties.

I believe it is eminently possible to accomplish but it requires solid leadership straight from the top and that, in turn, requires some familiarity and understanding with the processes of doing safety work and what makes a company safety-minded. Some think that all safety is, is admonishing individual "unsafe" acts. Others believe that if employees feel that it is safe to report unsafe acts, that is a safety culture. It isn't unless/until the information feedback loop is completed through changes that are led and supported right at the top of the organization. One approach is a "way of doing and travelling" and the other is just swatting at flies - they always come back and land.

The intention is never to management-bash - not at all. That implies irrational anger without the intent or expectation of change. Such an approach informs far too many management-employee, management-union discussions today. Safety is not the basis of industrial-type discussions; safety is not an industrial matter.

Management is just that: in a leadership position, and as such it is expected to lead, make critical decisions and, under SMS, be accountable for all outcomes as a result of their decisions. The messages sent by senior management will be hearkened to by "all the knights and those knights are going to ride out among the people with the message the king has proclaimed".

To return to the critical point you made regarding the hostility with which safety decisions are often greeted - yes, that has been, and in some places remains, the way safety information is received within airlines. That is a culture which historically has had to learn the hard way through aviation's (or medicine's) harshest lesson - a serious incident or a fatal occurrence which, in some circumstances (such as these days), can place the organization itself at risk.

Some posters above chose to mention decades-old examples of fatal accidents as exceptions to the general notions posited here, (in 2009). At that time, (30+ years ago), flight data analysis only existed at British Airways, pioneers in the field, (and still are, where management, the union and the regulator all have access to FOQA flight data). Also, CRM - crew/cockpit resource managment and other human factors understandings were not widely known or understood. What is missing in the posts citing these two examples is the acknowledgement that these carriers, and the ones mentioned, learned and changed and are today models of an active safety culture - not perfect, but still excellent examples of how it should be done.

QANTAS is another example of positive change. Before their accident at Bangkok, their flight data was telling them the risk of an overrun was high on the 747, (ATSB Report states this). They changed after the accident.

It is extremely difficult to "make the safety case" to managers who are driven by the pressures of the bottom line, creating shareholder "value" and the pressures of departmental performance which never measures "how safe" one's department is because such a concept can't be quantified as '6', or powerpointed in dot-points and attention-getting graphics in the ten minutes usually alloted for such discussions at such venues as corporate safety board meetings.

If hostility and diffidence is what you are greeted with, it is my experience that the only other way that an organization will embrace good sense is the school of hard knocks.



Re the saying, "the airplane trying to kill you", it's more of an acknowledgement that if you don't pay attention, the machine will have an accident, as we have seen. Reagan said years ago, "trust, but verify". I think that applies here: a Category 3 landing (CATIII) can be done in 600' visibility where it is tougher (and, apparently in Canada, illegal now) to taxi to the terminal than it is to do the autoland. Pilots trust such systems implicitly, mainly because they perform now and historically, nearly flawlessly. It is the benign, sunny day (in both our "operating theatres) which one must be cautious of.

"Autoland", "Alphaprot", TCAS, EGPWS, ADS, airport ground radar etc etc and the hundreds of technological changes now available are all "aids" designed to reduce operational tolerances while still keeping a constant level of safety. There must be medical changes in procedures, tools and insights which do the same thing - get closer to heightened risk but maintaining a high degree of safety.

infrequentflyer789
6th May 2009, 00:20
The Hercules fleet must have by now at least 800,000 hours of operation, with no lose of life and airframes over a period of 50 years.
[...]
One of the problems of commercial aviation is that it is often driven by cost factors, where the Bean Counters who have in my opinion too much say, often place operational restrictions on matters that should not be introduced.


With no disrespect to your fleet safety record, there is no general exemption from cost factors, bean counters, etc. in military or any non-commercial aviation. You may also be more likely to end up with politicians involved as well...

The recent history of the Nimrod program is the obvious military example, but there are plenty of others.

Mad (Flt) Scientist
6th May 2009, 01:45
The Hercules fleet must have by now at least 800,000 hours of operation, with no lose of life and airframes over a period of 50 years.

And, of course, the historic accident rate for large commercial aircraft over the last 20-30 years (based on Boeing and ICAo studies, and generally accepted to the extent that it's virtually part of the regulations now) is one accident per million FH. So no accidents in 800,000 hours actually isn't really that out of the ordinary - it's somewhat close to what you might expect.

White Knight
6th May 2009, 02:25
Herkman - there have been TWO laptops aboard EKs' aeroplanes for at least a year already. However the SOP was to use ONE for the take-off performance calculations so you are incorrect in your summary.
As for 800,000 hours accident free, well the EK Airbus fleet is way past the 1,000,000 hour mark... Just the 345 fleet alone I would guess to be around 250,000 hours in 5 years, but I mention the whole fleet because most of us fly MFF..

CONF iture
6th May 2009, 02:36
How much for flight safety, sometimes it cannot be fully calculated, until one has both a hull lose and human life. It is only then that the full costs come out. Who wants to be the one to tell the loved ones, that in this case a lousy $2000 lap top, means your men will not be coming home tonight.
To me a single laptop is good enough, what matters is the way you use it :

PF gets some perf figures from the laptop, inserts them in the FMS, exit the program.
When ready PNF takes the same laptop, gets some perf figures as well and compares with what was already inserted in the FMS.


If a difference exits, then both guys discuss ... ?

I don't know if it's the way EK SOP are, but I would find it logical.
At least that's the way we proceed with a single set of performance paper charts.

Uncle Fred
6th May 2009, 03:44
Very good thoughts PJ2 particulary in your mention of the pitfalls of looking at accidents that have occured before the advent of FOQA, CRM, trend analysis etc. We have collectively learned from previous accidents.

I very much agree with your (at least as to how I interpreted it) thought that airline safety is not beaking on someone for not wearing a safety vest but rather striving for both a changed awareness and looking for flaws that might occur in "the" system e.g., the NASA review.

It has taken decades to adopt an attitude of "fix the problem and not the blame" The more enlightened operators in any industry, airline or not, realize that fixing the only the blame ends up fixing nothing. Safety is an attitude that is fostered, grown, and reminded not mandated.

Great remarks as well about fatigue that you posted in another thread- definately the 800 lb Gorilla in the room that one all too often only pays lip service to but never seems to seriously acknowledge. As it was once well put, if you awaken Rip van Winkle from a 20 year slumber and keep him awake for one night, the 20 years of sleep are for nought." Why it is so constantly denied that rising at 1 a.m. body time for a ten hour flight is not a fatigue risk to me beggars belief although that is a topic for a different thread...

PJ2
6th May 2009, 06:44
Uncle Fred;

Re your response...these are the reasons why I don't fully trust the IOSA process. I've seen an IOSA safety audit give a passing grade to a carrier in circumstances that didn't deserve it, (and another country to which the carrier flew agreed and banned their flights until the issue was dealt with). Too much politics and economic pressures in trade for the title, and not sufficient diligence in my view.

The certification is at risk of being a sought-after label or even a brand and not an "ISO 9004" type standard. IATA is a corporate lobby group for airlines and while it has a significant presence in flight safety, they also strongly lobbied for example, against changes in flight time and duty regulations in Canada.

It's never black and white of course but you can't suck and blow at the same time.

The industry is extremely good at what it does but there exist illusions and denials which are inappropriate, and at odds with the known data in, for example, FOQA, LOSA and ASAP programs. That's the only message I am ever really, constantly serious about - illusions are fine for stage management but don't work in aviation. "Fly the airplane" (meaning, know you're in the aviation business and all that entails including knowing what your airplanes are doing), whether you're the CEO or the guy who checks the oil, and the business can be profitable over the long run and you won't hurt people.

4Greens
6th May 2009, 08:18
For those who wish to delve deaper into why safety is so difficult to achieve with error prone humans, try reading up on Risk Homeostasis. A gentleman called Wilde is the chief guru.

alf5071h
6th May 2009, 12:55
Error is a very complex subject. An interesting view is given by Woods & Cook in Perspectives on Human Error. (http://csel.eng.ohio-state.edu/woods/error/app_cog_hand_chap.pdf) Error could be a categorization after the accident (hindsight bias), or considered as a process generating the accident. The references to knowledge in this paper, IMHO are important both to this discussion and aviation safety.
Other error categorizations are given by Reason in GEMS (generic error modeling system) which includes Skill, Rule, and Knowledge based behavior (note that rule in this instance refers to the ‘rules of mind’ not SOPs); there are similar categorizations in HFACS.

A more practical view is given in Errors in Aviation Decision Making (www.dcs.gla.ac.uk/~johnson/papers/seattle_hessd/judithlynne-p.pdf) (Orasanu et al). Here decision error is broken down as a failure to understand the situation, or with correct awareness, an incorrect choice of action; these definitions appear to relate to the TK and ET accidents respectively.

A cultural view of situation assessment (or accident review) might come from New Scientist “East meets west: How the brain unites us all.” (www.newscientist.com/article/mg20126981.700-east-meets-west-how-the-brain-unites-us-all.html?full=true)
My very simplistic interpretation of this is that a traditional ‘Eastern culture’ might perceive a situation holistically – a wider view including the relationships within it, whereas the ‘Western culture’ perceives a situation analytically – a narrower logical view.
These views would not be restricted to national culture, thus organisation and professional cultures (independent of nationality) should be considered, possibly with the largest divide amongst the ‘professionals’ e.g. commercial pilots / GA.

Relating the above to the original question, then the different views might indicate to how the situation preceding the accident is judged in hindsight. In addition, there could be bias due to different understandings of human factors (beliefs, myths, experience, knowledge of error) and the expected professional standard; see my post at http://www.pprune.org/4904796-post2429.html

Thus the mode of thought (or capability for measured thought) might account for the differing views of the accidents.
If this is relevant then a focus on our thinking skills could open opportunities for safety solutions. Crews should be trained how to perceive situations and assess their own performance (self awareness - thinking). i.e. we cannot train-out type 3 mistakes as they are inherent in human behavior, but we could train crews how to think more effectively instead of relying on rules, procedures, and guidance, which have limited applicability.

A form of ‘self’ TEM, or ‘self’ LOSA, but aren’t they the basis of airmanship?

FlexibleResponse
6th May 2009, 14:28
mercury dancer,
Could I also ask a question about the comments about the aircraft is trying to kill you? I understand the general meaning that flight means involving physical forces and conditions which can easily exceed what humans are built to withstand, but is the comment an expression of the attitude of the flight crew? It seems to imply that there is some conflict between the aircraft and the pilots. Does it indicate a mistrust in the aircraft and its systems? If so to what extent?

Just to take this one point...perhaps the phrase "the aircraft is trying to kill you" is analogous to "the patient is trying to die on you".

One false move, or one moment's inattention and the patient will die on you due to the way in which you conducting or are failing to conduct the procedure.

In aviation, those in the team assisting the Captain are empowered and encouraged to voice their concerns if they perceive that some action or decision might be in some way to be detrimental to safety of the flight. They are not ridiculed or punished in any way if such concerns ultimately are shown to be unfounded due to whatever (say inexperience). In fact the concern expressed is directly addressed and justified or explained or the decision is found to be faulty and modified. Anyone contributing will be rewarded with a thank you for their observation and interjection.

Special phrases are sometimes used to alert the Captain that a concern is not a minor event and a very positive acknowledgment and response is required. Some well known airlines use the phrase, "Captain, you must listen to me..."

This is a real attention getter and cannot be ignored.

The outcome of a flight is the product of the team not just the Captain. In the same way the outcome of an operation is the outcome of the team and not just the chief surgeon.

That is not to say that the individual skill of one of the team may not be the determining factor of a good outcome, but it is to say that an aircraft or life is not lost due to an error that was noted by one of the team and was not vocalized in a timely manner.

Oh, I thought he saw or realized that, so I didn't say anything at the time...

I have spent a lot of time training pilots to fly airliners. But, the first lesson is preceded by, "If you don't understand what I am doing, or if you see me doing something wrong, you must tell me immediately." On occasion trainees have made a very direct contribution to safety of my flights.

I didn't invent this type of crew training and interaction, it came from CRM training that was given at my airline and is now part of the curriculum to gain an airline pilots licence.

dessas
6th May 2009, 14:35
Some questions and my answers:
Q: Why this new Boeing did not have "ALPHAPROT" or at least low-energy warning as all buses do?
A: Because it is cheaper to make it like this (higher mark-up for Boeing)...
Q: Why do we have to use LPCs and various similar customer programs to calculate the speeds in an un-pilot like manner?
A: Because it is cheaper to do it this way, rather than implement a totally new on-purpose design that will never allow the proverbial alignment of the cheese holes...
Q: Why, after so many incidents and accidents Boeing has not yet redesigned the oxy supply on the 737?
A: Because it is cheaper to make it like this (higher mark-up for Boeing)...
I can go on for ever!

On a different tack:
IMHO the problem is that the machine-human interface is becoming ever more sophisticated, but dangerous as well as it invites operators to diminish the importance of experience! Because operators and regulators are so being told by Airbus and Boeing!!!
I got my first taste of wide-body flying about 6 years and 3500 hours into my career. It was close to the industry standard of the mid-80'. Now every other time I have an f/o on my right with about 1500h, the "playstation" type as someone rightfully named them.
These playstation guys can interpret fairly well what the dials and various fancy diagrams are telling them, but can they use this chewed-up info to build a good enough "greater picture", especially under extreme stress?
On the other side, could the "old hands" be lured into the "cheese trap" by just becoming happily complacent because the computer "flies itself so well"
I guess the last ones are the zillion dollar type question, but they keep coming to me on these long, long, long 2-man operated flights...
:mad:

Kerosene Kraut
6th May 2009, 15:00
"Not so when LH had the dubious distinction of seriously crashing the first B747and killing a bunch of German tourists at NBO in 1974 due to crew error: Unverified leading edge flap setting. The european press had a field day, and LH's shiny image was dragged through the mud."

Gluey,
the lufty flightcrew got cleared by a german court back then. There was no proof against them but a possible problem with the slat indicator in the cockpit that seemed to have happened to a british 747 in a similar way shortly before too. However the germans had not been aware of that incident before.

Blacksheep
6th May 2009, 22:08
From EASA Safety Information Bulletin 2009-12 published as a result of the Copenhagen accident.

"If one LRRA provides erroneous altitude readings, the
associated flight deck effects may typically include:
• Inappropriate Flight Mode Annunciation (FMA)
indication of autothrottle RETARD mode during
approach phase with the airplane above 27 feet Above
Ground Level (AGL). There will also be corresponding
thrust lever movement towards the idle stop. The FMA
will continue to indicate RETARD after the thrust levers
have reached the idle stop rather than change to ARM.
• Large differences between displayed radio altitude.
• Inability to engage both autopilots in dual channel
approach (APP) mode.
• Unexpected removal of the Flight Director Command Bars
during approach on the pilot’s side with the erroneous radio
altimeter display.
• Unexpected Configuration Warnings."

Faced with multiple unexpected and confusing indications and warnings a crew should "fly the aeroplane". Human nature is however, attuned to solving problems by shutting out everything but the task in hand. The instinctive reaction to multiple problems is to focus on the first to be noticed and ignore the rest. It is this conflict between instinctive behaviour and trained behaviour that lies at the root of "human factor" accidents.

mercurydancer
6th May 2009, 23:02
pj2 and flexible response

Thank you for your replies. its appreciated.

greywings
7th May 2009, 03:42
How refreshing to see such a fascinating debate on such an important subject. Although I cannot add anything of particular import to the proceedings, I would like to join the debate with part of the conclusion of a paper that I recently wrote, entitled, 'Where are we going with the Accident Rate?'. Although I will be guilty of repeating some of the points already made I consider it a privilege to be part of this discussion.

Thank you,

GW


"6. Conclusion

The above no way implies criticism of the entire industry; indeed, most pilots and managers perform to admirable standards, often under enormous commercial and operational pressures. However, as an industry we have no right to go on killing people and losing aircraft at the present rate, especially when most losses are totally avoidable. If I had to identify one area that causes me most concern it would be over-reliance on automation and reduction of manual handling skills. Engaging the autopilot immediately after take-off and disengaging it immediately prior to landing may result in an efficient, fuel-saving flight, but insistence by management that automation is used exclusively leads to reduction of handling skills and possible confusion when an unfamiliar situation develops. The current increase in loss of control in flight incidents / accidents may well be at least partly attributable to this. Although the current move to improve 'Upset Recovery' training is all well and good we should also be ensuring that crews do not get into the situation in the first place. Knowledge of the handling characteristics of the aircraft in all stages of flight, combined with good handling skills, will help alleviate the problem.

In an ideal world, every flight would be routine, the weather perfect. Engine failures would occur precisely at V1 and all abnormal and emergency situations would occur in the cruise, without any distraction and with plenty of time to run appropriate checklists. However, as we know, this is rarely the case. Whereas some pilots are fortunate, with their incidents occurring under ideal conditions and no decisions required other than to land the aircraft immediately, others face situations that test their technical knowledge and handling skills to the limit. Unfortunately, and all too often, the latter situations do not get the publicity they deserve, which is a lost public relations opportunity apart from anything else.

As an industry we need to train thinking pilots and all too often we are not doing that. Instead we place our faith in technology and rather than developing human skills and knowledge to match the improvements in technology, we often do the opposite. New cockpit designs and procedures place great emphasis on heads-down activity, drawing attention away from what is happening outside the cockpit with a commensurate decrease in situational awareness. Recruitment and training bear special mention. In times of rapid growth in the industry demand often exceeds supply and people end up in the cockpit with less than desirable experience and, in some cases, aptitude for the job. More needs to be done to weed out those who do not have the personal attributes to perform consistently as required, in all situations and always to an acceptable standard. Fear of legal consequences has led to training and check reports being marked 'satisfactory' or 'unsatisfactory'. I would suggest that there is nothing less satisfactory than those two comments. Unless there is a comprehensive paper trail that can be followed in the event of an accident, how can we ever validate our recruitment and training systems? Failing to do this is dishonesty of the highest order.

Pilots and managers share responsibility for the standards in our industry. When things go wrong it is far too easy to blame the regulators, manufacturers and others who have an influence. At the end of the day, we know what is right and what is not, what we should and should not do. We need to make more effort to come together to address those issues that we know are in need of attention."

PJ2
7th May 2009, 04:43
For me GW, very well stated indeed - it may be the same elephant in the living room but stating it as many ways as possible is needed. Well done, in my view. I think Canada in particular, needs to hearken to the message - the FAA in the U.S. has already been chastened by events under SMS. We can only avoid the outcomes of the privatization of safety for so long.

SPA83
7th May 2009, 06:11
When you have both poor maintenance and poor airmanship in an airline, it’s no use to blablabla about human factors after an accident. You need first to question about mediocrity and safety culture

lomapaseo
7th May 2009, 13:20
When you have both poor maintenance and poor airmanship in an airline, it’s no use to blablabla about human factors after an accident. You need first to question about mediocrity and safety culture

You're on page 1 and we're on page 4. The question has already been answered and now we're at how to address the answer.

FrequentSLF
7th May 2009, 17:53
I found this thread very interesting.
In my opinion there is something missing on this thread.
Great contributions form the likes of P2J, lomapaseo and many others...
However I found still puzzling the understanding of "lack of training" and "poor airmanship in the airline".
I questioned in the past how a pilot will deal if he/she feels that him/her was in need of extra training. I did not get a clear cut reply!
I am a SLF therefore my opinion/feeling are definitely questionable, but let me say that the issue of lack of training sounds more like as a good excuse to shift the blame. Same goes with the "culture issue", another way to find something to blame for systemic failures.
Let me explain, I assume that all the pilots are highly trained professionals, which are "immune" (cannot find a better word) from incompetence and "culture issues". I also assume that all the pilots are 100% dedicated to their job and would not compromise on any safety related issue.
If my assumptions are correct, why to blame the lack of training? If a professional feels that his/her training is not the best should voice his/her concern about it...
Furthermore highly trained professional should not be influenced by "culture issues". He/her should be able to overcame such issues without a blink of eyes.
Am I really out of bounds?
Are we witnessing a degrading of the level of professionalism of your category?
I might sound provocative, I regret that, but I feel that it should be a topic of discussion. We (you actually) cannot dismiss easily the type 3 errors (Turkish), saying that those pilots where incompetent. IMHO the first question should be "why they were incompetent?"
Just my 2c.

greywings
7th May 2009, 18:12
Thanks PJ, I am encouraged by all the comments made so far. Clearly, we are all thinking along the same lines. However, the piecemeal approach that we are taking merely dilutes the efforts. Well-intentioned people work hard to improve things - and I include managers as well as pilots - but often with differing interpretations of what is the most effective way to proceed. The mandating of SMS (for instance) is all well and good but unless there are clear guidelines of exactly what is required there could be as many different systems as there are operators.

As someone who has seen the problems from both the line pilot and management sides, the solution seems obvious. Both sides have to work closely together. Each has to set aside the mistrust that predominates in many operations, and I include here corporate (business) operators as well as the airlines. There are glimmers of hope: pilot / management relations are excellent with some operators, but, sadly, not in all.

Clearly, there isn't a shortage of good ideas. We have seen many offered here in the last few days. How we collate those in a useable form and get all stakeholders to buy in is the challenge.

Regards,

GW

PJ2
7th May 2009, 19:24
FrequentSLF;

Your questions and observations are valid in the sense that they are legitimate and thoughtful questions to ask.

First, your observation regarding the very high level of professionalism, dedication and "address", (a term that intends to convey the notion that on is "aggressively, thoughtfully and caringly professional) by 99% of aircrews is correct in a number of senses, the main one being, aviation kills those who are habitually inattentive or careless.

I think it is safe to say that there is not one person who is now flying commercially, professionally who does not know the instant rush of adrenaline and all the other sudden physiological responses which follow a serious incident in which one almost lost one's life and almost cost the lives of one's passenger(s); one never forgets the sensation - it is never "familiar". It is equally safe to say that 99% of all pilots will know at least one friend or aquaintance if not a few who were killed in an aircraft accident.

These are viscerally effective teachers, the lessons from which are not forgotten.

It is a pity that neither the bean-counting crowd nor the CEOs, Presidents or other executives of aviation companies experience such primordial levels of physiological response; they might be a bit more careful and attentive themselves to the business they're in and "fly the virtual airplane" more mindfully. I don't mean this unkindly or in a "bashing management" way - I think it is crucial for executive level managements to know how safety works and that it certainly isn't admonishng someone to "not run with a knife", or to "be careful out there", etc.

This in hand, we all know that comfort and complacency are psychological states of mind to which humans are susceptible and which, when permitted, have sociological and organizational expressions, captured under many new awarenesses the most recent of which is called "the normalization of deviance".

It is not the easiest thing to "be aware of what one doesn't know", at least until one meets circumstances which challenge one's "address" significantly enough to momentarily set one back on one's heels. That is a human trait as well. That is one of the chief projects of safety work - to highlight heretofor "invisible" factors, bringing into awareness those issues and circumstances which may harm one, or one's organization.

In a healthy safety culture, if one feels the need for training, one asks for it and one is granted the opportunity.

The caveats are interesting however. One asks because one is "lacking" - and for a pilot to admit same or to be shown as lacking through failure, especially in a culture where macho he-men "make no mistakes", (or fire those who do). Such cultures exist, (as we have seen here), and one does not always obtain the necessary and appropriate response.

In airlines which have a functioning FOQA Program with an associated safety culture that takes such a program seriously, an agreement between the airline and the pilots' association will inevitably be in place which permits the addressing of competency/training issues as may be showing up in the data.

This doesn't mean that management is using data to "go after" a pilot. The arrangement is, the pilots' association accepts the responsibility of approaching an individual, discussing the what and why and, through the FOQA Agreement, scheduling training as needed. The entire process is "below management's radar" but they are aware due diligence is served. These kinds of responses are extremely rare; in circumstances which may call for some comment, it is a "fine-tuning" response usually from simulator or checkride sessions and the individual almost always voluntarily addresses the situation.

I have, with others, built these processes, seen them in play and know all of this works and works extremely well, but the company must obviously be onside and trust the process.

Usually, long before anything "semi-formal" arises, one's colleagues with whom one shares a pretty tiny environment, may make respectful but necessary observations which go to the same point - competency and training. Increasingly, this is done without fear of reprisal, hostility or pouting in the corner... That's a small part of what CRM is. One would hope that at some point the medical profession will arrive at this same level of mini-intervention where one's professionalism and integrity remains intact but a caring observation from a colleague can still remind one that something may need attention.

Enlightened organizations which know that employees are tremendous assets and not mere millstones around profit's neck will have programs which can formally or informally intervene when something is coming off the rails. Financial issues, family issues, health issues, addiction issues and so on affect everyone; discipline or dismissal is almost always never an appropriate response. Peer-to-peer programs, Employee Care programs and so on provide necessary responses long before the collective agreement clauses are anticipated or used.

I know of no major carriers who enjoy long-term success in this squirrely industry which do not have all these approaches to human factors in place to some degree or other.

We should be under no illusions either; These programs aren't about social "wellness" just for the sake of it; corporations are clearly not in the social welfare business as we well know - these programs are about employee productivity and keeping valuable, highly-trained resources at work and functioning towards the company's only goal - financial success.

To your last point regarding, "are we experiencing a degrading of professionalism...?", - In my view and as I have expressed a number of times here, yes, we are.

Even as though personal standards and integrity in pilots is in the end a survival tool, such endeavours and standards must be fostered and supported within the organization. It is a complex inter-relation which cannot rely solely on "personal standards" to retain high safety levels. The best example I can think of are the two Shuttle accidents. Can one think of a more highly-trained, dedicated and professional crew than those of Challenger and Columbia. But it is clearly demonstrated in the hundreds of studies, papers and books on these two accidents, that they were purely "organizational" in nature; there was absolutely nothing either crew could have done either before launch or during flight that could have saved the mission.

I realize that these are outlying examples and that airline work is much farther away from such operational boundaries; but the principles are the same. If we turn to the pilots for the reason an accident occurred, we will not know the whole story and so it will repeat itself again, either within the same organization, or, because sharing information is still very much in it's infancy (in terms of trust as well as airline interest), within other airlines.

FSLF, hope this is useful - this is the stuff of large books and late nights, y'know!

lomapaseo
7th May 2009, 21:37
I'm just probing a little more here to better adjust my own points of view.

I'm always bothered by throwing a problem over the fence with such throwaway words as:

they screwed up... end of story

The bean counters are in the way

management doesn't care

it's in their culture

etc. etc.

OK maybe those words are a little over the top, but they do serve as an introduction to what I sense

In my own interactions with the safety professionals within our industry I do see dedication and an unwillingness to sweep things under the rug. Furthermore I do not hear any sounds of the crass statements that I made above in my introduction.

So why am I hearing it here on this board?

I do recognize that the great majority of us don't work directly in a flight safety office, but for personal knowledge of those that do (third hand or better) just how restricted are your hands tied from identifying shortfalls and following up that they are addressed?

Am I to understand that this is a ad-hoc issue and even driven by culture? or do the great majority attend to these issues in an openly funded process free from interference from bean counters and management politics?

bubbers44
7th May 2009, 22:26
#1 RA wasn't working properly. It did not cause the crash, the pilots who are normally monitoring instruments caused the crash by not monitoring. I don't think Boeing has much to do with this crash. You can not make an aircraft so safe that the most incompetent crew in the world can fly it with no problems. Airbus may try to make pilots unable to screw up but I hope Boeing doesn't. Having final control of the aircraft and no computer overiding me has been wonderful in my career. I love Boeing products that let you overide anything that doesn't do what you want. I know this was a Boeing but nobody was flying it.

PJ2
7th May 2009, 23:58
lomapaseo;
I do recognize that the great majority of us don't work directly in a flight safety office, but for personal knowledge of those that do (third hand or better) just how restricted are your hands tied from identifying shortfalls and following up that they are addressed?

Am I to understand that this is a ad-hoc issue and even driven by culture? or do the great majority attend to these issues in an openly funded process free from interference from bean counters and management politics?
In my 20-year experience as a safety specialist I never experienced tied hands, fear of reprisals or job security or restrictions in identifying shortcomings, even when speaking directly to senior executives. Funding, while strategically parsimonious to a fault, was still there even under difficult economic circumstances.

Your question assumes however, perhaps without intending it, that if these aren't issues then everything will proceed normally because who could be in possession of the data and then not use it? It seems so obvious as to be not worth considering. But in my experience it must be considered.

If a carrier collects FOQA safety data but does little or nothing with it or simply ignores or dismisses it as "wrong" (when inconvenient) or otherwise explains serious occurences away, is one still "doing safety"? I submit that such box-ticking is not doing safety at all but is merely satisfying SMS requirements, or rather the illusion, "to have a program in place." That's just not how SMS is supposed to be done, at least in my understanding - you're supposed to collect data, review it regularly and, where indicated, change, and not take a few years to do so when the data indicates a consistent high-risk trend in aspects of the operation.

Obviously this isn't the place for a detailed discussion but I can assure you without hesitation and with plenty of examples that the impeding factors and "characterizations" which you describe above have been, in my 2 decade experience as a safety specialist, unmistakable and significant. I don't use the terms I do lightly.

But you make a valuable point nevertheless. "Parsimony" or aggressive cost-control do not, in and of themselves, "cause incidents/accidents". I think that is an important point to understand and I think this is your point: We cannot "blame" the bean-counters for "not installing this-or-that, or funding this-or-that" in and of itself. Some very fine operations continue unaffected with thin budgets. It is when an organization over-reaches it's funded risk mitigation strategies and, further, isn't aware that it has done so, that risk increases significantly.

To support this point even moreso, most of the serious incidents and high-risk trends I/we became aware of and communicated (vigourously and often) to the appropriate personnel, were not high-cost items to fix. I think this is the case with the vast majority of causal pathways (why-because) - shortage of money was not a factor; awareness and the corporate will to act, however, were.

I think the term "beancounter" is used pejoratively by many including me, to inform others of their frustrations with the larger processes of change or rather the thwarting of change when/where indicated from a safety program point of view. I think the duty day and crew fatigue issue is a exact case in point. Crew costs are always right at the coal-face just as fuel costs are. Where our professional differences lie is in the tremendous lobbying efforts by those who we see as interested only in pairing down costs without sufficient or educated justifications, to enhance "the bottom line".

That is the "story" for us. How it is viewed from our beancounters' side has, to my knowledge, never been provided, not at least to us. They simply ignore what we have to say about crew fatigue and the causes of accidents and carry on lobbying and complaining about the cost of fourth pilots or pilots doing safety work with reduced flying schedules. I am certain that pilots are seen as whiners, complainers and prima-donnas by those who have never been along on a trip and seen what is done to keep our, and their, operation going. 'Twas ever thus.

lomapaseo
8th May 2009, 01:29
PJ2

Thanks for your as always thoughtful reply. Let's see what others might also contribute.


You have pointed out a couple of points which I take as:

Some safety folks don't recognize or properly use the data available to them. That's a big problem if it turns out to be pervasive.

The pilot fatigue problem is not being captured in the safety data and as such continues as an unabated and possibly increasing risk.

I'll probably have more to say about both of these points if they turn out to be well supported across the other posters

FrequentSLF
8th May 2009, 05:14
P2J,

Thanks for your reply, I could not ask more!

We shall never forget the the "employees are the biggest asset of a company"

FSLF

PJ2
8th May 2009, 06:01
lomapaseo;
Some safety folks don't recognize or properly use the data available to them. That's a big problem if it turns out to be pervasive.

No, that's not what I meant - it is when the safety people do their due diligence, report what is being seen in the data, and what is reported to the operations people is ignored, dismissed, discredited or simply not even acknowledged by a response from them.
The pilot fatigue problem is not being captured in the safety data and as such continues as an unabated and possibly increasing risk.
The pilot fatigue issue is being captured in the data. My point was, the data is being strategically ignored (with corporate "justifications").

We have already, accidents directly attributed to crew fatigue - AA at Little Rock, MK at Halifax to cite two examples from memory, one domestic, one international. That, and crew reports as well as the available science on physiological effects of long wakefullness especially across time zones, (domestic or international).

The issues swirling about this thread have to do with human factors, and how the aviation system is addressing them - coming to terms through due process with them; One poster asked if I thought these issues were degrading professional standards - I said yes, they are. As always, there are pockets of "resistance" and pockets of concern with an overall highly successful safety record. It is trends and the nature and quality of recent accidents which is on safety people's radar. The accidents are not technical failures, bad weather, running into terrain, etc. They're loss of control and situational awareness as well as approach-landing accidents which were well outside stabilized approach criteria. These are bread-and-butter issues for this industry, but there they are - occuring. |We can see them in the data, now...today and have continued to report them. How clear must one be before someone "gets it" ?)

In terms of flight duy times alone, (and I dont' have my needle stuck on that single issue but I've done a career-load of long-haul and know the effects inside out), nothing has materially changed since the 70's and in Canada it's worse, where two crew members and a relief pilot (not certified/trained to takeoff or land - legally in the seat only at cruise altitudes) can be on duty for up to 20 hours, 23 in "unforseen circumstances". Where else except in the third world does such laxity obtain?

In terms of data use to assess trends in risk levels, the record is spotty at best and we are moving towards self-regulation in flight safety, with data collection and use being the center-piece of SMS. If the data is ignored by those responsible for cost control (flight operations management personnel), SMS is not being done. That is, in my view, a distinct, "clear and present" risk. While auditing by the regulator may discover such shortcomings, staffing levels within the government department cannot begin to deal with the work of actually sifting through and dealing with the results. The trend is toward "self-regulation", all the way down and the safety people know it.

There are a lot of people in the industry who comprehend all this very clearly and who have also commented here. This isn't "me" talking - this is the industry.

I hope too, with you, that others will at least offer a comment, perhaps even in disagreement - the dialogue is the thing, not "being right" or "succinct". There is much more to this than I am capable of expressing and we have seen a bit of this already.

lomapaseo
8th May 2009, 14:06
I hope too, with you, that others will at least offer a comment, perhaps even in disagreement - the dialogue is the thing, not "being right" or "succinct". There is much more to this than I am capable of expressing and we have seen a bit of this already.


We are not necessarily in disagreement but probably approaching this from different directions. However we do agree that we need other views as well.

I sense a degree of frustration on your part about the use of the data. I also agree that there will never be enough regulators to audit and indentify poor use of the data.

Of course in my many years I have come across the same. The way that it can be addressed is to promote a uniform process that forces the use of the data in a gated system approach. The regulator then audits that an approved system is in place without necessarily auditing dotting the eyes and crossing the tees. This method does require some buy-in by the stakeholders. This latter coment is a biggie and my words below may be beyond the interest of most thread readers.

1) You need buy-in by the stakeholders that there is a problem and addressing it will be good for all.

2) Proposals are then made for adoption of best practices (your bigger airlines probably have developed some of these already)

3) Several of the best practices are then identified for all to embrace that best fit their operations). Shake hands on this.

4) The regulators then follow up to identify which of these best practices have been adopted.

Now when/if an accident occurs the more advised in the industry (on boards like this) can point at the best practice that could have addressed the causal chain without blaming the crew or their culture.


And I realize that I am not addressing crew fatigue since I have no data other than ancendotal

Silverspoonaviator
8th May 2009, 17:05
This is perhaps the most logical and non confrontational thread ever on Pprune.

No bitching, or "blaiming", (well almost), and a logical progress of thought trying to explain the accident factors and processes.

Would it be possible for the major contributors, to produce a combined document that covers this thread. In a PDF version, for presentation to management. As well as for back up resource for any Safety lectures.

ssa

alf5071h
8th May 2009, 18:04
FrequentSLF, IMHO your assumption goes too far (#65).

If ‘incompetence immunisation’ is having the required skills to undertake the job, then pilots are trained and checked to both have and maintain minimum standards. However, these may not qualify them individually, for dealing with all situations. In commercial aviation, there are instances where both crew are required to provide an acceptable level of safety, i.e. a training flight could have higher risk.
Skill is an acquired behaviour which has to be developed. Flying skills are relatively easy in modern aircraft, but ‘operating’ skills within the modern aviation system are more difficult to teach and improve – the problems of gaining experience in the digital world, hi tech etc.

One problem identified repeatedly in this forum is the human-automation interface; this is a complex subject where there are many differing types and standards of automation. Furthermore the implementation of automation (technology) can be quicker than changes in the bounding regulations or training (particularly the removal of old habits – changing skills). E.g. the 737 is a relatively old aircraft design, leading to a perception of inadequate automation. When operated ‘as designed’ its good and meets the safety requirement; however, if used in a complex situation, perhaps with a more recent hi tech operational philosophy (always use autos), then workload might increase and safety reduce, with opportunities for error.
Most of the piloting skills in flight operations are mental – hence my previous post re the need to train pilots to think. Thinking is a dominant element of professionalism ~ airmanship ~ (discipline, skill, proficiency, knowledge, situation awareness, judgement, - T. Kern).

Similarly, ‘training’ pilots for resistance to cultural issues is difficult; the extreme is like saying that pilots will not suffer errors. Cultural resistance, (national, organisational or professional culture) is also a thinking skill, closely associated with CRM training, but often not recognised in this area and hence not taught. Yet operators meet the requirements so perhaps the regulators need to take note.

‘Is the level of professionalism falling’?
Probably, when judged by the high ideals set by the professionals in the industry.
Possibly, when having to maintain a high level of safety in a rapidly changing (technical and economic) aviation system; pilots may be unable to acquire the necessary knowledge in the timescales available – this could be seen as a shortfall in training or poor professionalism.

‘Were they incompetent’; we wait for the safety reports.
If incompetent – lack of skills, then why … what was the standard of training provided, materials available, money, management policy, etc, etc; the conclusion cannot rest solely on the crew.
The alternative, that the crew had sufficient skills (a minimum for the task), but were these at the limits of human performance for the situation the crew encountered. As above, the investigation cannot stop there (pilot error). There should be judgement elsewhere on the required standards and if the crew met them. Were procedures / guidance to minimise error in such situations provided, were these the optimum, etc, etc. Why did the crew encounter such situation? These may be operator or regulator responsibilities.
Thus begs the question how deep do you go with the investigation. In my experience it has to be deep and practical enough to prevent another accident, but in this quickly evolving and complex world what are the timescales that an assessor has to consider?

greywings
8th May 2009, 18:05
Sounds like a good idea.

I have recently produced the paper I referred to above which I will be pleased to send to you if you like. It is philosophical rather than full of detail but it does highlight many of the points that I have raised or have been raised by others in this thread and may be of use / interest to you.

As I stated previously, there isn't a shortage of ideas, in fact, the opposite is true and maybe that is why it is so hard to make some serious progress. A paper that encompasses the best ideas and presents it to the aviation community via various periodicals and conferences may have the best chance of making a difference.

Despite the above and comments by others we have to remember that although safety should never be compromised there will always have be a compromise between operational excellence and commercial expediency. We have to accept that as a necessity and work with management and regulators to provide something that, while guaranteeing safety (as far as is possible), it does not impede commercial success - otherwise we will all be out of a job.

Thank you for your suggestion.

GW

FrequentSLF
8th May 2009, 19:01
alf5071h

IMHO your assumption goes too far

You are right. My intention was to be "provocative" in a positive way.
I pushed a bit to the limit the rational but that has provoked two great replies, yours and P2J which have provided very exhaustive lines of thinking about the issues I have raised.
I am confident that you understand that I do not have the knowledge to even write half of what you and P2J have expressed on your posts. However let me say that I was expecting some replies on those lines.
I thank you for spending your time to write your post, which adds a lot to this thread.

‘Is the level of professionalism falling’?
Probably, when judged by the high ideals set by the professionals in the industry.
Possibly, when having to maintain a high level of safety in a rapidly changing (technical and economic) aviation system; pilots may be unable to acquire the necessary knowledge in the timescales available – this could be seen as a shortfall in training or poor professionalism.

If you allow me to comment the above

Do you mean that the industry standards of professionalism are set to high? I do not think that saying "the bar is set too high" means that the levels are falling. However if that is used an excuse (i.e. the bar is set too high to jump it) it is a symptom of falling professionalism.
The second part of your statement is the most intriguing. How to address it? Definitively airlines are always looking to expand the network and the frequency, which will mean shorter timescales. Is humanly possible to train "good" pilots in a shorter time? We might have nailed the nail right on the head.

Thanks once again
FSLF

PJ2
8th May 2009, 19:02
Silverspoonaviator;
I think your suggestion is a good one but if I might offer an observation followed by a suggestion...

What we are seeing here is the informal expression by both safety and pilot professionals who are experienced in this work and who are offering native understandings of aviation safety and how it should work. These understandings are largely (though not wholly) intuitive to the profession though likely are not part of the toolbox of many operational personnel. The notions here expressed, have been formally presented and otherwise addressed in a number of superb books listed by me and a few others earlier in the thread. If I might ensure an understanding about what's being said here, there is nothing new being said whi

If there is anything that is frustrating or confusing to those who do safety work, it is the misapprehension that safety means "wearing reflective vests on the ramp", etc, etc, etc. I agree that this thread illustrates that such work has far deeper groundings in organizational behaviours and priorities, which, contrary to another misapprehension that such processes needn't impede the primary goal of the organization - to make money for teh shareholders - but can be done reasonably and needn't attract huge expenditures. Knowing what your airline and airplanes are doing (in terms of Air Safety Reports and Flight Data as well as LOSA - Line Oriented Safety Audits - and comparing that data with what you expect and what the SOPs are, is a significant first step and while such a program is initially expensive, the awareness (and thus the safety) dividends are significant. That said, such expenditures are extremely difficult to argue for and justify to the beancounters because they think in terms of "quantities" but flight safety, by definition is about "quality", thus the latent frustration being expressed here. In other words, in response to the question, "how safe are we?" (which would be asked by somebody who didn't know what they were talking about), you cannot say "6"...but you can say, "we have a trend towards non-stabilized approaches, especially at such-and-such an airport". A suitable, diligent response would be to ask your training people and checking people what they are seeing as well and to make SOP changes then watch the data for positive responses. Clearly, being a human activity, recidivism (reverting to old habits) is an issue which requires addressing in any SOP or other procedural changes.

In my view, lomapaseo strikes a very good note in observing the concept of "best practices". One can do no more, but today in an environment of increasing "visibility" and self-regulation, one must do no less.

This kind of dialogue is unfamiliar to most operations people but shouldn't be. But education then, as lomapaseo also states, "buy-in" by operations personnel, is necessary but very difficult to come by. Most see this work as "data-deadly boring" but that is only for lack of understanding. Most safety departments are seen by the bureaucratically-ambitious, as dead-ends for their careers and frankly from what I have seen, safety departments are equivalent to either corporate purgatory or the "senate". However, again from what I am seeing, because of "visibility", "corporate governance" issues, stupendous liabilities inherent in not doing this work and the ethical issues (for me, this last is by far the largest issue), positive changes are happening. Turning a large corporation onto a slightly different course takes enormous patience, time, and effort by many, many people, comprehending these issues and talking about them instead of ignoring "the elephant in the living room", to borrow a metaphor from other interventionist dialogues which have the same goals.

My suggestion would be take what one can from an informal thread and migrate slowly towards the books and other literature which are readily available, then, perhaps formalize the issues within one's organization (if such exist!) through a series of meetings. Even if the outcomes reify that one's operation is safe, the examination is worth it. Such a dialogue must be respectful but honest/frank/open discussion. Changing world views is a very difficult, and at times, emotional challenge but as this industry is taken further towards SMS, the "privatization of flight safety" may have a positive outcome. If not and both the courage to act on data and/or the self-audit process is less than forthright, kicking tin is the alternative.

For my money, I would suggest first reading Diane Vaughan's book, (The Challenger Launch Decision) as well as Dekker's two excellent books. For a more succinct paper on Challenger, William Starbuck and Frances Milliken wrote, "Challenger; Fine-tuning the Odds Until Something Breaks" for the Journal of Management in 1988.

These processes don't hobble operations: they are a way of travelling.

"Non-stable approaches save fuel because they're fast and clean" - that illusion informs operations' thinking about what kind of approaches to tolerate. I have heard the justifications over and over, that, "the runway is long, so why are stable approaches which cost fuel, so important?" Believe it or not, I have heard management people genuinely ask this question in their desire to keep airplanes fast and clean as long as possible, despite both the data and the historical accident record.

Sometimes an expensive decision to ground an airplane from which the data has indicated a serious but undetected-by-other-processes incident has occurred. The organization must trust the process it put into place to render such decisions reasonable even though commercial priorities and pressures are high and senior managers are breathing down one's back about getting the airplane back in the air. Such "setbacks" are a matter of perception which exist in the sense of immediacy which airline life is - but a healthy safety culture takes the "long view", which is, as has been observed, a "best practice".

This whole process is like putting on a new pair of glasses. The familiar things which we have seen (or rather, because they are so familiar, we haven't seen them!), are perceived in entirely new ways, which permit newly-conceived outcomes to be positively viewed and accepted. I think that is the greatest value of this discussion.

soem dood
8th May 2009, 20:02
"To all concerned, get hold of a copy of James Reasons book "Human Error". Its all in there."


Besides heartily endorsing Reason's book, whose lessons appear central to this flight, I also recommend Perrow's "Normal Accidents" as well as Dorner's "The Logic of Failure," since the human element is but one part of the complex high risk overall system to be considered. All three are on my shelf.

greywings
8th May 2009, 20:34
With the greatest respect I think we should now be focussing on Implementation rather than Reading Recommendations. It seems fairly obvious that we have had some well-informed opinions expressed here but the challenge is - and has been for all of my aviation career - getting those good ideas implemented.

If management are approached with long-winded academic arguments without a clearly-defined course of action, they will never buy into the proposal. They don't have time. Most of their day is spent putting out fires, and even though many of them may still be current, active pilots, you won't find one sitting at his / her desk with nothing to do but wait for the next good idea to coming breezing through the door. As we all know, if the CEO / Accountable Manager doesn't buy in to the idea, it probably isn't going to happen or, if it does, then there may only be lip service paid to it.

In many ways we are our own worst enemies. Compare the number of instances of pilots taking strike action over safety matters with the ones supporting industrial ones. In my previous airline the best-attended general meetings of the pilots were ones to do with conditions of service. It wasn't worth considering putting technical items on the agenda because no-one would show up. We need to get our own colleagues to buy into these ideas as much as we need management to do so.

This thread has shown that there are some good minds out there with ideas that warrant serious attention. We should maybe,

1. Gather the ideas
2. Collate and edit
3. Circulate for discussion and comment
4. Rewrite as necessary
5. Develop a plan for implementation

There is much, much more, but that could be a start.

GW

PJ2
8th May 2009, 21:11
GW;

I certainly concur with your views on implementation and CEO/Executive Management buy-in - if that isn't there, nothing else is going to happen. Also agree on the observation that nobody has the time to read - and it is really obvious when trying to open a safety discussion with flight operations people - their eyes glaze over instantly.

But educating themselves is their responsibility and at some point the buck has to stop with those in charge. Reading is part of the education on the safety work necessary to run any airline.

All that said, you can lead a horse to water...

Re implementation, here's one way:

After a long period (five years) in what we expected would be an automatic buy-in to data analysis because of what was in the data, (some not very pretty events), and the need was so "obvious", there was still nothing - no interest, no engagement, no support other than the lip-service we're all familiar with. For some strange reason, while other aspects of the safety culture were excellent, nobody wanted to know what was in the data, but they kept collecting it. That was how we understood it and perceived it.

So through a series of techniques, we began showing the pilots themselves what was in the data which included some "interesting" incidents. The reported disbelief was really interesting; - "that was one of us?" was a common reaction. Sometimes we contacted crews but we never judged, we just presented and answered questions where we could from what was in the data. We never went public and never embarrassed or disrespected any crews or the organization. It took a long time.

It was only after we began doing that, that things began to slowly change. Our confidentiality processes were such that it was impossible to find out who the crew was in the data - it was all about "what" and never about "who". Our FOQA Agreement had absolutely clear mandates to which both the pilots' association and airline management were signatories. We stayed strictly apart from any industrial aspect and to everyone's credit, such an approach was never attempted.

The reporting culture is superb. It is confidential and risk-free under an excellent safety policy - these are part of building the tools that need to be in place. Instead of continuing to push rope, we went directly to the pilots and it seems to be working. That was one way to get something implemented. There are others.

lomapaseo
8th May 2009, 22:35
Greywings

I take your points and clarity of expression.



With the greatest respect I think we should now be focussing on Implementation rather than Reading Recommendations. It seems fairly obvious that we have had some well-informed opinions expressed here but the challenge is - and has been for all of my aviation career - getting those good ideas implemented

We need to move off the identification issue of who screwed up and onto the corrective action issues following an accident, else why bother even investigating accidents.

I'm not a fan of more regulation is good basically beacause I don't trust that the regulator really has all the answers. The end user can more clearly see the problems given enough vision. It's the lack of vision or willingness to open ones eyes that is the problem.

I have been impressed with the magnitude of data and understanding that exist in totality in our industry and is readily available through the safety professionals and offices in our industry. Yes I recognize that these same professionals are often busy putting out fires and thus justifying their day to day jobs.

But in my view there is nothing wrong in calling a group of these experts together to share "best practices" on issues that we all agree are important to our industry's view by the public. Once this is done then and only then should the regulator enter the arena if only to take credit for overseeing such best practices.

I suspect that this thread may have drifted off the idea of Rumors & News but I appologize that my initial intent was to stop people from viewing an accident thread dujour as one where they could most easily find fault with who/what caused the accident in the first place.

It's OK with me that the contributors take this thread to where they want.

greywings
9th May 2009, 01:45
Thank you for a truly interesting and stimulating discussion on an important issue. Although we may have strayed away slightly from the original thread (many thanks to Gibon2), the ground that we have covered has been relevant and productive. Clearly, amongst others, PJ and colleagues have made enormous progress with their safety programme(s) and are to be congratulated. Regrettably, others live and work within a culture that would seem foreign to those of us who enjoy enlightened and professional debate - and subsequent action.

I have spent many years working in cross-cultural environments and know that 'authority gradients' (for instance) play their part in degrading or, in some instances, enhancing crew performance. However, I think it is true to say that, at the end of the day, a pilot who truly loves what he / she does and strives for the highest professional standards will perform to a universally-acceptable standard. I had the great pleasure of serving IFALPA as a Regional Vice-President for a while and was occasionally appalled at the primitive standards that prevailed in some airlines. Frequently, and more often than not, their pilots sought help because they wanted to be amongst the best but lacked the resources to be so. Naturally, others, more fortunate than themselves, were usually happy to help.

Various situations in the past have demonstrated that we are far from being a 'happy band of bothers', in fact, just another group of professionals with mortgages, family and other commitments. If we are to make any serious progress and have pilot-ing accorded the respect that other professions have, we need to unite behind a banner of pride in what we do and care for those we serve. I am not sure where we began to lose our way but I think that we have strayed too far from the reputation that we had as leaders rather than followers, and have lost the respect of those outside of our profession who tend to believe the numbskull reporting that depicts us as merely button-pushers.

We need to come together to demonstrate that being a pilot is not only tremendously rewarding personally but provides an invaluable service to the global community.

We need to become leaders again.

GW

alf5071h
9th May 2009, 23:14
FrequentSLF, thanks … particularly for the provocation ( ‘po’ (http://en.wikipedia.org/wiki/Po_(term)) – de Bono).

Re: “… standards of professionalism are set to high.” (#78)
No I do not think so. Standards when set are aspects of professionalism, not professionalism it self, i.e. skill, or knowledge.
My view of professionalism includes a never ending objective of self-improvement, and as long as individuals in the industry maintain this ethos we should be able to maintain a high level of professionalism.

However, therein is a significant problem; how to maintain these standards in a rapidly changing social environment – how prevent professionalism deteriorating which I fear it is.
Pilots and other professionals are challenged by an evolving ‘culture’ of instant gratification, self centeredness, low respect, deficient self esteem, and under confidence (‘Beyond Feelings’ – Ruggiero). What is there to look forward to beyond the age of 40 when pilots can become Captains in their late 20s/early 30s?

Re: “How to address it. … can we provide good pilots in a shorter time”.
I doubt that training timescales can be reduced; risks – opportunities for error increase during times of change, so seeking an improvement might cause more problems.

IMHO current ‘post graduate’ training is not providing new pilots with the skills, or the other constituents of professionalism, to cope with the demands of a technologically changing industry. We fly reliable aircraft yet focus on failure, we require flexibility but operate in constrained airspace and ‘have to follow procedures’, there is increasing regulation, etc, etc. Thus, there are fewer opportunities to learn from failure, from minor errors, or particularly from senior pilots, as the majority of them are also children of the technological age (but not the real ‘oldies’).

Pilots acquire knowledge and basic flying skills during ab-initio training. However, the practical application – tacit knowledge, the know-how as opposed to know-what, is more difficult to acquire; ‘on-the-job-learning’.
This requires experience in situ, guidance from others, and self-reflection to facilitate learning. As above, the opportunities for this are reducing, Captains should coach the less experienced pilots (increasing workload), and we all need to debrief every flight – reflect on what has been learnt – self improvement, the reinforcement of our professional ethos. There are weaknesses in all of these areas.

Note the range of questions in this forum – more ‘how to do it’ than requests for pure technical knowledge.
Consider the ‘signal to noise ratio’ of forum posts – ‘signal’ is knowledgeable information, opposed to the noise of supposition, myth, or bias. There is an increasing amount of noise, similar to the world-at-large, e.g. sensationalist media reporting or more regulation (verbiage); there are far fewer ‘signals’ of value.

Whatever problems the industry faces in this changing world, it is unlikely that they will diminish, thus the industry has to live with them – we have to adapt.
How can we adapt … how do we address it? Perhaps the other discussion in this thread is considering this, - how people have adapted or what is required to adapt.

Given that humans are very good at adapting, we will change; but will it be in time and without significant risk?

Centaurus
10th May 2009, 11:36
East is East and West is West and never the twain will meet.


The culture of the West (Europe and the Americas) will always be very different from that of the East (Asia). (Twain means “two.”) This saying is part of the refrain of “The Ballad of East and West,” a poem by Rudyard Kipling


In terms of flight safety, culture has great significance. I recall where an Air Japan (?) DC8 crashed into the Bay of Tokyo piloted by a captain intent on suicide. The last words on the CVR was that of his first officer pleading to the captain not to "kill us all" Culture prevented the F/O from taking positive action to prevent the tragedy. There are countless examples of aircraft accidents in which ethnic culture played a deadly part. As they say "Watch this space" and wait for the next culture driven accident.

alf5071h
12th May 2009, 23:01
“East is East and West is West”
This might be just an ‘artistic’ cultural view, whereas there are many more components in an ‘aviation’ culture – things that affect the way we behave.
“Culture” – what we do when no one is looking.

Professionalism and Culture are linked; perhaps professionalism is an outward indication of culture. I associate professionalism with airmanship, thus good airmanship might reflect aspects of a positive culture, but exhibiting good airmanship is not necessarily related to national culture.

“East is East and West is West”, could be equated to “those who have encountered a serious error, and those who will”.
The different views of error amongst the professionals might originate from how individuals consider the outcome of an encounter. There are those who ask ‘why’, seeking a deeper understanding of the human involvement and the situation, and thus learn from mistakes. Alternatively, some pilots will dismiss the encounter – ‘it was just me’, they consider that it won’t happen again, but without understanding why; this view leads towards an attitude of invulnerability.

Some academic views indicate similar divisions. Error either originates from the situation, and the human is pawn in the proceedings, or error commences with the individual – who may generate the situation, and human behaviour or limitations dictate the outcome.
At a practical level these views might be one and the same, or that individually, both contribute to the process of error.

The differences amongst pilot may just be their choice of view – situation or individual, or particularly which end of the error tunnel they look down. A backward look encourages hindsight bias, whereas attempting to see the situation as if they are person in the error path might open many more alternatives.
This is similar to reactive and proactive views of safety. Neither should dominate, we require both to be successful. As the industry improves safety (reactive bias), it is approaching an asymptotic level - “The paradoxes of almost totally safe transportation systems” (www.ida.liu.se/~eriho/SSCR/images/Amalberti%20_(2001).pdf). This paper argues that as total safety is approached, we need to bias more towards proactive safety.

The differences in pilot attitude to error may just reflect these different views of safety or the emphasis placed on them – reactive, blaming, fix the problems of this accident; or proactive, understanding, prepare defenses for a range of accident situations.

The next culture driven accident is just as likely to be dominated by poor corporate culture as it is by poor professionalism, it depends which one dominates; then again there are other aspects of a dominant culture. (http://uk.geocities.com/[email protected]/DOMINANTANDMINORITYCULTUREOBLIGATIONS.pdf)

Bergerie1
13th May 2009, 07:08
PJ2, Greywings, et al,

This has been one of the most interesting and professional threads I have read. In another life I was both a flight instructor/IRE/TRE and pilot manager and what you have had to say about safety management and all its surrounding processes is excellent. Thank you for such a stimlating discussion.

FrequentSLF
13th May 2009, 19:17
alf5071h

sorry for the late reply...I was on duty :}

Given that humans are very good at adapting, we will change; but will it be in time and without significant risk?

Well....the title of the thread says "Err is human"...in Latin the complete sentence is "Errare humanum est, perseverare diabolicum"...
I believe the industry is working hard to forget and delete the last part of such sentence. However as you said the working environment is changing so fast that we should always consider the complete sentence.

“East is East and West is West”

Being an Italian and living in Malaysia for 17 years I cannot agree more with your post!

Cheers

alf5071h
13th May 2009, 20:19
For others who like I may not have appreciated the full quotation:
Errare humanum est, perseverare diabolicum - 'to err is human, but to persist is diabolical.' - Seneca.

A more recent version might be:
“Somebody does somethin' stupid, that's human. They don't stop when they see it's wrong, that's a fool.” - Elvis Presley.

I won’t persist with the discussion for fear that I be associated with the quote …

Kalistan
14th May 2009, 01:17
To cut through all the bs...it's blatant racism, pure and simple. You can articulate and spin all you want: both the Turkish crew and the EK crew made crucial mistakes, only the element of luck prevented a disaster at Tullamrine.

greywings
14th May 2009, 03:20
Bergerie1, thank you for your comments. I have been fortunate to have spent my aviation career in the company of like-minded professionals of all colour, creed and ethnicity. Our 'differences' were ignored when we came together as aviators, proud of our profession and of our dedication to serving the travelling public to the best of our ability. Although I sympathise with the comments made by Kalistan, I would recommend a thorough re-reading of the previous posts, as I think it will become obvious that, apart from one recent post, the comments were 'racism-neutral'.

It is quite correct to say that there are close correlations between the Emirates and Turkish airline accidents. Emirates were lucky and Turkish were not. That really should not be the issue. The issue is why professional aviators make mistakes of such proportion. Is there a cultural factor that we have ignored? Is fatigue more insidious than we realise? Or are accidents merely an example of the randomness of the behaviour of complicated technical / human systems?

In my former life I had the great pleasure of working with some of the industry's best minds: Japanese, Australian, British, American and, in more recent years, Chinese, as well as others. No one culture has a monopoloy on commonsense and airmanship. True, we often view the world from different perspectives but we all strive for the same thing.

If we persist in bad-mouthing others because they come from a different part of the world, we, as an industry, will lose our credibility and the ability to influence - positively - the way it will develop in the future.

Best wishes,

GW

GemDeveloper
17th May 2009, 19:34
alf5071h

This is similar to reactive and proactive views of safety. Neither should dominate, we require both to be successful. As the industry improves safety (reactive bias), it is approaching an asymptotic level - “The paradoxes of almost totally safe transportation systems” (http://www.ida.liu.se/~eriho/SSCR/images/Amalberti%20_(2001).pdf). This paper argues that as total safety is approached, we need to bias more towards proactive safety.


Some years ago, I was in a conversation with a relative who was running a fairly large and respected airline that was about to introduce the 767. His background included flying a large number of multiengine types, starting with underpowered beasties flown under trying circumstances in the years immediately before 1945.

His concern was that the approach and landing on a 767 was so smooth and uneventful compared with the need to wrestle to the ground the current type in the airline fleet, particularly in any sort of a cross wind, that he worried that, in due course, there would be an accident as the PF would not be ‘on the edge of his seat’ and thus insufficiently in the loop to react in a timely fashion.

An observation of possible circumstances that I suspect require a mixture of both reactive and proactive safety to counter, probably biased to proactive; and I doubt have got any simpler to manage with the passage of time.

SILLY GOOSE
21st May 2009, 13:05
In Addition to 757 driver comments on the authoritive nature and the culture ,also the Military backround of some of thier Captains, not in thier nature to aply effective CRM in the cockpit.

This is labled as one of several hazards in aviation . It was very visible in the several accidents by Korean Airlines in the past as they had recruited from the airforce with the same behavior untill it was ironed out buy intense organisational changes.

PJ2
26th May 2009, 18:21
Greywings:
If we persist in bad-mouthing others because they come from a different part of the world, we, as an industry, will lose our credibility and the ability to influence - positively - the way it will develop in the future.

Very well and appropriately stated. Cheers to you sir.

Bergerie1;

Indeed, to echo Greywings, thank you - it's gratifying that such input is valuable to some. I have been writing about these issues for about a dozen years from within a major carrier with whom I worked as well as on various aviation forums. While one never expects an airline's management to hearken to an anonymous forum for "advice", and because management's job is to, a) pluck the most feathers from the goose with minimum hissing, (max productivity, minimum money/working conditions), and, b) to make a profit at all cost (I know such a comment may be accused of being hyperbole in most cases but not some that I've witnessed internally) they won't listen to this kind of input even if they read it, at least this kind of rational and non-agenda'd discussion helps others who may not be aware that their flight operations departments are not concerned with safety but with profit, (forced into the position by deregulatory pressures and shareholder demands for "instant results"). With few exceptions most safety departments are seen as corporate bureaucratic backwaters, viewed derisively as career dead-ends with little glamour and even less benefit (due to the ignorant "millstone around profit's neck" impression most have of the work).

I would challenge any, no, every airline CEO to state what she or he knows about flight safety and specifically about their own airline's safety programs and processes, not, obviously, because they need to run the program or even supervise it but because they absolutely must support it with comprehension so that they can lead, and, where needed, defend expenditures which are viewed as unnecessary by the less-informed, and to remind all those underneath of the business they are all engaged in - the fundamental insurance policy of aviation. Frankly however, in my experience they know little to nothing about the processes and don't know risk from a balance sheet - many don't even know the nature of the business they're in, so far away are they from what "aviation is" and what it requires. For some, it's simply another cash-cow with extorted largesse, (pay me well or I'll leave you).

The entire story of deregulation, (both in the financial sector and, for decades now in the aviation sector) has been about the gradual removal of "layers of cheese", so to speak, under the illusion of the "normalization of deviance" - the notion that "expensive" layers of safety defence can be removed without resulting decreases in safety (because knowledge of how the present excellent record got that way does not exist within airline managments), but with a commensurate increase in profit, a very "desireable result" so far as bottom-line thinking goes. The stupidity and commensurate risk of such an approach is, many believe, now showing itself. There are too many polished-but-ignorant MBA's and Marketing experts and not enough people who know the smell of kerosene. Both are necessary but it is time to re-balance and re-dedicate a knowledge of aviation and not just of spreadsheets and ASMs etc.

The psychology of "cost vs benefit" is what they know and therefore they neither can make the connection to complex safety processes nor comprehend why pilot training, reasonable wages, working conditions which attend to fatigue issues, sick-pay programs which do not financially punish pilots for booking off when they ought to and so on should be both understood and led by top management so that such programs remain effective "layers of cheese".

With these "glasses" on, we can see quite easily why aviation has gone the way it has and why the fatal accident rate is going to rise because airlines have been dismantling or even ignoring these trends out of a sense of satisfaction with the present level of safety and risk. The safety conferences I have attended have long said otherwise and have cautioned airlines against this but to no avial.

Instead, as always in aviation, managements must learn the hard way, and the preventative programs in place, while effective where installed and actually employed by the carrier, remain secondary alternatives to kicking tin and managing the publicity disaster that a fatal accident always brings to an airline.

The thread on the Colgan accident is especially painful to read because the accident was both predictable and therefore preventable - not before the actual flight but years ago when airline managements began desecrating the career of the professional pilot on their way to satisfying the demands of a profit-driven enterprise.

The chickens are coming home to roost, right at the point in aviation's history where a good safety initiative is being implemented very poorly, with few resources, little regulatory oversight and no comprehension by anyone: SMS. As it is being presently implemented, it is nothing less than the de-regulation of flight safety. The record is already established and, if not addressed the trend will continue: Airlines have already tried to get away with substandard operations beyond the eye of the regulator and, where commercial pressures are high and visibility, (the chance of being caught) is low, the choice may invariably be in favour of commercial priorities.

These are large patterns which must be examined from "afar" - they cannot be seen in the individual flights or pilots or managements. Yet no one is examining the industry thus.

While a wholesale examination and change is neither possible nor necessary because the fundamentals are sound, (it is the trends, not the foundations that are disconcerting), certainly an examination of the factors set out in this thread are in order.

As an aside, or more bluntly a diversion, (it has been discussed here before), there is very little difference between what has been happening to aviation, and what has happened (since the early '70's) to the U.S. economy which resulted in it's own "fatal accident" last October thanks to massive de-regulatory initiatives. Though the outcomes look different, the principles are the same - there are important lessons here for both.

FoolsGold
27th May 2009, 12:46
>"Layers of cheese"...
Unusual analogy but it does occur to me that it just may be an apt one! That final layer of cheese has no warning on it. It appears to be just another way for the MBA bean-counter to slice a few cents from some cost-center somewhere.

When a craps dealer doesn't know the proper payout he can simply drop chips onto the layout until the player cracks a smile and then take back one chip. The feedback from the player is immediate and there is but one factor involved. With a bean-counter cutting corners it may be a long time before there are a sufficient chain of events to provide any feedback. And then its too late to take back that last layer of cheese!

I recall reading decades ago about a small plane that had gone through nine "cheapie annuals" until the tenth annual inspection found an altimeter that had been recalled by the manufacturer ten years prior to the inspection. You get what you pay for.

greywings
28th May 2009, 00:21
PeeJay et al,

What an interesting and thought-provoking thread this is turning out to be. Hopefully, we can continue to attract more contributors and thus more good ideas.

I would like to take up on a couple of points before digressing slightly.

'Management' comes in for a lot of stick, and much of it is justified. PJ alluded to a typical CEO having little or no understanding of the intricacies of an SMS and called into question the validity of MBA's standing in our industry. Both very valid points. However, airline managers - in many cases - are not carefully nurtured as they are in many other industries. Typically, one day a chap is a competent line pilot and the next he / she is sitting in an office with their name on the door, wondering what the heck they are supposed to be doing. Consequently, ill-briefed and ill-prepared they take the 'safe course' - and don't make decisions, or they make decisions that don't agree with the 'management party line' and get a severe 'career interview'. Either way, they may well become ineffective managers and not necessarily due to latent incompetence. Either way the industry suffers.

Many CEO's (Accountable Managers) have little or no understanding of the minutae of the line operation. I replaced someone as a JAA Accountable Manager, who had absolutely no idea of what his responsibilities were. Although I was surprised I quickly realised this is not an unusual situation.

Summary of the above: many airline managers may be well-intentioned but desperately ill-prepared for the job.

We cannot avoid the 'commercial expediency' that pervades our industry, any more than any other. The mindless, and unsustainable, pursuit of constantly improved quarterly returns (ie; profits) puts CEO's in untenable positions and the stress is passed down the line to all management and supervisory positions, and results in 'extraneous' expenses, such as training, safety, etc, being pared back. The result is an operation that ostensibly meets all regulatory requirements but has nothing in reserve. Assumptions are made that all is well until something goes wrong. Concerns expressed by the pilot group are often seen as a nuisance that can be ignored. We all keep our fingers crossed that there won't be an accident or even a serious incident though often we feel as though we are living on borrowed time.

From time to time the industry becomes heavily involved with a trend that warrants attention. For many years this was Controlled Flight into Terrain (CFIT); now it is Loss of Control in Flight (LOC-I). LOC-I is now the largest single cause of accidents. having replaced CFIT by a significant margin. I have recently been involved in some interesting exchanges on the subject and offer my most recent thoughts for consideration and comment. They are presented slightly tongue-in-cheek, but I think you will get the gist of what I am trying to say.

"Thanks for this. Funnily enough I have just been reading the article in Flight International on the proposed (expected?) training for 'upset flight', LOC, LOC-I, and all the other myriad definitions it has. I am also wading through a document on the same subject. I am sure it is all well-intentioned, but there is very little being said about what, to my mind, is the principal cause of this situation - overreliance on automation to the exclusion of handling practice.

Loss of control is not like Swine Fever. It hasn't suddenly mysteriously blossomed somewhere then spread like wildfire through the pilot ranks. It has, instead, insidiously grown to be the major factor in aircraft accidents due to diminishing professional standards amongst pilots.

Do I mean that we have all suddenly become a bunch of glue-sniffing, soap-opera watching, layabouts more interested in a perceived glamorous lifestyle 'down route' than serious attention to what is an extremely demanding - though rewarding - profession?

No, but what I do mean is that we have allowed ourselves to be led by the nose by accountants and aircraft manufacturers to the extent that we no longer fly the aircraft: we have delegated what used to be the fun bit to the automatics.

This didn't start yesterday. It started with the Airbus and 'Next Gen' Boeings with their marketing emphasis on 'ease of operation' with little or no human intervention.

("Frankly, Monsieur Airline CEO, we could teach ze monkey to fly zis plane").

As the 'older generation' pilots, with their years of handling lesser technology aircraft chose when and whether to engage the automatics with little or no fear of the consequences, others, with minimum experience - and, in some cases, rank unsuitability for the job - were coerced by their managers to make maximum use of the magenta line, LNAV/VNAV, coupled approaches, etc, etc.

("You see, Monsieur, all ze peelot 'as to do is taxi in and taxi out, and we already 'ave ze fix for zat in the future").

Eventually, with rapid expansion in the industry, the hoary old guy in the left seat was often replaced with a spotty-faced youth with little experience but a fascination with wiggly amps and computers. He did really well until the automatics played up and he and his sidekick had to figure out - quickly - what all the waggly bits on the wing and tail were for, and how they reacted when the stick was moved.

In my humble opinion the answer does not merely lie with investing squillions of dollars on more technology and more simulator time and more expense and more procedures to learn, but enforcing - and reinforcing - training of the basics to ensure familiarity with the characteristics of the aircraft, THROUGHOUT THE FLIGHT ENVELOPE.

"Hey, Guys, you can actually fly the aircraft MANUALLY. I know it is a strange concept these days, but it still works the way the Wright Brothers envisioned. You don't automatically die if the autopilot disengages in the cruise. It is fun to hand fly the aircraft in the climb, as well as the descent, and there is nothing, absolutely nothing, quite as satisfying as doing a visual approach on a nice sunny day"

I could on for hours on this subject, and I am about to put pen to paper. Although I agree that more and better simulation can help, I also think we could do more to avoid some of the bone-headed decisions that some of our brethren make / made / will make in the future".

GW

stilton
28th May 2009, 01:00
Well, I hand fly every approach Autothrottles off disconnecting everything by 1500' at the latest.


This is unless weather demands a monitored approach , full Autoland, or if I feel excessively fatigued.


Several reasons, the currency 'keeping your hand in' is absolutely vital, I find it simpler to hand fly, but most of all I enjoy it enormously, the challenge, the satisfaction of doing a good job, is for most of us, I think, the reason why we became Pilots in the first place.


There is far too much dependence on Automation out there, I find the reasons behind the recent THY crash to be incomprehensible.

Cacophonix
28th May 2009, 01:08
I have lurked around the good points on this interesting and thought provoking thread and suddenly realised that the whole thing was summed up by Cecil Lewis (RIP) in his book "Saggitariius Rising". I refer to his training of the Chinese and the many issues he encountered when trying to put together an airlline there post 1918 before he became disillusioned and he returned to set up the BBC.

I'd put money on any Chinese pilot today. Time and tide. I worry about the culture here these days. Where is "here"? Britain unfortunately.

We all share the same human traits. No nation, race, people, is necessarily culturally better equipped to produce pilots. It just takes time, training and patience.:ok:

lomapaseo
28th May 2009, 01:20
Well as long as the floor is still open for discussion, I'll weigh in with some different perspectives.

It's not useful to highlight the airline CEO or even the bean counters when discussing safety. They have their jobs to do and it isn't to lead or even to promote regarding safety. Their best roll is a supporting roll. The promotion and leading of safety has to be done by people who understand the differences between risk and safety.

An organization needs the equivalent of an ombudsman who tracks items that identify risk, unsafe practices and has a process that brings to bear corrective actions that manages identified risk items to a level of safety commensurate with the rest of the industry. Sure it's nice to be risk free or perfectly safe (in your own mind) but you'll only dream of this. But woe to the organization that allows itself to be perceived as less safe than others.

So look around you, who in your organization is the person responsible for identifying risky practices and promoting actons to correct these? The bigger the organization the bigger the staff, but it sure ain't the CEO or a bean counter staff. I've come to believe that it really is some of you on this board. But I don't like to hear about inpediments in this process that have us pointing fingers away from ourselves with the idea that it's somebody else job and they just don't understand what they are doing.

dbee
28th May 2009, 16:38
Sorry, but the THY accident surely is lack of awareness or monitoring, otherwise how did the autothrottle shut off the power, resulting in a speed loss of over 40kt that not one of the 4 on the flightdeck noticed.

Maybe 'what is is doing now?' is correct; a category of fogetfullness that is never repeated, as all flight deck were killed. A sad day, but it would have been sadder still if there were no survivors. dbee:uhoh:

Oakape
28th May 2009, 17:01
I see your point lomapaseo, but I can't help thinking that this is 'ideal world' stuff & doesn't have a lot of bearing on what actually happens out there in the real world.

The people who understand the differences between risk & safety & the "ombudsman" you mentioned are already there as the safety manager & his/her department, supported by line pilot reporting. The safety department is there because the various regulatory authorities mandate it. The problem, as has been mentioned earlier, is that these people are generally kept out of the inner circle. Flight ops management generally do little to support them & the non-flying types don't understand why they are required anyway. Coupled with the fact that they generally have no power to effect change & rely on the good will of the decision makers to get any recomendations implemented, they are often only there to pay lip service to the regulatory requirement. The other problem is that many reports from the line are seen by senior management as a grab by line pilots for better conditions under the guise of safety.

The only way to have a really effective safety department is to have total buy in at the most senior level of the executive management. If the CEO has a safety manager he/she totally trusts & has them reporting directly to him/her, safety recomendations stand a chance of being implemented. But because most CEO's these days are not the aviation people of years gone by, they do not understand the importance of this. CEO's these days jump from one position to another & from one industry to another quite regularly. Their skills are seen as readily transferable & not industry specific now. This generally works out Ok in most industries, but the quite unique nature of the aviation business seems to be no longer understood.

Good or bad pilots, training, experience all mean nothing to the modern CEO or even senior management. The only requirement in their minds is a licenced pilot. Modern management has been influenced by the manufacturers that "any piloy can fly this aeroplane" &, due to their lack of aviation background, they believe it. Greywings summed it up perfectly!

The problem is that everything costs. Finding good pilots & getting rid of bad ones costs (before you all get stuck into me, I'm talking about the very few who have no business on a flight deck), training costs & even experience costs. And of course, safety costs. When you have a bunch of accountants in middle management, all with the ear of a cost aware CEO who is trying to keep the shareholders happy, cost is a dirty word! Someone once said that accountants know the cost of everything & the value of nothing. Perhaps they were right.

So the flight ops departments have to come up with ways to enable any pilot fly their aircraft & still keep cost under control. The solution is often to have what they consider strong SOP's & rigid adherance to them. An often inflexable automation policy also helps. This works fine while all is going to plan, but often falls way short when things start going wrong. It also has the suble tendancy to de-skill pilots in many areas, including manipulation & decision making. I think that we are starting to see the long term effects of this with some of the recent incidents & accidents. When a very senior manager of a flight ops department in a large ME carrier states to a group of pilots in a meeting, "I do not believe that low morale is in any way related to flight safety", you can see how far that these people have bought into the SOP/automation myth!

I believe that this all started with deregulation & low cost carriers, but I am sure that many will have another take on the reasons for the problems the industry faces today. However, we would all do well to listen to Greywings & PJ2. They certainly know what they are talking about in my humble opinion.

PJ2
28th May 2009, 18:56
GW;
Summary of the above: many airline managers may be well-intentioned but desperately ill-prepared for the job.
Precisely.

With a few notable exceptions, (native ability/intelligence/experience) it was the same where I worked as well. Those that went into management were often very junior, intent on improving their working conditions by controlling what their "juniority" could not, or getting a promotion (bigger airplane or a left-seat job which their seniority would not permit outside line pilot seniority). That's not a cynical opinion nor is it intended to slam management personnel. That's what happened, routinely - it's how middle management was (and is) usually staffed. The experienced ones flew the line and kept their head down because they knew the work was thankless with little support from above, the politics were distracting and sometimes risky to one's advancement, and the monetary benefits minimal.

All this said, your point is exactly on the mark - by virtue of the absence of formal training, mentoring or internship programs for such a purpose, most airline management below the senior and executive levels are ill-prepared for the work and do indeed make very conservative decisions, (I have written about this problem before in relation to SMS and the need for courageous but expensive safety decisions and will not repeat comments here), but are, in almost every case, extremely well-intentioned.

Again, the Challenger-Columbia accidents and the dynamics at NASA in both cases are highly instructive and I would commend anyone interested/fascinated with these dynamics to find and read the following texts for a solid, far better reading than we can offer here on a complex, often-ignored subject: the factors, characteristics and issues here being discussed. The books are:

The Challenger Launch Decision, Diane Vaughan, Chicago Press;
Organization at the Limit, Moshe Farjoun, William Starbuck, Blackwell

As mentioned before, Sidney Dekker's books are of immense value in putting flesh to the points raised here. Just Culture and The Field Guide to Understanding Human Factors are both well worth the investment in careful reading.

In particular GW, your point regarding the well-intentioned manager is made a number of times in the first two works cited above, the point being of course, that "amoral calculation" (Vaughan, 1996) is not a factor in accidents and instead "the best of intentions" is cited along with the valuable notion, the "normalization of deviance", a human factor with which we are all familiar in our daily lives but which, because we are so accustomed to such justifications for change, is invisible in high-risk environments, a factor which is exacerbated by the aforementioned lack of specialized, focused training for flight operations management personnel.

lomapaseo;
Re, It's not useful to highlight the airline CEO or even the bean counters when discussing safety. They have their jobs to do and it isn't to lead or even to promote regarding safety. Their best roll is a supporting roll. The promotion and leading of safety has to be done by people who understand the differences between risk and safety.

Hm, I think you will find that most safety literature will disagree with your assessment that the role of executive management is not to lead and/or promote safety within the organization they are responsible for. To be sure, (and I have been careful to qualify this thought a number of times), it is not necessary that they be safety specialists or even knowledgeable about the intracacies of flight safety work; I even agree that their main responsibility is indeed to ensure the commercial health of the enterprise - obviously safety is immaterial to a grounded, bankrupt carrier.

But if the organizations managers/employees are receiving the message straight from the top that "safety doesn't count when commercial or scheduling pressures are high, employee productivity is always a concern or maintenance "delays" keep airplanes in the hangar for "too long", (I'm exaggerating I know - such "messages" are almost always far more subtle but again, I'm trying to keep posts to a reasonable length and not succeeding very well), you may be certain that those messages will be acted upon by well-intentioned managers, not because they are apple-polishing but because, "that is the way things are done here".

If however, an organization has a healthy safety culture, not "led" but "engendered" within the organization from the very top, then managers know that they will be supported when a critical SMS-driven decision runs counter to the organization's commercial interests and priorities. One thing is a bureaucratic and psychological certainty within the social network and structure of an organization - an employee will invariably listen to the message delivered by his/her boss than by a trainer or safety specialist.

That is what is meant by CEO's/beancounters etc "understanding safety". If an executive only knows, say, marketing, then all problems and solutions are seen within a marketing "discourse" and may be wholly inappropriate in terms of support for sufficient resources. The role of the CEO/President etc, cannot be underestimated but it should not be confused with the need for "subject-matter-expertise". One can lead, motivate and support quite effectively even if one is not an expert in one or another fields. In fact, that is the very definition of "manager". This is what is meant by holding the CEO accountable for leadership in safety.

I would like to address your second point,
An organization needs the equivalent of an ombudsman who tracks items that identify risk, unsafe practices and has a process that brings to bear corrective actions that manages identified risk items to a level of safety commensurate with the rest of the industry. Sure it's nice to be risk free or perfectly safe (in your own mind) but you'll only dream of this. But woe to the organization that allows itself to be perceived as less safe than others.

So look around you, who in your organization is the person responsible for identifying risky practices and promoting actons to correct these? The bigger the organization the bigger the staff, but it sure ain't the CEO or a bean counter staff. I've come to believe that it really is some of you on this board. But I don't like to hear about inpediments in this process that have us pointing fingers away from ourselves with the idea that it's somebody else job and they just don't understand what they are doing.
If I might respectfully offer, the "ombudsman" you describe is the organization's flight safety department and the tracking and risk assessment activities are the various flight safety programs created, put in place and properly, effectively resourced and then actually employed and otherwise hearkened to by the organization's flight operations department. The establishing of such programs and the appropriate resourcing of same is not the responsibility of safety people - it is the CEO's/beancounters' responsibility to ensure such programs are created and sufficiently resourced. That has been my point all along, perhaps poorly expressed at times!

But examination of the point you make regarding how such an "ombudsman" would work within an organization in terms of authority and independance of voice must be taken further.

The assumption is, the safety people in place to risk and trend assess will actually be listened to, their and their safety programs' input taken seriously, and their authority to interdict in a number of established and accepted ways. This is a somewhat idealistic expectation and does not always obtain. The notions of "box-ticking" and "pushing rope" express real manager and employee frustrations with an absence of what seems should be an obvious involvement. Such engagement simply cannot be taken as given. Both the Challenger and Columbia accident reports indicate this as do the more enlightened and broadly-conceived reports on accidents including Moshansky's Report on the Air Ontario Dryden accident. Ignoring flight data was cited as a causal factor at, of all carriers, QANTAS, in their overrun accident at Bangkok. I have witnessed flight data which indicated a very serious airframe exceedance sent to all relevant departments including operations and maintenance but which was summarily ignored in favour of dispatching the aircraft. I can assure you, without details, that we were not silent but it took a significant intervention over a relatively long period of time to obtain appropriate action though all after the fact. So there is not an automatic relationship between any "ombudsman" to use your term and the operations people but the appearance and subsequent illusions of same may lead one to believe it so if not examined for what it is.

In fact, under a de-regulated SMS environment in Canada, the US and elsewhere, where SMS documentation and accreditation (IOSA, for example) are perhaps viewed by CEOs/beancounters as sufficiently acceptable measures and even sought after (the accreditation game), and which may be imbued perhaps greater importance than actually seeing/knowing what is going on within their organization, it is not unreasonable to observe, given a host of recent accidents the nature of which has not been seen before at least in such frequency, that we have a safety system at significant risk of further deterioration.

In pointing to pilots as responsible entities, I think you re-state a valuable factor which requires highlighting;- that it is "us" indeed and as well, who must maintain our guard. This is a systems approach, not a component approach. While management must always lead, (unions, employees, even the regulator, do not and cannot ultimately lead - effective leadership for a safety culture must always emanate straight from the top), employees and their respective representatives are not bystanders but integral contributors and, in many cases where priorities are seen to be imbalanced, responsible for intervention whether singular or structural. Flight crews are "at the coal face" and are in the best position to provide feedback. Pilot unions have a long history of flight safety involvement but, just as companies will place commercial priorities higher, unions often place industrial matters as higher priorities than the less sexy and "pedantic" safety agenda. While pilots and their representatives play a significant role in regulatory reform, aircraft and procedural design, procuring volunteers or arranging displacements from flying so that such work can be done is often problemmatic in terms of resources. So I think your point in this is well taken - we indeed have a large and unavoidable responsibility.

In the end however, the metaphor of "pushing rope" is applicable. If the CEO doesn't want it or doesn't make it hurt if a certain tolerance for compromise in flight safety occurs or is established within the organization, there is little that pilots, unions, or even the regulator can do.

kind regards as always,

Oakape;
But because most CEO's these days are not the aviation people of years gone by, they do not understand the importance of this. CEO's these days jump from one position to another & from one industry to another quite regularly. Their skills are seen as readily transferable & not industry specific now. This generally works out Ok in most industries, but the quite unique nature of the aviation business seems to be no longer understood.

I have mentioned this here before but I heard it straight from a very senior Boeing manager involved in flight safety work that in his experience today's airline CEO's know little to nothing about flight safety or even the engineering/technical aspects of aviation or aircraft purchasing and thus are very difficult to communicate with or talk to concerning the matters raised in this thread, all of which have serious relevance for the organization of which one is CEO. It was for me, a disappointing confirmation of my own experience where I once worked and is, in my view, with other equally serious issues, near the heart of the presently-unfolding issues being discussed here.

PJ2

lomapaseo
28th May 2009, 21:34
I guess PJ2 you will always out wordsmith me in your replies. Yes there is much that we can agree upon but of course in the end what is going to be done about our frustrations.

You continue to raise the specter of the ignorant CEO being a block.

I don't see it that way.

It's the job of the people wihin the organization charged with safety (assessments, and corrective actions) to perform the task while at the same time being allowed to do so. Yes, I have seen ignorance from above but an awakening from the safety professional within the organization is what is needed and almost always works.

You even mentioned a highly placed safety executive from a manufacturer agreeing with you about the shortcomings of airline's CEO, amen to that. But I will add that in such cases where safety was not being met these same manufacturer's safety orgainizations have met with the airlines of concern and brought about immediate recognition and correction of the problems. I guess it was a shame that the airlines didn't see the writing on the wall as well, but in the end they did recognize that they indeed had a problem when an safety expert laid the data in front of them.

So I'm not willing to throw this problem over the fence towards a CEO or whomever. It truly is our problem to communicate effectively to anybody standing in the way of safety to the standards of today's "best practices"

PJ2
29th May 2009, 00:52
lomapaseo;

I never thought of it either as "wordsmithing" or a competition in who expresses ideas better. It is just the way I write. :)

but an awakening from the safety professional within the organization is what is needed and almost always works.
Most definitely not in my experience. The number of times we have presented very serious incidents and one or two near-accidents to both senior AND in one case, the CEO and the full executive made no difference whatsoever. They simply did not "see", nor did they comprehend what the airplane had just done in any of the cases we presented and no one from Ops spoke up in support of the program. It was extremely disheartening and in my view, foreboding but there it was.

I am afraid your expectations regarding the ability of earnest safety people to do this work alone "in spite of" an unsupportive or non-engaged CEO are very much in error. The attitude expressed was, we're doing fine thank you. The data said otherwise but went nowhere.

Yes, we have a common problem and it is why I continue to write about it here and elsewhere so that others who experience this kind of management diffidence and over-confidence can know that they are not crazy but that it happens elsewhere in very large organizations. The issue is indeed driven from the top down and in many organizations only changes after an accident. It is this kind of disastrous intervention which is clearly to be avoided.

I don't wish to concentrate on narrow circumstances other than to illustrate what I believe others are also up against and why. One way or another, aviation provides it's lessons. The CEO is, prima facie, the leader and is accountable to the airlines' passengers, employees and shareholders. No flight safety group can function otherwise and would be, at best, a sham and at worst, a legal liability. I am afraid we'll have to agree to disagree and permit the thread to hopefully take the broader course.

best,
PJ2

lomapaseo
29th May 2009, 02:35
I am afraid your expectations regarding the ability of earnest safety people to do this work alone "in spite of" an unsupportive or non-engaged CEO are very much in error. The attitude expressed was, we're doing fine thank you. The data said otherwise but went nowhere.


My experiences do not match your own.

Perhaps it lies in communications

I've never had a problem in painting a bullseye around the safety problem at hand (data data data, history and best practices etc. etc.). Once the bullseye has been illuminated everybody in the room does their best to get out of its aim. If they can't counter it with data and illustrative arguments then they at least become supportive of somebody who can make it go away.

One thing becomes clear early on.... as a presenter of the data I am not going away so the problem remains to be solved and not swept underneath a rug.

You are correct that the discussion we are having here is not between you and I. It is a discussion about where does the problem lie and who will fix it. I care not for those that throw it away as not there problem as many of the posters tend to do by fixing blame after an accident.

So yes the dicsussion may continue with others as well. :)

PJ2
29th May 2009, 05:32
You are correct that the discussion we are having here is not between you and I. It is a discussion about where does the problem lie and who will fix it. I care not for those that throw it away as not there problem as many of the posters tend to do by fixing blame after an accident.
Touché. Now we will see where it will go.