PPRuNe Forums

PPRuNe Forums (https://www.pprune.org/)
-   The Pacific: General Aviation & Questions (https://www.pprune.org/pacific-general-aviation-questions-91/)
-   -   Norfolk Island Ditching ATSB Report - ? (https://www.pprune.org/pacific-general-aviation-questions/468378-norfolk-island-ditching-atsb-report.html)

blackhand 26th Sep 2012 00:49


The PIC is merely the final enabler. He pulls the trigger of a weapon that has been primed, loaded and cocked by others. And we sit here and sanctimoniously expect him to be the guardian of everyone else's mistakes.

This is your opinion and is respected as such, others on here that are ATPLs see it completely differently. You seem to be entralled with James Reason's model. This model is designed to take the responsibility away from the end user and put it on the corporate entity.

The crew (CRM) elected to fly the aircraft with minimum fuel, the risks far out weighed the benefits.

Brian, I read your reference, it's conclusions are about the same as ATSB findings for the Norfolk ditching - what is your point?

Lester Burnham 26th Sep 2012 00:58

Brian;


There was information published (forget where) that had the tanks been full the aircraft would have been stuck at a lower altitude that would have reduced the range even further.
Wrong; there was speculation published about this on the Plane Talking blog that isn't supported by the facts. What is the SGR for a WW at LRC at non RVSM levels versus normal cruise at RVSM levels?

There is evidence in the CASA report that in the weeks leading up to the accident the PIC flew the same route in the same aircraft and burnt more fuel than he took off with on the night of the accident. Shouldn't this have rung alarm bells?

gobbledock 26th Sep 2012 01:29


You seem to be entralled with James Reason's model. This model is designed to take the responsibility away from the end user and put it on the corporate entity.
What a load of crap Blackhand. The Swiss Cheese model is a look at causal factors in an accident. The emphasis on 'factors', plural. No one accident has just one cause. Nowhere in the Reason model is a trigger, formula or method so as to induce blame or throw the monkey on to a corporate identity.
The Norfolk accident contains numerous factors. I agree that the PIC, in this case Dom, was the last line of defence. He stuffed up and became the last hole in this accident. But what has also come to light is some of the other factors that contributed to the accident - inadequate S.O.P's, operational pressure, lack of company support, company culture, fatigue, lame regulatory oversight and the list goes on.

blackhand 26th Sep 2012 02:01


What a load of crap Blackhand.
The Reason model was desgned for errors in patient care.
It has become a badly understood and widely utilised panacea for all safety issues.
An unsafe act is at the pinacle of the pyramid, and, as in this case, can happen without a causal chain

Sarcs 26th Sep 2012 02:47

Hmm...Blackie I hope your not peddling that vintage of pony pooh to your clients??

Since your a guru on LAME issues take a look at this quote from a very good (yeah I know....well at least in some areas of the ATSB remit they still set a benchmark) ATSB report:


The errors of maintenance personnel can be the most visible aspects of maintenance human factors, but to understand how and why maintenance errors occur, we need to understand the organisational context in which they occur.

Figure 5 below shows the main causal elements involved in accidents and incidents. It is an adaptation of the ‘Swiss Cheese’ model originally developed by James Reason.

According to this model, accidents or incidents are usually triggered by the actions
of operational personnel, such as pilots or maintenance engineers. However, these
actions occur in the context of local conditions, such as communication, workplace
conditions, and equipment. The task environment also includes risk controls.

These are features such as procedures, checks or precautions designed to manage hazards that threaten safety. Risk controls, local conditions and individual actions can, in turn, be influenced by organisational factors such as company policies, resource allocation, and management decisions.

In order to understand and ultimately prevent accidents, it is necessary to trace the
chain of causes back through all the elements of the system including organisational
influences. This is often referred to as root cause analysis.

And since you are a guru of the 'blackhand world' you should be able to tell me what report that came from because it should be included in any LAME Human Factors course i.e. required reading??

ps Oh in case your stumped here's the link!:ok:

http://www.atsb.gov.au/media/27818/ar2008055.pdf

gobbledock 26th Sep 2012 03:08


The Reason model was desgned for errors in patient care.
It has become a badly understood and widely utilised panacea for all safety issues. An unsafe act is at the pinacle of the pyramid, and, as in this case, can happen without a causal chain
Oh dear. I see that the Reason Model is understood differently in PNG?

Blackie, you better tell Mr Reason that his cheese is being abused by the airline industry. You should also advise the ATSB, NTSB and most Western regulatory bodies that Reason's cheese is really not applicable in aviation and they have all got it wrong.

blackhand 26th Sep 2012 03:26

Sarcs, I am embarrassed by your praise:cool:
Yes, I have a modicum of knowledge on safety systems and regulations pertaining to aircraft maintenance. Have been involved in half a dozen helicopter accident investigations and a couple of fixed wing ones.
In my experience this Reason model allows the blame to be spread for sometimes no tangible benefit.
The maintenance incidents nominated in the Overview of Human Factors in Aviation Maintenance, to me highlight bad maintenance carried out in spite of systems in place.
The mech did not put the orings on, inspectors not finding indications of skin stress etc.

Gobbles, James Reason makes a great living out of his model, I shouldn't imagine he is complaining

Hmm...Blackie I hope your not peddling that vintage of pony pooh to your clients??
Maybe, but I am certainly not adverse to using it to pull them out of the poo.

Jinglie 26th Sep 2012 03:32

Beyond Reason.
 
This is beyond Reason! The are so many holes here there is hardly any cheese!!
I'm doing fuel calcs on the leg, and I doubt you can use Noumea as an alternate, particularly non-RVSM. Probably why PEL-AIR had it in Airwork rather than Charter! The ATSB numbers in the report don't add up. Maybe Beaker did them too!

gobbledock 26th Sep 2012 03:43


The ATSB numbers in the report don't add up. Maybe Beaker did them too!
http://www.itsasafety.org/images/upl..._Australia.jpg
MEMEMEMEMEMEME

Jinglie 26th Sep 2012 03:44

Great summation
 
This I found on Avweb regarding this accident. Great summary.

"Finally, there have been pilot comments on the ATSB's inaccurate statements about fuel calculations and requirements for single engine or depressurised calculations.

Dominic was not without fault, but he has been nailed to a cross of an operator of failed management, a regulator of failed oversight, inadequate equipment, and an unfairly biased if not incompetent ATSB."

I suspect a lot of poo is going to launched about this in the Inquiry. I hear there is more poo to be flung and it is capital B, capital A, and capital D for the ATSB and CASA.

Sarcs 26th Sep 2012 04:03

Top shot Gobbles!:ok: All you need now is a profile shot of Beaker from the Muppets....hmm muppets that's a fairly appropriate description of the bureau's standing in all this..:E


The ATSB numbers in the report don't add up. Maybe Beaker did them too!
He maybe a beancounter but I'm not sure if he would have had his hands anywhere near this. It's more likely he would have issued a directive to the investigators to narrow their focus on the actions/inactions of the pilot, which ultimately compromised the investigation and the final report version 1..ah.2 or is it now 3, I'm confused!:(

Brian Abraham 26th Sep 2012 04:59


The Reason model was desgned for errors in patient care
Not so. His book "Human Error" was first published in 1990 and as the jacket cover says

Modern technology has now reached a point where improved safety can only be achieved through a better understanding of human error mechanisms. "Human Error" spans the disciplinary gulf between psychological theory and those concerned with maintaining the reliability of hazardous technologies. This is essential reading not only for cognitive scientists and human factor specialists, but also for reliability engineers and risk managers. No existing book speaks with such clarity to both the theorist and the practitioners of human reliability.
The medical profession has been addressed in some of his writings however.

One thing Reason said was

Blaming individuals is emotionally more satisfying than targeting institutions.
Human error: models and management

I think we might be seeing some of that on display here.

gobbledock 26th Sep 2012 05:02

ASIC photo?
 
http://i2.listal.com/image/3012333/500full.jpg http://www.itsasafety.org/images/upl..._Australia.jpg
MEMEMEMEMEMEMEME

Sarcs 26th Sep 2012 05:28

That's perfect Gobbles!
 
Take off the ears and he's a 'dead ringer!':ok::D

Top article BA, Tony Kern's books are extremely good references on the subject as well.

Lookleft 26th Sep 2012 07:12

James Reason never intended to have his model used as a template for accident investigations but it was a theory for why accidents occurred. In this accident the Reason model is a valid explanation for the poor support given to the crew to commence the flight. It is also is useful as a way of explaining how two pilots with not much experience were on the flight deck. It can also be used to explain why the rules governing the flight were not as tight as they could have been (i.e. charter instead of aerial work).

It cannot be used to absolve the PIC from his responsibilties once he realised that the weather was not as forecast and that he did not have the fuel to divert. CRM and TEM are supposed to be the defenses available to crew to prevent that last hole lining up but where were they? I notice that a lot of the PIC's advocates are silent on the decisions (and the lack of decisions when it came to notifying the Unicom of where they were ditching) that were made from ToD, other than the ridiculous assertion that a 146 that made a successful landing when faced with a similar situation proved that the PIC was not responsible for the outcome(and I'm the one claimed to be the idiot!)
This is where Reason exits the stage and CRM is supposed to enter but it didn't happen. The fact that it didn't happen is the crew and utimately the PIC responsibility. If anyone can explain the organisational issues involved in the period from ToD to ditching that were beyond the crew's control I would be interested to read it.

Wally Mk2 26th Sep 2012 07:44

"LL" interesting post there.
I don't think we need James's or anyone's models for that matter in this case it's really a black & white story that's getting 'coloured' in big time.

From TOD the scene was already set, the organizational issue/s where lacking well before this guy even took off, the fact that he did attempt the mission is almost irrelevant as far as the organizational part of it goes it was doomed to fail anyway, the 'play' had an ugly ending that few even saw inc the regulators.

I love the words CRM, TEM etc etc they are really just warm & fuzzy words to make people feel good & or feel protected. Man has been crashing planes well before the boffins dreamed up those 'feelgood' words & they still continue to crash planes well after those words have been around & will continue to do so 'till the end of time.

You mix man & machine together no fancy words, teachings beliefs etc will ever stop that interaction that brings a plane to a screaming halt in pieces!
Some might say well CRM TEM etc at least reduces the accidents/incidents, well how do we all know that? Lets just say for Eg that we could erase all the previous CRM crap teachings from every pilots minds etc & count the amount of wrecked planes against the hills over the next 10 yrs then re-install the same rubbish & count again to see any difference, well we can't (obviously) so we just now accept that CRM TEM & all the reasoning stuff is better! Way of the future I guess right or wrong.
I do it 'cause I have to & perhaps I 'might' learn something from it (I'll never really know anyway)all but it's not the be all end all that's for sure!The biggest defense in these cases is experience,something that getting cut short more & more due the almighty dollar!

Dom was armed with insufficient knowledge to complete the task on that night & the argument now rests with who's fault it was.Will it fix anything all this bickering blaming etc?....nope it will happen again sometime somewhere you can guarantee it & we'll be right back here going round in circles like man has done since Adam was a boy when he was hit on the head by a falling apple & tried to "reason' it all !:)!


Wmk2

Sarcs 26th Sep 2012 08:13

Well said Wally I couldn't agree more, this matter has well and truly moved on from who was at fault.

The issue now is what appears to be a severely compromised Final Report from the ATSB and the reasons why...coercion, political expediency, covering the Minister's ass etc.

Here's a quote from an experienced pilot who reviewed the ATSB report before it was released:

“Tomorrow, the ATSB will release a Final report into an aviation accident at Norfolk Island.

The Final report in it’s current form contains factual errors. These have been brought to the attention of the (name and position withheld), and also the (name and position withheld).

Despite these many areas under dispute having been brought to the attention of both of these men on numerous occasions and requests that the Final report be delayed pending dialogue to resolve the inaccuracies, (name withheld) has determined that this report will be released tomorrow.

(Name withheld) has also let it be known that he is expecting a lot of media attention. I ask what is the point of media attention to a factually incorrect report?

Obviously a report with facts in dispute should not be released. Having spoken to both pilots concerned and as a professional aviator myself, I am left dismayed at the attitude of the ATSB and their willingness to issue such a questionable report. As Minister, I would ask you to stop the release to give all parties involved the opportunity for continued dialogue before it’s release.

At the moment, I am extremely disappointed in the attitude of the ATSB and have been left with great doubts about their investigations. At this point, I would most certainly be advising my colleagues in the airlines that they cannot rely on the ATSB to report the facts nor give them a fair hearing if they are ever involved in an incident.

This is obviously not the reputation that the ATSB should be making for themselves.”

From Planetalking 26/09/2011
Pel-Air report errors ignored by ATSB says expert reviewer | Plane Talking



If Beaker thinks this will all just fade away into the "Never, Never" I've got news for him and it's all bad!

Lookleft 26th Sep 2012 08:33

There is a Phd in that for you Wally! I agree with Sarcs, people have their positions on the ditching and we are just going around in circles. Maybe its time to start a new thread.

DBTW 27th Sep 2012 04:36

This is an incredible debate with loads of blame flying around and responsibility appearing to stick nowhere without generating much further debate. In all these discussions, everyone is taking the word of the pilot that he meant to ditch.

What if he didn't mean to ditch?

Let's all face it, the pilot continued on this flight with the information he had to hand, and for one reason or another, he made the decision very early that he was going to land at Norfolk Island. The pilot did land, and it was clearly not quite in the way he expected. Much of the debate is therefore about when he should have made some other decision that he (also clearly) did not make. And of course the follow on to that is who should be responsible for decisions made or not made.

Maybe, in this case, the ATSB and CASA are actually just being polite?

In this case the pilot, having done much planning (regardless of whether it was flawed or not), had a fuel panic, done some instrument approach work, made decisions about not descending below minima or converting to a home made GPS approach, he then decided to fly out over the water and crash...and didn’t really tell anyone about it. Specifically, he didn’t mention it to the Unicom operator.

Does that really sound plausible?

We have all seen situations in aviation where a certain outcome makes an individual create a seemingly plausible explanation as to how they got to where they were in order to make it seem as though they had a plan.

He may have said something inside the cabin as a precautionary measure, but judging by the earlier decisions, and especially the one to continue to NI and land, it seems much more likely that in the end he was over water looking for a cloud break and quite simply found himself in the water.

I have heard of that happening several times before, but in my time I have never heard of anyone trying a controlled, powered ditching at night before...especially when there is a well lit and instrumented runway nearby.

Most aeroplanes when performing instrument approaches successfully manage to descend below the minimum descent altitude without hitting anything, so it stands to reason if he simply tried a stabilised approach using any instrument procedure the chances were good that he would have made it to the runway. If he had done that, there would be no ditching and no story.

Similarly, it has been brought up in this thread that people talk about descending over the sea to get a cloud break if all else fails using traditional instrument approach methods.

In my experience, low altitude over water can be extremely difficult, especially when associated with night, poor visibility, low cloud and rain. In those conditions, the risk of flying into the sea due to lack of awareness and disorientation is high.

We are now really just discussing an unfortunate and unplanned outcome to what could have been a complete non-story. Doesn’t really matter who is right or wrong anymore. The aeroplane is lost and everyone survived. That’s lucky!

blackhand 27th Sep 2012 04:47

DBTW, that would have to be the most succinct appraisal I have seen on this issue so far.
Much food for thought, and not a bit of cheese in sight.

Capn Bloggs 27th Sep 2012 04:50


it seems much more likely that in the end he was over water looking for a cloud break and quite simply found himself in the water.

I have heard of that happening several times before, but in my time I have never heard of anyone trying a controlled, powered ditching at night before...
Watch Four Corners.

Dora-9 27th Sep 2012 06:21

But Four Corners went on and on about what a hero Dominic was in carrying out a rare successful ditching in a jet, yet the ATSB report states they never even saw the water until after impact, so surely in that case there was much more luck prevailing here than good management?

DBTW, you just might be right. I've thought it strange that the ATSB were surprisingly uncritical about the actual ditching.

Wouldn't you just love to hear the FO's version?

Capn Bloggs 27th Sep 2012 06:58

Pitch-black night with no lights, how do you reckon they'd see the sea?? Doesn't mean you can't attempt a perfectly reasonable ditching with a reasonable chance of pulling it off.

Given that the pax had time to put on their jackets AND get the rafts ready, I'd say DBTW's conspiracy theory, that they flopped into the water with no warning as the crew were doing descending orbits through holes in the clouds hoping to break visual, is a bit far-fetched.

DBTW 27th Sep 2012 07:13

CB, you obviously have a clear understanding of what really happened, and I really don't need to budge you on that. Just a couple of questions then. Didn't the jackets go on during the instrument approach phase? And were they doing orbits?

I don't know. If so, I certainly think it unwise because it would be even more disorienting! Most of the near and actual unintentional controlled flights into the sea I know about have been in straight line descents. If performed in a turn they are (pretty much) always fatal.

blackhand 27th Sep 2012 07:15


Pitch-black night with no lights, how do you reckon they'd see the sea?? Doesn't mean you can't attempt a perfectly reasonable ditching with a reasonable chance of pulling it off
Which makes an attempted landing at the actual airstrip more reasonable than flying it into the ocean.

flying-spike 27th Sep 2012 07:37

How the FDR and CVR could have helped
 
It brings me back to the issue of the appearance of both engine nacelles in the underwater footage. No bulging of the nacelles which would be evident had they been spinning on impact, also no curling of the compressor blades. I heard when Dom said they were going to ditch the FO asserted herself and said "No we're not" I reckon the last missed approach was asymmetric (witness said it sounded "different") and the other died as they turned back in. Although they may have briefed the ditching it happened a little sooner than they thought.
Just my 20c worth.

Cactusjack 27th Sep 2012 07:43


"No we're not"
Sounds just like the AN VH-INH 'mishap' in SYD when el Capitaino ask for TOGA and the Flight Engineer said "no you won't"!! Well something like that, the memory is fading a little, and I do miss Big Trev so much, and he gave his Pilots a 'bulging narcelle' over that episode also!! Does he still wear the wig?
Oh yeah, a big hello to Gary T in PNG also:ok:

Capn Bloggs 27th Sep 2012 08:21


I heard when Dom said they were going to ditch the FO asserted herself and said "No we're not" I reckon the last missed approach was asymmetric
Thanks for that, Love!

Capn Bloggs 27th Sep 2012 08:31


Which makes an attempted landing at the actual airstrip more reasonable than flying it into the ocean.
Possibly, but the actual inbound tracks (for the 11/29 VORs, at least) go nowhere near the runway itself. Given terrain, it is questionable whether a forced land landing would have been more successful than what happened. Now the VOR 04 (VOR at threshold) and slide off the runway at low speed? A possibility.

DBTW, I was a bit tongue-in-cheek about about descending orbits until Visual. However, your theory:

it seems much more likely that in the end he was over water looking for a cloud break and quite simply found himself in the water.
just doesn't stack up. The report clearly states what happened in the few minutes prior to alighting. If the report is correct the crew knew exactly when they were going to touchdown.

blackhand 27th Sep 2012 08:43

Thanks Capn.
Makes his decision to ditch seem an option, would you have done the same? or maybe tried to drop to the 600 ft level and then slam it on, you know like a bandit approach into Bulolo when its clouded.

Capn Bloggs 27th Sep 2012 08:52


Makes his decision to ditch seem an option, would you have done the same?
I refuse to answer on the grounds that I might incriminate myself! :}


you know like a bandit approach into Bulolo when its clouded.
I don't know what a bandit is, nor do I know where Bulolo is, thank goodness! :p

DBTW 27th Sep 2012 08:59


If the report is correct the crew knew exactly when they were going to touchdown.
There's the rub CB...the report can only go on what the investigators were told, and I am only asking the question as to whether what they have been told stacks up...I am not comfortable with what they have been told.

Jabawocky 27th Sep 2012 09:00

I wonder how planned it was.

The report says at 10.25 they conducted an approach, and at 10.26 there was a unreadable transmission. assumed to be at the point of ditching.

So 1 minute? Either they had flamed out, and it was rather unplanned, or perhaps DBTW's theory is closer to the mark.

If it was well planned why then were only 3 out of 5 likely jacket bodies in jackets. The lady strapped in I can understand but a well planned ditching would have had the all other pax and the crew in jackets.

I also think that a planned ditching would have involved a very clear instruction to the unicom operator on where about he intended to conduct it, and that he wanted all resources pointed in that direction.

I think the report is lacking in a lot of things..... 2 things in particular. FDR & CVR and they are only 48m down. This is within diving range for pro folk. Actually I know some folk who can do it. Why did the ATSB not go get them, and as the Air France A330 has shown, if you find them, even buried under great amounts of pressure for a year or more, they survive.

ATSB.... GO GET THE F:mad: RECORDERS!!!!!!!!!!

blackhand 27th Sep 2012 09:14


I refuse to answer on the grounds that I might incriminate myself! http://images.ibsrv.net/ibsrv/res/sr...s/badteeth.gif
Na, there was a hole and we followed all the way down:E

Capt Claret 27th Sep 2012 10:47

If the ditching was planned, then IMO, doing so without advising the NLK R/T man, with whom they were in contact, was the most negligent act, of a poorly executed flight.

And one can't blame Pel Air, CASA, the ATSB, or any one else for that.

I find it so difficult to believe that one could ditch and not tell anyone where, that I find it even more difficult to believe that the ditching was planned.

Dog One 27th Sep 2012 10:57

ATSB need to recover both the data recorder and FDR recorder to resolve this investigation.
It is my belief that they flew into the water before they were ready. No real ditching briefing had been given, nor was the cabin prepared. Why did the FO tell Norfolk radio to standby when he was after details if they were about to ditch?

I understand that the gear is held up under pressure and that it would droop down after the punps and pressure were lost, but certainly not extend fully and lock down. The underwater photos show the gear down and locked.

Did they get partially visual and decide to circle at low level back to the runway and flew into the water?

It would be interesting to see how much the flight crew statements changed if the FDR and cockpit recorders where located!

Capn Bloggs 27th Sep 2012 11:44

As with other crashes, the crew often don't say much. This wasn't your standard Sim ride; they knew they were about die.

Agree about the recorders, although while they will tell us more about the last bits of the flight, the systemic-issues discussion probably will not benefit a lot.

Worrals in the wilds 27th Sep 2012 11:46

Did they ever make a mayday call?

Lookleft 27th Sep 2012 12:17

No they didn't make a Mayday. It is also unlikely that the scenarios put forward here occurred as the passenger accounts of what happened match the crew accounts. I think the crew would have understood their obligations under the TSI Act that they were legally required to provide all information with self-incrimination not being a defense against not telling the truth.
Also the F/O would not have had any reason to give an account that was not accurate. If they didn't tell the truth then I doubt that they will be looking forward to the Senate Enquiry. If I was the crew I wouldn't be looking forward to it anyway because I don't think that it will exonerate them if that is what they are after. The Senator's are not fools and they will soon realise that the accident could have been avoided if different decisions were made by the crew even after they realised that the weather was below minima and they didn't have the fuel for an alternate.

Remember the Mt Hotham accident? After the report was released there was all sorts of speculation that the ATSB had got it wrong and that it wasn't the pilots fault but a failed engine that caused them to hit the hill.A supplementary report released in response to the publicity showed that the pilot had flown a similar dodgy approach in the morning. End of speculation.

Brian Abraham 29th Sep 2012 04:11

I couldn't let the thread die without posting something the iconoclasts (it's all Dominics/crews fault) might mull over. Those whose olfactory organs are so distressed as to be unable to smell the cheese might wish to log off now.

Reconstructing human contributions to accidents:
The new view on error and performance


Sidney W. A. Dekker

Abstract

Problem


How can we reconstruct the human contribution to accidents? Investigators easily take the position of retrospective outsider, looking back on a sequence of events that seems to lead to an inevitable outcome, and pointing out where people went wrong. This does not explain much, however, and may not help prevent recurrence.

Method and results

In this paper I examine how investigators can reconstruct the human contribution to accidents in light of what has recently become known as the new view of human error. The commitment of the new view is to relocate controversial human assessments and actions back into the flow of events of which they were part and which helped bring them forth, to see why assessments and actions made sense to people at the time. The second half of the paper is dedicated to one way in which investigators could begin to reconstruct people's unfolding mindsets.

Impact on industry

In an era where a large portion of accidents gets attributed to human error, it is critical to understand why people did what they did, rather than judging them for not doing what we now know they should have done. This paper contributes by helping investigators avoid the traps of hindsight, and by presenting a method with which investigators can begin to see how people's actions and assessments could actually have made sense at the time.
Introduction

In human factors today there are basically two different views on human error and the human contribution to accidents . One view, recently dubbed "the old view" (AMA, 1998; Reason, 2000), sees human error as a cause of failure. In the old view of human error:

• Human error is the cause of most accidents.

• The engineered systems in which people work are made to be basically safe; their success is intrinsic. The chief threat to safety comes from the inherent unreliability of people.

• Progress on safety can be made by protecting these systems from unreliable humans through selection, proceduralization, automation, training and discipline.

The other view, also called "the new view", sees human error not as a cause, but as a symptom of failure (Rasmussen & Batstone, 1989; Woods et al., 1994; AMA, 1998; Reason, 2000; Hoffman & Woods, 2000). In the new view of human error:

• Human error is a symptom of trouble deeper inside the system.

• Safety is not inherent in systems. The systems themselves are contradictions between multiple goals that people must pursue simultaneously. People have to create safety.

• Human error is systematically connected to features of peoples tools, tasks and operating environment. Progress on safety comes from understanding and influencing these connections.

The new view of human error represents a substantial movement across the fields of human factors and organizational safety (Reason, 1997; Rochlin, 1999) and encourages the investigation of factors that easily disappear behind the label "human error"—long-standing organizational deficiencies; design problems; procedural shortcomings and so forth. The rationale is that human error is not an explanation for failure, but instead demands an explanation, and that effective countermeasures start not with individual human beings who themselves were at the receiving end of much latent trouble (Reason, 1997) but rather with the error-producing conditions present in their working environment. Most of those involved in accident research and analyses are proponents of the new view. For example:
"...simply writing off ... accidents merely to (human) error is an overly simplistic, if not naive, approach.... After all, it is well established that accidents cannot be attributed to a single cause, or in most instances, even a single individual." (Shappell & Wiegmann, 2001, p. 60).

However, our willingness to embrace the new view of human error in our analytic practice is not always matched by our ability to do so. When confronted by failure, it is easy to retreat into the old view. We seek out the "bad apples" and assume that with them gone, the system will be safer than before. An investigation's emphasis on proximal causes ensures that the mishap remains the result of a few uncharacteristically ill-performing individuals who are not representative of the system or the larger practitioner population in it. It leaves existing beliefs about the basic safety of the system intact.

The pilots of a large military helicopter that crashed on a hillside in Scotland in 1994 were found guilty of gross negligence. The pilots did not survive—29 people died in total—so their side of the story could never be heard. The official inquiry had no problems with "destroying the reputation of two good men", as a fellow pilot put it. Potentially fundamental vulnerabilities (such as 160 reported cases of Uncommanded Flying Control Movement or UFCM in computerized helicopters alone since 1994) were not looked into seriously (Sunday Times, 25 June 2000).

Faced with a bad, surprising event, we seem more willing to change the players in the event (e.g. their reputations) than to amend our basic beliefs about the system that made the event possible. To be sure, reconstructing the human contribution to a sequence of events that led up to an accident is not easy. As investigators we were seldom—if ever—there when events unfolded around the people now under investigation. As a result, their actions and assessments may appear not only controversial, but truly befuddling when seen from our point of view. In order to understand why people could have done what they did, we need to go back and triangulate and interpolate, from a wide variety of sources, the kinds of mindsets that they had at the time. But working against us are the inherent biases introduced by hindsight (Fischoff, 1975) and the multiple pressures and constraints that operate on almost every investigation—political as well as practical (Galison, 2000).

In this paper I hope to make a contribution to our ability to reconstruct past human performance and how it played a role in accidents. I first capture some of the mechanisms of the hindsight bias, and observe them at work in how we routinely handle and describe human performance evidence. Trying to avoid these biases and mechanisms, I propose ways forward for how to reconstruct people's unfolding mindsets. Most examples will come from aviation, but they, and the principles they illustrate, should apply equally well to domains ranging from driving to shipping to industrial and occupational safety.

The mechanisms of hindsight

One of the safest bets we can make as investigators or outside observers is that we know more about the incident or accident than the people who were caught up in it—thanks to hindsight:

• Hindsight means being able to look back, from the outside, on a sequence of events that led to an outcome we already know about;

• Hindsight gives us almost unlimited access to the true nature of the situation that surrounded people at the time (where they actually were versus where they thought they were; what state their system was in versus what they thought it was in);

• Hindsight allows us to pinpoint what people missed and shouldn't have missed; what they didn't do but should have done.

From the perspective of the outside and hindsight (typically the investigator's perspective), we can oversee the entire sequence of events—the triggering conditions, its various twists and turns, the outcome, and the true nature of circumstances surrounding the route to trouble. In contrast, the perspective from the inside of the tunnel is the point of view of people in the unfolding situation. To them, the outcome was not known, nor the entirety of surrounding circumstances. They contributed to the direction of the sequence of events on the basis of what they saw on the inside of the unfolding situation. For investigators, however, it is very difficult to attain this perspective. The mechanisms by which hindsight operates on human performance data are mutually reinforcing. Together they continually pull us in the direction of the position of the retrospective outsider. The ways in which we retrieve human performance evidence from the rubble of an accident, represent it, and re-tell it, typically sponsors this migration of viewpoint.

Mechanism 1: Making tangled histories linear by cherry-picking and re-grouping evidence

One effect of hindsight is that ”people who know the outcome of a complex prior history of tangled, indeterminate events, remember that history as being much more determinant, leading ’inevitably’ to the outcome they already knew” (Weick, 1995, p28). Hindsight allows us to change past indeterminacy and complexity into order, structure, and oversimplified causality (Reason, 1990). In trying to make sense of past performance, it is always tempting to group individual fragments of human performance which prima facie point to some common condition or mindset. For example, "hurry" to land is a leitmotif extracted from the evidence in the following investigation, and that haste in turn is enlisted to explain the errors that were made:

”Investigators were able to identify a series of errors that initiated with the flightcrew’s acceptance of the controller’s offer to land on runway 19…The CVR indicates that the decision to accept the offer to land on runway 19 was made jointly by the captain and the first officer in a 4-second exchange that began at 2136:38. The captain asked: ’would you like to shoot the one nine straight in?’ The first officer responded, ’Yeah, we’ll have to scramble to get down. We can do it.’ This interchange followed an earlier discussion in which the captain indicated to the first officer his desire to hurry the arrival into Cali, following the delay on departure from Miami, in an apparent to minimize the effect of the delay on the flight attendants' rest requirements. For example, at 2126:01, he asked the first officer to ’keep the speed up in the descent’… (This is) evidence of the hurried nature of the tasks performed.” (Aeronautica Civil, 1996, p. 29)

But the fragments used to build the argument of haste come from over half an hour of extended performance. The investigator treats the record as if it were a public quarry to pick stones from, and the accident explanation the building he needs to erect. The problem is that each fragment is meaningless outside the context that produced it: each fragment has its own story, background, and reasons for being, and when it was produced it may have had nothing to do with the other fragments it is now grouped with. Also, behavior takes place in between the fragments. These intermediary episodes contain changes and evolutions in perceptions and assessments that separate the excised fragments not only in time, but also in meaning. Thus, the condition, and the constructed linearity in the story that binds these performance fragments, arises not from the circumstances that brought each of the fragments forth; it is not a feature of those circumstances. It is an artifact of the investigator. In the case described above, ”hurry” is a condition identified in hindsight, one that plausibly couples the start of the flight (almost 2 hours behind schedule) with its fatal ending (on a mountainside rather than an airport). ”Hurry” is a retrospectively invoked leitmotif that guides the search for evidence about itself. It leaves the investigator with a story that is admittedly more linear and plausible and less messy and complex than the actual events. Yet it is not a set of findings, but of tautologies.

Mechanism 2: Finding what people could have done to avoid the accident

Tracing the sequence of events back from the outcome—that we as investigators already know about—we invariably come across joints where people had opportunities to revise their assessment of the situation but failed to do so; where people were given the option to recover from their route to trouble, but did not take it. These are counterfactuals—quite common in accident analysis. For example, "The airplane could have overcome the windshear encounter if the pitch attitude of 15 degrees nose-up had been maintained, the thrust had been set to 1.93 EPR (Engine Pressure Ratio) and the landing gear had been retracted on schedule" (NTSB, 1995, p. 119). Counterfactuals prove what could have happened if certain minute and often utopian conditions had been met. Counterfactual reasoning may be a fruitful exercise when trying to uncover potential countermeasures against such failures in the future.

But saying what people could have done in order to prevent a particular outcome does not explain why they did what they did. This is the problem with counterfactuals. When they are enlisted as explanatory proxy, they help circumvent the hard problem of investigations: finding out why people did what they did. Stressing what was not done (but if it had been done, the accident would not have happened) explains nothing about what actually happened, or why.

In addition, counterfactuals are a powerful tributary to the hindsight bias. They help us impose structure and linearity on tangled prior histories. Counterfactuals can convert a mass of indeterminate actions and events, themselves overlapping and interacting, into a linear series of straightforward bifurcations. For example, people could have perfectly executed the go-around maneuver but did not; they could have denied the runway change but did not. As the sequence of events rolls back into time, away from its outcome, the story builds. We notice that people chose the wrong prong at each fork, time and again—ferrying them along inevitably to the outcome that formed the starting point of our investigation (for without it, there would have been no investigation).

But human work in complex, dynamic worlds is seldom about simple dichotomous choices (as in: to err or not to err). Bifurcations are extremely rare—especially those that yield clear previews of the respective outcomes at each end. In reality, choice moments (such as there are) typically reveal multiple possible pathways that stretch out, like cracks in a window, into the ever denser fog of futures not yet known. Their outcomes are indeterminate; hidden in what is still to come. In reality, actions need to be taken under uncertainty and under the pressure of limited time and other resources. What from the retrospective outside may look like a discrete, leisurely two-choice opportunity to not fail, is from the inside really just one fragment caught up in a stream of surrounding actions and assessments. In fact, from the inside it may not look like a choice at all. These are often choices only in hindsight. To the people caught up in the sequence of events there was perhaps not any compelling reason to re-assess their situation or decide against anything (or else they probably would have) at the point the investigator has now found significant or controversial. They were likely doing what they were doing because they thought they were right; given their understanding of the situation; their pressures. The challenge for an investigator becomes to understand how this may not have been a discrete event to the people whose actions are under investigation. The investigator needs to see how other people's "decisions" to continue were likely nothing more than continuous behavior—reinforced by their current understanding of the situation, confirmed by the cues they were focusing on, and reaffirmed by their expectations of how things would develop.

Mechanism 3: Judging people for what they did not do but should have done
Where counterfactuals are used in investigations, even as explanatory proxy, they themselves often require explanations as well. After all, if an exit from the route to trouble stands out so clearly to us, how was it possible for other people to miss it? If there was an opportunity to recover, to not crash, then failing to grab it demands an explanation. The place where investigators look for clarification is often the set of rules, professional standards and available data that surrounded people's operation at the time, and how people did not see or meet that which they should have seen or met. Recognizing that there is a mismatch between what was done or seen and what should have been done or seen—as per those standards—we easily judge people for not doing what they should have done.

Where fragments of behavior are contrasted with written guidance that can be found to have been applicable in hindsight, actual performance is often found wanting; it does not live up to procedures or regulations. For example, ”One of the pilots…executed (a computer entry) without having verified that it was the correct selection and without having first obtained approval of the other pilot, contrary to procedures.” (Aeronautica Civil, 1996; p. 31).
Investigations invest considerably in organizational archeology so that they can construct the regulatory or procedural framework within which the operations took place, or should have taken place. Inconsistencies between existing procedures or regulations and actual behavior are easy to expose when organizational records are excavated after-the-fact and rules uncovered that would have fit this or that particular situation. This is not, however, very informative. There is virtually always a mismatch between actual behavior and written guidance that can be located in hindsight (Suchman, 1987; Woods et al., 1994). Pointing that there is a mismatch sheds little light on the why of the behavior in question. And for that matter, mismatches between procedures and practice are not unique to mishaps (Degani & Wiener, 1991).

Another route to constructing a world against which investigators hold individual performance fragments, is finding all the cues in a situation that were not picked up by the practitioners, but that, in hindsight, proved critical. Take the turn towards the mountains on the left that was made just before an accident near Cali, Colombia in 1995 (Aeronautica Civil, 1996). What should the crew have seen in order to notice the turn? They had plenty of indications, according to the manufacturer of their aircraft:
”Indications that the airplane was in a left turn would have included the following: the EHSI (Electronic Horizontal Situation Indicator) Map Display (if selected) with a curved path leading away from the intended direction of flight; the EHSI VOR display, with the CDI (Course Deviation Indicator) displaced to the right, indicating the airplane was left of the direct Cali VOR course, the EaDI indicating approximately 16 degrees of bank, and all heading indicators moving to the right. Additionally the crew may have tuned Rozo in the ADF and may have had bearing pointer information to Rozo NDB on the RMDI” (Boeing, 1996, p. 13).

This is a standard response after mishaps: point to the data that would have revealed the true nature of the situation. Knowledge of the ”critical” data comes only with the omniscience of hindsight, but if data can be shown to have been physically available, it is assumed that it should have been picked up by the practitioners in the situation. The problem is that pointing out that it should have does not explain why it was not, or why it was interpreted differently back then (Weick, 1995). There is a dissociation between data availability and data observability (Woods et al., 1994)—between what can be shown to have been physically available and what would have been observable given the multiple interleaving tasks, goals, attentional focus, interests, and—as Vaughan (1996) shows—culture of the practitioner.
There are also less obvious or not documented standards. These are often invoked when a controversial fragment (e.g. a decision to accept a runway change (Aeronautica Civil, 1996), or the decision to go around or not (NTSB; 1995)) knows no clear pre-ordained guidance but relies on local, situated judgment. For these cases there are always ”standards of good practice” which are based on convention and putatively practiced across an entire industry. One such standard in aviation is ”good airmanship”, which, if nothing else can, will explain the variance in behavior that had not yet been accounted for.

While micromatching, the investigator frames people's past assessments and actions inside a world that s/he has invoked retrospectively. Looking at the frame as overlay on the sequence of events, s/he sees that pieces of behavior stick out in various places and at various angles: a rule not followed here; available data not observed there; professional standards not met overthere. But rather than explaining controversial fragments in relation to the circumstances that brought them forth, and in relation to the stream of preceding as well as succeeding behaviors which surrounded them, the frame merely boxes performance fragments inside a world the investigator now knows to be true. The problem is this after-the-fact-world may have very little relevance to the actual world that produced the behavior under investigation. The behavior is contrasted against the investigator’s reality, not the reality surrounding the behavior in question at the time. Judging people for what they did not do relative to some rule or standard does not explain why they did what they did. Saying that people failed to take this or that pathway—only in hindsight the right one—judges other people from a position of broader insight and outcome knowledge that they themselves did not have. It does not explain a thing yet; it does not shed any light on why people did what they did given their surrounding circumstances. The investigator has gotten caught in what William James called "the psychologist's fallacy" a century ago: he has substituted his own reality for the one of his object of study.

It appears that in order to explain failure, we seek failure. In order to explain missed opportunities and bad choices, we seek flawed analyses, inaccurate perceptions, violated rules—even if these were not thought to be influential or obvious or even flawed at the time (Starbuck & Milliken, 1988). This search for people's failures is another well-documented effect of the hindsight bias: knowledge of outcome fundamentally influences how we see a process. If we know the outcome was bad, we can no longer objectively look at the behavior leading up to it—it must also have been bad (Fischoff, 1975; Woods et al., 1994; Reason, 1997).


All times are GMT. The time now is 03:57.


Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.