PDA

View Full Version : Latest from James Reason


Genghis the Engineer
4th Sep 2009, 14:27
Okay, we all know about James Reason's Swiss Cheese model.


However, in a moment of enthusiasm the other day, I looked him up on Amazon and discovered a fair number of books concentrating upon safety, accident causality, and so-on.

Amongst them, was a new book which came out in the last year called The Human Contribution (http://www.amazon.co.uk/Human-Contribution-Unsafe-Accidents-Recoveries/dp/0754674029/ref=sr_1_1?ie=UTF8&s=books&qid=1252074137&sr=8-1).

In even more enthusiasm, I spent my own money on a copy - which I'm currently about 1/3rd through at the moment.

After a rather turgid introduction, it gets into some really interesting concepts of the concepts of "violations" - failures to follow procedure, and how these should be related to good and bad procedures, and also learning opportunities. The extensive use of case studies (in particular a lot of discussion of Chernobyl, but also various aviation cases) makes it particularly readable and I find it's giving me a lot of food for thought about the way my organisation works.

Once I've finished, I'll find this thread again and post a more full review, but in the meantime I thought I'd at-least flag it up as a very interesting read, and a good sample of the latest thinking of somebody who we all regularly refer to, but probably not enough have actually read.

G

Daysleeper
4th Sep 2009, 14:39
Read it!
His point that not every accident is an organisational one was interesting but his use of the columbia accident board as an example was perhaps not well explained. I'll avoid any more plot spoilers other than to say a very readable work that even my wife (non aviator) really enjoyed.
When you've finished it we can re-engage the discussion :8

alf5071h
5th Sep 2009, 00:17
Also see the Customer review here (www.amazon.com/Human-Contribution-James-Reason/dp/0754674029/ref=sr_1_1?ie=UTF8&s=books&qid=1250455509&sr=1-1) and the FSF write up ‘We Still Need Exceptional People’ (www.flightsafety.org/asw/mar09/asw_mar09_p53-56.pdf).

For more on ‘violations’ see the CRM Devel forum ( http://groups.yahoo.com/group/crm-devel/messages) – files section – “Bending the rules; The Violation Manual” Reason et al (registration required, free).

Also at Aviation.org (http://aviation.org/) see the ‘library’ section “Improving Procedural Compliance” (registration required, free).

Other online info from J Reason:- Publications listing. (http://unjobs.org/authors/james-reason)

A37575
5th Sep 2009, 14:10
Okay, we all know about James Reason's Swiss Cheese model.


It has been done to death a million times over on Pprune. His point is that pilots are never to blame for accidents and that by his account command responsibility is an old fashioned irrevelant term. Some may find this hard to digest but pilot error is often the direct cause of an accident and all the Swiss Cheese holes have little to do with it. But one thing is certain and that is Professor Reason had made a bucketful of money with his theory.

Miles Gustaph
6th Sep 2009, 08:07
A37575 what utter drivel you have written…”Some may find this hard to digest but pilot error is often the direct cause of an accident and all the Swiss Cheese holes have little to do with it” …. Well on planet earth a Pilot is one part of a whole process…the cheese “blocks” …. Before the pilot are ops, planners, SMS, CRM, Training etc… after are SMS, engineering, CRM, Training etc… meaning that in a self regulating industry it is almost impossible for a single action to cause an accident, normally a single point accident is an act of negligence or sabotage.

If a pilot hasn’t been negligent, and hasn’t been today’s saboteur then it is reasonable to suggest that the SMS, CRM, Ops manual, planning etc did, or did not do something that affected the pilots ability to carry out his duties.

Don’t get me wrong here I do realize that the accident rates are pointing at Pilots acting “outside of the box” but that is in itself a failure of the whole process not just of the pilot.

The reason model is exceptional and if you don’t get it then I suggest you get yourself off for another round of Human Factors training.


P.S Genghis, thanks and if you could let us have more info it would be appreciated.

Tee Emm
8th Sep 2009, 13:31
meaning that in a self regulating industry it is almost impossible for a single action to cause an accident

Absolute rubbish. A pilot fails to flare and hits hard, jerks back on the control column, and gets a tail strike caused by a combination of thrust still applied, spoiler actuation on impact and hard back stick. The pilot may have done perfectly serviceable landings previously, but on this occasion he didn't. So how the hell can the regulator, airline management, ATC who asked him to keep up speed on final approach, the rostering staff and Uncle Tom Cobbly and All, be blamed for a error of judgement by the pilot. The verdict must be pilot error.

xetroV
8th Sep 2009, 14:50
His point is that pilots are never to blame for accidents and that by his account command responsibility is an old fashioned irrevelant term.
This proves that you have probably never read any of his books and certainly have never understood them. Reason's models do not rule out individual responsibilities at all. If you think they do you're applying them incorrectly.

Miles Gustaph
8th Sep 2009, 17:01
Tee Emm you said it buddy "caused by a combination of thrust still applied, spoiler actuation on impact and hard back stick" the use of the noun "combination" denoting multiple events causing one accident/incident supports my assertion that "it is almost impossible for a single action to cause an accident"... and what Pilot do you blame? The left hand Pilot or the Right hand Pilot, or was the Flight Engineer in on it? If the situation that you propose above happened then you already have a pile of failed blocks, not least of which is inter-pilot communication.

"So how the hell can the regulator, airline management, ATC who asked him to keep up speed on final approach, the rostering staff and Uncle Tom Cobbly and All, be blamed for a error of judgment by the pilot. The verdict must be pilot error." ...before you guys get a full blown snott-on about the reason model you need to get three things very clear...
1. aviation is a team game: every one who participates has an input, that input can have an effect in a causal chain. Reason model "blocks" (The bits of cheese)
2. The blame culture, such as your suggested situation where we "must" blame the Pilot is pointless and promotes a closed culture rather than an open culture (blocks fail because of people living in fear of their jobs, litigation etc) &
3. Pilots and Engineers are not emanations of the Creator of the Universe, they are unfortunately in the majority of accidents/incidents the end user, meaning that all the blocks (cheeses) and their own training/skill/prep/briefing etc... has failed, so since they do not have Creator of the Universe status and we cant assume it was an act of divine judgment we must treat them as mortals...shock horror... and support them and look into the ...big word here..."root cause" of the accident/incident to find out what else failed that meant they were in the hot seat.

So if you do take the Reason model into account, rather than the hang the Pilot/engineer/ATC etc argument then I stand by my statement "it is almost impossible for a single action to cause an accident, normally a single point accident is an act of negligence or sabotage."

TEE EMM, or should it be Hang Emm, If you genuinely believe what you have written then get yourself onto another Human Factors/CRM course ASAP, or go and work in the Middle East and see how well the culture you advocate works!

Genghis the Engineer
8th Sep 2009, 18:00
Gosh - I started something here!

Miles, I agree with everything you've said so far, but will hold fire until I've finished the book then try and compose a reasonably thorough review of it.

G

alf5071h
8th Sep 2009, 18:46
A document well worth reading is “Revisiting the Swiss Cheese Model of accidents” (www.eurocontrol.int/eec/public/standard_page/DOC_Report_2006_017.html), which provides historical background and insight into the use of the model.
Although James Reason is credited with the ‘Swiss Cheese’ title, IIRC he acknowledges that it originated from Dr Rob Lee. However, the concept of error and systematic failure, and the graphic model are from Reason’s work.

The short paper “Human Error” (www.pubmedcentral.nih.gov/articlerender.fcgi?artid=1070929) compares the ‘person view’ with the ‘system view’ of safety and thereafter the origins of the Swiss Cheese Model.
I was particularly impressed by the paragraphs on high reliability organizations (Page 4), which could be taken as the basis of SMS or a personalized CRM/TEM.
Also of note is the comment on military organizations – clarity of goals.

For those who wish to reconsider the concept of blame and punishment see Safety and the 'Just Culture' (www.mers-tm.org/support/Marx_Primer.pdf).
From this, individuals would do well to remember that “we must all be held accountable for our efforts to make the system safer”, but also when judging others that “the important question is whether human factors learning from events outweigh the deterrent effect of punishment against negligent employees. If the threat of discipline increases one’s awareness of risk or at least increases one’s interest in assessing the risks, does this heightened awareness outweigh the learning from thorough error investigations”.

The paper does not provide a clear cut answer – there isn’t one; even Dekker notes that whilst having a just culture is important, the real key to this is in who draws the line.
After reading the above, I agree that a just culture is essential, but in those few marginal cases, perhaps a tribunal consisting of Operations (safety/training), Human Factors, and Legal (company/personnel interests) might provide an optimum solution opposed to a view of just one individual.
However, there could be some wide ranging differences amongst cultures, e.g. the extreme legalistic approach in the USA vs a more tolerant approach (trusting/ learning) in other parts of the world.

Tee Emm “…ATC who asked him to keep up speed on final approach…” is just one example of a latent factor, a systematic weakness, which both individuals and operators can challenge. See the theme in ‘Just Say No – its OK to Go Around’ (www.flightsafety.org/ppt/managing_threat.ppt).

Tee Emm
11th Sep 2009, 13:29
If Pilot Error is now an offensive non-PC term then how come in Flight International 8-14 Sept page 16, the headline is "China Airlines 737 Fire was due to Maintenance Error" with Japan's Transport safety Board (JTSB) confirming that a fire which destroyed a China Airlines Boeing 737-800 in August 2007, was due to Maintenance Error".

Perhaps the venerable JTSB have never heard of the famous James Reason and his Swiss Cheese theory on the cause of aircraft accidents. "Maintenance Error" indeed. Wash that man's mouth out..:ok:

Miles Gustaph
12th Sep 2009, 08:40
Tee-Emm, it's not a PC issue it's an issue of fact... if your hell bent on wanting to live in a world with a blame culture then go-forth and live in the middle east, it will give you the certainty that you so seem to desire.

If on the other hand you are a manager, or senior pilot and actually believe what you are saying then, as said before, get yourself onto a Human Factors course or CRM course ASAP because your attitude stinks and people with your attitude should be thrown out of our industry as soon as we can hunt them down, there is no place in modern aviation for your attitude, you are a dinosaur and I hope your species becomes extinct very soon!

To take this further, people with your Hang-Emm attitude stop perfectly decent team-playing members of our industry from reporting anything because people like you support the sack-Emm attitude, which means SMS systems do not work because people are frigntened to report or provide feedback to the SMS manager/QA/Safety etc... which means that the regulation of "The System" fails... and in case you hadn't noticed whats happened in the "real world" aviation is a self regulating industry so a failure of the safety/QA systems can end up in dead people. Yes when we mess up in aviation people have this inconvenient habit of dying.

So the Hang-Emm school of thought teaches us:

Hang-Emm culture=frightened people who don't report "errors"=failure of the system=dead people

In comparison the Reason model teaches us:

No blame culture=reports=continual assessment and improvement of the system=reduction in the number of dead people

I don't know it maybe just me but I think James Reason may be onto something here.

However to come back to your assertion that "maintenance error" is a filthy phrase that is on your "hang-Emm" side of the fence, well no it isn't! The Reason model and incidentally if you'd read the actual report, you'd have got this...

Maintenance is a generic term to describe a number of differing activities with regard to aircraft, in the EASA zone it starts with Part-M on the Operators side and transfers across to the maintenance organization, Part-145.

The Part-145 maintenance organization will normally cover line & base maintenance, planning, technical support, Quality Assurance & several reporting systems, which include SMS.

Maintenance error in the context of the China Airlines 737 showed that there was a failure of the maintenance process with errors at numerous stages in the maintenance process, these would be failures of the Swiss-cheese blocks that James Reason advocates, so no single person to blame!
Go read the report, it quite staggering the number of individual processes that failed to result in that accident.

Tee-Emm there is no place for your circa 1950's ex-military type attitude in our modern industry, people with your attitude are a danger to themselves, there colleagues, the passengers we serve, and the potential third part victims of an aircraft accident.

Get a human factors course, a CRM course or get out of our industry.

I apologize to the rest of the decent people who frequent this thread for the long post.

Miles Gustaph
12th Sep 2009, 08:57
Tee-Emm,

how can someone who wrote such an intelligent James Reason supporting post as the following have your attitude?

"Most ADI's in today's low cost synthetic trainers, offer full 360 degrees of roll and a pilot can practice unusual attitude recoveries on an el cheapo machine and be a safer pilot for it. Pull through from inverted and any aeroplane will lose considerable altitude in an attempted recovery. A pilot never having been taught to recover from that manoeuvre - or any other extreme attitude - would surely crash if faced with the real event - particularly in IMC.

So, whether you practice recovery on instruments in a 737 flight simulator or an Elite synthetic trainer; or even under the hood in a Cessna 150, the instrument indications would most probably be the same. Better a little knowledge of unusual attitude recovery technique than none at all. And certainly a lot better than reading all about it in a text book, and expecting that to get you out of trouble if you are upside down in the real thing.. "

What you advocate in your post of July 9th is the Reason Model...
Problem identified-training needed-added to training program-reduction in the accident rate.

you've just advocated the Reason model...

What gives?

frontlefthamster
15th Sep 2009, 06:05
A few words to say that I, like others, was taken in some time ago by those who claimed Jim might be our saviour...

Lovely chap though he is, there are others doing better work now, notably Sid Dekker. A quick search on Amazon will find things to brighten up your bookshelf...

Burr Styers
21st Sep 2009, 15:00
I think JR is a bit old hat these days. Sidney D is very good, but there is a new kiddy on the block - Prof Erik Hollnagel, who has written "The ETTO Principle". Which is his view on the Efficiency-Thoroughness Trade-Off. Which postulates that life is a balance between being efficient (minimum amount of steps to achieve the aim) and Thoroughness, taking time to achieve the aim. You can have too much of either, too efficient and mistakes occur due to shortcut taking, too thorough and the deadline/aim is missed.

The book (via Ashgate) is very compelling, but with much of the abstract theory that aviation seems to attract, can I turn that into something tangible for the good and benefit of the organisation ?

Have not finished the book yet, but I would encourage all safety pratictioners to have a read of it.

As a f'rinstance, "Commercial pressure" is quite commonly quoted by pilots trying to get a job done. What I think they are expressing is that they are trying to be thorough in a particular part of the job (esp turnrounds), but feel pressured to sacrifice thoroughness for efficiency, and the potential for mistakes to happen increases.

He also hit the nail on the head when he says "Safety is not something you have - it is something you do". Agree with that wholeheartedly.

There is no metric in this world that can be applied to safety, but there is when it comes to "doing the do". It is the physical and tangible efforts of an individual or an organisation that will define how safe they are or not. SMS is the tangible part of an organisation that supports its day to day, flying programme.

I spent too many man hours pursuing or trying to implement the various abstract theories that have surrounded SMS over the last few years. The penny dropped a couple of years ago, and now all is clear.

Everyone has a role, and everyone has a place in the SMS stucture.

Flight ops folk tend to be too flight ops centric with regard to safety management (been there, done that) it is all about the big picture.

Aviation is a very linear and cyclical business, which makes it predictable.
(It has a beginning middle and end, what I call the "Shed to Bed" theory)

Yep, it is a collection of complexities that all need to work within themselves, and with other agencies that they have to interact with.

Where do most problems occur ? Where one organised system, has to interact with another organised system, it is on the point of interface that most problems occur, there are thousands of examples when you start to thik about it.

In aviation today and in my domain (uk based regional carrier) the problems are on the ground, not in the air. And that is where we concentrate our efforts - a rich hunting ground.

Why on the ground ? because they are in a permanent battle to get the Effectiveness/thoroughness balance right, and are most likely to get it wrong.

Just my three penneth. Dont spend too much time wondering about abstract theories - just get on and "do the do":ok:

BS

MONT BLANC
21st Sep 2009, 20:02
Hi Burr,

very much agree with the sentiment of your post. Have never been entirely convinced by the Reason Model. Eurocontrol publish a paper at Bretigny named "Revisiting the Swiss Cheese Model", for a further discussion on this.

Erik Hollnagel I first came across at a NATO Advanced Science Institute in 1992 - and Jim Reason was there as well (played goal full back to Erik's goalkeeper). Erik has been writing for many years, and Sidney was with him at a university in Sweden (not Lund). Erik's work is particularly good.

His new book builds a lot on his previous work. If you haven't read it, I commend "Barriers and Accident Prevention" (Ashgate 2004).

Not sure Erik is the new kid on the block:) Jim Reason's latest book, particularly the last two chapters is very Dekker/Hollnagelian!


MB

Burr Styers
21st Sep 2009, 20:40
Hi MB

perhaps I should have qualified that, and said EH is the new kid on my block. Having done a bit more research I can see he is a prolific and widely read academic.

I try to keep my range of reference books quite narrow, the ICAO publications regarding SMS is pretty good, some academic books to stretch the mind and challenge ones biases and views are always good.

I suspect that like many aviation safety managers these days, I have to find my own personal balance between "Information pull" and "Information push", or Mr "Information Overload" becomes an unwelcome guest at the party.

Certainly EH has caused me to stop and think, and if I can turn his ETTO theory into a win for me, and a win for the organisation that I am part of - how good would that be :ok:

Thought I might create a bit of a firestorm by quoting EH who says "Safety can't be measured, it is something you do".........Blindingly simply, and absolutely the way to approach safety management.

Too often I have seen Directors/senior managers, middle managers, and the workforce at large glaze over when you start to get onto some of the abstract language and concepts that has gone with the development of aviation SMS over the last few years.

It doesn't work.

What does work, is showing an individual where they are in an organisation, what the safety expectation of them is, and what we will do to support them. I basically brief the workforce to be my spies.

If you see something you come and tell me. I'll put the mechanisms in place to do that, and then I will get those responsible for whatever area or problem it is, to deal with it...........And then we'll tell you what we've done.

And at the top of the organisation, I want you to resource me, and your company, in a reasonable measure so that we can achieve "joined up" safety.

I could draw you a picture, but briefly it is about "horizontal" communications, procedures, practices which are pan organisational, and "Vertical" communications/procedures, practices that apply to one specific area, i.e. flight operations, maintenance, admin staff etc.

I found my own answer to SMS (After 4 years of trying) and have had it in working form for about 3 years now.

Always a process of fine tuning and tweaking, but I know what to discard, and what is relevant, the CAA like it too :)

Enough from me, a glass of something cool crispy and white beckons...

BS

MONT BLANC
21st Sep 2009, 21:26
Hi Burr,

enjoyed your last post immensely. Wish I had access to it before I took on a safety manager role!

Obfuscation by the use of overly technical language isn't helpful. Sadly I'm very guilty of this (academic background as well as operational) so your mail is a timely reminder to watch it. But I have started to use safety concepts and philosophies that are anatheama - partly to shock (not terribly helpful) but partly because I think the organisation has reached the point in its "safety" development where it needs to consider new philosophies of safety that will lead the way to a safer future.

To do this, I believe, that we need to think of "safety" differently, and as you say provide the "gashshag" controller, engineer and pilot with the wherewithal and environment or context to make it happen. Very interested by what you have achieved: I suspect it is what we aspire too, but it is further away than some of us would like.

Your quote from EH on measuring safety is very apt. Lost the battle though on that one, still waiting to win in the end!

At times the lack of inertia really frustrates me, particularly when I hear the language of "old view" safety. In the last week I have heard it a lot. Indeed challenged a very senior manager today over the use of the phrase "an own goal". Safety climate is brittle and can be swayed easily by acts with unintended consequences. Where ETTO scores I think is that most of the operational community can see it, feel it and they do it every time that they sit and perform the task. It is not the same for senior managers it is argued, but they are just as subject to "ETTO" as the operational community.

Hope your glass was cool crispy and delectable

MB

Burr Styers
22nd Sep 2009, 09:02
A tenet that I have (indeed is on the office wall) is a quote from Peter Drucker, the American Economist.......

"The first duty of a business is to survive, and the guiding principle of business economics, is not the maximisation of profit......It is the avoidance of loss"

If you delete "Busines economics" and insert "Safety", it makes a whole lot of sense.

Loss is something tangible that everyone can understand, and the list is almost endles, obvious examples being, loss of a Hull, loss of man hours (through injury) loss of time (caused by someones elses actions) etc etc.

By looking at those areas that are causing you loss (Recently in our case - ground incidents) then you can be objective in how you tackle a particular area.

(Giving all my tricks away here)

Another one up the sleeve.

Just deal with the top 20% of whatever is giving you grief. (Ol Pareto was right). Focus yours (and others) energies into just dealing with a short list.

My experience is that it has a depressive/suppressive effect on the other "stuff" that is going on. (Co-lateral effect)

You will never solve the world as a safety manager/safety dept. Just accept that s**t happens, but reduce the amount to an acceptable level.

ok, no more clues;)

Back to the day job

BS

turbocharged
22nd Sep 2009, 12:21
Jim Reason himself said at the RAeS a few years back that he felt that the pendulum has swung too far away from 'pilot error' towards 'organizational error'.

You cannot avoid the fact that problems arise because of the actions of individuals. Airlines don't crash aeroplanes; pilots do. But individuals act within an organizational context. What Reason was suggesting at the RAeS is that we need to reconcile these 2 positions.

Hollnagel put Reason into context for me when he was at Linkoping with Dekker. Bit of a throw away line but he said that the 'swiss cheese' isn't a model, it's a metaphor for an organization. Get your head around that and you can see why it's easy to get disillusioned with a 'model' that doesn't do anything.

Dekker says we need new models of failure. What's needed is an approach that throws light on that dynamic relationship between organizations and people.

Burr Styers
22nd Sep 2009, 13:35
TC

perhaps we need to consider what is the model for success (of anything), before we consider a model for failure.

Would a model for success have a series of elements, all interacting in a linear sort of way, that lead to a successful conclusion ?

Would a failure model be the same, but with one component that is changed/modified for whatever reason, in whatever way,which then gives a different, less than expectation result.

I'd keep a wide open mind on this, and try not to be to aviation specific. Good question for us all to ponder. Worth starting a seperate thread on this ?

BS

turbocharged
22nd Sep 2009, 13:48
There does seem to be a desire to talk more about success rather than harp on about failure these days. I suppose the problem is that after an adverse event there is a need (desire) to make sense of why things went wrong and so we look at 'failure'.

A model of success, on the other hand, would need to identify opportunities for failure and then develop preventative measures. Hollnagels' barrier approach.

If we accept that everyone is trying to do a good job, then what we need to examine is the ways in which normal operations become exposed to hazards.

Burr Styers
22nd Sep 2009, 16:07
I guess there can be a million reasons why a "something" might be successful, but it only takes one (sometimes completely unrelated) event to have a very significant adverse effect.

Agree with the human desire to order/model/make understanding of, adverse events, as that is probably the easier thing to do, and it will be very event specific.

Accidents still happen on a regular basis, and sometimes are almost identical in nature. They will also happen on opposite sides of the world, and be entirely unrelated. Doesn't stop them happening though.

There is no doubt the more complex a process, and the more technology that it involves, the likelihood of something going wrong will increase.

You don't see much in the papers about the Monks in the local monastry having a poor accident rate with regard to their ale making activities !

(P'raps we have something to learn from those guys):}

I Don't have an answer to the success/failure model - perhaps there isn't one ?

BS

What Limits
22nd Sep 2009, 17:51
So are you saying that Pilot/Maintenance/Human Error is a metaphor for NEGLIGENCE?

Can anyone recall an aircraft accident solely caused by human error?

john_tullamarine
23rd Sep 2009, 01:00
So are you saying that Pilot/Maintenance/Human Error is a metaphor for NEGLIGENCE?

The legal folk probably would go down the black and white negligence path. However, Industry folk are more interested in the shades of grey. Current thinking is tending along the path of

(a) we have come a long way in tidying up the systemic problems

(b) the individual and his/her individual errors are becoming a significant part of the whole and we now need to skew our attention back, somewhat, toward the individual. Doesn't matter what discipline the individual is involved with .. pilots, maintainers, techo folk etc.

turbocharged
23rd Sep 2009, 08:37
Not sure how WL's comment is addressed to but, first, I thought it was now accepted that almost all adverse events are solely caused by human error. I can think of some aircraft destroyed by debris from buildings trashed by hurricanes (Homestead AFB a few years back for example) but the rest is down to us.

However, we need to be careful to distinguish between accidents that arise from operations exceeding a threshold of controllability and those arising from acts of negligence.

When I was talking about a model of success earlier I had in mind the sort of training needed to make pilots more sensitive to the cues that indicated that they were close to being in a state of 'uncontrol'. Elsewhere on this site there is a video of the ALPA safety conference closing address. The speaker is actually making this point. Current training and checking regimes have been trimmed to the extent that some pilots are unaware of the performance margins available to them ... and so they screw up. That's error but its not negligence.

What Limits
23rd Sep 2009, 14:25
My comment is aimed at the forum in general.

What is an error? I believe it to be 'an undesired state'. How do we arrive at this.

If a person makes an error, why do they make it

Lack of knowledge
Lack of understanding
Lack of ability
Lack of process

For each of these examples (non-exhaustive) there is some degree of negligence on the part of the person or the system.

I hypothesise that system error counts for around 80% of all undesired states. These are the errors that need to be managed first because they are the easy ones to correct.

Changing human behaviour will take many generations although I personally do not support the position that rationalised training programmes or increasing automation in aviation will make a positive contribution to reducing errors.

Burr Styers
24th Sep 2009, 14:06
Hi WL

Your post leads to something that I have been looking into this year and that is "Threat and Error Management" - TEM.

"TEM is a conceptual framework that assists in understanding from an operational perspective, the inter relationship between safety and human performance, in dynamic and challenging operational contexts"

The components of a TEM model are;

Threats (usually of the environment)
Errors (usually of people, or latent errors of the system)
Undesired aircraft state (Expensive clanging noises)
Countermeasures. (Happiness)

A threat and/or error can lead to an undesired aircraft state. An examination of threats and errors will lead to "Countermeasures", which are either "Hard" (SOPs, briefings training etc) or "Individual and Team" which covers Planning, execution, and review.

The model can be used in several ways;

1.Debrief of a specific incident, or
2. Reviewing a cluster or trend of incidents,or
3. Review of your safety database

I have used it in all three instances, and this summer ran a series of workshops to tackle some problems we were having in ground operations.

We found out so much that we just did not know about, basically because people don't have the time to make safety reports (in the ground ops/ramp environment)

We were able to come up with a range of contermeasures, suggested by the very people who we initially thought were part of the problem, they actually became part of the solution - how neat is that.

This stuff works in a real world environment.

Have a Google of "threat and Error management" and then put in Capt Dan Maurino.

He has written an excellent paper, which led me to other research, which led to the construction of my own model, which we have used in the company,which gave us answers (countermeasures), and as I look at the ground incident stats, they haven't quite collapsed on last year, but a big positive move in the right direction.:ok:

Don't get hung up philosophising about errors and blame, just go and "do the do"

BS

Miles Gustaph
28th Sep 2009, 18:51
"So are you saying that Pilot/Maintenance/Human Error is a metaphor for NEGLIGENCE?

Can anyone recall an aircraft accident solely caused by human error?"

Human error pilot:
Emirates EK407 - wrong data entered into computer
The Airbus at the Paris airshow - pilot switched off the safety computers

Human Error engineer:
BA 5390 - engineer fitted wring screws to window, 2nd engineer did the same
JAL 123 - engineers incorrectly repaired the pressure bulk head

Human error pilot & engineer:
Olympic - classic train of errors, engineers left valve open, pilots ignored the warning.
Tuniair - Sicily 2005 - pilot prayed in stead of doing emergency drills, engineers didn't fit a bit correctly

And as for "Human Error is a metaphor for NEGLIGENCE" no it's not and with an attitude like that I feel sorry for anyone who works with you or for you. An error, (one of many) is just one part of a chain of events leading to an event. Negligence is acting in a manner that is likely, or you know could lead to an event in it's own right.

Or you could just wiki negligence and read what it says...

Genghis the Engineer
29th Sep 2009, 12:08
I'm working on a report on a light aircraft accident at the moment - four separate errors I can identify that led up to the accident, all ultimately by human beings (or organisations run by human beings), and the good old Swiss Cheese model covers it very well since I can show how any of them, handled differently, could have prevented the accident (well, three out of the four anyhow).

The only thing that doesn't quite accord in this particular case with the standard Swiss Cheese diagram is that it didn't finish with an obvious operator error - simply that too many opportunities had been missed to trap a structural fault.

MG:
pilot prayed in stead of doing emergency drills
Thank you, that just made my (agnostic!) day.

G

Clandestino
29th Sep 2009, 22:49
Sorry Gengish, but our Miles Gustaph is pretty unreliable here. Good text on Tuninter ditching was written by Mark Lacagnina and can be found in July edition of Aviation Safety World. It nicely shows how train of small undetected typing errors lead to wrong type of FQI installed and lax attitude towards expected vs. actual fuel uplift sealed the flight's fate. F/O's prayer is not mentioned but yes, he said a brief prayer but it didn't affect the outcome much. I guess that it was just well mannered man's reaction to extreme stress. If I found myself in the same situation, CVR transcript would be full of #s.


Human error pilot:
Emirates EK407 - wrong data entered into computer
The Airbus at the Paris airshow - pilot switched off the safety computers

Where I fly, procedure is for each pilot to calculate the takeoff speeds independently and they get compared before getting them set. Chances of both pilots making exactly the same mistake and shorting the check circuit are pretty small, especially as more often than not speeds do match. If this or similar procedure was in place at EK but pilots didn't follow it, then it is pilot error. If not, it still is pilot error but enabled by organizational error. I need help on resolving this, as I didn't follow the EK thread closely.

It wasn't Paris show, it was Habsheim! It was the case of poorly planned flypast with even worse execution and PIC of the flight did prepare the rope to hang himself. Switching the alpha floor off was nowhere near the top of causal factors. However, three dead people were result of some AF genius authorizing carriage of passengers on airshow flight.

Olympic (...) Tuniair It was Helios and Tuninter!