PDA

View Full Version : Air accidents at record low


isi3000
3rd Jan 2008, 13:14
Air accidents are at a record low. The number of air accidents fell to its lowest level for more than 40 years last year.
There were 136 accidents worldwide over the past 12 months against 164 the previous year, making it the best year since 1963.
Aviation in general is becoming safer every year :O

the_hawk
3rd Jan 2008, 13:27
google hints at source being

http://www.baaa-acro.com/Communique%20de%20presse-UK-010108.htm

(text in link explaining numbers as well)

cited eg by http://www.theaustralian.news.com.au/story/0,25197,23001414-12377,00.html

Captain Planet
3rd Jan 2008, 17:04
Don't tempt fate!!!

Loose rivets
3rd Jan 2008, 17:14
Don't tempt fate!!!


Yes, it's funny how even scientists feel reluctant to prod at fate.


As I said on t'other thread, it does seem that SOPs and other modern teachings, coupled with high-tech systems, is countering a lower experience level in crews. I was once told that I needed 8,000 (that's not a mistake) to fly a certain Apache out of Luton. Things have changed a tad.

However, I'm mindful of one of my favorite sayings. 'Randomness comes in lumps.'

All the best for a safe new year.

isi3000
3rd Jan 2008, 17:21
lower experience level in crews

does not apply to all crews and training has improved greatly!:p Although next year could be totally different, lets not speak too soon...

lomapaseo
3rd Jan 2008, 17:36
Great play on Title wording:)

Anytime that you have more than 5 minutes between accidents than one could argure that Air accidents are at their all time low of zero in-between accidents

On the other hand if one considers that both pro-active and re-active preventive measures take years to implement and generate meaningful statistical effect than maybe a 5 year rolling average rate might be a more realistic measure, although it makes a lousy headline or grabber thread title

the_hawk
5th Jan 2008, 15:54
Here are the numbers from http://aviation-safety.net

("number of fatal aircraft accidents and total number fatalities involving civil multi-engine airliners of which the basic model has been certified for carrying 13 or more passengers")

year, #accidents, #fatalities, øaccidents, øfatalities (ø over the last 5 years)
leading zeroes for formatting :)


2007 26 0750 28 0761
2006 27 0888 30 0831
2005 35 1059 31 0807
2004 28 0429 31 0812
2003 25 0679 34 0860
2002 37 1101 36 0968
2001 28 0768 37 0996
2000 36 1082 42 1206
1999 42 0671 45 1224
1998 39 1219 47 1382
1997 42 1240 49 1366
1996 52 1817 52 1426
1995 50 1173 52 1288
1994 53 1462 50 1192
1993 48 1138 52 1205
1992 57 1541 54 1206
1991 53 1125
1990 39 0693
1989 61 1530
1988 59 1143


I think we can speak of decreasing numbers here ;)

frontlefthamster
5th Jan 2008, 16:12
Without wishing to cause this thread to drift, I'll mention first that 2007 was a bad year for GA and sport aviation fatalities in the UK, and that no-one has posted a comparison between normalised fleet age and fatalities, to which the cognoscenti would refer before drawing other conclusions... :cool:

Oftenfly
6th Jan 2008, 04:57
Wait a minute... Before comparing one year with another, using data such as those in the post from the hawk, we need to standardise the numbers of accidents for activity level. That is to say, we need to compute the accident rate, not just the raw number of accidents.
There is a variety of ways of defining accident rate. It could be number of accidents per cycle, or per passenger kilometer flown, for example. It's a little worrying that, in their announcement, B.A.A.A./A.C.R.O does not seem to have done this, but maybe the rates are given somewhere else in their data base.
Of course, our intuition tells us that the number of cycles and the number of passenger kilometers were both higher in 2007 than in 2006, and so if the raw accident data show a decrease then so too should the accident rates. But the numbers that would go into the denominator (that is, the number of cycles, or the number of passenger kilometers, etc) probably haven't been increasing for each year over the last decade (consider the post 9/11 effect), and so we need to be cautious in interpreting long-term trends.

Southernboy
6th Jan 2008, 09:52
It does look encouraging & if accurate, obviously a welcome trend but remember Mr Disraeli - "Lies, damned lies and statistics."

As lomapaseo says it takes years to develop safe practice but not long to lose it. Be wary of managers using this to deny the effect of pilot fatigue/poisoned cabin air/exhausted engineers etc.

Having made the industry safe we need to keep it that way and the first place complacency will emerge is in the boardroom.

Huck
6th Jan 2008, 09:57
Ten years ago, at an ALPA safety conference, we were told that if the accident rate was as bad in 2000 as it was in 1960, we'd have a major accident every week (in the US).

And if the accident rate is as bad in 2030 as it was in 2000, we'll have a major accident every week.....

frontlefthamster
6th Jan 2008, 17:33
Southernboy,

Safety improvement at present is coming from the industry trend for new aircraft in high-capacity (and thus modern) operating environments. New aircraft are more economical and better proofed against fatigued, inexperienced, and poorly-trained, pilots.

Decisions which are good for shareholders are good for safety.

PJ2
6th Jan 2008, 18:25
Decisions which are good for shareholders are good for safety

Exactly. Now, if only the CEO, Executive management and the shareholders could realize this and support those "expensive" safety programs which protect their livelihood and investment.

frontlefthamster
6th Jan 2008, 18:37
PJ2, you miss my point. Buying modern aircraft and operating them in well-developed environments is worth far more, in safety terms, than any airy-fairy 'safety initiative'. :rolleyes:

These modern aircraft, and their operational environment, support safe operations by borderline-competant crew on a daily basis.

PJ2
6th Jan 2008, 22:28
frontlefthamster;

I did miss your point, thanks.

I would add to the notion that aircraft today make it safer by virtue of their design and reliability, that there are alongside such primary contributions, many other processes which, because they are "normal" to the daily ops are transparent to the operation and crews yet nevertheless are effective in avoiding serious incidents. One example might be an organizational decision to create policies (vice relying upon ops-specs or regs) regarding "Low visibility operations"; who does the approach under what conditions, etc. There are many other such examples.

Safety initiatives such as FOQA/FDM, ASRs, LOSA etc are.... "airy-fairy"?

Southernboy
7th Jan 2008, 08:01
Of course new aircraft are the reason and v welcome for all that but are you (FLH) suggesting that we can now sit back & ignore all the hard won lessons honed when a/c were unreliable?

New technology did not save numerous Indonesian nor the recent Turkish CFIT victims and many others did it?

Tired poorly trained pilots can still kill people, hi tech or no.

frontlefthamster
7th Jan 2008, 09:22
Tired poorly trained pilots can still kill people, hi tech or no


True, but at a rate, and in parts of the world, where it seems 'acceptable'. :uhoh:


FOQA/FDM


A bit airy-fairy but a useful tool to keep folk worried about doing anything out of the ordinary. :p

ASRs

Quite airy-fairy, although correctly used, the system can have significant impact in reducing, for example, ramp events (which are very costly). :ok:

LOSA

The most airy-fairy of the group you put forward, but a great place for wannabe academics to pretend that they aren't dull, boring pilots. (I think I got that the right way round...) :oh:

Mungo Man
7th Jan 2008, 13:46
Looks like the 2008 stats are already counting...

PJ2
7th Jan 2008, 23:19
flh:

Quote:
Tired poorly trained pilots can still kill people, hi tech or no
True, but at a rate, and in parts of the world, where it seems 'acceptable'. :uhoh:

Quote:
FOQA/FDM
A bit airy-fairy but a useful tool to keep folk worried about doing anything out of the ordinary. :p

Quote:
ASRs
Quite airy-fairy, although correctly used, the system can have significant impact in reducing, for example, ramp events (which are very costly). :ok:

Quote:
LOSA
The most airy-fairy of the group you put forward, but a great place for wannabe academics to pretend that they aren't dull, boring pilots. (I think I got that the right way round...) :oh:

FOQA, ASR and LOSA Programs don't need defending from me of course but I am interested in the views expressed nevertheless. I am not assuming naivete here at all and am taking your response seriously.

Would you stop work on "aviation safety" at the aircraft-design stage then? What are your views on the role of the regulator, and of individual airline policies and procedures, including SOPs both in and beyond the cockpit?

I ask because in one approach, unacceptable risk is (so your line of thought appears to suggest), designed out of the system whereas in the latter approach, some level of risk, acceptable or otherwise, is assumed to always be present and so programs are designed to highlight such areas so something can be done to mitigate risk-made-visible.

The notion that the rate of fatal accidents is somehow more acceptable in Africa than "elsewhere" depends upon who's views you are invoking. I doubt if either ICAO, IATA or even the FSF would concur with your opinion. The only logical conclusion one may draw from the observation is then, perhaps the Africans themselves accept the fatality rate as somehow "inevitable"? Is this more of a political observation than a tactical one? Either way, the point requires clarification.

I note a strong disdain for the work of "academics, pretenders or otherwise". I don't know how one actually comes to define "a wannabe" from a genuine academic but if I pursue that, this thread will likely be moved! :) I think suspending judgement in favour of curiosity is better approach, and is always a healthy attitude but only on the basis of the work produced - if it's poor work, (and there is a lot of it around and there are a lot of guys out there stumping ideas and selling books. I note however that the snake-oil doesn't survive very long), then academia needs to hear about it. Otherwise, dismissing "academics" for the sake of it is simply an irrational prejudice and not worthy of further discussion. I doubt if that's what you meant but that is for you to clarify, not me. For sure there are charlatans everywhere in all walks. One's notions must either withstand the marketplace of ideas or they must fail on their own merit (despite individual attitudes towards either profession). That goes for pilots and academics alike, although results of anyone "pretending to the throne" as pilots are clearly more serious. The business has a way of winnowing, as you likely know.

For me, and this will not come as a shock I am sure, I see a partnership between knowledge and design. Some here have said it differently - bad pilots still kill people regardless of how push-buttony, CRT'd and Idiot-proofed the craft their flying is. Not sure if you fly the Airbus product but that airplane will still very nicely and smoothly fly you into the ground. Are you a fan then of auto-go-arounds, auto-TCAS interventions and auto-CFIT responses? My assumption is that you are, for if you are not then there is indeed a limit to which design and automation will intercede in flight path control beyond the crew's authority and which then will clearly point to data-gathering, risk-management and human intervention responses.

FWIW, I find your challenge very interesting indeed as one place to find either justifications for, or the true criticisms (and change) of, any such "airy-fairy" work particularly in aviation where very expensive fluff does not belong, is from these very observations. Just because something is complex, has a long history and is the creation of and subject of many academic works does not automatically imbue them with the royal jelly.

But I will leave you with one example and you can either take it from there or I will assume you weren't serious in the first place and was just having fun with the remarks made: -Don Bateman, the inventor, (literally) of GPWS and later EGPWS, has almost certainly, (although I don't have the exact numbers), single-handedly saved more lives from his chair in academia than the Airbus design itself. That's obviously a guess but I think it's a pretty good one, given that most airliners have Don's invention and perhaps even EGPWS but not all airliners are Airbus 320 fleet types - in fact, most are not. Nevertheless, there it is - a blend of academia, intelligently conceived and applied, and intelligently comprehended and implemented, (where worthy, which, see above), and good design such as the Airbus (with which I heartily agree with you having flown and instructed on the 319/320/330/340 types for many thousands of hours) is the best recipe for safe flight. I believe that an unquestioned disdain for academia and it's products and place in aviation is a significant mistake but that is a personal view again which needs no defence here. The marketplace of ideas and perhaps the statistics have already independantly voted.

Best,
PJ2

lomapaseo
8th Jan 2008, 00:01
PJ2

You seem to have accepted and understood one posters invented word "airy-fairy" and together have cornered this thread into a one on one match between somewhat unequal backgrounds.

Not preferring to step into a sparring match between only two people I'll just sit this one out until I see something productive to discuss.

alf5071h
8th Jan 2008, 00:48
PJ2, I agree with much of your recent posts.
However, one minor amendment; surely Don Bateman would be preferred to be described as an ‘Engineer’ and not as having a ‘chair in academia’. His achievements are most laudable and have contributed a high proportion of the reduced accident rate, but his work is more than designing and sitting back. He has pursued the avoidance of CFIT relentlessly, over many years challenging the FAA’s objections, encouraging ICAO and FSF with their education programs, and harassing manufacturers and operators to install the equipment. The momentum for safety has been maintained with new and insightful technology, presentations, and publications; every pilot should acquire a copy of Terrain II and ‘From Take off to Landing’ (out of print).

The appropriate use of technology will be a major aid to maintaining a low accident rate. Automated responses must be considered as the ability of human factors training involving knowledge and behavior reach either a cost effective or a self limiting plateau: – a possible reason why pilots do not pull up after a TAWS warning?
However, there is still room for improvement in the organizational human-activities, why don’t all manufactures fit the latest technology or update existing systems, why don’t operational management buy the technology, invest in training, or publish procedures. Financial constraint is probably a major influence as are those international cultural differences cited previously.

The need to reduce the number of accidents as the industry expands in order to maintain or lower the current accident rate, is based on the premise that the public will not accept any increase.
I wonder if any one has considered that this might be incorrect. Consider some of the ideas of Charles Perrow – ‘Normal Accidents’.
A normal accident typically involves interactions that are "not only unexpected, but are incomprehensible for some critical period of time."
A normal accident occurs in a complex system, one that has so many parts that it is likely that something is wrong with more than one of them at any given time.
Taking these points to extreme, then safety efforts will reach maximum effectiveness, accidents will increase – will this be acceptable? Intuitively we might answer no, but then consider road accidents and the public’s perception of these.
I don’t have any suggestions; I support current technological and human based safety initiatives, but that might be because I haven’t thought about the problem or projected – what if – far enough ahead.

PJ2
8th Jan 2008, 01:50
alf5071h;

Yes, I think Don Bateman would prefer "engineer" and the example, reading it again, was a bit far-fetched. After I posted I thought of all the engineers who worked on the Airbus (for example) and they're cut from the same cloth as Mr. Bateman is so the example is a bit stretched.

I am well aware, (from your posts), that you're familiar with all this, but for the purposes of a bit more discussion for others... - The industry has largely conquered the historical causes of accidents such as mechanical failure, navigational error, weather, collision/CFIT and ATC issues - all those areas which, through technical achievment, have built safety into today's aircraft. The areas which remain the largest single factor in accidents is the human factor and it is in these areas that "academics" can contribute best. Such learning results not in "improved turbine blade design", etc etc, but in safety policies which, through academic research, have shown where the human weaknesses and strengths are. In fact I would submit that pilots in and of themselves, (unless trained or have taken up the field out of interest), are not very good at "human factors" as it is not their primary bread-and-butter. Further, the pointy-end is not the place where policy is, or should be created and set, (even though airlines will routinely leave a crew out on a limb "making up policy" where there should be an overarching one in place - another topic entirely). Such protective layering comes from the academic research of, and I think I am on safer ground here, people like Jim Reason, (and earlier proponents of organizational accident theory such as Heinrich), Robert Helmreich, who come readily to mind, they being among thousands of like-minded individuals who's expertise lies well beyond the cockpit and who provide academically-based guidance material for the harder-wired safety programs such as FOQA/FDA, LOSA and even ASR programs.

The point of the post being of course, that academic research has it's place and that a dismissive attitude towards such programs is harboured at one's, as well as an organization's risk. The levels of safety achieved in the industry are not only from good and clever design from the engineers but from those who take human factors seriously, those factors being best expressed (made visible) in detailed data programs such as those under discussion.

The notions expressed in Diane Vaughan's "The Challenger Launch Decision; Risky Technology, Culture and Deviance at NASA", and the work done on Columbia, "Organization at the Limit - Lessons from the Columbia Disaster" have resonance in airline work. Although far, far less risky, the human factors such as the "normalization of deviance" (normal accidents) are the same in any organization, a fact which continues to be lost in the white noise of cost-control and profit... - another topic.

lomapaseo, the idea was not to monopolize the dialogue and drive out others thereby, (although I hardly think I qualify as having the power, that power residing only in one's choice to participate), but to continue an interesting and valuable point regardless of how it was originally conceived. I have had thoughts along the lines expressed by flh not because I think that such programs are as was described but because I think that a continuous examination of one's assumptions even in the face of broad acceptance, is valuable and I thought this was an opportunity to do so. Sorry it took the thread off topic...

best,
PJ2

4Greens
8th Jan 2008, 06:28
Every accident has, as a contributory factor, human error.
Human error is normal. It is how we learn, amongst other things.
Accidents are normal.
As a preventative measure, a knowledge of human factors is vital.
The only purpose of investigation is to prevent the next accident.
At the moment the best approach is defence in depth.
The more defences the better.

Signed 4Greens

Ex Operator and Academic

Southernboy
8th Jan 2008, 08:30
Succinctly put and a return to where I came in. The surest way to see the numbers go back the wrong way is to ignore all the human elements that still cause accidents.

So, I stick with my observation that tired, poorly trained pilots working for companies that cut corners remain the vulnerable area. Just because UK/EU/US hasn't had a major accident for a while doesn't mean we can relax.

An ex colleague with a large regional carrier tells of v poor standards right here and now. Let's hope the holes remain non aligned.

DODO97
8th Jan 2008, 14:22
Hope it continues to be that way. Although, we can never be relax about it because fatigue can set in without a warning and applying the wrong procedure is inevitable if you're too tired to think.

alf5071h
8th Jan 2008, 19:10
PJ2, re “The areas which remain the largest single factor in accidents is the human factor and it is in these areas that "academics" can contribute best.”
Our opinions appear to diverge at this point.
Human involvement is the largest factor in accidents, but I am not convinced that academics can provide the best contributions for improving safety.
There are and will be notable exceptions, but in very general terms even where the academics identify critical areas affecting safety they fail to deliver practical solutions, e.g. Helmreich defined CRM as “the application of human factors’, but he has not provided a good workface application / implementation of CRM (or is TEM/LOSA another attempt).

The dependence on academics is perpetuated by the regulators; IMHO there are very few ‘HF’ experienced pilots in administration and the regulator/researchers, although domain focussed are often years behind industry’s needs. Even when using basic academic research, only that which is proven is used, thus it may be quite old or based on dated operational information in a rapidly changing world.
The problem is two fold, first the regulators believe that they can regulate safety with ‘laws’, and secondly, that they can apply the academic output directly. Whereas safety (the maintenance of the good safety record) actually requires a good working partnership with industry (combined safety culture) involving a two way dialogue about problems and solutions, and wide ranging guidance materials to aid implementation of the regulations. Again there are exceptions.

Some of the better training / safety applications originate from the pilot-academics (not necessarily academic-pilots), but the most effective people are experience pilots who can ‘translate’ applicable research and provide practical applications. These and similar people with knowledge and experience in the industry are able to identify the most important safety issues. Although these issues may be identified by the academics they often remain hidden in theoretical general principles or as unproven hypotheses. Then there are many subjects where the academics do not agree, e.g. decision making is a critical safety area in aviation, yet only recently has industry is been appraised of ‘Naturalistic Decision Making’ which represents what actually happens in operational situations. Not all academics agree with the mainstream NDM theories, but from those who do, the descriptions of their work is best translated as ‘aspects of airmanship’, so the academics now tell us that we need to teach airmanship.
I am sure that there are similar examples from TEM and LOSA – new found safety initiatives, that when deciphered from academic/regulatory language matches what many experienced pilots have been doing for years. The problem then is that of educating – training, time, and money; thus these are the emerging threats to the good safety record.

As stated in previous posts the industry requires defences in depth, but which are the best defences, where do we put our scant resources, can the operators withstand any more initiatives / regulations?
Perhaps the academics could answer these questions, but even the astute James Reason did not provide guidance(how to) when he said that it is now time to ‘pick the high fruit’ (the difficult to reach targets) if safety was to be improved; unless of course he was referring to operator management or even the regulators.

4Greens
8th Jan 2008, 21:41
Aviation degree programmes, on one which I lecture, aim to readdress this missmatch.

PJ2
8th Jan 2008, 22:56
alf5071h;

Some of the better training / safety applications originate from the pilot-academics (not necessarily academic-pilots), but the most effective people are experience pilots who can ‘translate’ applicable research and provide practical applications.

I think we're closer in agreeing than my posts may indicate.

The gulf between academic research, production and the conveyance of same in practical application for the industry is a significant impediment to progress, not to mention the other significant issue which is, how to divide the paltry resources typically dedicated towards such applications.

For example, I've read the work on HFACS and listened to the authors' presentations but I find it difficult to apply from a crew's point of view even though I know the authors desired to make the system entirely practical to daily ops. The difficulty is in translating the understandings into programs which make a difference. The greatest difficulty is in convincing the financial people that such programs are valuable, as you say. Also, after all this time during which Reason's work has established itself, coming to terms with the swiss cheese model as it may apply to discerning what is and is not a weakness in a safety system so that already-thin resources can best be directed, is not a straightforward organizational task.

To me this is where SMS fails badly, as almost nobody in a flight operations department really understands safety and are thus prone to all manner of fads and strange ideas as well as being prone to ignorance of the true threats. In training sessions, (annual recurrent safety training), the swiss cheese poster was always posted and people always talked about the holes as is sometimes blythely trotted out here, but rarely with any true understanding. We would discuss accidents and causes with the idea that personnel behaviour would change but we were never allowed to discuss organizational causes because the facilitators had a script and the topic wasn't in on it, nor were the answers to the questions.

I think your term "pilot-academic" vice "academic-pilot" is an important distinction and one which, with a couple of reservations, I agree as well. The ability to take the long-learned skills and experience out of the cockpit and translate them into research and then convey findings in a way that makes sense to line crews as well as operations people is a key aspect here and is rare, as you say. Rather than repeating one key statement, I'll quote it as again, I fully concur:
Whereas safety (the maintenance of the good safety record) actually requires a good working partnership with industry (combined safety culture) involving a two way dialogue about problems and solutions, and wide ranging guidance materials to aid implementation of the regulations. Again there are exceptions.

SMS is in and of itself a superb change in traditional approaches to safety, in part removing the notions of blame and enforcement from the mix and requiring that data be collected and examined for trends. (That such collected data remains unprotected from non-aligned interests such as lawyers etc is a related matter only). Few operational people understand SMS however and, as I have posted elsewhere, think that safety is now "everybody's business" believing that safety is ensuring that others don't "run with knives" while leaving even further behind the larger institutional programs which, for better or worse, have resulted from academic research and implementation.

4Greens
9th Jan 2008, 00:36
I reiterate. A number of our graduates are now serving in airlines. Their syllabus included Flight Safety and CRM both in theory and practice. SMS was of course also taught.
I am convinced that this is the way to improve safety by bridging the gap between academia and the coal face.

PJ2
9th Jan 2008, 06:00
4Greens;

I have put forward some fairly definitive views on SMS and its potential effectiveness. Others in Canada have testified along the same lines at a Senate hearing early last year. I would be interested in your views from your vantage point, on the practicalities of SMS based upon the criticisms I and others have offered.

It is reasonable to accept, in theory, SMS as a positive step forward providing it is fully embraced and resourced by airlines but I don't believe it is, and as such cannot work in practise. I would be interested in hearing otherwise, and why.

Cheers,
PJ2

4Greens
9th Jan 2008, 06:19
PJ2,

SMS is a good approach but it still requires a competent Regulator to audit the operators. In other words it is not the complete answer but then nothing is.

Defence in depth is always the way. SMS, HF, Audit, FOQA, good training, maintenance etc. Last but not least, guard against complacency.

Rgds 4Greens

captplaystation
9th Jan 2008, 15:31
What these statistics overlook is that the difference between an incident and a (big) accident are just down to luck. Read the safety review for the year in (for example) Flight International and you will always identify a few very very near misses that would be enough to throw the statistics totally the other way. Last January for instance a F100 had a take-off incident from Pau, with the sole casualty being the unfortunate car driver who was squashed after aforesaid Fokker took-off, stalled and crashed back onto and overran the departure runway. The difference in outcome between this, and a very similar fatal accident a few years ago involving Macedonian Airways is purely luck. The AF and Iberia A340 overruns could similarly have finished very badly. Counting bent airframes can also be useful when you assess safety , but that also fails to draw your attention to the near misses.The difference between an incident and a fatal accident unfortunately owes as much to luck, as it does to skill, training or experience.
On that note I wish you all, a Happy and Lucky New Year

CONF iture
9th Jan 2008, 16:41
That event (http://www.liveleak.com/view?i=ac4_1173146230) is not from 2007, but it is just to emphasize the precedent posting.

PJ2
10th Jan 2008, 00:00
alf5071h;

Further to my response and in the light of the discussion, I would to add a comment from the D&G PPRuNe thread posted by "spanner90", post #100 at (http://www.pprune.org/forums/showthread.php?t=307552&page=5);

I'm sure everyone who has completed HF training will remember, that if a potential failure path is identified, you should put in place another line of defence. In other words, if the holes in the cheese look like they might line up, put another slice of cheese in.

I think that poster said it better than I...an identified HF problem where there is no technical fault involved demands as a response, a line of defence. Such defence can be a change in existing procedures, a new procedure or a new rule or SOP.

I would only add that such a line of defence from an identified problem should be a matter of overall Flight Operations/Flight Safety policy and not attached to any one individual or position as is the case many times, even in large organizations; - (The former installs the procedure for all time and all employees, the latter is followed only as long as that person or that position is around).

alf5071h
10th Jan 2008, 19:36
PJ2, defending against an identified failure path is good custom, but a slice of cheese is not necessarily a line of defence.
“Put another slice of Cheese in” is a common view of the Swiss Cheese model, however IMHO Prof Reason portrays a different aspect.
The added slice is not a steel barrier, it is just cheese, and like all of the other slices it could have holes in it. From a numerical view the more slices the better, but if there is a common origin of the holes this could apply to all slices simultaneously including the new one.

Some holes may be permanent resulting from latent factors, or they may open and close as opportunity for error or ‘the unsafe act’ arises (context / situation dependent).
The latter are predominantly the holes due human aspects (error, unsafe acts) and are always present even if not in the same place. Defences for these may involve complex solutions – its like trying to hit a moving target; however, this is still a necessary task as HF training can reduce error, e.g. situations might be better recognised and behaviours shaped to avoid the unsafe acts - an appropriate course of action is chosen - improved decision making.

Prof Reason argues that a more effective defence is to rectify the organisational / management aspects (latent issues – static holes) as these may apply to many or all of the slices. Also, because these issues might be identified more easily. Some may involve higher organisational aspects such as regulation or the wider operational system (industry norms, culture).
The solutions for these differ from the operational HF aspects which tend to focus on ‘the last slice’ and the situation in which an accident can occur (the sharp end / outside of the cheese); the management aspects require a rigorous process equivalent to financial management, and the full participation of management to enact the changes (HF - high level management behaviour, decision making, safety culture).

Overall the Swiss Cheese model shows that we need to reduce the number of permanent holes and reduce the frequency / opportunity of the variable ones. It is not so much the number of defences that are provided, but the quality of the defence that is important. ‘Depth’ can be used to describe the quality of an item such as wine or …. cheese.


Connecting SMS with this discussion; SMS is a major ‘top down’ safety tool, often associated with Reason’s organisational issues.
It is not the presence of a SMS that provides safety; it is the quality of the SMS that matters - the way in which it is implemented – practical operational use.
A major problem for the working industry is that ICAO and Regulators often promote high level initiatives – the ‘overarching Philosophy’ or ‘Policy’, without fully considering the implementation and practical use (how to …); these are the required Procedures and Practice to complete the ‘PPPP’s of SMS. Every new initiative is a problem for the industry; if you give someone a problem then you should provide guidance or an example solution.
Many of the new safety initiatives aimed at maintaining / reducing the low accident rate are depicted as Management Systems involving processes or models that loop back on themselves e.g. TEM and SMS. The important aspects in these are the points that enable a breakout from the loops; what do management do when a hazard is identified, what are the new defences, where are they and when; how do crews identify threats, errors, and what practical activities are required to defend against them – is the knowledge and thinking skills training being provided.

The Swiss Cheese model is simpler than these modern processes – its a straight line path. Originally it may have aided accident investigation and thus was a tool of hindsight. However, the way in which we learn and gain experience is by reflecting on the components of our hindsight, components that may be identified as the holes or potential holes in the cheese.

If thin cheese slices are heated they tend to stretch and leave more holes, but thicker cheese slices (or higher quality cheese) spreads out covering everything with a ‘defensive’ blanket.

The industry is under pressure to maintain its good safety record, the heat is being turned up, thus we need thicker, more resilient cheese slices, or better quality cheese in our defences.

Revisiting the Swiss Cheese model of accidents. (www.eurocontrol.int/eec/gallery/content/public/documents/EEC_notes/2006/EEC_note_2006_13.pdf)

What do you think Grommet?

frontlefthamster
10th Jan 2008, 19:41
Nope, we need to throw away swiss cheese, because it's only applicable in the aftermath of an incident or accident (and they are now too rare), and we need to move on to better models such as Syd Dekker's, which don't extend Barry Sweedler's principle to safety improvement. :\

alf5071h
10th Jan 2008, 19:58
frontlefthamster, I am not familiar with Dekker’s ‘better model’. This might only be another model like Swiss Cheese or the lesser models of TEM and SMS.
Although Dekker enlightens us with ‘a new view of human error” I haven’t seen anything approaching the type of practical guidance which Reason gives (what to fix and why) …. and his work has stood the test of time with implemented across several industries.

Furthermore Reasons work on organisational error, based on ‘a few accidents’ in hindsight provides the industry with proactive solutions – thing to do. Some might argue that this work was the precursor of SMS.
Reason views safety as having two aspects. A negative one associated with accidents, and the positive – “a systems intrinsic resistance to its operational hazards”. Think proactive, think positive.

An example of Dekker’s model / solution please.

Human Error and organisational failure – Reason. (http://qhc.bmjjournals.com/cgi/content/full/14/1/56)
Diagnosing vulnerable system syndrome - Reason. (http://qhc.bmjjournals.com/cgi/reprint/10/suppl_2/ii21)

sweeper
10th Jan 2008, 20:02
certain types of accidents e.g."landing long when windy/wet/:uhoh: slippery" seem to be keeping their numbers up

frontlefthamster
10th Jan 2008, 20:44
Alf,

Syd's less published than Jim is...

But, you might like to whet your appetite with this interesting account:

http://www.safetyfromknowledge.com/pdf/FOCUS%20Paper%20on%20Human%20Error.pdf

... or read his book...

http://www.amazon.com/gp/product/0805847456/ref=cm_cr_pr_product_top

...or dip out of aviation into healthcare and read another perspective...

http://www.longwoods.com/product.php?productid=18448&cat=452&page=1

I know which approach I find more satisfactory. I also know that Jim makes a lot of money by pushing his approach into many industries. I just don't, honestly, believe it's the way ahead...

But, you asked about models and practical solutions. Any decent method will give you the oportunity to improve, but from which perspective? Regulator? Investigator? Manager? Owner/shareholder? That's the key. I don't believe that Syd will claim to have all the answers; he'll just give you a big help once you're facing in the right direction. If Jim is a McDonald's drive-'thru', Syd is the really nice Italian place just off the high street with the fresh parmesan...

alf5071h
10th Jan 2008, 22:22
frontlefthamster, thanks.
Dekker’s ideas, papers, and books are refreshing, and bring a much need catalyst for change in our views of human error, but they too are based on accident investigation and the explanation of error.
I understood your point to be that because the Swiss Cheese model provides a retrospective (aftermath) view of accidents it is unable or is not very good for providing solutions for maintaining the good safety record.
The Swiss Cheese model is just that, a model, it can be used to aid accident investigation, but also used in other ways to help the industry. It is simple enough for the average operator to understand and thus a basis for communicating a safety message. Alternatively its theoretical complexity enabled the industry to see the importance of organisational contributions to accidents and from those seek solutions. Also, its flexibility lends itself to explaining what should be done such as defend in depth and identifying the opportunities for error (latent conditions). It has critics, their points are reviewed in the link (#35).

The industry has a good safety record and thus it requires exceptional effort or extraordinarily potent initiatives to maintain it. The Swiss Cheese model is not a complete answer, but it can enable individuals, operators, management, and regulators an opportunity to visualise the components of ‘safety’ which have to be addressed.

sweeper."landing long when windy/wet/slippery". Yes a major problem, this thread Avoiding an overrun: what should be trained? (www.pprune.org/forums/showthread.php?t=306748) seeks some solutions.
Overruns appear to be an increasing threat and might justifiably warrant special attention. Recent incidents and accidents could be examined with the Swiss Cheese model and from the results solutions proposed, but this I believe is exactly what some of the investigators have already done, now it is up to us to identify the defences and ensure that they are in place / used.

frontlefthamster
11th Jan 2008, 09:22
To my mind, the two most significant problems with Jim's cheese are:

1. We know almost nothing about the 'slice' which we might label 'good fortune'. Very often, it's the most siginificant slice. :cool:

2. Big accidents happen at extraordinarily low levels of probability, when all sorts of weird things have happened. They are often a very poor cue to where and when the next big accident will be, whereas an holistic (and I am being careful not to use that word in a crass sense) approach based on Syd's principles will yield better chances of identifying and eliminating future big accidents. Note that there are exceptions to this, but they're generally very obvious, such as poor alerting systems on certain aircraft types, and operations on poor runway surfaces in demanding environmental conditions. :sad:

alf5071h
14th Jan 2008, 00:39
The originating report indicates that the accident rate has reduced. It would be interesting to look at those features which contributed to this and consider if they indicate how our future activities might be shaped.
Undoubtedly, technology has improved safety, with reliable engines, structures, and systems; there have been similar improvements in ground facilities.
More recently, technology has targeted the reduction of human error, e,g, EGPWS, ACAS, aircraft envelope protection, crew workload. In parallel with this, training for human-factors has been introduced.
It could be assumed that during the reporting period many accidents were due to human error; with further assumption the human contribution to accidents during past years has not changed significantly as indicated indicted by several current studies – “80% of all accidents involve human factors”.
Thus for the future do we continue to apply technology and maintain the training effort on human factors. On current evidence, the latter might be a program of diminishing returns – the human factor in accidents cannot be completely eliminated. Further advancements in technology could protect humans from themselves or even replace them in many critical activities, but some source of error would inevitably remain.
What is the outlook for safety in the industry?
We must recognise that these systems are nearing the end of their life, and should not be placed off-balance by requiring operations to take place within unreachable performance and safety objectives. …

Optimum safety is achieved through the careful monitoring and the tolerance of a minimum number of incidents. The message is twofold:
• It is essential to fully understand system behaviour.
• Such a system must be treated with methods allowing it to remain simultaneously at the edge of safety and at a sufficient performance and competitiveness level to resist market constraints.
From The paradox of ultra-safe systems. (www.casa.gov.au/fsa/2000/sep/FSA58-60.pdf)