Go Back  PPRuNe Forums > Flight Deck Forums > Rumours & News
Reload this Page >

Air accidents at record low

Wikiposts
Search
Rumours & News Reporting Points that may affect our jobs or lives as professional pilots. Also, items that may be of interest to professional pilots.

Air accidents at record low

Thread Tools
 
Search this Thread
 
Old 8th Jan 2008, 00:48
  #21 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
PJ2, I agree with much of your recent posts.
However, one minor amendment; surely Don Bateman would be preferred to be described as an ‘Engineer’ and not as having a ‘chair in academia’. His achievements are most laudable and have contributed a high proportion of the reduced accident rate, but his work is more than designing and sitting back. He has pursued the avoidance of CFIT relentlessly, over many years challenging the FAA’s objections, encouraging ICAO and FSF with their education programs, and harassing manufacturers and operators to install the equipment. The momentum for safety has been maintained with new and insightful technology, presentations, and publications; every pilot should acquire a copy of Terrain II and ‘From Take off to Landing’ (out of print).

The appropriate use of technology will be a major aid to maintaining a low accident rate. Automated responses must be considered as the ability of human factors training involving knowledge and behavior reach either a cost effective or a self limiting plateau: – a possible reason why pilots do not pull up after a TAWS warning?
However, there is still room for improvement in the organizational human-activities, why don’t all manufactures fit the latest technology or update existing systems, why don’t operational management buy the technology, invest in training, or publish procedures. Financial constraint is probably a major influence as are those international cultural differences cited previously.

The need to reduce the number of accidents as the industry expands in order to maintain or lower the current accident rate, is based on the premise that the public will not accept any increase.
I wonder if any one has considered that this might be incorrect. Consider some of the ideas of Charles Perrow – ‘Normal Accidents’.
  • A normal accident typically involves interactions that are "not only unexpected, but are incomprehensible for some critical period of time."
  • A normal accident occurs in a complex system, one that has so many parts that it is likely that something is wrong with more than one of them at any given time.
Taking these points to extreme, then safety efforts will reach maximum effectiveness, accidents will increase – will this be acceptable? Intuitively we might answer no, but then consider road accidents and the public’s perception of these.
I don’t have any suggestions; I support current technological and human based safety initiatives, but that might be because I haven’t thought about the problem or projected – what if – far enough ahead.
alf5071h is offline  
Old 8th Jan 2008, 01:50
  #22 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
alf5071h;

Yes, I think Don Bateman would prefer "engineer" and the example, reading it again, was a bit far-fetched. After I posted I thought of all the engineers who worked on the Airbus (for example) and they're cut from the same cloth as Mr. Bateman is so the example is a bit stretched.

I am well aware, (from your posts), that you're familiar with all this, but for the purposes of a bit more discussion for others... - The industry has largely conquered the historical causes of accidents such as mechanical failure, navigational error, weather, collision/CFIT and ATC issues - all those areas which, through technical achievment, have built safety into today's aircraft. The areas which remain the largest single factor in accidents is the human factor and it is in these areas that "academics" can contribute best. Such learning results not in "improved turbine blade design", etc etc, but in safety policies which, through academic research, have shown where the human weaknesses and strengths are. In fact I would submit that pilots in and of themselves, (unless trained or have taken up the field out of interest), are not very good at "human factors" as it is not their primary bread-and-butter. Further, the pointy-end is not the place where policy is, or should be created and set, (even though airlines will routinely leave a crew out on a limb "making up policy" where there should be an overarching one in place - another topic entirely). Such protective layering comes from the academic research of, and I think I am on safer ground here, people like Jim Reason, (and earlier proponents of organizational accident theory such as Heinrich), Robert Helmreich, who come readily to mind, they being among thousands of like-minded individuals who's expertise lies well beyond the cockpit and who provide academically-based guidance material for the harder-wired safety programs such as FOQA/FDA, LOSA and even ASR programs.

The point of the post being of course, that academic research has it's place and that a dismissive attitude towards such programs is harboured at one's, as well as an organization's risk. The levels of safety achieved in the industry are not only from good and clever design from the engineers but from those who take human factors seriously, those factors being best expressed (made visible) in detailed data programs such as those under discussion.

The notions expressed in Diane Vaughan's "The Challenger Launch Decision; Risky Technology, Culture and Deviance at NASA", and the work done on Columbia, "Organization at the Limit - Lessons from the Columbia Disaster" have resonance in airline work. Although far, far less risky, the human factors such as the "normalization of deviance" (normal accidents) are the same in any organization, a fact which continues to be lost in the white noise of cost-control and profit... - another topic.

lomapaseo, the idea was not to monopolize the dialogue and drive out others thereby, (although I hardly think I qualify as having the power, that power residing only in one's choice to participate), but to continue an interesting and valuable point regardless of how it was originally conceived. I have had thoughts along the lines expressed by flh not because I think that such programs are as was described but because I think that a continuous examination of one's assumptions even in the face of broad acceptance, is valuable and I thought this was an opportunity to do so. Sorry it took the thread off topic...

best,
PJ2

Last edited by PJ2; 8th Jan 2008 at 04:37.
PJ2 is offline  
Old 8th Jan 2008, 06:28
  #23 (permalink)  
 
Join Date: Jul 2000
Location: London
Posts: 1,256
Likes: 0
Received 0 Likes on 0 Posts
Every accident has, as a contributory factor, human error.
Human error is normal. It is how we learn, amongst other things.
Accidents are normal.
As a preventative measure, a knowledge of human factors is vital.
The only purpose of investigation is to prevent the next accident.
At the moment the best approach is defence in depth.
The more defences the better.

Signed 4Greens

Ex Operator and Academic
4Greens is offline  
Old 8th Jan 2008, 08:30
  #24 (permalink)  
 
Join Date: Nov 2007
Location: Down South
Posts: 98
Likes: 0
Received 0 Likes on 0 Posts
Succinctly put and a return to where I came in. The surest way to see the numbers go back the wrong way is to ignore all the human elements that still cause accidents.

So, I stick with my observation that tired, poorly trained pilots working for companies that cut corners remain the vulnerable area. Just because UK/EU/US hasn't had a major accident for a while doesn't mean we can relax.

An ex colleague with a large regional carrier tells of v poor standards right here and now. Let's hope the holes remain non aligned.
Southernboy is offline  
Old 8th Jan 2008, 14:22
  #25 (permalink)  
 
Join Date: Jan 2008
Location: south east asia
Posts: 1
Likes: 0
Received 0 Likes on 0 Posts
thats good news

Hope it continues to be that way. Although, we can never be relax about it because fatigue can set in without a warning and applying the wrong procedure is inevitable if you're too tired to think.
DODO97 is offline  
Old 8th Jan 2008, 19:10
  #26 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
PJ2, re “The areas which remain the largest single factor in accidents is the human factor and it is in these areas that "academics" can contribute best.”
Our opinions appear to diverge at this point.
Human involvement is the largest factor in accidents, but I am not convinced that academics can provide the best contributions for improving safety.
There are and will be notable exceptions, but in very general terms even where the academics identify critical areas affecting safety they fail to deliver practical solutions, e.g. Helmreich defined CRM as “the application of human factors’, but he has not provided a good workface application / implementation of CRM (or is TEM/LOSA another attempt).

The dependence on academics is perpetuated by the regulators; IMHO there are very few ‘HF’ experienced pilots in administration and the regulator/researchers, although domain focussed are often years behind industry’s needs. Even when using basic academic research, only that which is proven is used, thus it may be quite old or based on dated operational information in a rapidly changing world.
The problem is two fold, first the regulators believe that they can regulate safety with ‘laws’, and secondly, that they can apply the academic output directly. Whereas safety (the maintenance of the good safety record) actually requires a good working partnership with industry (combined safety culture) involving a two way dialogue about problems and solutions, and wide ranging guidance materials to aid implementation of the regulations. Again there are exceptions.

Some of the better training / safety applications originate from the pilot-academics (not necessarily academic-pilots), but the most effective people are experience pilots who can ‘translate’ applicable research and provide practical applications. These and similar people with knowledge and experience in the industry are able to identify the most important safety issues. Although these issues may be identified by the academics they often remain hidden in theoretical general principles or as unproven hypotheses. Then there are many subjects where the academics do not agree, e.g. decision making is a critical safety area in aviation, yet only recently has industry is been appraised of ‘Naturalistic Decision Making’ which represents what actually happens in operational situations. Not all academics agree with the mainstream NDM theories, but from those who do, the descriptions of their work is best translated as ‘aspects of airmanship’, so the academics now tell us that we need to teach airmanship.
I am sure that there are similar examples from TEM and LOSA – new found safety initiatives, that when deciphered from academic/regulatory language matches what many experienced pilots have been doing for years. The problem then is that of educating – training, time, and money; thus these are the emerging threats to the good safety record.

As stated in previous posts the industry requires defences in depth, but which are the best defences, where do we put our scant resources, can the operators withstand any more initiatives / regulations?
Perhaps the academics could answer these questions, but even the astute James Reason did not provide guidance(how to) when he said that it is now time to ‘pick the high fruit’ (the difficult to reach targets) if safety was to be improved; unless of course he was referring to operator management or even the regulators.
alf5071h is offline  
Old 8th Jan 2008, 21:41
  #27 (permalink)  
 
Join Date: Jul 2000
Location: London
Posts: 1,256
Likes: 0
Received 0 Likes on 0 Posts
Aviation degree programmes, on one which I lecture, aim to readdress this missmatch.
4Greens is offline  
Old 8th Jan 2008, 22:56
  #28 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
alf5071h;

Some of the better training / safety applications originate from the pilot-academics (not necessarily academic-pilots), but the most effective people are experience pilots who can ‘translate’ applicable research and provide practical applications.
I think we're closer in agreeing than my posts may indicate.

The gulf between academic research, production and the conveyance of same in practical application for the industry is a significant impediment to progress, not to mention the other significant issue which is, how to divide the paltry resources typically dedicated towards such applications.

For example, I've read the work on HFACS and listened to the authors' presentations but I find it difficult to apply from a crew's point of view even though I know the authors desired to make the system entirely practical to daily ops. The difficulty is in translating the understandings into programs which make a difference. The greatest difficulty is in convincing the financial people that such programs are valuable, as you say. Also, after all this time during which Reason's work has established itself, coming to terms with the swiss cheese model as it may apply to discerning what is and is not a weakness in a safety system so that already-thin resources can best be directed, is not a straightforward organizational task.

To me this is where SMS fails badly, as almost nobody in a flight operations department really understands safety and are thus prone to all manner of fads and strange ideas as well as being prone to ignorance of the true threats. In training sessions, (annual recurrent safety training), the swiss cheese poster was always posted and people always talked about the holes as is sometimes blythely trotted out here, but rarely with any true understanding. We would discuss accidents and causes with the idea that personnel behaviour would change but we were never allowed to discuss organizational causes because the facilitators had a script and the topic wasn't in on it, nor were the answers to the questions.

I think your term "pilot-academic" vice "academic-pilot" is an important distinction and one which, with a couple of reservations, I agree as well. The ability to take the long-learned skills and experience out of the cockpit and translate them into research and then convey findings in a way that makes sense to line crews as well as operations people is a key aspect here and is rare, as you say. Rather than repeating one key statement, I'll quote it as again, I fully concur:
Whereas safety (the maintenance of the good safety record) actually requires a good working partnership with industry (combined safety culture) involving a two way dialogue about problems and solutions, and wide ranging guidance materials to aid implementation of the regulations. Again there are exceptions.
SMS is in and of itself a superb change in traditional approaches to safety, in part removing the notions of blame and enforcement from the mix and requiring that data be collected and examined for trends. (That such collected data remains unprotected from non-aligned interests such as lawyers etc is a related matter only). Few operational people understand SMS however and, as I have posted elsewhere, think that safety is now "everybody's business" believing that safety is ensuring that others don't "run with knives" while leaving even further behind the larger institutional programs which, for better or worse, have resulted from academic research and implementation.
PJ2 is offline  
Old 9th Jan 2008, 00:36
  #29 (permalink)  
 
Join Date: Jul 2000
Location: London
Posts: 1,256
Likes: 0
Received 0 Likes on 0 Posts
I reiterate. A number of our graduates are now serving in airlines. Their syllabus included Flight Safety and CRM both in theory and practice. SMS was of course also taught.
I am convinced that this is the way to improve safety by bridging the gap between academia and the coal face.
4Greens is offline  
Old 9th Jan 2008, 06:00
  #30 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
4Greens;

I have put forward some fairly definitive views on SMS and its potential effectiveness. Others in Canada have testified along the same lines at a Senate hearing early last year. I would be interested in your views from your vantage point, on the practicalities of SMS based upon the criticisms I and others have offered.

It is reasonable to accept, in theory, SMS as a positive step forward providing it is fully embraced and resourced by airlines but I don't believe it is, and as such cannot work in practise. I would be interested in hearing otherwise, and why.

Cheers,
PJ2
PJ2 is offline  
Old 9th Jan 2008, 06:19
  #31 (permalink)  
 
Join Date: Jul 2000
Location: London
Posts: 1,256
Likes: 0
Received 0 Likes on 0 Posts
PJ2,

SMS is a good approach but it still requires a competent Regulator to audit the operators. In other words it is not the complete answer but then nothing is.

Defence in depth is always the way. SMS, HF, Audit, FOQA, good training, maintenance etc. Last but not least, guard against complacency.

Rgds 4Greens
4Greens is offline  
Old 9th Jan 2008, 15:31
  #32 (permalink)  
 
Join Date: Apr 2002
Location: FUBAR
Posts: 3,348
Likes: 0
Received 0 Likes on 0 Posts
Grrr

What these statistics overlook is that the difference between an incident and a (big) accident are just down to luck. Read the safety review for the year in (for example) Flight International and you will always identify a few very very near misses that would be enough to throw the statistics totally the other way. Last January for instance a F100 had a take-off incident from Pau, with the sole casualty being the unfortunate car driver who was squashed after aforesaid Fokker took-off, stalled and crashed back onto and overran the departure runway. The difference in outcome between this, and a very similar fatal accident a few years ago involving Macedonian Airways is purely luck. The AF and Iberia A340 overruns could similarly have finished very badly. Counting bent airframes can also be useful when you assess safety , but that also fails to draw your attention to the near misses.The difference between an incident and a fatal accident unfortunately owes as much to luck, as it does to skill, training or experience.
On that note I wish you all, a Happy and Lucky New Year
captplaystation is offline  
Old 9th Jan 2008, 16:41
  #33 (permalink)  
 
Join Date: Jan 2005
Location: W of 30W
Posts: 1,916
Likes: 0
Received 0 Likes on 0 Posts
That event is not from 2007, but it is just to emphasize the precedent posting.
CONF iture is offline  
Old 10th Jan 2008, 00:00
  #34 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
alf5071h;

Further to my response and in the light of the discussion, I would to add a comment from the D&G PPRuNe thread posted by "spanner90", post #100 at (http://www.pprune.org/forums/showthr...=307552&page=5);

I'm sure everyone who has completed HF training will remember, that if a potential failure path is identified, you should put in place another line of defence. In other words, if the holes in the cheese look like they might line up, put another slice of cheese in.
I think that poster said it better than I...an identified HF problem where there is no technical fault involved demands as a response, a line of defence. Such defence can be a change in existing procedures, a new procedure or a new rule or SOP.

I would only add that such a line of defence from an identified problem should be a matter of overall Flight Operations/Flight Safety policy and not attached to any one individual or position as is the case many times, even in large organizations; - (The former installs the procedure for all time and all employees, the latter is followed only as long as that person or that position is around).
PJ2 is offline  
Old 10th Jan 2008, 19:36
  #35 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
PJ2, defending against an identified failure path is good custom, but a slice of cheese is not necessarily a line of defence.
“Put another slice of Cheese in” is a common view of the Swiss Cheese model, however IMHO Prof Reason portrays a different aspect.
The added slice is not a steel barrier, it is just cheese, and like all of the other slices it could have holes in it. From a numerical view the more slices the better, but if there is a common origin of the holes this could apply to all slices simultaneously including the new one.

Some holes may be permanent resulting from latent factors, or they may open and close as opportunity for error or ‘the unsafe act’ arises (context / situation dependent).
The latter are predominantly the holes due human aspects (error, unsafe acts) and are always present even if not in the same place. Defences for these may involve complex solutions – its like trying to hit a moving target; however, this is still a necessary task as HF training can reduce error, e.g. situations might be better recognised and behaviours shaped to avoid the unsafe acts - an appropriate course of action is chosen - improved decision making.

Prof Reason argues that a more effective defence is to rectify the organisational / management aspects (latent issues – static holes) as these may apply to many or all of the slices. Also, because these issues might be identified more easily. Some may involve higher organisational aspects such as regulation or the wider operational system (industry norms, culture).
The solutions for these differ from the operational HF aspects which tend to focus on ‘the last slice’ and the situation in which an accident can occur (the sharp end / outside of the cheese); the management aspects require a rigorous process equivalent to financial management, and the full participation of management to enact the changes (HF - high level management behaviour, decision making, safety culture).

Overall the Swiss Cheese model shows that we need to reduce the number of permanent holes and reduce the frequency / opportunity of the variable ones. It is not so much the number of defences that are provided, but the quality of the defence that is important. ‘Depth’ can be used to describe the quality of an item such as wine or …. cheese.


Connecting SMS with this discussion; SMS is a major ‘top down’ safety tool, often associated with Reason’s organisational issues.
It is not the presence of a SMS that provides safety; it is the quality of the SMS that matters - the way in which it is implemented – practical operational use.
A major problem for the working industry is that ICAO and Regulators often promote high level initiatives – the ‘overarching Philosophy’ or ‘Policy’, without fully considering the implementation and practical use (how to …); these are the required Procedures and Practice to complete the ‘PPPP’s of SMS. Every new initiative is a problem for the industry; if you give someone a problem then you should provide guidance or an example solution.
Many of the new safety initiatives aimed at maintaining / reducing the low accident rate are depicted as Management Systems involving processes or models that loop back on themselves e.g. TEM and SMS. The important aspects in these are the points that enable a breakout from the loops; what do management do when a hazard is identified, what are the new defences, where are they and when; how do crews identify threats, errors, and what practical activities are required to defend against them – is the knowledge and thinking skills training being provided.

The Swiss Cheese model is simpler than these modern processes – its a straight line path. Originally it may have aided accident investigation and thus was a tool of hindsight. However, the way in which we learn and gain experience is by reflecting on the components of our hindsight, components that may be identified as the holes or potential holes in the cheese.

If thin cheese slices are heated they tend to stretch and leave more holes, but thicker cheese slices (or higher quality cheese) spreads out covering everything with a ‘defensive’ blanket.

The industry is under pressure to maintain its good safety record, the heat is being turned up, thus we need thicker, more resilient cheese slices, or better quality cheese in our defences.

Revisiting the Swiss Cheese model of accidents.

What do you think Grommet?

Last edited by alf5071h; 10th Jan 2008 at 20:00.
alf5071h is offline  
Old 10th Jan 2008, 19:41
  #36 (permalink)  
 
Join Date: Dec 2007
Location: France
Posts: 481
Likes: 0
Received 0 Likes on 0 Posts
Cool

Nope, we need to throw away swiss cheese, because it's only applicable in the aftermath of an incident or accident (and they are now too rare), and we need to move on to better models such as Syd Dekker's, which don't extend Barry Sweedler's principle to safety improvement.
frontlefthamster is offline  
Old 10th Jan 2008, 19:58
  #37 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
frontlefthamster, I am not familiar with Dekker’s ‘better model’. This might only be another model like Swiss Cheese or the lesser models of TEM and SMS.
Although Dekker enlightens us with ‘a new view of human error” I haven’t seen anything approaching the type of practical guidance which Reason gives (what to fix and why) …. and his work has stood the test of time with implemented across several industries.

Furthermore Reasons work on organisational error, based on ‘a few accidents’ in hindsight provides the industry with proactive solutions – thing to do. Some might argue that this work was the precursor of SMS.
Reason views safety as having two aspects. A negative one associated with accidents, and the positive – “a systems intrinsic resistance to its operational hazards”. Think proactive, think positive.

An example of Dekker’s model / solution please.

Human Error and organisational failure – Reason.
Diagnosing vulnerable system syndrome - Reason.

Last edited by alf5071h; 10th Jan 2008 at 20:28. Reason: links
alf5071h is offline  
Old 10th Jan 2008, 20:02
  #38 (permalink)  
 
Join Date: Jan 2000
Posts: 46
Likes: 0
Received 0 Likes on 0 Posts
certain types of accidents e.g."landing long when windy/wet/ slippery" seem to be keeping their numbers up
sweeper is offline  
Old 10th Jan 2008, 20:44
  #39 (permalink)  
 
Join Date: Dec 2007
Location: France
Posts: 481
Likes: 0
Received 0 Likes on 0 Posts
Cool

Alf,

Syd's less published than Jim is...

But, you might like to whet your appetite with this interesting account:

http://www.safetyfromknowledge.com/p...an%20Error.pdf

... or read his book...

http://www.amazon.com/gp/product/080...pr_product_top

...or dip out of aviation into healthcare and read another perspective...

http://www.longwoods.com/product.php...cat=452&page=1

I know which approach I find more satisfactory. I also know that Jim makes a lot of money by pushing his approach into many industries. I just don't, honestly, believe it's the way ahead...

But, you asked about models and practical solutions. Any decent method will give you the oportunity to improve, but from which perspective? Regulator? Investigator? Manager? Owner/shareholder? That's the key. I don't believe that Syd will claim to have all the answers; he'll just give you a big help once you're facing in the right direction. If Jim is a McDonald's drive-'thru', Syd is the really nice Italian place just off the high street with the fresh parmesan...
frontlefthamster is offline  
Old 10th Jan 2008, 22:22
  #40 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
frontlefthamster, thanks.
Dekker’s ideas, papers, and books are refreshing, and bring a much need catalyst for change in our views of human error, but they too are based on accident investigation and the explanation of error.
I understood your point to be that because the Swiss Cheese model provides a retrospective (aftermath) view of accidents it is unable or is not very good for providing solutions for maintaining the good safety record.
The Swiss Cheese model is just that, a model, it can be used to aid accident investigation, but also used in other ways to help the industry. It is simple enough for the average operator to understand and thus a basis for communicating a safety message. Alternatively its theoretical complexity enabled the industry to see the importance of organisational contributions to accidents and from those seek solutions. Also, its flexibility lends itself to explaining what should be done such as defend in depth and identifying the opportunities for error (latent conditions). It has critics, their points are reviewed in the link (#35).

The industry has a good safety record and thus it requires exceptional effort or extraordinarily potent initiatives to maintain it. The Swiss Cheese model is not a complete answer, but it can enable individuals, operators, management, and regulators an opportunity to visualise the components of ‘safety’ which have to be addressed.

sweeper."landing long when windy/wet/slippery". Yes a major problem, this thread Avoiding an overrun: what should be trained? seeks some solutions.
Overruns appear to be an increasing threat and might justifiably warrant special attention. Recent incidents and accidents could be examined with the Swiss Cheese model and from the results solutions proposed, but this I believe is exactly what some of the investigators have already done, now it is up to us to identify the defences and ensure that they are in place / used.
alf5071h is offline  


Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.