PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 4th Jul 2009, 19:49
  #20 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
BOAC A ‘big’ answer (well lengthy), not a big player; and not really an answer, more observations and questions.
Re #1 “… to ensure something usable remains…”
“… for the pilot fraternity to press hard for a change in the philosophy and application of training and recurrent testing.”

CS 25 (certification requirements for large aircraft) provisions ‘the something usable’.
Invariably, use of ‘the something usable’ is assumed (see the relatively new HF design requirements - CS 25 AMC 1302), but how can we guarantee that the crew will use it in all circumstances?
Many crews and operations do not understand the basis of certification and the assumptions therein; the result can be inappropriate training, poor SOPs, and unfounded concerns that may lead to inappropriate actions following a failure – ‘I know better’ attitude. Thus the associated problem is an understanding, which stems from knowledge – training – or more accurately education.
Due to the inherent limitations in human performance there is always the risk that crews will focus on trouble shooting and the reinstatement of the high-tech systems. This trait is perhaps more prevalent as the industry’s operations and training become technology dependent.

Our problem is that we are becoming technology ‘junkies’; no longer are we ‘children of the magenta line’, but we are developing into hardened technology addicts with all of the dependencies therein. This is partly a function of the techo-sociological world we live in; the initial schooling, play and relaxation, and the behaviours within other industries all around us – it (technology dependency) becomes our ‘expectation’.

An additional problem is that similar sociological, commercial and operational pressures, which generate the technological dependency, also affect the application of crew training and testing. Operators can elect to maintain a standard higher than the minimum specified by regulation; thankfully many do, but unfortunately this does not provide complete immunity from the random nature of accidents in a highly reliable industry.
Thus within training and testing, the problem could be associated with lowering of standards, application of the rules – (what can we get away with), i.e. falling industrial professionalism. This is often is reflected in personal attitudes to professionalism – airmanship, but also corporate culture.
Just because ‘your training system’ does not teach an aspect, should not negate self-improvement, even though we work in a high pressure time critical environment with ever increasing demands on ‘our’ spare time.

(FAST) A mid-1990’s study by a major manufacturer looked at accidents in which airplane systems were involved in an accident or where they could have prevented the event and did not. It was found that in approximately 70% of the accidents involving airplane systems, the original design assumptions were inadequate for the situation existing at the time of the accident due to changes in…
  • the aviation system
  • airplane operational usage
  • personnel demographics
  • evolving infrastructure or other considerations.
Thus current problems probably result from ‘change’ and ‘systematic complexity’, the ‘systems’ involving human activity. Complexity itself isn’t a problem; it’s the way we deal with complexity and the human interface, including understanding, need, objective, and mechanism of the ‘system’. In this respect we may be overregulated, – too many operational regulations, thus too complex to expect reliable implementation or to be correlated with certification regulations, e.g. certification claims alleviation for short duration ‘safe’ flight without airspeed, assuming that crew’s are adequately trained – is this always true – P2/P3 combination?
Can we see these changes in the complexity – are we looking? If seen, how are their importance assessed, do we choose an appropriate activity to combat any hazard?
We are a safe industry by most standards, but in safety there is no place for complacency – failures in looking, assessing, and deciding.

The industry depends on technology; in general, we created that need. The industry has yet to understand all aspects of technological dependency (accidents are often an unfortunate learning process), and individually we need to have greater understanding of the technology and the surrounding objectives and assumptions when we use it.
We have yet to learn to live with ‘aviation’ technology – we have to change, but in this change there may be more hazards. Combating these aspect requires thought and analysis – basic thinking.
For all of the human weaknesses, the human is still a powerful defensive aid – we can identify problems and deduce solutions. Do we teach pilots to think in these ways and adequately train them for the range of critical situations (stress, time dependent) which might be encountered?
Thus this problem is not only about technology, but the process of how to think – situation awareness, decision making – full circle back to airmanship, including skills and personal standards.

We are part of a self-generated complex system. In implementing technology, perhaps we have forgotten to look (think), or judge wider ranging aspects in the larger system which unfortunately may only surface with use – contact with aspects of the system, us - humans.

“No plan survives contact with the enemy” - ‘The principles of War’. Von Clausewitz

CS 25 Large Aircraft.
FAST Presentation.
alf5071h is offline