PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 26th Jun 2011, 19:04
  #156 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
BOAC et al, this is an excellent thread, but like that of AF447, the search for meaningful understanding and solutions to a complex problem often leads to repetition and entrenched thinking.
However, by revisiting ‘the two tasks’ in # 1, it may be possible to have a deeper, although more conceptual view of the issues; I offer the following:
  • … the manufacturer/regulators/operators to ensure something usable remains, …
If we remove AF447 and its speculative aspects from the wider view of current safety, then the existing requirements appear to be satisfactory. Where technical issues appear to dominate accidents, regulatory interpretation and/or the operational implementation (human factors) also contribute serious weaknesses, e.g. 737 Rad Alt, A320 Congonhaus, MD80 TOCW.
Other accidents almost exclusively involve the use, the application, of what equipment/knowledge ‘remains’ (or is normally available); LOC, disorientation, overrun.
  • … a change in the philosophy and application of training and recurrent testing.
This task reflects the problems of applying what ‘remains’, what is normal (as above). One solution proposed so far is what I have described as ‘more of the same’ (blame and train), and which other posts have described as specific changes in education, training, checking, and operation. The report in the link @ #146 follows the same theme.

However, with a conceptual view, I suggest that these solutions are only treating the symptoms of a much deeper problem. An obvious candidate is the increasing use technology, but not to discard interrelated aspects of human behaviour and the overall operational ‘system’.

Technology / automation may encourage complacency in operational, organisational, design and regulatory judgement; we are assuming too much, there is technological bias in our risk management.
Not that the human is lazy, but we do like to be efficient; high in trust, and making many (often undisclosed) assumptions.
We depend on automation, and in the extreme may believe that automation can replace the unique human ability to think. We no longer practice ‘old skills’ associated with understanding (situational awareness) because the required level of ‘understanding’ is presented in suitable formats; EFIS, FMS, Autopilot/FD, but most of the modern human-machine interfaces are adequate for basic flying tasks.

At a regulatory level this search for efficiency might result in lower standards (old assumptions), allowing greater complexity in operations - crowded airspace, longer duty time, etc.
At the operator-management level, there is a lower calibre recruiting, reduced training, etc.
And at the sharp end … … what exactly is the problem with automation; not seeing emerging problems, not appreciating ‘change’, or the need to change our thoughts or actions; not being very thorough.

Much of the above comes from ‘The ETTO principle’, Efficiency - Thoroughness Trade-Off (Hollnagel), how we balance getting the job done vs cost, time, and resource. This involves the sharp-end, management, regulators, and designers.
  • AF 447; regulatory assumption that crew can fly pitch / power, delay in retrofitting pitots was acceptable, crew fly close Cbs because of route structure / other traffic – efficiency!
  • 737 Rad Alt, MD 80 TOCW; maintenance, fault reporting and rectification, lower regulatory standards – grandfather rights (assumption), – efficiency!
  • Disorientation; crew rush to engage autopilot, early turns, weak crosschecking, – efficiency!
  • Overrun; press-on-itis, approximate calculations, poor knowledge, – efficiency!
How do we balance our quest for efficiency in normal operations with thoroughness to maintain safety?
We need to “enhance [our] abilities to respond, monitor, anticipate, and learn” - Hollnagel The ETTO Principle.
alf5071h is offline