PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 3rd Sep 2010, 17:20
  #100 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
Peter, your provocative article #95, argues for improving safety by replacing the human with automation. For the moment, the human aspects in design and maintenance are put aside.

Technically, full automation might be possible, but as discussed in http://www.pprune.org/safety-crm-qa-...ml#post5889774 this would involve constraint.

The example military operations are constrained to specific airfields, tasks, and I suspect weather conditions. Would such constraints be acceptable to civil operations, or if not, what costs (practicality) will the industry / travelling public tolerate to achieve such idealised safety?

Accepting constraints might well improve safety; no precision guidance, no autolanding, no flights to that airport = safety. This theoretical argument concludes that is safer to stay on the ground than fly, or use other means of transport (which may not be as safe as aviation).

I would argue that when discussing automation, practicality has to be the foremost view. By all means consider academic theories, but don’t loose sight of the practicalities.
Perfect safety may only exist in theory; in practice it involves managing risk – “safety is the avoidance of unnecessary risk” - safety is a compromise.

Practical solutions for improving safety should come from identifying and avoiding risk, both strategically and tactically, and in planning and practice.
We have to define ‘unnecessary’, which is undoubtedly connected with the situation, both now and future, what is the goal, or objective; how do these change with time and task.

Automation (technology) I suggest, is not better than the human in these tasks, even with human weaknesses resulting in error. The currently accepted judgement is that the human (with current automation) meets the requirements for safety – the public perception (TM #97 !!!).

For the accidents cited, assuming human involvement, we need to understand why the human performance did not meet the requirements of safety, whereas the vast majority of similar operations have done – why are these accidents or apparently the human behaviour in them, different from daily operations.
With such understanding, from accident reports (not always forthcoming or of sufficient depth), then it might be possible to pursue a combination of man and machine, e.g. technology aided decision making, situation awareness, EGPWS like systems and auto pull-up, and LOC auto recovery, as a stepping stone to increased automation.
Perhaps a practical study of the human and the man-machine interface would be more worthwhile.
alf5071h is offline