PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 12th Jul 2011, 01:47
  #188 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
Lonewolf50, slick quips and quotes; good enough to debate.
But seriously;

Yes, skills degrade without practice, but the computers don’t fly upset recoveries (perhaps they should, and ASI failures), so what skills are degrading due to automation/computers, and are these relevant to system failures (man and machine) in current operations?

Prevention is a good place to start, nothing is perfect. Safety requires a combined effort, thus recovery (action during, and after the ‘upset’) is also necessary.

Failures of the man-machine interface, I would use the generic ‘malfunction’ not just failure. The problem-space consists of a combination of technology, human, and situation.


Generally those who argue for more training align with a ‘blame and train’ culture. However, your views appear to represent an alternative of addressing the man / machine aspects, particularly the man.

Well reasoned arguments indicate that the pilot can no longer be expected to fully understand the technical system, nor the designers accommodate the irrationalities of human behaviour or combination of technical failures, and neither, can understand the entirety of complex situations.

In operation, ‘malfunction’ happens and we expect the human (best placed and probably best equipped - brain power) to solve the problem, which primarily is to fly the aircraft. But this is not normal flying, not a normal situation, indeed it is a situation which some humans may have been unable or unwilling to foresee.
At great cost we could train for a wide range of scenarios with extensive knowledge requirements, yet never be sure that every situation has been considered or that knowledge would be recalled.

Some areas of the industry might consider themselves safe enough; the trade off between safety and economics is balanced. Thus their task is to maintain the status quo; this could be an aspect of technological complacency or the reality of a sufficiently safe system (public perception).

If the latter is true, then the primary safety task might be to avoid the ‘big one’. Apparently what we don’t know is if an accident is the ‘big-one’ or just an accumulation of many ‘small ones’, and if either situation has automation/technology as a root cause.
More likely as with previous accidents, it is a combination of man, machine, and situation; - complexity, which we are ill equipped to understand.
alf5071h is offline