PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 28th Jun 2011, 01:54
  #164 (permalink)  
alf5071h
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
Lonewolf_50 refers to ‘probabilistic’ regulation, which has served the industry well, but the method does not capture human activity except by assumption. Adverse human contributions, to some extent, have been mitigated by selection, training, and proficiency, but even so, these are still subject to basic human fallibility.
Most new designs overtly aim to guard against error, but a technology-driven complacent industry, ‘by counting numbers’, might have unwittingly accepted that the safety improvements apparently rooted in technology would mitigate even lesser human standards.
More likely, commercial pressures have argued for a stabilisation (containment) in the quest for ever higher levels of safety (by using technology); this might be a facet of an ‘almost totally safe transport system’ (Amalberti).

Tee Emm reiterates some of the sharp-end issues and in part identifies a solution “If automation dependency has you by the short and curly, then you have only yourself to blame.” A facet of self discipline perhaps?
This solution is still only treating a symptom as there are many situations where humans now have to depend on automation, e.g. RVSM, PNAV, because the industry has changed. Thus the availability of automation (and other technologies) has altered both the operating situation and having a choice in executing a task (auto vs manual). Furthermore, human nature biases individual assessment of capability – we think that we are better than we are; complacency, ‘we can do that when required’, etc, etc, - "repetition and entrenched thinking".

If as BOAC states, modern systems are ahead of (beyond) human capability, in not having the skills for failure cases, then the context of the failure should be avoided. But the context is driven by the perception of safety, the risk of encountering a situation – probability. Moreover, if this a public perception – a social perception, then the industry might have more to fear from the media than technology (cf nuclear industry).

Avoiding the context (operational situation) could involve either, or a selection of, highly reliable technical solutions (everything automatic), focused training, or changing the workplace / task.
IMHO it is not technology that is beyond human capability, it is the situations which the human has to face if technology fails, that demand too much; this is an artefact of the modern ‘system’ – the modern technological complacent industry.
Assuming that the problems are perceived as being severe enough to worry about (probability), then solutions may not emerge until the industry recognises that some situational demands are too great for the human, whether these individuals are at the sharp-end, in management or design, or regulators.

We cannot change the human condition. But we can change the conditions under which humans work”, Professor James Reason.

“… to really understand risk, we have no choice but to take account of the way people interpret events.” Professor Nicolas Bouleau in “To understand risk, use your imagination”, New Scientist, 27 June 2011.
alf5071h is offline