PPRuNe Forums - View Single Post - Automation Bogie raises it's head yet again
Old 18th Jan 2011, 16:23
  #152 (permalink)  
fdr
 
Join Date: Jun 2001
Location: 3rd Rock, #29B
Posts: 2,951
Received 856 Likes on 256 Posts
mental models: "You can observe a lot by watching."

Alf,

"Always go to other people's funerals, otherwise they won't go to yours."
Yogi.

Sorry if the malapropism is lost in translation from Mr Berra's comment of "you have got to be careful..."

The application of FMS, ND's etc changed the problems, as you say, for various reasons, not merely as a response to loss of SA, there being a slip between reality and the mental model of the operator, however in most incidents and accidents there is evidence of a loss of SA other than some structural failure events.

Advanced applied technology such as APFD/display systems and increased automation resulted in new ways of getting hurt, as humans didn't act as the designer predicted. Nor do the systems always. Removing the human from the control loop has also resulted in new failure modes, from increased detachment of the cognitive capacity of the individual to the system, acting to reduce SA in some occasions. The new opportunity for errors and slips change failure modes, such as MK's B742F OPT issues at Halifax. If humans are considered to be excessively prone to failure, then for a real mess just add computers. For all the frailty of the human in respect to system failure modes, they also remain the best last hope of intervention when conditions are not exactly as programmed by the... human in the design or automation processes.

Human conditioning is not immune from similar problems, an excessive expectation on the benefits of CRM, or a mis application of the concepts can just as easily result in target fixation, poor workload management and losses of SA. It is embarrassing to see a crew CRM a problem to perfection and then not implement a solution... just as it is pretty depressing to see a crew dealing with all the processes of CRM, EMC, and similar programs while running out of fuel in a holding pattern, or flying away from an airfield while on fire etc... The salient point remains in most cases, that any processes that reduce SA are not conducive to good health... including company SOP's, say, that cause a saturation of the crew at a critical time of operation, such as EK's pre departure routines, or KAL's pedantic application of pre takeoff reminders... or large legacy carriers operating an advanced cockpit variant in the same manner as a legacy system to "standardise".

"...And I for winking at your discords too
Have lost a brace of kinsmen: all are punish'd".

Wm Shakespeare, (1564 -1616) Romeo & Juliet, Prince, Scene III

Management are accountable for the unintended consequences of ill considered process changes, both legally and morally.

My views may be considered to be depressing, or not. Human failure tends to be the result of (IMHO) loss of SA in the main, even a large percentage of violations result from the individual achieving a state of loss of SA due to the extent of the deviation undertaken. I would think that it is a point of some hope for improved safety as it follows that processes, procedures, practices and design of systems that promote SA are going to generally be advantageous, but more specifically, the capacity of the individual to be trained in heightened SA awareness, and to be at the very least alerted to the precursors and indicators of SA loss is a practical program. This is not warm fuzzy stuff; the effect of simulating conditions where a student ends up with a substantial slip in their SA model can be fairly traumatic.

System safety does improve to some extent with the application of emerging technology that is developed as a result of the detailed analysis of system behaviour. The system demands also evolve and so the desired performance level to achieve an acceptable risk level is also an ever changing target.

PBL; thanks as always for adding qualitatively to these discussions on this forum.
fdr is offline