PPRuNe Forums - View Single Post - AF447 Thread No. 3
View Single Post
Old 3rd Jun 2011, 19:19
  #1277 (permalink)  
BJ-ENG
 
Join Date: Jul 2009
Location: SUSSEX UK
Age: 76
Posts: 57
Likes: 0
Received 0 Likes on 0 Posts
PJ2 & Gums - Nicely put..

Just to add some quotes from two publications well worth the read, and particularly relevant.

Nancy Leveson (Safeware - System safety and computers -1995)

Computers and other automated devices are best at trivial, straightforward tasks. An a priori response must be determined for every situation whereby an algorithm provides predetermined rules and procedures to deal only with the set of conditions that have been foreseen. Not all conditions are foreseeable however, especially those that arise from a combination of events, and even those that can be predicted are programmed by error-prone humans.

Sidney Dekker (The field Guide to understanding Human Error -2006 ) in his preface nicely sums up his 'New View' of human error as follows:

What goes wrong:

Human error is a symptom of trouble deeper inside a system. To explain failure, do not try to find where people went wrong. Instead, find how people's assessment and actions made sense at the time, given the circumstances that surrounded them.

How to make it right:

Complex systems are not basically safe. Complex systems are a trade-offs between multiple irreconcilable goals (safety and efficiency). People have to create safety through practice at all levels of an organisation.

Note: N Leveson is Boeing Professor of Computer Sc at the University of Washington. NASA advisor on the Shuttle software development process.

S Dekker is professor of human factors, School of aviation Lund University Sweden. He is also has experience as a pilot, type trained on DC-9 and A340.
BJ-ENG is offline