PPRuNe Forums - View Single Post - AF 447 Thread No. 10
View Single Post
Old 24th Mar 2013, 21:33
  #1061 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
franzl;

The notion of "amoral calculation" was discussed in Diane Vaughn's work, "The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA". The notion expresses the common worry and perception that "bean-counting, profit-seeking managers" will drive instrumental priorities in spite of clear evidence of outcomes in terms of deviance, failure, incidents and accidents.

Vaughn found that contrary to her beginning premise (that managers at NASA were expressing an amoral calculation in their operational work), there was almost no evidence of cynical wrong-doing motivated by profit but only the best intentions towards the goals at hand. From Vaughn:
Repeatedly, I was struck by the difference between the meaning of actions to insiders as the problem unfolded and interpretations by outsiders after the disaster. As I gained an understanding of cultural context and chronological sequence, many actions, much publicized and controversial, took on new meaning. Incidents that when abstracted from context contributed to an overall picture of managerial wrongdoing became ordinary and noncontroversial. For example, after writing a 1978 memo objecting to the joint design, Marshall engineer Leon Ray helped develop corrections to the joint that assuaged his concerns, leading him to believe that the design was an acceptable flight risk. Ray's memo became part of the official record creating the impression that managers had been overriding engineering objections for years; his subsequent action did not. Now alert to the importance of relocating controversial actions in the context of prior and subsequent actions I expanded my focus beyond its restricted attention to rule violations. (Vaughn, p. 60)
The other notion that Vaughn pioneered (but which we in this business are familiar with by other names) is the "normalization of deviance". For those new to the notion, one way of expressing the meaning is, the reducing of margins of error in standardized proven systems because the standard can successfully be reduced while maintaining sufficient margins of error. (There are other ways of expressing this of course!).

So rather than nefarious activities behind engineers' backs, most managers could claim to be onside with the safety people but they also knew that they had to be mindful of schedules, budgets, regulatory affairs, government politics and public perceptions. As you would expect these are very bright and aware people but none of that guarantees that phenomenon such as normalizing standards through "reasonable justifications" is the right thing to do. Often it is seen as "amoral", and calculated towards pedantic goals only in hindsight.

The recent review of the "courses of action not taken" regarding in-orbit video and photographs of Columbia concerning the wing damage, (initially discussed in papers in Starbuck's and Farjoun's "Organization at the Limit: Lessons From the Columbia Disaster") is one such clear example - a sad one showing that NASA had not fully learned the lessons of Challenger ten years earlier.

If I were looking for a place in organizational dynamics that could lead to present circumstances, (your point about "mostly stuff what experienced old school pilots learned" etc), I would view a relatively unquestioned stance towards the privileged place of technology in present-day operations. Such a question is not informed by what we could call the "fiscal discourse" yet it appears to mimic the effects of "bean-counting" priorities. That way of putting the question leaves the issue open to recognition of the good that technology contributes positively, while examining the permissions we grant technology in terms of the relinquishing of control, all of which can lead to an inappropriate reliance and which, as we know well, will (not may) let us down at the most critical moment.

I think the industry has been changing for some time now - ie., returning to the old ways while keeping technology, and the RAeS lecture is recognition of this and as such is indeed information in aid of this change.

That doesn't mean that pure bean-counting decisions which dismiss safety in favour of commercial gain aren't made. I've seen it happen, (despite senior managers seeing the FDA data.) However from what I saw the other side is by far the weightier one!

I know from conversations with those doing the work, (I'm retired) that this knowledge has been in the simulator scripts and training courses for some time now. I'm less confident that an abiding mild skepticism regarding automation is making similar inroads but I think it is moving inexorably in that direction.

Last edited by PJ2; 24th Mar 2013 at 21:59. Reason: punctuation;Vaughn quote
PJ2 is offline