PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 7th Jul 2011, 01:25
  #174 (permalink)  
safetypee
 
Join Date: Dec 2002
Location: UK
Posts: 2,461
Likes: 0
Received 9 Likes on 5 Posts
“… a courageous realization …”

BOAC has a problem “How to live in an unfathomable world”, as we all do according to New Scientist, 17 May 2011.

“… opposing positions are predictable, but they are also incoherent, unintelligible and entirely unhelpful in navigating the complexities of our technological age.”

The gist of the New Scientist article is that we fail to distinguish between various levels of technology.
Level 1 is simply functional, level 2 is part of a network with increasing complexity, and level 3 is a highly complex system with adaptive subsystems and human interaction, which we cannot fully understand. Level 3 systems are beyond our cognitive abilities.

The problem is that we tend to focus on levels 1 and 2 because we can understand and assess them, and manage their complexity. It’s our expectation that all technology be like this, we remain in control, except that in reality at level 3 we are not.

“Level 3 systems whose implications you cannot fathom.”

“We are not the ‘knowledge society’; that's Level 1. We are in fact an ignorance society, continually creating more and more ignorance as we busily expand the complexity of the anthropogenic Earth. But our ignorance is not a 'problem' with a 'solution': it is inherent in the techno-human condition.”

“The question now is how to enable rational and ethical behaviour in a world too complex for applied rationality, how to make our ignorance an opportunity for continual learning and adjustment.
This necessary evolution does not demand radical changes in human behaviour and institutions, but the opposite: a courageous realisation that the condition we are always trying to escape - of ignorance and disagreement about the consequences of our actions - is in fact the source of the imagination and agility necessary to act wisely in the Level 3 world.”

Take care not to interpret the final quote out of context;

“… that to participate ethically, rationally and responsibly in the world we are creating together, we must accept fundamental cognitive dissonance as integral to the techno-human condition. What we believe most deeply, we must distrust most strongly.”

IMHO this is not the distrust of technology / automation, it’s about how we should trust/distrust what we feel about it, how technology can be used, and what can be expected with human interaction. We need to be a learning society, except in this instance there is a limit to our understanding, and we need “agility necessary to act wisely in the Level 3 world”.

We have to accept that we may never understand aspects of ‘level 3’; complex technical systems in a vast operational environment, with human interaction, such as AF 447.
safetypee is offline