PPRuNe Forums - View Single Post - Man-machine interface and anomalies
View Single Post
Old 12th Apr 2012, 01:34
  #18 (permalink)  
safetypee
 
Join Date: Dec 2002
Location: UK
Posts: 2,453
Likes: 0
Received 9 Likes on 5 Posts
The initiating question infers that increasing technological sophistication has changed the balance of the man-machine interface.
The interface represents a process which takes place within a situation (context); thus any change in the process can originate either from man or machine, or from the situation. However, none of these are independent; there is human input in all aspects.

One view such a system is to use the SHEL model of human factors. Normally the focus is on the central (L), but from a technology viewpoint (H) the links remain the same. The limits of SHEL are in its simplistic view; in reality all constituent factors in each category will have some interface with the other factors, e.g. autopilot alt hold mode (H) will link with the effects of weather (E) & (S) in the choice hard/soft ride vs accuracy – a human decision in design (L) and procedures and training (S) & (L).
Thus the balance sought involves a highly complex process (a chaotic system) – where a small change in input can have an unexpected disproportionate result.

The behavior of complex systems can be analyzed with techniques such as FRAM, and FRAM intro.
Note the four steps:-
- Identify and describe essential system functions, and describe these by six basic parameters.
- Describe the context.
- Define the functional interactions
- Identify safety barriers and specify required performance monitoring.
A technological comparison: – describe the automatic system and use, the usable conditions; define the dependencies and interactions, and the safety barriers. The human is involved in these and in the basic assessment parameters – input, time, control, output, resource, and precondition; an example ‘risk assessment’.

We need not stray into higher science with these ideas, as in the broadest sense they are like managing everyday life; e.g. humans use of fire – there are advantages and hazards, we elect to use it and take precautions, fireproofing, water, restricted situations, and have procedures for guidance. The human continually assesses and adjusts activity as the situation changes; the objective is to remain in control of the situation.

In many cases it’s the increasing complexity of operational situations that demands an increase in technical capability; both operations and capability are driven by economics. The changes (input to the system) can either be proactive – the pilot wishes to achieve an objective by using automation, or reactive where pilot activity is required – normal and non-normal situations – again the objective is to remain in control of the situation.
The situational aspects are in most (all) scenarios, and where these require a change of action, i.e. a malfunction (technology, human, or situation), the human has to understand the changed situation and its significance within the larger complex situation, and then choose a revised course of action. This will be an iterative process.

The above is a very high level view; however most events with technical / operational interfaces can be viewed this way, e.g. accident investigation.

As much as technology has changed, so too may have the human due to the changes in the social environment, education, career expectation, and interaction with complexity. This does not automatically represent a change in training standards. Aspects of the context also change; economic pressure, airspace accuracy, operational need.
In addition we may be biased by the current salience of technology related accidents. The industry has a very low accident rate thus any significant event will stand out. In a complex system just because increasing automation use is seen as an input, this may not mean that it is a dominant cause of the ‘accident’ output.
safetypee is online now