PPRuNe Forums - View Single Post - Man-machine interface and anomalies
View Single Post
Old 16th Apr 2012, 01:26
  #41 (permalink)  
safetypee
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
Lyman / Nats, you appear to be associating the majority of the interface problems with the machine.
To paraphrase Jens Rasmussen: –
The problem in the design interface is not to match the technology with the user’s mental model, but to create an interface which generates and maintains an effective and safe mental model”.
Note that the requirement is to create an interface, not just design the automation; thus there is need of close relationship between the automation, the human, and the situation.

Attempting to meet just one pilot’s wish list with automation may only satisfy one pilot. This is similar to designing a fix for the most recent accident; you fix ‘the cause’, whereas there could have been many contributing factors, any of which can reoccur in other circumstances – situations, people.
The alternative is to design automation which helps the operator form a mental model (a construct of awareness and interpretation of the real world, together with knowledge and knowhow) and thus may benefit a wide range of pilots in various situations. For the operator this requires knowledge of the automation’s objectives, capabilities, and the limiting situations – a need for training, education, and understanding.

Perhaps these requirements identify a weakness in training, but also in the human attitudes to automation. Many posts stated what the automation ‘must give them’, but an interface has to be two-way, and is of particular value where the pilot extracts information from the machine and applies it.
Modern society has a need for instant gratification, solutions without thought (Google, SOPs); this is not the purpose of automation in aviation, excepting perhaps a few fully automatic operations. The pilot has to think, create a mental model, understand the situation, and decide on the automation’s role in that situation – risk assessment, decision making, action, and checking.


Switch it off.
Re “… the human ability to switch it off …”; this requires discipline, knowledge, and again risk assessment – do you know when to switch it off, and will you. Humans are biased in risk, action, and belief - ‘that they know better’ (macho attitude). Training should help control these behaviors, but occasionally in stressful or surprising situations human performance is insufficient and automation is allowed to continue too far.
Where such a need to switch off is recognized, it is an indication of poor situation awareness. This is not to say that automation has or has not contributed to this, but the human must to maintain control of these aspects as well as controlling their own thinking processes – part of the interface.

“… if we get p----d off with it then …”; then this is the loss of control of your thinking, discipline, CRM.

“The concept of human error: is it useful for the design of safe systems?” Jens Rasmussen.
safetypee is offline