PPRuNe Forums - View Single Post - AF 447 Thread No. 6
View Single Post
Old 29th Aug 2011, 01:10
  #570 (permalink)  
Ian W
 
Join Date: Dec 2006
Location: Florida and wherever my laptop is
Posts: 1,350
Likes: 0
Received 0 Likes on 0 Posts
Human cognitive limitations

Clandestino posted:
I am sorry sir, but ambiguity of "STALL STALL STALL STALL STALL STALL STALL STALL STALL STALL STALL STALL STALL STALL" aural warning is lost upon me but it is for BEA HF group to discover why crew went selectively deaf on stall warning. Granted, stickshaker and big red flashing "STALL" in the middle of PFD or as separate light somewhere on glareshield might have helped a bit but questions remain: why did the RH pilot pulled the aeroplane in the stall in the first place? Why didn't the LH pilots survival instinct kick in?
A little exercise for those that are interested.
Go back a page or two and read one of the more technical posts - say by Owain, while doing that try to recite a well known child's rhyme to yourself at the same time have someone read a paragraph of different text for you to to write down. You will find that you cannot do all of these - in fact if you concentrate on the reading YOU WILL NOT EVEN HEAR the person talking to you.

This is not selective deafness - it is because the human brain has a limited number of cognitive channels and they can only handle ONE input at a time. So aural verbal, visual verbal and speaking verbal activities all use the same single verbal cognitive channel. (see the work of Christopher Wickens).

Some people can if trained and practiced rapidly switch between various verbal inputs and outputs - but if something important happens on one input they WILL NOT HEAR the others.

Think about how many times when you are driving and there are complex lane changes and road signs... that you may have had to ask a passenger to repeat themselves.

The reason that 'steam gauges with needles' seem to be easier to read is that they are a spatial cognitive load and form patterns that can be recognized without much cognitive effort. All the glass cockpit tapes with numbers and the ECAM require visual verbal analysis; and no-one can read one thing and fully understand it while saying something else and listening and comprehending something else again. The human brain cannot do it so will just 'drop' any input that is the overload - it is perfectly possible that the pilots literally did not hear the stall warning as their verbal processing cognitive channel was already overloaded. A stick shaker or other haptic input, like someone tapping you on the shoulder when you are busy, can have an immediate attention getting effect that a voice alarm or flashing words may not.

One of the aspects I expect the BEA Human Factors investigators to look at is the cognitive workload that the ECAM and failure messages put on the pilots. Especially the aspect of overloading particular cognitive channels. Perhaps every potential emergency scenario should be subject to what is called a 'cognitive walk-through' that actually assesses the cognitive loads and identifies likely overloads.

Older pilots may well have followed a rather older but repeatedly successful dictum - disregarding all the cacophony - aviate (i.e. pitch and power), navigate, then communicate.

It mightn't pass the sim check ride - but it may have had a better outcome in the real world.
Ian W is offline