Go Back  PPRuNe Forums > Ground & Other Ops Forums > Safety, CRM, QA & Emergency Response Planning
Reload this Page >

human factors and cognitive biases: how we can be functionally blind

Wikiposts
Search
Safety, CRM, QA & Emergency Response Planning A wide ranging forum for issues facing Aviation Professionals and Academics

human factors and cognitive biases: how we can be functionally blind

Thread Tools
 
Search this Thread
 
Old 12th Jun 2013, 08:04
  #1 (permalink)  
Thread Starter
 
Join Date: Nov 2009
Location: flying by night
Posts: 500
Likes: 0
Received 0 Likes on 0 Posts
human factors and cognitive biases: how we can be functionally blind

Some accidents never fail to make me wonder "how could they miss that? how could they fail to notice what was going on?" I've been trying to understand some cognitive biases, how my mind (or your mind) could be functionally blind to blazingly evident clues.

One thing I think I'm starting to understand is, a bit simplified: If you're in a deep stall, falling out of the sky fast, but you're looking for clues to confirm an overspeed condition, possibly you'll find some clues which could indicate overspeed. And you could be completely blind with regards to the obvious, the stall. With 20/20 hindsight, when you know what to look for, it's easy to see everything, but can you spot something you're not prepared for? Are you a good observer?

Can you see what's going on in this amazing card trick?


How is this relevant to aviation safety? Well I found this video helpful to understand the limitations of my own mind. A trained observer who knows what to look for, and systematically scans for information, with the big picture in mind, will notice the obvious at first sight. The rest of us - we tend to believe we are born naturally good observers, but in reality, we're functionally blind. Task-specific training, and preparedness is key.

(mods, wasn't sure where to put this, feel free to move)

Last edited by deptrai; 12th Jun 2013 at 09:11.
deptrai is offline  
Old 12th Jun 2013, 09:35
  #2 (permalink)  
Per Ardua ad Astraeus
 
Join Date: Mar 2000
Location: UK
Posts: 18,579
Likes: 0
Received 0 Likes on 0 Posts
The only counter to your 447 'observation' is that, cognatives permitting, for years as an instructor I had been pointing out "High nose attitude, high rate of descent - means......?" Was this missed from their basic training?
BOAC is offline  
Old 12th Jun 2013, 10:45
  #3 (permalink)  
Thread Starter
 
Join Date: Nov 2009
Location: flying by night
Posts: 500
Likes: 0
Received 0 Likes on 0 Posts
I didn't intend to offer an explanation for the 447 accident specifically, maybe my stall example was a bit distracting. Anyway, if you subscribe to the swiss cheese model, there is usually no single cause. Particularly in aviation, with all it's redundancies and safeguards to mitigate possible failures, accidents tend to happen after a sequence of multiple failures (to state the obvious ).

It's just that when reading accident reports, I've often found myself wondering "how can they be so blind?". I don't know about you, but for me, it's been very hard to understand. This card trick was merely an attempt to demonstrate that such blindness could happen to anyone.

But to answer your question (which is another issue), whether there was something wrong with the 447 pilots basic training - I don't know, but my own knee-jerk reaction would be to assume AF should have included a high altitude stall scenario, and possibly pitot failure, in their recurring training, and (hopefully) it could have made a difference. If you already know what to look for, you'll spot it more easily. I'm sure Air France did modify their training. If we follow that line of thinking, we also need to consider that there are practical limits to the number of scenarios that can be trained. Should training departments include one more specific scenario into sim sessions after every accident? Because modern aviation is highly complex, and training needs to be task and situation (and type) specific? A typical, reactive approach, and legal advisors tend to approve of it ("we need to do something specific to address this possible liability and cover our ass").

When I read your "High nose attitude, high rate of descent - means......?", it strikes me how brilliant simplicity is, at solving complex problems. Many tend to assume that complex problems require complex solutions. Anything related to aviation is complex, and discussions tend to get even more complex. But if we look at examples of good decision-making, eg Sully and Skiles, after realizing that they have no thrust, how do they figure out whether they can make it back to La Guardia or not? Do they use tables, solve equations, or consult the Flight Management System? Consider vectors, altitude, mass, velocities, aerodynamics, drag, glide ratio, wind etc? No, the obvious answer is you "intuitively" look out of the window and see if a visual reference moves up in the windshield, and if it does, you won't make it. A single angle is all you need to consider to solve that problem, and avoid making a hole in the ground - you can safely disregard all the other complexities and intricacies of physics.

So when you ask "Was this missed from their basic training?" I did start to wonder, maybe some training, lets randomly single out JAA ATPL theory as an example, has become too theoretical to be of much use in real life? And/or incomplete, and too complex? Real-life decision making requires you to cope with complexity and uncertainty. "Simple" and "obvious" things like "High nose attitude, high rate of descent = probably a stall" can be very helpful. Maybe "more training" isn't always the answer, but better, more practical, and "simpler" training, training which focuses on equipping pilots with the skills to reduce complexity? I'm just thinking out loud here.

Last edited by deptrai; 12th Jun 2013 at 15:49.
deptrai is offline  
Old 12th Jun 2013, 11:44
  #4 (permalink)  
Per Ardua ad Astraeus
 
Join Date: Mar 2000
Location: UK
Posts: 18,579
Likes: 0
Received 0 Likes on 0 Posts
I quite agree - how can folk (myself at times included) be so blind to the obvious cockpit clues the aircraft is giving me? It is cognitive overload - as we know, it is thought that hearing is the first sense to fail in overload conditions.

I was not trying to pick your post to pieces. I just saw " If you're in a deep stall, falling out of the sky fast, but you're looking for clues to confirm an overspeed condition, possibly you'll find some clues which could indicate overspeed." as a direct 447 allusion and too ask myself why 'basic' instinct did not kick in. After all, stall/major flying speed loss is probably the most threatening basic thing for a pilot - of any sort - and should be ingrained from the 'nappies' stage..
BOAC is offline  
Old 12th Jun 2013, 22:49
  #5 (permalink)  
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
Related to training and making sense:-

Outcome feedback (“you got it wrong”) isn’t nearly as useful as process feedback (“you did it wrong”), because knowing that performance was inadequate isn’t as valuable as understanding what to modify in the reasoning process.

Making Sense 1

Making Sense 2

and an interesting view of making sense – A320 Hudson ditching:- http://www.pprune.org/safety-crm-qa-...ml#post7886479
safetypee is offline  
Old 13th Jun 2013, 01:09
  #6 (permalink)  
 
Join Date: Jan 2011
Location: France
Posts: 191
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by deptrai
How is this relevant to aviation safety? Well I found this video helpful to understand the limitations of my own mind. A trained observer who knows what to look for, and systematically scans for information, with the big picture in mind, will notice the obvious at first sight. The rest of us - we tend to believe we are born naturally good observers, but in reality, we're functionally blind. Task-specific training, and preparedness is key.
There's a similar example here
, from 23:40 onwards, that also shows how we can make a decision and immediately accept the alternative "wrong" choice as being the result of our own rational judgement.

I believe this is very relevant to aviation safety when humans are increasingly being given a supervisory role in the cockpit. Our brains are simply not wired for that kind of task and, ironically, the more we improve aircraft safety through automation, the greater the risk of "pilot error" when carbon and silicon disagree.
CelticRambler is offline  
Old 13th Jun 2013, 07:34
  #7 (permalink)  
Per Ardua ad Astraeus
 
Join Date: Mar 2000
Location: UK
Posts: 18,579
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by CR
ironically, the more we improve aircraft safety through automation, the greater the risk of "pilot error" when carbon and silicon disagree.
- indeed, the more we 'automate' the job, the more is the 'surprise' factor when it goes pear-shaped, whereas in 'the old days' when you were actually having to consciously FLY the aircraft, you were far more 'in tune' in SA terms.
BOAC is offline  
Old 14th Jun 2013, 14:21
  #8 (permalink)  
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
Experts can perceive things that are invisible to the novice”.
“It takes a decade or more for someone to become an expert in most significant domains
”.

Another essay from the same stable as previous references:- Perceptual (Re)learning.

The experts didn’t just detect cues but understood the meaning and significance of cues that were present, and ones that were absent.”
safetypee is offline  
Old 13th Jul 2013, 08:07
  #9 (permalink)  
 
Join Date: Feb 2009
Location: England
Posts: 858
Likes: 0
Received 0 Likes on 0 Posts
Very interesting

Have you considered how the law of primacy and confirmation bias can play a big part in the decision making process.
Pull what is offline  

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.