PDA

View Full Version : human factors and cognitive biases: how we can be functionally blind


deptrai
12th Jun 2013, 08:04
Some accidents never fail to make me wonder "how could they miss that? how could they fail to notice what was going on?" I've been trying to understand some cognitive biases, how my mind (or your mind) could be functionally blind to blazingly evident clues.

One thing I think I'm starting to understand is, a bit simplified: If you're in a deep stall, falling out of the sky fast, but you're looking for clues to confirm an overspeed condition, possibly you'll find some clues which could indicate overspeed. And you could be completely blind with regards to the obvious, the stall. With 20/20 hindsight, when you know what to look for, it's easy to see everything, but can you spot something you're not prepared for? Are you a good observer?

Can you see what's going on in this amazing card trick?

Colour Changing Card Trick - YouTube

How is this relevant to aviation safety? Well I found this video helpful to understand the limitations of my own mind. A trained observer who knows what to look for, and systematically scans for information, with the big picture in mind, will notice the obvious at first sight. The rest of us - we tend to believe we are born naturally good observers, but in reality, we're functionally blind. Task-specific training, and preparedness is key.

(mods, wasn't sure where to put this, feel free to move)

BOAC
12th Jun 2013, 09:35
The only counter to your 447 'observation' is that, cognatives permitting, for years as an instructor I had been pointing out "High nose attitude, high rate of descent - means......?" Was this missed from their basic training?

deptrai
12th Jun 2013, 10:45
I didn't intend to offer an explanation for the 447 accident specifically, maybe my stall example was a bit distracting. Anyway, if you subscribe to the swiss cheese model, there is usually no single cause. Particularly in aviation, with all it's redundancies and safeguards to mitigate possible failures, accidents tend to happen after a sequence of multiple failures (to state the obvious :}).

It's just that when reading accident reports, I've often found myself wondering "how can they be so blind?". I don't know about you, but for me, it's been very hard to understand. This card trick was merely an attempt to demonstrate that such blindness could happen to anyone.

But to answer your question (which is another issue), whether there was something wrong with the 447 pilots basic training - I don't know, but my own knee-jerk reaction would be to assume AF should have included a high altitude stall scenario, and possibly pitot failure, in their recurring training, and (hopefully) it could have made a difference. If you already know what to look for, you'll spot it more easily. I'm sure Air France did modify their training. If we follow that line of thinking, we also need to consider that there are practical limits to the number of scenarios that can be trained. Should training departments include one more specific scenario into sim sessions after every accident? Because modern aviation is highly complex, and training needs to be task and situation (and type) specific? A typical, reactive approach, and legal advisors tend to approve of it ("we need to do something specific to address this possible liability and cover our ass").

When I read your "High nose attitude, high rate of descent - means......?", it strikes me how brilliant simplicity is, at solving complex problems. Many tend to assume that complex problems require complex solutions. Anything related to aviation is complex, and discussions tend to get even more complex. But if we look at examples of good decision-making, eg Sully and Skiles, after realizing that they have no thrust, how do they figure out whether they can make it back to La Guardia or not? Do they use tables, solve equations, or consult the Flight Management System? Consider vectors, altitude, mass, velocities, aerodynamics, drag, glide ratio, wind etc? No, the obvious answer is you "intuitively" look out of the window and see if a visual reference moves up in the windshield, and if it does, you won't make it. A single angle is all you need to consider to solve that problem, and avoid making a hole in the ground - you can safely disregard all the other complexities and intricacies of physics.

So when you ask "Was this missed from their basic training?" I did start to wonder, maybe some training, lets randomly single out JAA ATPL theory as an example, has become too theoretical to be of much use in real life? And/or incomplete, and too complex? Real-life decision making requires you to cope with complexity and uncertainty. "Simple" and "obvious" things like "High nose attitude, high rate of descent = probably a stall" can be very helpful. Maybe "more training" isn't always the answer, but better, more practical, and "simpler" training, training which focuses on equipping pilots with the skills to reduce complexity? I'm just thinking out loud here.

BOAC
12th Jun 2013, 11:44
I quite agree - how can folk (myself at times included) be so blind to the obvious cockpit clues the aircraft is giving me? It is cognitive overload - as we know, it is thought that hearing is the first sense to fail in overload conditions.

I was not trying to pick your post to pieces. I just saw " If you're in a deep stall, falling out of the sky fast, but you're looking for clues to confirm an overspeed condition, possibly you'll find some clues which could indicate overspeed." as a direct 447 allusion and too ask myself why 'basic' instinct did not kick in. After all, stall/major flying speed loss is probably the most threatening basic thing for a pilot - of any sort - and should be ingrained from the 'nappies' stage..

safetypee
12th Jun 2013, 22:49
Related to training and making sense:-

“Outcome feedback (“you got it wrong”) isn’t nearly as useful as process feedback (“you did it wrong”), because knowing that performance was inadequate isn’t as valuable as understanding what to modify in the reasoning process.”

Making Sense 1 (http://xstar.ihmc.us/research/projects/EssaysOnHCC/Perspectives%20on%20Sensemaking.pdf)

Making Sense 2 (http://xstar.ihmc.us/research/projects/EssaysOnHCC/Sensemaking.2.pdf)

and an interesting view of making sense – A320 Hudson ditching:- http://www.pprune.org/safety-crm-qa-emergency-response-planning/509356-crm-training-question-about-its-operational-limitations.html#post7886479

CelticRambler
13th Jun 2013, 01:09
How is this relevant to aviation safety? Well I found this video helpful to understand the limitations of my own mind. A trained observer who knows what to look for, and systematically scans for information, with the big picture in mind, will notice the obvious at first sight. The rest of us - we tend to believe we are born naturally good observers, but in reality, we're functionally blind. Task-specific training, and preparedness is key.

There's a similar example here How to Make Better Decisions (BBC Documentary) - YouTube, from 23:40 onwards, that also shows how we can make a decision and immediately accept the alternative "wrong" choice as being the result of our own rational judgement.

I believe this is very relevant to aviation safety when humans are increasingly being given a supervisory role in the cockpit. Our brains are simply not wired for that kind of task and, ironically, the more we improve aircraft safety through automation, the greater the risk of "pilot error" when carbon and silicon disagree.

BOAC
13th Jun 2013, 07:34
ironically, the more we improve aircraft safety through automation, the greater the risk of "pilot error" when carbon and silicon disagree. - indeed, the more we 'automate' the job, the more is the 'surprise' factor when it goes pear-shaped, whereas in 'the old days' when you were actually having to consciously FLY the aircraft, you were far more 'in tune' in SA terms.

safetypee
14th Jun 2013, 14:21
“Experts can perceive things that are invisible to the novice”.
“It takes a decade or more for someone to become an expert in most significant domains”.

Another essay from the same stable as previous references:- Perceptual (Re)learning. (http://xstar.ihmc.us/research/projects/EssaysOnHCC/Perceptual_%28Re%29learning.pdf)

“The experts didn’t just detect cues but understood the meaning and significance of cues that were present, and ones that were absent.”

Pull what
13th Jul 2013, 08:07
Very interesting

Have you considered how the law of primacy and confirmation bias can play a big part in the decision making process.