PDA

View Full Version : The Dorsolateral Prefrontal Cortex


Sciolistes
30th Dec 2009, 04:36
12WattTim posted a link to this Wired Article (http://www.wired.com/magazine/2009/12/fail_accept_defeat/all/1) with reference to the kind of human factors that scientists experience, that suggests the the scientific process is not all it is cracked up to be. No surprises there for the worldly wise.

But the article also seems potentially relevant to flight crew too. It seems to me that the DLPFC (sorry you're going to have to read the article) can actively hinder our ability to see undesired situations suggesting it isn't simply a matter of alertness.

Unlike scientists, we are trained to with abnormal situations to the extent that abnormal is normal in a simulator. As a consequence, certainly in my meagre experience, reaching for the QRH doesn't seem that unusual and the precipitating event likely to be considered by me to be expected at some point in my career. In other words, by and large due to our training, the word "abnormal" is a synonym for another kind of expected event. That's easy for us to deal with, especially with the systems related issues.

However, in terms of situation awareness, I think there are plenty of examples. We know the reports of gear up landings despite the blaring horn and CFIT despite the "whoop whoop pull up", I'm sure there is a veritable myriad of less extreme examples. These seem to fit the 'doesn't fit the picture - delete' function of that part of the brain. But yet I have never heard of this part of the brain and its, or at least one of its, functions. If my correlation with flight crew is a reasonable one, perhaps this is a bit of an omission in human factors teaching.

To be aware that your own trained ability will work against you in a devilishly insidious manner just when being able to recognise such a situation in a timely manner is of critical importance. I think specific knowledge of the brain's limitation in this regard is the one thing that can enable a pilot to question himself, not only when when something is clearly wrong but also when there maybe no immediately apparent reason to do so, and not to give his own mental picture any further credence until it has been consciously revalidated.

Well, it certainly piqued my interest!

turbocharged
30th Dec 2009, 08:07
It's an interesting article but...

We cover confirmation bias during discussions about decision-making. Isn't this what is being discussed in the article?

It is probably time to move HF/CRM onto a more rigorous footing - ecological/evolutionary explanations can now be offered for most behavioural precursors - but I wonder if the audience would be as interested as we are?

Sciolistes
30th Dec 2009, 11:50
Maybe. But my perception of confirmation bias is really taught as something we combat by looking for assumptions, reviewing and crosschecking.

The area of the brain the article refers to seems become more efficient with training and experience as the norms become more engrained, thus that brain is automatically filtering out what it believes is irrelevant noise without any concious effort.

This issue then (as I infer) seems to occupy a lower more basic level in that the concious effort to review or cross-check is potentially and unknowingly thwarted.

So, it doesn't seem to be simply a case of looking for what you perceive to be correct, but more about being blinded to what is probably unusual, even if you are looking for something wrong. As the article says, the 'unwanted' information can actually be inhibited.

turbocharged
30th Dec 2009, 13:16
My point was that confirmation bias, along with many other decision-making heuristics, is a manifestation of some psychological process. As such, it has to happen somewhere in the brain.

Seems to me that there will still be some requirement for deliberate, conscious testing of data in order to establish the validity of assumptions ... no matter how hard we try to 'train the brain'