PPRuNe Forums - View Single Post - AF447
Thread: AF447
View Single Post
Old 26th Jun 2009, 16:34
  #2375 (permalink)  
PJ2
 
Join Date: Mar 2003
Location: BC
Age: 76
Posts: 2,484
Received 0 Likes on 0 Posts
YRP;
But be careful about blaming automation for this. How much better would a human pilot do with three separate sensors connected to three separate airspeed indicators? If indeed there is a pitot icing problem here, the problem is a mechanical/physical pitot/static sensor design problem NOT one in the software/automation.
[my italicizing/bolding]:

Very well stated. In fact, (if I may beg the indulgence of the mods for a moment), from designer to pilot, the notion you have expressed is fundamental to any comprehension of what is at the heart of "automated" flight. Software that can exercise human judgement (as when a pilot must make a judgement based upon 'what is reasonable'), has yet to be written not because it is difficult technically but because it is difficult philosophically as anyone fascinated by Turing and the notion of AI will attest.

The notion, "philosophically", is not meant in any ethical sense - it is not a "should we or shouldn't we?" question, it is a question of 'understanding what human understanding and judgement means'. Further, the act of understanding "understanding" is itself, a philosophical act.

And when we "understand" (or 'see'), we do so in a particular way and not just any way. Our "seeing" is a particular template and not a mere objective act from which we can then derive "objective" data or perceptions.

This is an implicit limitation on knowing what "human judgement" is, in the same way that Heisenberg thought of uncertainty when he posited the notion that the act of measurement itself, affects that which is measured.

In other words, our perceptions or, the "way" we perceive, are not innocent nor are they given - thus understanding judgement is an elusive project which demands not a technical understanding but a philosophical one.

Hopefully that circuitous discussion of your point regarding automation will add to understanding why designing and writing software to "mimic" human responses (ie, judgement) is a different-order problem and not an "automation" problem per se.

Graybeard;
PF saying, "Hey, my airspeed is falling off." The PNF replies, "Mine, too; and so is the standby, just like that recent advisory."

PF then disconnects the automatics and flies attitude.

Meanwhile, the automatics behave exactly as they were programmed 15 years ago. They didn't read the advisory..
Precisely the point, said in much clearer terms!

PJ2
PJ2 is offline