PPRuNe Forums - View Single Post - Can automated systems deal with unique events?
Old 26th Oct 2015, 20:30
  #11 (permalink)  
slast
 
Join Date: Jan 2010
Location: Marlow (mostly)
Posts: 364
Likes: 0
Received 1 Like on 1 Post
responses to a few points

Good to get some serious answers so fast...!

DaveReidUk, I pondered long and hard over whether to make it unforeseen or unique (or both). can you continue that thought with examples as to what events, if any, are unique but not unforeseen and vice versa.

PD, j(and several others! Just to be clear, I DON'T consider that pilots ARE the predominant cause - that's the pro-automation lobby viewpoint. But it gets support from graphs like this from an MIT study on "Safety Management Challenges for Aviation Cyber Physical Systems" . This was picked at random from many others similar.


Darkroom, your "in theory"... para. The failures I see as problematic to deal with are not ones that HAVE ever happened, but ones that have not YET happened and almost certainly never will. These are for practical purposes infinite in number - certainly many order of magnitude greater than the possible moves in a game of chess (10^120?)

Humans brain within human body can be pretty good at chess but are relatively easily beaten now by specialist programmes. However, the same human brain/body combination can also deal with umpteen other issues (e.g. raise children, create music) at which the same programme and hardware has zero capability. To what extent would a system "trained" to handle the QF32 scenario and every historic event be able to deal with a second QF32 in which one hot chunk went in a 1 degree different direction with significantly different consequential failures? However a human would ATTEMPT to cope just the same.

172Driver. I agree entirely with your comment about the information bias. I have devoted a couple of pages specifically to this on my own website but to make the point that where pilots ARE responsible they need collectively to do something about it. See these diagrams I made to illustrate that public perception

does not align with the underlying reality:


Thanks for the link, very useful. The self-drive car issue is of course the canary in the mine, if they can't resolve the liability issue for cars, then it goes away for aircraft. See Volvo recent statement... Volvo will accept liability for self-driving car crashes


FC101: see earlier, I do not consider that pilots ARE the major problem. I have an adaptation of Jim Reason's diagram here... and agree Gawande's Checklist Manifesto is a good read.



Thanks for the input, keep it coming...
slast is online now