PPRuNe Forums - View Single Post - "Pilotless airliners safer" - London Times article
Old 2nd Apr 2015, 15:54
  #419 (permalink)  
Lonewolf_50
 
Join Date: Aug 2009
Location: Texas
Age: 64
Posts: 7,228
Received 417 Likes on 260 Posts
Originally Posted by Tourist
Aguadalte

Well, I think we have just seen another reason not to have humans on the flight deck............
An erroneous conclusion drawn from an outlier.
Not a good use of the scientific method you chirped about a few pages back.
More recently, the AD covering the A320 AoA signal fault was to address a death dealer that has to be overcome by fleshy cockpit occupants.
The 30+ known Unreliable Airspeed anomalies that didn't lead to crashes (this ref is to AF 447) were overcome and mitigated by fleshy cockpit occupants.

The toxic cultural environment of today, toxic culture within an organization driven by the min max imperative (as the airline industry seems to be in a general sense), and the potentially toxic cockpit culture within a given crew on a given day are symptoms of a belief systems: the primacy of rules and of if / then statements.
If you want to look at root causes for a safety case, look at culture (writ large) and culture (organizational).
I'll put this to you, Tourist, that the overmathematicization of modern ivilization has led to a min/max attitude that ended up in events like Calgon, and is cousin to the belief that "if we just had another law" or "if we just had another rule" we'd make it better. In some ways, use of those tools has made something like automotive design, fuel efficiency, and road network management better. Min-Max, sadly, leaves out most people, since most people exist within the 3 sigmas of the bell curve. Most people aren't min, and aren't max.

The post provided a few pages back about the Y2K discovery process in a serious IT system should give you pause in your belief in computer programs as anything other than tools for meaty, fleshy, human beings.

As to military flying: I got to use a variety of kit, from CR2's to tube radios to NDB's to some pretty fancy digital cockpit devices by the time I left Naval Aviation.
The one common factor in them all (save the CR2) was that now and again, you had to turn the sumbitch off and turn it on again as a trouble shooting step when it acted up. That was the meat system, me, overcoming the machine system in order to get it to work.

For those systems that you couldn't, like the engine power sharing system for the 2 T-700 engines on a Seahawk, it was a bit frustrating to note that a comparatively simple control system like that took years to trouble shoot by the engineers and patent holders of the systems and sub systems involved to get spurious inputs out of the system that led to, among other things, un-commanded engine shut downs in flight.

At one point while this electronic mess was being untangled by the card carrying smart guys, the Pacific Fleet curtailed certain hover training evolutions. That's right, for a while some of our training that involved hovering over water, which is a fundamental helicopter thing, was taken off the books until some of the electronic problems were identified and overcome.

In the decades since then, I have been very impressed with the reliability improvements of both turbofan and turboshaft engines: impressive work by Rolls, GE, Pratt, etc.
Exceptional reliability doesn't equal fault proof. Whatever one-off event occurs becomes a sensation, it seems, just as a rare one-off with Germanwings has evoked a reaction that may or may not be in proportion to the root causes of that crash.

Compared to a control system top to bottom for a passenger aircraft, the logic / control system for these remarkable modern engines is primative.

No thanks, your brave new world doesn't sell. The root cause is more likely to be the dehumanization of the person in the cockpit than the presence of the person in the cockpit. The question isn't "what was a human doing in that cockpit" but "what was that particular human" doing in that cockpit that day? Each day, humans in the cockpit do a great job at getting folks from point A to point B. As Dr Deming might suggest to you, from a statistical analysis point of view, if you make systems modification decisions based on outlier rather than sound statistical basis, your systems change won't improve your product.
Your crazy FO example is an outlier.

When someone says "pilot error" and doesn't have the experience of ever investigating a crash where pilot error may be a factor, the typical failure to assess "did the system he was in set him/her up to fail" requires deep digging and attribution. The OP article, written by someone who doesn't understand that, wasn't worth the bandwidth it used.

Back to the rule and law obsession, and its attendant legalism and lawsuit crazy cultural cousin: the corporations who build, the agencies who govern and monitor, and the operators who run the business are heavily incentivized by money to NOT open the kimono due to liability concerns when something goes wrong.

Easier to do as was common in the sheep and goat herding society of the ancient Hebrews: find a scapegoat and sacrifice him or her. Even better is to convince some of the sheep that it's in their best interest to not even exist.

My, how far we humans have come, culturally.

NOT!
Lonewolf_50 is offline