Go Back  PPRuNe Forums > Flight Deck Forums > Tech Log
Reload this Page >

End of Aircraft Operation

Wikiposts
Search
Tech Log The very best in practical technical discussion on the web

End of Aircraft Operation

Thread Tools
 
Search this Thread
 
Old 2nd Jul 2020, 11:55
  #21 (permalink)  
 
Join Date: Dec 1999
Posts: 342
Likes: 0
Received 0 Likes on 0 Posts
One thing is certain...……………..who ever will be sat at the pointy end.....will not be paid very much...…………..

Better get used to it...…………..
waco is offline  
Old 2nd Jul 2020, 13:15
  #22 (permalink)  
 
Join Date: Jun 2007
Location: Wanderlust
Posts: 3,404
Likes: 0
Received 0 Likes on 0 Posts
By the way Sully did not handle the Hudson River ditching perfectly. As it happens his mis-settings were not fatal, but they could have been in different circumstances​​​​​​
That's true. He was faced with unpracticed situation and tremendous time pressure. The Engines had not failed but damaged to produce any meaningful thrust. Therefore the aircraft was in Normal Law. Configuration was not important but he didn't maintain his speed. The low speed triggered Valpha prot and he wasn't able to flare sufficiently leading to heavy impact. Perhaps twistedengine wants say if on automation was possible it could've controlled speed better.
vilas is offline  
Old 2nd Jul 2020, 13:20
  #23 (permalink)  
ENTREPPRUNEUR
Thread Starter
 
Join Date: Jun 2001
Location: The 60s
Posts: 566
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by esscee
Any computer is only as good as the input information to it plus its own "100%" solid safe operating software. You only have to consider the situation of Windows 10 and the many problems of nearly every "Update".
This I think is this biggest argument against automation.

The perspective I proposed is, in an era of concern over the competence and authenticity of airline pilots, you can guarantee quality performance by changing to computer systems which deliver consistency and can be made aware of every known parameter, procedure, and optimum failure response, apart from being largely immune to the temptations of doing things that are not supposed to be done.

The argument is $Billions can be invested in your pilot because your pilot is exactly the same as everybody else's flying the same aircraft type. Your pilot can know more than any human pilot because your pilot will have millions of man-hours of training. Your pilot will have been tested for thousands upon thousands of hours in the most extreme and rare situations. Clever aeronautical engineers will have honed your pilot to the pinnacle of precision.

The logic is irresistible.

Until you spend a few hours on the internet.

We are all exposed to websites which are chronically inept and dysfunctional. These are usually from large, global organisations with huge revenue streams at stake. Companies with super sophisticated and super expensive IT departments. Yet they punt out endless persistent rubbish. I totally fail to understand why a) software engineers create crap and b) testers do not pick up the problems (having been both a software developer and a tester during my working life).

So the calculation then becomes do you want to risk your life (or your massive aeronautical colossus) on slightly flawed but mainly very, very effective human pilots, or choose a digital nirvana where virtually anything is possible but history shows if there is a cock-up that can be made, it somehow can't be ruled out?

If only we knew what Boeing and Airbus are thinking...
twistedenginestarter is offline  
Old 2nd Jul 2020, 16:29
  #24 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by esscee
Any computer is only as good as the input information to it plus its own "100%" solid safe operating software. You only have to consider the situation of Windows 10 and the many problems of nearly every "Update".
Great point. Even Machine Learning, a subset of AI, where the system/computer/machine learns over time to improve performance with experience, errors are made--potentially fatal. In supervised ML all instructions are explicitly programmed in, and then the machine can use preset parameter weights to learn, in semi-supervised learning, there is a balance between pre-programmed instructions and flexible learning parameters, and then there is unsupervised ML; this was very useful in uncovering genes we did not know existed during the Human Genome Project. Theoretically, yes these ML algorithms can save lives or land with greater ease under some circumstances, but no one knows in advance, and at least with Boeing's recent failures (787. 737 Max) we should all be more skeptical of so-called capabilities of new smart-autonomous systems. In research with my team: biometrics, AI financial/stock market analysis, autonomous car research, and cybersecurity--so much goes wrong in the testing, a lot we did not even anticipate. Speaking with my pilot friends here on PPRUNE and elsewhere, split-second decisions on an aircraft are not only complex but often life and death.
jcbmack is offline  
Old 2nd Jul 2020, 16:37
  #25 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
"This I think is this biggest argument against automation.

The perspective I proposed is, in an era of concern over the competence and authenticity of airline pilots, you can guarantee quality performance by changing to computer systems which deliver consistency and can be made aware of every known parameter, procedure, and optimum failure response, apart from being largely immune to the temptations of doing things that are not supposed to be done."

I wanted to address this part of your quote in isolation. In no industry in the world can any computer system guarantee quality performance. In all my years of programming, algorithm assessment, design, testing, implementation, and leading specialist teams with their own expertise, it has never worked out that way. During the cold war, it was believed so-called "AI" would be able to interpret Russian and other languages near flawlessly--it is only in the last few years that Een Google translate has become above proficient in a plethora of languages, and even today it often misses translating sub-dialects, and known slang in languages like Bisya (Visya/Cebuano), and Russian. Computer vision is still in many ways in its infancy too--an area I worked on with researchers from Saarland University in Germany. Even in self-driving cars, there are significant performance and safety issues, sometimes ending in death---while aircraft might be in more clear airspace as compared to vehicles on the road--when things go wrong, they can often become widespread fatalities. Clearly, not every pilot is a "PPRUNE heavyweight", but that does not make the argument for autonomous computer systems working perfectly--the negative in one area does not prove positive in another. I have a deep respect for pilots and I love traveling by flight---I work with technology that has enhanced human being's lives--but the limitations are more than most realize.
jcbmack is offline  
Old 2nd Jul 2020, 16:40
  #26 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Complex

Originally Posted by vilas
That's true. He was faced with unpracticed situation and tremendous time pressure. The Engines had not failed but damaged to produce any meaningful thrust. Therefore the aircraft was in Normal Law. Configuration was not important but he didn't maintain his speed. The low speed triggered Valpha prot and he wasn't able to flare sufficiently leading to heavy impact. Perhaps twistedengine wants say if on automation was possible it could've controlled speed better.
I have no reason to doubt that claim from multiple professional pilots. Still, Sully got the A320 safely down, no casualties, and everyone grateful. I am skeptical that the system itself could have controlled for all the variables and did a better job--not impossible; I don't work on systems in Airbuses, but knowing the technology limitations, I can still be skeptical.
jcbmack is offline  
Old 2nd Jul 2020, 17:13
  #27 (permalink)  
 
Join Date: Jun 2007
Location: Wanderlust
Posts: 3,404
Likes: 0
Received 0 Likes on 0 Posts
Speaking with my pilot friends here on PPRuNe and elsewhere, split-second decisions on an aircraft are not only complex but often life and death.
There are simply no split second decisions for any failures. If anyone told you that it's exaggeration. Most accidents/incidents are attributed to pilot error. The long list of human factors responsible for errors makes it tempting for the technology to attempt to remove the human. Accidents happened many times more without automation. All weather CAT3 operations, reduced vertical separation to increase the number of aircaft using an airway is just not possible without automation. And with human presence if accidents cannot be avoided then it's a question of can they be less without them with full automation. B737 max failures are not at all because of automation. It's because of trying to match a competitor, fitting a more fuel efficient engine on to an unsuitable airframe. Then trying to correct the problems that arose through software without redundancy and selling it in immoral haste. Don't compare that.
vilas is offline  
Old 2nd Jul 2020, 19:57
  #28 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by vilas
There are simply no split second decisions for any failures. If anyone told you that it's exaggeration. Most accidents/incidents are attributed to pilot error. The long list of human factors responsible for errors makes it tempting for the technology to attempt to remove the human. Accidents happened many times more without automation. All weather CAT3 operations, reduced vertical separation to increase the number of aircaft using an airway is just not possible without automation. And with human presence if accidents cannot be avoided then it's a question of can they be less without them with full automation. B737 max failures are not at all because of automation. It's because of trying to match a competitor, fitting a more fuel efficient engine on to an unsuitable airframe. Then trying to correct the problems that arose through software without redundancy and selling it in immoral haste. Don't compare that.
There are no split-second decisions to make, at all? No novel events not foretold by an AC, flight manual, or computer system? I know the NTSB likes to blame pilots virtually straight away--this I have been told by a few experienced pilots here on PPRUNE, and the same case was attempted to be laid against Sully. I cannot comment on actually flying a B737, but I can guarantee automation failure and other computer-based factors were at play there--just like the detrimental idea of using Lithium-Ion batteries in the 787 was a grave error. I can tell from the information I have access to, which is not all Boeing has, that the software was buggy for the 737 Max and the issue here was system design. There are other systems error examples like Qantas Flight 72, where a malfunction in the ADIRU was found; Lufthansa Airbus A321, had autopilot issues, and when the autopilot was turned off--the nose dropped. The Boeing 737 Max' MCAS did indeed malfunction. Now I think we all know the claimsof William Langewiesche--while not the pilot his father was (author of Stick and Rudder) he is still a pilot, so I did read his statements very carefully. There is ample evidence Boeing downplayed the serious issues with MCAS, the FAA did not look hard enough, and now, we have disparate experts trying to blame the pilots; it is easy for us non-pilots to blame the pilots, but I do know wit years of experience in software engineering, AI/ML research, and talking to seasoned pilots the narrative is not so clear-cut. Sure there are pilot errors, and they get found out, but to blame the pilots and undermine Boeing's sloppy work in the engineering of the upgrades to the 737 Max does a great disservice. These computer systems are not infallible either.
jcbmack is offline  
Old 3rd Jul 2020, 02:40
  #29 (permalink)  
ENTREPPRUNEUR
Thread Starter
 
Join Date: Jun 2001
Location: The 60s
Posts: 566
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by jcbmack
we have disparate experts trying to blame the pilots;
With MCAS, the argument is whether or not it was correct to expect the pilots to do the right things at the speed they needed to. MCAS strongly supports the proposition of automated flying, simply because MCAS doesn't work when the plane is flying automatically. The autopilot does not need any special help to cope with the evil larger engine configuration.

If you take the Max crashes and factor out MCAS, you are left with an interesting question: if the plane had been designed so that the computers were in charge: how easy would it have been to program them to cope with the sensor failures?. Inconsistent or suspicious speed etc readings can be caused by a number of system failures so would have to be a class of situations the computers could cope with.. Probably the answer to this is Boeing would have to have built in greater redundancy (ie resilience), more like Airbus do.

And more like Boeing will now have to do anyway.
twistedenginestarter is offline  
Old 3rd Jul 2020, 03:38
  #30 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Mixed Results

Originally Posted by twistedenginestarter
With MCAS, the argument is whether or not it was correct to expect the pilots to do the right things at the speed they needed to. MCAS strongly supports the proposition of automated flying, simply because MCAS doesn't work when the plane is flying automatically. The autopilot does not need any special help to cope with the evil larger engine configuration.

If you take the Max crashes and factor out MCAS, you are left with an interesting question: if the plane had been designed so that the computers were in charge: how easy would it have been to program them to cope with the sensor failures?. Inconsistent or suspicious speed etc readings can be caused by a number of system failures so would have to be a class of situations the computers could cope with.. Probably the answer to this is Boeing would have to have built in greater redundancy (ie resilience), more like Airbus do.

And more like Boeing will now have to do anyway.
http://www.b737.org.uk/mcas.htm

There was an early issue regarding how the system responded in slow flight (adding slow speed activation along with high speed pitch up conditions). There was an issue with the AOA disagree alert where it was not properly activated in all 737 Max's by all airlines. This was a contributing factor in the Lion Air accident.

For the Ethiopian accident after the pilots turned off the stab trim, cut out switches, they lost manual control of the trim wheel, so they switched them back on to regain electric trim, but this turned back on MCAS, and it's not known why they could not manually control the trim wheel.

As far as Boeing becoming more like Airbus, I think, historically, they have had different engineering philosophies, and perhaps an issue here is Boeing is trying too hard to be like Airbus. They are very different aircraft. They are not the same style or engineering practice.

Here is a salient quote " During The House Committee on Transportation & Infrastructure hearing in October 2019, an email exchange was disclosed between Boeing employees from 2015 which read: "Are we vulnerable to single AOA sensor failures with the MCAS implementation?" The response from CEO Dennis Muilenburg was that the email showed that "our engineers do raise questions, in an open culture," but that the single-sensor design met the standards. John Hamilton, chief engineer for Boeing’s commercial airplane division, who testified alongside Muilenburg, said that single points of failure are allowed in airplane design depending on the hazard assessment. Any dissent the committee could present on the final assessment that a single sensor was merited “highlights that our engineers do raise questions and it’s an open culture.”

The final KNKT investigation report into the Lion Air accident said as a contributing factor "The replacement AOA sensor that was installed on the accident aircraft had been miscalibrated during an earlier repair. This miscalibration was not detected during the repair." the angle it registered was 21 degrees too high. Following the publication of this report, the FAA revoked the certificate of approval of Xtra Aerospace of Miramar, Fla., the company that supplied the faulty AoA sensor. Xtra subsequently issued a statement saying that “we respectfully disagree with the agency’s findings.” It added that the revocation of its certificate “is not an indication that Xtra was responsible for the accident.”

Again sensor issues have become more common-place and this needs to be addressed as well even if everyone is trying to avoid responsibility. More significant redundancies perhaps--but, maybe MCAS itself was a mistake to gain longitudinal stability. That takes more digging.

Last edited by jcbmack; 3rd Jul 2020 at 03:59.
jcbmack is offline  
Old 3rd Jul 2020, 07:25
  #31 (permalink)  
 
Join Date: Dec 2006
Location: The No Transgression Zone
Posts: 2,483
Received 5 Likes on 3 Posts
Originally Posted by twistedenginestarter
When I started flying, a Trident could do a Cat 3C autoland. That was over 50 years ago. Self driving cars are here now. True, they can't drive on ordinary roads. That may take a decade or more. But flying is not like driving a car. It takes place in a highly organised, structured and protected context.

I am not suggesting there will be pilotless airliners in my lifetime. That would be far too ambitious. What Boeing/Airbus want is to reduce costs and avoid Max-type complications. Going too far would simply present new alternative deadly threats to their revenue streams.

Single pilot is missing the point. That still implies a pilot operating the controls. It still means the pilot has to know what to do in every situation. It still means pilots making mistakes or doing bad things.

No, I think the logic is to enable the plane to carry out all phases of every flight. That in turn puts pressure on everyone to ensure all the pieces are in place (eg The Bergerac Ryanair 737 being set up to perform a GPS approach).

The remaining pilot (or pilots) would still have an important and skilled job in managing the mission (including handling emergencies) although it would quite likely be less enjoyable.

By the way Sully did not handle the Hudson River ditching perfectly. As it happens his mis-settings were not fatal, but they could have been in different circumstances. I think the essence was he didn't know what the correct configuration was for a Hudson River landing because it is something no one would ever train you for. Airbus of course did know, so a computer operated landing would have been just that tad safer.
How would Airbus know how to ditch with computers? Sully had very little time to think, he made the correct decision to Ditch in the Hudson rather than attempt a turn back which would have most likely ended in disaster.
Pugilistic Animus is offline  
Old 3rd Jul 2020, 14:27
  #32 (permalink)  
 
Join Date: Oct 2015
Location: 43N
Posts: 264
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by vilas
There are simply no split second decisions for any failures. If anyone told you that it's exaggeration. Most accidents/incidents are attributed to pilot error. The long list of human factors responsible for errors makes it tempting for the technology to attempt to remove the human. Accidents happened many times more without automation..

What we don’t read about are the thousands of flights each day which are operated safely because of, not despite decisions of the pilots. We make a myriad of seeming insignificant decisions during flight which result in a safe outcome of that flight.

Technology is great until it isn’t. Pilots aren’t paid for when things go right, we are paid for when things go wrong.

Regards,
CaptainMongo is offline  
Old 3rd Jul 2020, 15:23
  #33 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by CaptainMongo
What we don’t read about are the thousands of flights each day which are operated safely because of, not despite decisions of the pilots. We make a myriad of seeming insignificant decisions during flight which result in a safe outcome of that flight.

Technology is great until it isn’t. Pilots aren’t paid for when things go right, we are paid for when things go wrong.

Regards,
Great points and I have a deep respect for you pilots who do fly day in and day out.
jcbmack is offline  
Old 3rd Jul 2020, 18:50
  #34 (permalink)  
 
Join Date: Jun 2007
Location: Wanderlust
Posts: 3,404
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by jcbmack
There are no split-second decisions to make, at all? No novel events not foretold by an AC, flight manual, or computer system? I know the NTSB likes to blame pilots virtually straight away--this I have been told by a few experienced pilots here on PPRUNE, and the same case was attempted to be laid against Sully. I cannot comment on actually flying a B737, but I can guarantee automation failure and other computer-based factors were at play there--just like the detrimental idea of using Lithium-Ion batteries in the 787 was a grave error. I can tell from the information I have access to, which is not all Boeing has, that the software was buggy for the 737 Max and the issue here was system design. There are other systems error examples like Qantas Flight 72, where a malfunction in the ADIRU was found; Lufthansa Airbus A321, had autopilot issues, and when the autopilot was turned off--the nose dropped. The Boeing 737 Max' MCAS did indeed malfunction. Now I think we all know the claimsof William Langewiesche--while not the pilot his father was (author of Stick and Rudder) he is still a pilot, so I did read his statements very carefully. There is ample evidence Boeing downplayed the serious issues with MCAS, the FAA did not look hard enough, and now, we have disparate experts trying to blame the pilots; it is easy for us non-pilots to blame the pilots, but I do know wit years of experience in software engineering, AI/ML research, and talking to seasoned pilots the narrative is not so clear-cut. Sure there are pilot errors, and they get found out, but to blame the pilots and undermine Boeing's sloppy work in the engineering of the upgrades to the 737 Max does a great disservice. These computer systems are not infallible either.
You are making too many assumptions. Pilots are not blamed straight away. The inquiry says it's purpose is not to assign blame. It takes years two publish inquiry report there was no question of blaming Sully. He was faced with something not in the book nor practiced, he landed in water. Had he gone back he would have been blamed because he wouldn't have made it. When majority of accidents are due to pilot error it is a fact of life. Emirates 521 B777 go around accident pilot didn't know his thrust was at idle who should be blamed? PK8303 Karachi pilot did not do a single thing as per procedure, who will be blamed? We as humans have problems in the air. That's why they are called human factors but not being a bird we done a pretty good job. Don't discuss MAX because it should not have been designed. Yes Boeing trying to blame Pilots was nothing short of criminal. It's a jugglery now between humans and computers. If humans make mistake so will computers but it's matter of which is safer. Naturally it's emotional issue with pilots but technology marches relentlessly. Besides aviation is a business like any other. You sell wine, computers, hotel rooms or aircaft seats it's no different it's done to make profit. What's profitable will decide the future. Poor piloting is a bad advertisement for human presence in the cockpit. Take AF447 unreliable speed accident. How was it solved automation. First came back up speed scale now in A350 it's backup speed itself. Pilot does nothing. Computers do it and inform the pilot we are on alternate speed. This is an endless discussion. Piloting is not the only issue. Humans won't be required for most jobs or at least not in those numbers. It's a serious problem.
vilas is offline  
Old 3rd Jul 2020, 19:08
  #35 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by vilas
You are making too many assumptions. Pilots are not blamed straight away. The inquiry says it's purpose is not to assign blame. It takes years two publish inquiry report there was no question of blaming Sully. He was faced with something not in the book nor practiced, he landed in water. Had he gone back he would have been blamed because he wouldn't have made it. When majority of accidents are due to pilot error it is a fact of life. Emirates 521 B777 go around accident pilot didn't know his thrust was at idle who should be blamed? PK8303 Karachi pilot did not do a single thing as per procedure, who will be blamed? We as humans have problems in the air. That's why they are called human factors but not being a bird we done a pretty good job. Don't discuss MAX because it should not have been designed. Yes Boeing trying to blame Pilots was nothing short of criminal. It's a jugglery now between humans and computers. If humans make mistake so will computers but it's matter of which is safer. Naturally it's emotional issue with pilots but technology marches relentlessly. Besides aviation is a business like any other. You sell wine, computers, hotel rooms or aircaft seats it's no different it's done to make profit. What's profitable will decide the future. Poor piloting is a bad advertisement for human presence in the cockpit. Take AF447 unreliable speed accident. How was it solved automation. First came back up speed scale now in A350 it's backup speed itself. Pilot does nothing. Computers do it and inform the pilot we are on alternate speed. This is an endless discussion. Piloting is not the only issue. Humans won't be required for most jobs or at least not in those numbers. It's a serious problem.
There is no determination as to whether computers or humans are safer. Aviation like many endeavors is human-computer interaction based. Self-checkouts are more convenient but more error-prone than a live cashier, MCAS killed a bunch of people due to sensor and systems errors, drivers crash all the time on the road, self-driving cars also crash. There is no statistical technique that can show robustly that more automation will make flying safer in and of itself. Airbuses are a different breed than Boeings, and I like both aviation philosophies but Airbuses malfunction too. There is no way to know if computers will reduce the estimated 75%-85% causal factors of pilot error, if such an estimate is really accurate. Look, I love CS/IT and AI/ML. It is a huge and blossoming field, but I believe it is you making too many assumptions.
jcbmack is offline  
Old 3rd Jul 2020, 22:31
  #36 (permalink)  
ENTREPPRUNEUR
Thread Starter
 
Join Date: Jun 2001
Location: The 60s
Posts: 566
Likes: 0
Received 0 Likes on 0 Posts
MCAS is a red herring here. It is a red herring because it is not automation. It is a fly-by-wire feature to help human pilots to fly manually. It doesn't work when the Max autopilot is engaged. In fact I recall that one of the crash crews tried to engage the autopilot because they knew, or believed, it would correct the trim problem.

For the purposes of this thread the 737 Max crashes were caused by faulty angle-of-attack data which amongst other things gave incorrect speed data. I can't remember exactly whether anyone said the autopilots could have coped with the sensor failures. I believe the autopilot tripped out when selected in the relevant crash but that the cause was the excessive misconfiguration ie things had gone too far.. If autopilots had been used from the very first possible moment they may have been able to avoid the two crashes.

If that were true it would make an interesting case for enhanced use of automation. Even if it were not true then I suspect you wouldn't need a huge change to the software to make it cope with sensor failure.

The argument I am making is not whether computers should fly planes rather than pilots. My point is simply Boeing have suffered massively and they might well feel they could easily get into the same situation again. They have to make things more sophisticated (ie complicated) to make progress. You can therefore imagine why a spin of the strategy wheel could end up pointing at backing automation and leaving pilots behind. That would have been wild speculation until we found out this week that Airbus have spent two years making something like 450 flights in a big A350. Imagine the cost of that. You can only think momentum is building up.
twistedenginestarter is offline  
Old 4th Jul 2020, 05:32
  #37 (permalink)  
 
Join Date: Oct 2008
Location: united states
Age: 45
Posts: 113
Likes: 0
Received 0 Likes on 0 Posts
Automation

Originally Posted by twistedenginestarter
MCAS is a red herring here. It is a red herring because it is not automation. It is a fly-by-wire feature to help human pilots to fly manually. It doesn't work when the Max autopilot is engaged. In fact I recall that one of the crash crews tried to engage the autopilot because they knew, or believed, it would correct the trim problem.

For the purposes of this thread the 737 Max crashes were caused by faulty angle-of-attack data which amongst other things gave incorrect speed data. I can't remember exactly whether anyone said the autopilots could have coped with the sensor failures. I believe the autopilot tripped out when selected in the relevant crash but that the cause was the excessive misconfiguration ie things had gone too far.. If autopilots had been used from the very first possible moment they may have been able to avoid the two crashes.

If that were true it would make an interesting case for enhanced use of automation. Even if it were not true then I suspect you wouldn't need a huge change to the software to make it cope with sensor failure.

The argument I am making is not whether computers should fly planes rather than pilots. My point is simply Boeing have suffered massively and they might well feel they could easily get into the same situation again. They have to make things more sophisticated (ie complicated) to make progress. You can therefore imagine why a spin of the strategy wheel could end up pointing at backing automation and leaving pilots behind. That would have been wild speculation until we found out this week that Airbus have spent two years making something like 450 flights in a big A350. Imagine the cost of that. You can only think momentum is building up.
MCAS is certainly part of automation; I suggest you open the link I left for further reading. Airbus is cool, but different than Boeing. I fear you may need a little more CS background


http://www.b737.org.uk/mcas.htm

https://www.airwayslife.com/news/201...n-in-aviation/
jcbmack is offline  
Old 4th Jul 2020, 05:46
  #38 (permalink)  
 
Join Date: Jul 2008
Location: Australia
Posts: 1,253
Received 195 Likes on 90 Posts
How many times do we have to put up with tech nerds getting all excited about the impending demise of the pilot. There is one simple reality. There is no current airliner in the advanced design stage that does not have two pilots in the front. If Boeing thought they were going to get a quantum leap in the market place they would not have persisted with a 50 year old design but gone the pilotless aircraft path.
Lookleft is offline  
Old 4th Jul 2020, 08:11
  #39 (permalink)  
 
Join Date: Nov 1999
Location: UK
Posts: 2,493
Received 101 Likes on 61 Posts
Human error exists amongst computer programmers too. Anyone who has written computer software, will know that even though your program looks good and logical; the first time you run it it will almost certainly throw up some errors or lock up and not work at all because of a logical cul-de-sac you hadn't thought of. You then spend time working through the errors and rewriting the code. Even then if someone other than yourself uses the software, they will often find bugs that you hadn't, because they will almost certainly take the software to places you never thought of.

AF447 had the STALL warning stopping because the aircraft thought it was below 60 kts IAS.

The tragically unecessary PIA crash possibly had the landing gear warnings suppressed by other warnings.

Which self driving car was is that thought the grey side of a large lorry was just the sky and crashed into the lorry?

Even if an autonomous flight deck were possible - and deemed safe - who would compute the take-off performance, monitor and cross-check the fuelling, (and decide the fuel load), do the walk-around and monitor the loading? Engineers could if there were many more of them, but I really think you need these things to be done by someone whose life will depend on getting it right by actually flying in the aircraft, not waving it goodbye from the ground. Who would liaise with Ops and ATC regarding slots etc? How would the aircraft taxi to and from the runway? Who would check the controls were not reversed before take-off?

The Cabin Crew have their own work to do, they would not have time to prepare the aircraft as well.


Boeing MCAS is not FBW, it is a bodge, and a bad bodge at that. If Boeing had introduced FBW on the 737 years ago - even just in pitch - and certified it, they would still be a brilliant manufacturer and the MAX would have just required some software parameter changes to the FBW in pitch to account for the longer engines. As it was, the company changed and concentrated on making money rather than making airliners.
Uplinker is offline  
Old 4th Jul 2020, 11:13
  #40 (permalink)  
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
There is little value in considering replacing humans with computers in the short term; similarly comparing current operations with future developments.
Future operations will be innovative, evolve as a combined Man - Machine team, each contributing to their particular strengths.

New design concepts will enable single pilot operations with economic benefit. Some highly automated systems are already capable, but without technical enhancement may not have sufficient redundancy for an acceptable level of safety with single crew.
A critical 'failure' is pilot incapacitation; a rare event nowadays - not that rare for a single pilot when considering the overall level of safety.

Semiautonomous operation (automatic recovery) would only be required after the first failure (no human), and depending on certification requirement - probability and capability - avoiding adverse consequence. A 'relatively simple' recovery system might be sufficient; it would not have to manage pilot + lighting strike, etc - excluded by certification probability, just a lower minimum standard with accepted risk.

One problem is the level of piloting skill for one crew operation. Current operations have opportunity to improve experience; First Officers aspire to Captaincy, but perhaps with less experience than in previous operations. However, with single pilot ops there may not be any opportunity to learn on the job - no FO, thus the automation will have to compensate for reduced levels of experience within the pilot-automation team. However, the need for further improved technology may be no more than would be required for abnormal flight without a pilot, although the tasks differ.

Airbus is thinking ahead, considering means, method, and validation; most of all they are not depending on simulation. Their tests are real, man and machine, warts and all, to provide a wider experience base and safety confidence in choosing a way forward.

Background reading; 'Complex Operational Decision Making in Networked Systems of Humans and Machines: - Integrating Humans, Machines, and Networks: A Global Review of Data-to-Decision Technologies'
https://download.nap.edu/cart/downlo...ecord_id=18844 'Download free pdf'
Summary ~ page 16, with interesting findings.
Chapter 3 is a valuable view of the human element - HF, CRM, etc.
safetypee is offline  


Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.