PPRuNe Forums

PPRuNe Forums (https://www.pprune.org/)
-   Safety, CRM, QA & Emergency Response Planning (https://www.pprune.org/safety-crm-qa-emergency-response-planning-93/)
-   -   Computers in the cockpit and the safety of aviation (https://www.pprune.org/safety-crm-qa-emergency-response-planning/379780-computers-cockpit-safety-aviation.html)

BOAC 1st Jul 2009 10:28

Computers in the cockpit and the safety of aviation
 
PROMPTED by the AF447 accident, but NOT directed at Airbus specifically, I would like to open a discussion on this. ALL manufacturers are moving towards FBW/software control and protection etc.

It seems to me that we have reached a dangerous 'fork in our airway'. The FBW and software make for an amazing, clever and safe operation when they are working. Our 'new' pilots don't really need the old-fasioned basic flying skills, since these systems prevent abuse/mishandling.

What is frightening to me is that after 4 weeks of 'phone-a-friend'/post a PDF chapter/analyse ACARS messages we STILL do not really seem to be sure what the AF crew had left. Experts sift back and forth, 'maybe this and maybe that, but....' - all with the luxury of time. This crew had minutes to sort out an apparent cascading deterioration.

To me this says we need 2 things, 2 basic foundation-level things for starters.

We need a system in the cockpit that DEFINITELY leaves a crew with a basic flying panel, albeit limited - maybe no IAS or altitude, but at least power and attitude and does not just dump a pile of hot poop in the crews' laps and go off shrugging its shoulders. If that means a simple, battery powered AI, then fit it.

We need the crew to be able to revert to this basic instrumentation and make a reasonable fist out of descending away from performance limiting altitudes where they can take time and try to 'reboot' all the gismos at a more leisurely pace. We need basic skills, as demonstrated by the AMS, PGF and Buffalo accidents and far less 'over-confidence' in the magic.

2 tasks then, as I see this. One is for the manufacturer/regulators/operators to ensure something usable remains, and not to be seduced into glittery-eyed fascination with how clever everything is. The second for the pilot fraternity to press hard for a change in the philosophy and application of training and recurrent testing. Learning how to programme and push the buttons is important, but more important is to be able to pick up the pieces. These requirements WILL impact on the bean-counters. The question is how do we get it done?

EGMA 1st Jul 2009 19:33

BOAC: Spot on!

I'm a private pilot and know how easy it is to become disoriented in IMC at night even when all is well. I also write safety critical software (non aviation) so I also understand the problems in handling real data that may become corrupt/invalid; generally a tidy abort is the best outcome.

Humans are lousy at monitoring automated systems and most automated systems are lousy at giving meaningful error messages, when they eventually FU. The situation that the AF pilots found themselves in IS going to happen again and with future advances in FBW technology it will be even harder for the pilots to trouble shoot.

In an ideal world the human would fly the plane and the computer would monitor his/her performance ... but that's not going to happen ... the bean counters don't like it when our pilot shortens the fatigue life of an airframe with a heavy landing.

Lets not fool ourselves; the ultimate goal of FBW is to facilitate aerodynamically unstable passenger transports with the fuel savings that that would bring. We are in a learning phase; the important lesson that we must learn is that any computerized system has limitations; just like our pilot. The problem is getting our FBW system (when it can no longer cope) to hand over to the pilot efficiently. Herein lies the problem, it can't hand over until it fails; and when its failed it's too late. The pilot needs to know what started the sequence, not the result.

It seems to me that a possible solution would be to provide an independent flight performance monitoring system. It need not concern itself with who is flying (computer/pilot) or provide corrective action. A simple aural 'CFIT in 30 seconds' would be all that was necessary, in the AF case 'airspeed' would have been all they needed to be told to know that things were going south.

I don't know what happened to AF447 but I'm certain that distraction/misinformation would have been a major contributory factor.

And yes, you're right, an independent basic flight control system would have been much more use than a trouble shooting manual.

BOAC 1st Jul 2009 20:00

Thank you EGMA for your support. I was beginning to think we ought to be discussing plumb bobs and cats tails a la R&N thread..................

Mr Optimistic 2nd Jul 2009 11:31

is a simple vertical gyro still in use
 
...in modern aircraft ? In addition to the full strapdown IMU ? If so, doesn't this give 'attitude of last resort' info ?

GlueBall 2nd Jul 2009 11:55

Standard, inexpensive, self contained, electro-mechanical SAI [Standby Attitude Indicator with built in gyroscope], completely independent from everything and anything, hot wired to its own battery, would save the day whenever the glass dashboard goes on vacation.

BOAC 2nd Jul 2009 12:56

Should we consider non-pressure driven engine instruments ie N1, not EPR? I would hope the pitch/power tables offer a back-up if they are EPR's?

Mr Optimistic 2nd Jul 2009 21:31

pls don't get annoyed but
 
..if it were possible to have an attitude reference of last resort (or be able to switch to known good source...yes I know), would a big button which did no more than try to get wings level and pitch to +3 degrees an impossibility ? It would have to have control algorithms built up by extensive testing of modelled upsets, take very conservative assumptions about the control deflections it could apply, and leave throttle settings to the pilots). Might be a compromise (does the right things in this scenario but not that, acts too slowly in this case etc) but better than guessing when disorientated ?

If this is rubbish, pls laugh and then delete.

Still like vertical gyros but these were only good for a a few seconds in an accelerating environment (where it could no longer find 'g' and correct itself).

BOAC 2nd Jul 2009 21:51

Not really - the pilots need to be able to select the pitch they require - maybe level flight, maybe a descent, and, of course, there's more software involved in that.

Tmbstory 3rd Jul 2009 07:24

Computers in the Cockpit and Safety of Aviation
 
BOAC:

Thank you for the post. It is 100 percent on the mark.

Your Basic Foundation - Level Things,number 1 and 2, have a lot of merit.

The Regulatory Authorities and the Industry ( both Manufacturing, Management and Pilots ) must work for and understand Safety and apply the original concept of Part 25 certification.

Regards


Tmb

Mr Optimistic 3rd Jul 2009 16:14

The question is how do we get it done?
 
..and without articulating the problem(s) so clearly and backing it explicitly with your (collective) professional experience that you don't start a public row and put the wind up the paying customers. That's a big ask, as they say.

BOAC 3rd Jul 2009 16:28

....and hoping for 'big' answers from 'big' players!

Carnage Matey! 3rd Jul 2009 17:39


We need a system in the cockpit that DEFINITELY leaves a crew with a basic flying panel, albeit limited - maybe no IAS or altitude, but at least power and attitude and does not just dump a pile of hot poop in the crews' laps and go off shrugging its shoulders. If that means a simple, battery powered AI, then fit it
Perhaps we should turn the question around and ask what aircraft does NOT provide this information. I know on the A320 you will always have at least a PFD and an N1 indication.

BOAC 3rd Jul 2009 17:49

Did the BA 320 that had the electrical problem a while back retain both of those?

Carnage Matey! 3rd Jul 2009 18:09

Perhaps I should have said you'll always have an AI and an N1. The standby AI was working and I've not seen anything in the AAIB report that suggest the auto-switchover of the engine indications to the lower ECAM display had failed (that said IIRC on an A320 the UNRELIABLE IAS drill requires the pilot to select the thrust to certain physical gates on the thrust quadrant (CLB/MCT/TOGA) so an actual N1/EPR isn't required). Notwithstanding that, the failures in this case were not to do with computers in the cockpit but to do with electrical failure, so is it really pertinent to the thread?

Of course if you are talking about a further failure following dispatch with the lower ECAM display inop then we are starting to get into a whole new realm of possibilities.

BOAC 4th Jul 2009 07:35

CM - I was specifically TRYING to keep away from types, manufacturers and specific failures; merely to try and see if we can set some ground rules in an industry which is changing significantly in many ways.

Safety Concerns 4th Jul 2009 15:08

sorry to be the one to buck the trend but the ground rules have already been set.

Pilots due their human nature were and still are the weakest link in the chain (that is not intended to be derogatory just fact). The figures are clear, more automation, less accidents better safety. Yes the automation does occasionally fail or get it wrong or there may still be situations yet to be covered by software but the answer most certainly isn't hand control back to fallible humans.

The answer unfortunately for pilots is to continue striving forward until the automation has been perfected.

BOAC 4th Jul 2009 16:41


Originally Posted by sc
The answer unfortunately for pilots is to continue striving forward until the automation has been perfected.

- unfortunately, since we have no idea how long this process will take, you need to change that to The answer unfortunately for pilots and passengers....... I think we owe it at least to the latter to seek some escape from whence we are heading.

Mr Optimistic 4th Jul 2009 17:23

humans v machine automation
 
Echoing SC, isn't the question settled in principle and now the only room for consideration relates to how to reduce the v. small residual risk even further, ie when human intervention is still needed (software goes for a walk, equipment failures, bad weather right in front of you ?). Tailored training, (even) better i/f !, better sims, ops margins ?

Even if it is not agreed, isn't it how the issue will be presented/managed ?

BOAC 4th Jul 2009 19:10

You have chosen your user-name well!:)

alf5071h 4th Jul 2009 19:49

BOAC A ‘big’ answer (well lengthy), not a big player; and not really an answer, more observations and questions.
Re #1 “… to ensure something usable remains…”
“… for the pilot fraternity to press hard for a change in the philosophy and application of training and recurrent testing.”

CS 25 (certification requirements for large aircraft) provisions ‘the something usable’.
Invariably, use of ‘the something usable’ is assumed (see the relatively new HF design requirements - CS 25 AMC 1302), but how can we guarantee that the crew will use it in all circumstances?
Many crews and operations do not understand the basis of certification and the assumptions therein; the result can be inappropriate training, poor SOPs, and unfounded concerns that may lead to inappropriate actions following a failure – ‘I know better’ attitude. Thus the associated problem is an understanding, which stems from knowledge – training – or more accurately education.
Due to the inherent limitations in human performance there is always the risk that crews will focus on trouble shooting and the reinstatement of the high-tech systems. This trait is perhaps more prevalent as the industry’s operations and training become technology dependent.

Our problem is that we are becoming technology ‘junkies’; no longer are we ‘children of the magenta line’, but we are developing into hardened technology addicts with all of the dependencies therein. This is partly a function of the techo-sociological world we live in; the initial schooling, play and relaxation, and the behaviours within other industries all around us – it (technology dependency) becomes our ‘expectation’.

An additional problem is that similar sociological, commercial and operational pressures, which generate the technological dependency, also affect the application of crew training and testing. Operators can elect to maintain a standard higher than the minimum specified by regulation; thankfully many do, but unfortunately this does not provide complete immunity from the random nature of accidents in a highly reliable industry.
Thus within training and testing, the problem could be associated with lowering of standards, application of the rules – (what can we get away with), i.e. falling industrial professionalism. This is often is reflected in personal attitudes to professionalism – airmanship, but also corporate culture.
Just because ‘your training system’ does not teach an aspect, should not negate self-improvement, even though we work in a high pressure time critical environment with ever increasing demands on ‘our’ spare time.


(FAST) A mid-1990’s study by a major manufacturer looked at accidents in which airplane systems were involved in an accident or where they could have prevented the event and did not. It was found that in approximately 70% of the accidents involving airplane systems, the original design assumptions were inadequate for the situation existing at the time of the accident due to changes in…
  • the aviation system
  • airplane operational usage
  • personnel demographics
  • evolving infrastructure or other considerations.

Thus current problems probably result from ‘change’ and ‘systematic complexity’, the ‘systems’ involving human activity. Complexity itself isn’t a problem; it’s the way we deal with complexity and the human interface, including understanding, need, objective, and mechanism of the ‘system’. In this respect we may be overregulated, – too many operational regulations, thus too complex to expect reliable implementation or to be correlated with certification regulations, e.g. certification claims alleviation for short duration ‘safe’ flight without airspeed, assuming that crew’s are adequately trained – is this always true – P2/P3 combination?
Can we see these changes in the complexity – are we looking? If seen, how are their importance assessed, do we choose an appropriate activity to combat any hazard?
We are a safe industry by most standards, but in safety there is no place for complacency – failures in looking, assessing, and deciding.

The industry depends on technology; in general, we created that need. The industry has yet to understand all aspects of technological dependency (accidents are often an unfortunate learning process), and individually we need to have greater understanding of the technology and the surrounding objectives and assumptions when we use it.
We have yet to learn to live with ‘aviation’ technology – we have to change, but in this change there may be more hazards. Combating these aspect requires thought and analysis – basic thinking.
For all of the human weaknesses, the human is still a powerful defensive aid – we can identify problems and deduce solutions. Do we teach pilots to think in these ways and adequately train them for the range of critical situations (stress, time dependent) which might be encountered?
Thus this problem is not only about technology, but the process of how to think – situation awareness, decision making – full circle back to airmanship, including skills and personal standards.

We are part of a self-generated complex system. In implementing technology, perhaps we have forgotten to look (think), or judge wider ranging aspects in the larger system which unfortunately may only surface with use – contact with aspects of the system, us - humans.

“No plan survives contact with the enemy” - ‘The principles of War’. Von Clausewitz

CS 25 Large Aircraft.
FAST Presentation.


All times are GMT. The time now is 15:51.


Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.