Go Back  PPRuNe Forums > Ground & Other Ops Forums > Safety, CRM, QA & Emergency Response Planning
Reload this Page >

Computers in the cockpit and the safety of aviation

Wikiposts
Search
Safety, CRM, QA & Emergency Response Planning A wide ranging forum for issues facing Aviation Professionals and Academics

Computers in the cockpit and the safety of aviation

Thread Tools
 
Search this Thread
 
Old 9th Jul 2011, 13:00
  #181 (permalink)  
 
Join Date: Apr 2005
Location: Australia
Posts: 1,414
Likes: 0
Received 0 Likes on 0 Posts
I would be very surprised if Boeing recommended (officially) a ground speed (GS) check during take-off.
This is an airmanship check. The Boeing 737 FCTM in fact does mention the subject under a section on unreliable airspeed where it states that "ground speed information is available from the FMC and on the instrument displays (as installed). These indications can be used as a cross-check".

With ground speed read-outs in full view of both pilots during take off it could be argued in court that the crew would be negligent not to use such a valuable resource if available.
A37575 is offline  
Old 9th Jul 2011, 20:46
  #182 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
A37575, “With ground speed read-outs in full view …
However, the prosecuting ‘Airmanship’ council would argue that if the crew knew that the GS value displayed was delayed in processing, plus a few seconds smoothing, with potential errors due to ‘your’ wind evaluation and addition/subtraction, and looking at the display - head down during take off; … then what’s the legal score.

It’s a fatuous debate. I am sure that we would agree that the airmanship issue is primarily about judgement; but judgement in aviation has to be supported with knowledge, which if inaccurate, misused, or absent when essential, then the resultant decision might have greater risk than for a less complex situation without that information.

This is part of the debate on automation. Recent posts suggest that it is impossible for the human to ‘know’ (understand) everything about highly complex systems (technology, human, and environment). Thus operations have to be conducted with a relatively lower level of knowledge. However in many instances this may represent a lower risk than with older systems due to greater data accuracy, clarity of display, and lower technology failure rates.

I have argued that if the overall risk has not been lowered, either due to the way in which technology is being used (erroneous organisational requirement or individual choice), or increasing operational complexity, then the situation should be changed – change the task.

Trying to use too much or inappropriate data (all available resources) due to complexity-induced knowledge deficiencies is just as ‘culpable’ (or more so), than overlooking/mistaking ‘good’ data. Both of these (opposing) views contain facets of human fallibility (use everything because ‘I know better’, vs mistakes in perception).
I would argue that the industry has to accept that the greater use of technology requires us to change the way we operate, at least to think about it – because the humans are at or beyond the limit of capability.
Adding more task to an already over tasked human is not a good choice, particularly when airmanship is a strenuous mental task.
alf5071h is offline  
Old 10th Jul 2011, 06:20
  #183 (permalink)  
 
Join Date: Jun 2006
Location: Australia
Posts: 1,186
Likes: 0
Received 0 Likes on 0 Posts
Adding more task to an already over tasked human is not a good choice, particularly when airmanship is a strenuous mental task.
Your point is well made. Except your statement that airmanship is a strenuous mental task. Presumably, you jest of course? But the design of the PFD in sophisticated aircraft is such that all the information needed to fly the aircraft on instruments is in the PFD. Combined with that information being fed into flight directors, in theory anyway, the instrument scan can be reduced to one flight instrument - and that is the PFD.

This has always been the problem when for regulatory reasons raw data is required to be tested during proficiency instrument rating tests. Pilots who are rusty at raw data instrument flying are like this because most of their flying on jet transports is gazing at the PFD with tunnel-vision.

Most PFD have a ground-speed read out and it should not exactly overload the crew to note that figure during the 80 knot call-out which is done from the IAS in the PFD anyway.
Tee Emm is offline  
Old 10th Jul 2011, 14:00
  #184 (permalink)  
 
Join Date: Jun 2009
Location: Oxford, England
Posts: 297
Received 0 Likes on 0 Posts
MountainBear, #180

That leads to a circular argument. They are 'nothing like smart enough'
because they haven't been programmed to be. They haven't been programmed
to be precisely because the pilot is there.
My response to that, to break the loop, is that they haven't been programmed
to be because the technology to do it safely doesn't exist.

It's no problem to program computers to take off from London, fly to New
York and land. The problem is how to handle the probably millions of
failure modes, their combinations and sequences, that could interfere
with the task. This is without considering environmental factors such as
weather and other unpredictable events. Computers are pretty dumb,
in that they can only be programmed to process a set of rules within strictly defined
limits. Outside these limits, the machine has no code to execute, nor
algorithm available to process the data and can only generate a failure
message. "Does not compute", applies here. It's very difficult, if not
impossible, to program a machine to handle chaos.

You might then argue that we can throw ai techniques at the problem, but
the response would be that ai technology is nowhere near mature enough
to be given responsibility for a high risk activity like flying several
hundred souls at 35k feet under full automatic control. It will possibly
never be, despite any claims that it can be done by optimistic
technologists.

It's unfair to blame the machine or the programmers behind the machine
for the inadequacy of the design document they were handed. The
human/machine interface only becomes a issue when you assume that that a
human being must be on the flight deck. Take away that design
requirement and the design of a FBW system is going to look a lot
different than it does now.
I don't think of it in terms of "blame" and have never subscribed to
blame culture. I'm sure that the systems have been tested rigorously to
the original specification and all involved were altruistic and
dedicated in their intent. All i'm arguing for is a much more holistic
(sorry about that word) view of the whole system. In the same way that
intuition and instinct, as well as learning, all contribute to the
merging of man and machine in m/cycle riding, car driving, light a/c
flying and more. The exact opposite to current civil aviation practice,
where crew seem to be trained and encouraged to fly the computers, not
the aircraft.

As for the flight deck, if it ever got to the stage that the crew became
redundant, there would most likely be no flight deck, just more racks of
avionics humming away in the background. But what an opportunity for the
beancounters. Not only do you save the space and weight of the kit on
the flight deck, but dispense with the services of expensive and sometimes
unpredictable crew as well. A win-win situation all round i'm sure ...

Regards,

Chris
syseng68k is offline  
Old 10th Jul 2011, 19:19
  #185 (permalink)  
 
Join Date: Jun 2010
Location: USA
Posts: 245
Likes: 0
Received 0 Likes on 0 Posts
My response to that, to break the loop, is that they haven't been programmed to be because the technology to do it safely doesn't exist.
Necessity is the mother of invention :-)

My point is that fully automated flight decks deserve the opportunity. I cannot emphasis that word enough. I make no predictions or promises. I hold no bias one way or the other. I hold that position because I believe two things to be true:

(1) Automation of flight decks has historically proven to be safer than human beings alone. Unless one is willing to put it all down to a grand coincidence I know of no other explanation for the majority of the decrease in risk in accidents in the 20th century.

(2) That the human population continues to expand. That we continue to put more and more planes into the air, ever bigger planes filled with more and more people. That the margins demanded in terms of airline separation continue to shrink. That the system works only because it is a system and not people flying randomly.

So for me, computers, computer programmers, who already doing an excellent job with drones in the military, deserve the opportunity to take it to the next level. Will they succeed; I do not know. But I don't think that pilot training and ever more training is going be adequate to the future demands of a system meeting the needs of 20 billion people.

From weaving looms to cars to computers automation has been resisted at every step but certain populations. Automation in airplanes has been no different. Even today, certain minority groups like the Amish continue to resist all modern technology; that is their prerogative. Yet I don't think that most people desire to go back to the horse and buggy days, occasional fits of pastoral romanticism aside.
MountainBear is offline  
Old 11th Jul 2011, 18:33
  #186 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
TM #183,
We might debate if mental tasks are strenuous or not; perhaps your view is from one with expertise, but we must not overlook the average individual, or even the experienced in demanding circumstances.
I use Kerns definition of airmanship which includes many mental tasks – discipline, skill, knowledge, awareness, judgement.
Thus I argue strenuous, but will accept ‘relative’ in relation to experience / situation.
alf5071h is offline  
Old 11th Jul 2011, 19:30
  #187 (permalink)  
 
Join Date: Aug 2009
Location: Texas
Age: 64
Posts: 7,197
Received 391 Likes on 242 Posts
alf5071h
I would not disagree that unusual attitude recovery training is an important subject, but what is the exact relevance to automation / computers?
Relevance is that if your computer flies and you don't, you get rusty. More importantly, you scan gets rusty so that when you need it, you have to play catch up or it is too slow for conditions.
It may be more beneficial to look at the reasons for the loss of control.
If the airplane can do it, you need to be trained to deal with it. Arguing perfect prevention is a good way to fill up graves.
If there have been system failures, then why did they fail, and how did the crew manage these failures given that in most, if not all circumstances the aircraft is still flyable – rule 1 fly the aircraft.
Rule one requires practice and proficiency. If you don't practice, you won't remain proficient.

‘Loss of control’ accidents without system failure appear to have elements of non normal operation, surprise, and hazards of physiological disorientation – these are not failures of technology or the aircraft.
They are a failure of the man machine interface. I do not find it that useful to pretend they can be separated.
Thus, the higher priority for training might be related to how pilots manage system failures, how they fly an aircraft in a degraded state, and how they manage themselves when dealing with the unexpected or when challenged by weakness of human physiology – always trust the instruments.
Trust the instruments, and know when the instruments are working, or aren't. I completely agree that training in degraded mode is critical for safe flying ... since eventually, any machine will break, or, as a computer is the topic here, have a small hitch and need at the least to be cycled on and off, if not reprogrammed back at base once safely on the ground.
It would better to avoid the hazardous situations, rather than relying on recovering from an upset, if indeed it is recognized / recognizable.
Presuming pure and perfect prevention fills graves. Yes, work on airmanship and judgment to improve hazard prevention, but if the plane can do it, you need to know how to fly out of it, and practice it.

In re the RTO and an FO whose attention is asserted as wandering ...
This is where teaching condition based scan patterns is useful. Teach and train particular scan patterns, and scan variations, that are tailored to particular critical conditions. That allows you to pick up on critical performance data in a timely fashion.

sys: (chris?)
They say a picture is worth a thousand words. Why ?. Because
a picture is effectively parallel processed by the brain, while reading
text or scanning instruments is serial and it takes much longer to
assimilate the meaning.
Possibly why those suites of nice round gauges worked so well for so long.

They painted a picture.
Trying to diagnose problems by wading through pages of error messages, and / or getting out handbooks, finding the right page, ad nauseum, takes far too much time in an emergency. There just has to be a better way. In some ways, modern a/c are quite primitive, despite all the complexity and shiny paintwork.
In a multi place aircraft, the trick to all that is pilot flying, FLY, Pilot non flying, work to filter out the non essential from the essential.

That is another area in this era of computers in the cockpit that absolutely must have emphasis in training. (Sim seems a great place to practice such things.)

Back to BOAC's original premise:

When I flew T-28's, I had to know its systems. When I flew Hueys, I had to know its systems. They were similar enough in complexity, with the added worry of hydraulics in the latter. Avionics mostly a wash.

When I flew SH-60's, I had to know a HELL of a lot more stuff, since it had more systems. So, the training was more intense, and the amount of work I had to do to stay proficient in my aircraft was considerably more detailed. (I sometimes hated knowing that I knew what a Mux De Mux was, but know it I did, since I had to talk to technicians when it went south.)

You have to know your systems, and I think the training and professional end of this lays a serious burden on leadership in the industry, in pilots, in training departments, and in the corporate management sector.

How do you encourage and incite active curiousity among your pilot work force in diving into any and every detail of how the bird works? This sort of enthusiasm can't be limited to the pilots. It needs to permeate the entire culture of your airline, because now and again, you'll find things out that you want to bring to the attention of the supplier, and others who fly your bird.

That last is a cultural imperative that I think increases in gradient as the computer era puts its stamp more firmly on flying.
Lonewolf_50 is offline  
Old 12th Jul 2011, 01:47
  #188 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
Lonewolf50, slick quips and quotes; good enough to debate.
But seriously;

Yes, skills degrade without practice, but the computers don’t fly upset recoveries (perhaps they should, and ASI failures), so what skills are degrading due to automation/computers, and are these relevant to system failures (man and machine) in current operations?

Prevention is a good place to start, nothing is perfect. Safety requires a combined effort, thus recovery (action during, and after the ‘upset’) is also necessary.

Failures of the man-machine interface, I would use the generic ‘malfunction’ not just failure. The problem-space consists of a combination of technology, human, and situation.


Generally those who argue for more training align with a ‘blame and train’ culture. However, your views appear to represent an alternative of addressing the man / machine aspects, particularly the man.

Well reasoned arguments indicate that the pilot can no longer be expected to fully understand the technical system, nor the designers accommodate the irrationalities of human behaviour or combination of technical failures, and neither, can understand the entirety of complex situations.

In operation, ‘malfunction’ happens and we expect the human (best placed and probably best equipped - brain power) to solve the problem, which primarily is to fly the aircraft. But this is not normal flying, not a normal situation, indeed it is a situation which some humans may have been unable or unwilling to foresee.
At great cost we could train for a wide range of scenarios with extensive knowledge requirements, yet never be sure that every situation has been considered or that knowledge would be recalled.

Some areas of the industry might consider themselves safe enough; the trade off between safety and economics is balanced. Thus their task is to maintain the status quo; this could be an aspect of technological complacency or the reality of a sufficiently safe system (public perception).

If the latter is true, then the primary safety task might be to avoid the ‘big one’. Apparently what we don’t know is if an accident is the ‘big-one’ or just an accumulation of many ‘small ones’, and if either situation has automation/technology as a root cause.
More likely as with previous accidents, it is a combination of man, machine, and situation; - complexity, which we are ill equipped to understand.
alf5071h is offline  
Old 12th Jul 2011, 08:10
  #189 (permalink)  
Per Ardua ad Astraeus
Thread Starter
 
Join Date: Mar 2000
Location: UK
Posts: 18,579
Likes: 0
Received 0 Likes on 0 Posts
Overall, most here seem to be dancing around 'the head of my pin' even if sometimes in opposite directions.

I come back to post #1 - as long as there is going to be a 'human' in place in the cockpit we must ensure that when the 'perfect protection' fails - as it inevitably will - that the human is:-

1) Left with basic information to enable a reasonably rapid analysis of the situation and an equally rapid 'plan of action' to be formed

2) Equipped with the basic skills to execute the 'plan'

3) Given flight controls that will 'execute' his/her demands without interference

4) Have the opportunity (and training) to processs 1) without feeling (or being) 'pressured' into working through some complex electronic jungle of 'information/action' BEFORE it can be done.

Returning (I know!) to the trigger for this thread, 447, why could the whole shooting match not just be dropped? Autotrim into a stall - no - if PF finds not enough elevator authority, move the trim wheel. Voting out a series of confusing (to the computers, certainly) input conflictions - why not an earlier default to "Dave I don't really understand all this. Here is a basic aeroplane".

When it all goes 'south' we do not need any more 'bells and whistles'. The crew were not trying to land on another planet or achieve earth orbit docking with the ISS. They needed to stabilise, descend and turn out of the weather- a fairly basic (yes, challenging) flying task to the 'older generation'. They did not achieve this. Industry needs to address this, be it lack of training, the wrong training, the wrong 'automatics', the wrong 'mind-set' or whatever. Whatever it was, to find a crew apparently 'looking at' a high nose, high power and a huge rate of descent over several minutes and NOT working out that the AB-taught recovery is not working needs to be examined, and examined thoroughly, and not just written off as 'pilot error'. After all, how many times have my 'gums' beaten around the fact that a high r of d is one of the 'symptoms' of a stall, both on the blackboard and in the cockpit? Having the 'fact' that you cannot stall an aircraft drummed into you will probably suppress any sense of recognition that you just have.

I do not see progress here. I see 'defensive positions' at many points and an assault on the child who shouts about the King's clothes. Where does the responsibility lie? On another thread I was accused of "You have been busy ripping software engineers a new set of orifices it seems" what ever that bizarre fetish might be. I am not 'anti' the software. I am not 'anti' the writers. They will write, and write as well as humanly possible within their brief and any 'understanding' they might have of the flying 'task'. We have HR and accountants taking a lead role in how we train, practice and operate. The 'business' pressures are immense. Often 'blaming;' the crew is expedient. Is the 'long-stop' perhaps the test pilots? Is it amongst a 'stronger' training community? Are they able to say, 'hang on'? Where is the 'moderation/reality check' to be?
BOAC is offline  
Old 13th Jul 2011, 01:47
  #190 (permalink)  
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
BOAC, “I am not 'anti' the writers. They will write, and write as well as humanly possible within their brief and any 'understanding' they might have of the flying 'task'.”

But if the writer’s understanding came from the piloting community, then aren’t we the cause of your concerns. Did we think that it wouldn’t happen, couldn’t happen, that we can understand system failures, we avoid Cbs by a large margin, and we can hand fly fly/pitch power with an airspeed failure.
More likely these concerns represent the lack of understanding about AF447; – yes, wait for the report, or at least the data.

Alternatively such concerns only arise with hindsight, such situations were not, or even could not be conceived beforehand. Recall the speculation about events before AF447; the other ‘successful’ events, possible regulatory assumptions about the relative ease of flight without airspeed, a short duration malfunction which average crews could manage easily.

http://archlab.gmu.edu/people/rparas...uto5302008.pdf - another theoretical view for the mixing pot.
safetypee is offline  
Old 13th Jul 2011, 07:17
  #191 (permalink)  
Per Ardua ad Astraeus
Thread Starter
 
Join Date: Mar 2000
Location: UK
Posts: 18,579
Likes: 0
Received 0 Likes on 0 Posts
SP - you did not address the rest of my last para in your 'quote'.

I'll TRY to answer your points:

But if the writer’s understanding came from the piloting community, then aren’t we the cause of your concerns.
- quite possibly -see my last para.
Did we think that it wouldn’t happen, couldn’t happen, that we can understand system failures,
- not in my case, anyway - cannot speak for the rest.
we avoid Cbs by a large margin, and we can hand fly fly/pitch power with an airspeed failure.
Yes - and?
More likely these concerns represent the lack of understanding about AF447; – yes, wait for the report, or at least the data.
- basically what I have been saying on the other threads, but you need to FORGET 447 and think 'big picture', please? 447 is just one small cameo. In my opinion it is the speed and direction the industry is making relative to the 'progress'of human understanding.
Alternatively such concerns only arise with hindsight, such situations were not, or even could not be conceived beforehand.
. I do not agree - the 'writing on the wall' was there in many peoples' minds and was reinforced by the absolute disbelief following the ?first? AB crash in service (Air India?) caused by mode confusion.
BOAC is offline  
Old 14th Jul 2011, 19:55
  #192 (permalink)  
 
Join Date: Aug 2009
Location: Texas
Age: 64
Posts: 7,197
Received 391 Likes on 242 Posts
Alf5071h
Failures of the man-machine interface, I would use the generic ‘malfunction’ not just failure. The problem-space consists of a combination of technology, human, and situation.
Agree, which points to an issue that was raised in another thread. Some of the pilots pointed out that a lot of their simulator training was spent doing regular procedures with all systems on. Of all the things to train, that's what one needs a simulator for least. Each repitition of those evolutions that you undertake in the daily job reaffirms that pattern through repitition. Apparently, most flights go along nicely with all systems working, thanks to fairly high reliability systems in place today.

So what do you do with that expensive sim time?

You do what you can't do in the aircraft with a load of paying customers, you set up situations that address man and machine interfaces in those situations where the crew have to make a difference. I'd say that's time better spent in the sim. (Personal bias, I must confess, given how I used to run training sim events all those years ago. Heh, I liked to see 'em sweat! ).

If the computers make things work, you need to work out how they work, and how they don't work. That way, you get the most out of the computers, regardless of how many are, or are not, working.
Well reasoned arguments indicate that the pilot can no longer be expected to fully understand the technical system, nor the designers accommodate the irrationalities of human behaviour or combination of technical failures, and neither, can understand the entirety of complex situations.
I think you are selling pilots a bit short on that score. I do not believe that comprehension of how the bits and pieces fit together is beyond the average professional pilot. Education is a continuum. I firmly believe that pilots get significant job satisfaction from expanding their professional knowledge.
Lonewolf_50 is offline  
Old 15th Jul 2011, 14:00
  #193 (permalink)  
 
Join Date: Jul 2003
Location: An Island Province
Posts: 1,257
Likes: 0
Received 1 Like on 1 Post
Lonewolf_50,
So what do you do with that expensive sim time?
As you indicate we spend a lot of time training specifics, but in complex ‘high tech’ situations we require greater generic skills, particularly those associated with situation assessment. Thus we require fewer action skills (many are still essential), and more thinking, pattern generating / matching skills.

Pilots have to be taught how to think in a high tech world – the patterns will differ from those in basic training, and then use and practice these skills in representative scenarios; not just the man – machine interface, but the man – situation interface.

I agree that we should not ‘sell the pilots short’. Humans are still a valuable source in complex situation, but also most prone to failure.
Perhaps my bias was from experience; like you I had the pleasure of flying the T28, but I’m not sure that I really understood all of it.

If the computers make things work, you need to work out how they work,”
I am not convinced of the need for that, or the practicality of achieving it – generating appropriate knowledge and the reliable ability to recall it when needed.
The previously linked Systems Failure suggests considering complex systems at a higher level; e.g. autoland is computer controlled, but a pilot does not need to know if this involves dual-dual or dual-dissimilar software/hardware; what is important is fail passive vs fail op - ‘do something’ vs ‘do nothing’ (do less) in the event of failure. There has to be a balance between knowing what, and knowing how; technology has changed this balance, has the training changed too?

Education, yes, but time and resource has to be balanced under currently dominating economic pressure. Are we, the industry safe enough, or are we blinded by the potential of automation; only the next accident might determine that.
alf5071h is offline  
Old 16th Jul 2011, 13:41
  #194 (permalink)  
 
Join Date: Apr 2005
Location: Australia
Posts: 1,414
Likes: 0
Received 0 Likes on 0 Posts
Education, yes, but time and resource has to be balanced under currently dominating economic pressure
During a visual descent in the real aircraft the crew were directed by ATC to join the circuit downwind. F/O was PNF. To the surprise of the captain and a qualified observer in the jump seat, the PF turned base far too soon with the result the aircraft was so high on final that a go-around had to be made.

Later the F/O explained that despite 1000 hours on type he had never done circuits and landings in the simulator and certainly never given the practice at entering the circuit on the downwind leg. But he had done well over a hundred ILS on the autopilot in the simulator and lots of one engine inop go-arounds and numerous multiple emergencies during LOFT exercises as well as no shortage of taxiing, another hundred holding patterns on automatics and LNAV but NEVER a circuit or a 35 knot crosswind landing.

All of these were regulatory box ticking exercises during the cyclic training regime. And all he wanted to do was to increase his pure flying skills by manual flying a few circuits without any automatics preferably in a crosswind. But the syllabus did not allow for that..

There surely must be a lesson here for the trainers as almost certainly this disregard of practice manual flying in the simulator is widespread in favour of heads down button pressing and autopilot`monitoring`.
A37575 is offline  
Old 16th Jul 2011, 20:25
  #195 (permalink)  
 
Join Date: Dec 2002
Location: UK
Posts: 2,451
Likes: 0
Received 9 Likes on 5 Posts
BOAC, #191
Re last part of your paragraph: - Where is the 'moderation/reality check' to be? There is little point in seeking additional responsibility without evidence of what the problem is – “dancing around 'the head of my pin'”.

If the piloting community are ‘a’ cause, and apart from you there are no great shouts for change, then is the industry happy with the current safety level. We still might be deluding ourselves – complacent, or overpowered by the bean counters.
In the first instance the moderation / reality check should be with the regulators, who have to balance the public (political) inputs with those of the industry, and of course the facts from accidents.
EASA safety review 2010– a good, safe year.
EASA action plan
- only tenuous links with automation (sect 5, automation policy, sect 6).

Avoiding Cbs by a large margin; do we? We may not avoid Cbs by a sufficient margin compared with previous operations with low res radar etc. Investigating incidents in similar conditions to AF447, indicated that some regional crew’s tend to cut the Cb mis distance quite fine. This appeared to be aided by the use of modern technology – high definition radar and accurate autopilot controlled flight track – we know where the storms are, where we are, where we are going – technological complacency; but do we know the significance of knowing or the limitations of our knowledge.

The Big Picture; many LOC accidents.
How many were a direct result (prime contribution) of failed automation, – few if any.
How many were due to crew/auto interface, - superficially a significant number, and of these speed awareness and trim contributed to many.
How many were due to disorientation, Go Around, FBW vs steam aircraft, - there is a mixture, but all in normal non-emergency operations.

If these contributors are all clearly identified, then which is more significant?
My biased, non-statistical rough cut, places the human, and only the human in pole position.
This is not blame, but recognition of human limitations, although from a different viewpoint – that involving modern technology – but that may just be a part of evolution.

"tempora mutantur nos et mutamur in illis"
safetypee is offline  
Old 17th Jul 2011, 23:19
  #196 (permalink)  
 
Join Date: Jun 2010
Location: USA
Posts: 245
Likes: 0
Received 0 Likes on 0 Posts
- basically what I have been saying on the other threads, but you need to FORGET 447 and think 'big picture', please? 447 is just one small cameo. In my opinion it is the speed and direction the industry is making relative to the 'progress'of human understanding.
I don't think your thinking big enough.

My issue with the whole training/complexity debate is that the debate tends to implicitly assume that the state of affairs we have at present will continue. It won't. Whether you call that 'progress' or use a more neutral term like 'evolution' technological change is a fact of life. That's a major reason why you can't rely on training to get you out of the difficulty. The training is always changing because the underlying technology is always changing and so there is no fundamental store or bank of experience than one can rely on. The idea that you or I or any group of people can yell "stop, wait till we catch up" is just sheer folly. The Cpt. with 30 years experience is in some ways worse off than the new guy because he's got the old technology cluttering his head and the new guys doesn't.

Mode confusion is just an interim problem. That doesn't mean that it should be ignored while it exists. But the long term trend is going to be the elimination of mode confusion by the elimination of the being that is confused, the human. In the short run, training probably is our best hope. But improvements in training is not a sustainable long term model for increased improvements in airline safety over the next 30-50 years.
MountainBear is offline  
Old 18th Jul 2011, 05:23
  #197 (permalink)  
Moderator
 
Join Date: Apr 2001
Location: various places .....
Posts: 7,183
Received 93 Likes on 62 Posts
The idea that you or I or any group of people can yell "stop, wait till we catch up" is just sheer folly

That's a real worry to me.

A basic tenet of activity in many areas is the "knock it off" (or some similar catch cry) approach to discontinuing whatever it is that appears to be going off the rails.

Clearly, we can't adopt a stop and wait approach but we certainly need the option of adopting some other technique to replace/supplement that which may be causing us grief in the short term. For us old pharts, that might be a preference for the big O.F.F button, followed by getting things under control via a bit of pushing and pulling, and then the ON button to go back onto the automatics.

While I am quite happy with the idea that the Airbus machines enhance safety statistics overall, there remains that niggling disquiet if I can't find an OFF button to give me some chance in the event that the automatics simply give up ? Caveat - I haven't flown any Airbus machines so I am a tad in the dark as to the cockpit specifics.
john_tullamarine is offline  
Old 19th Jul 2011, 00:06
  #198 (permalink)  
 
Join Date: Apr 2011
Location: Oz
Age: 80
Posts: 2
Likes: 0
Received 0 Likes on 0 Posts
Noticed this thread is still alive..Now what exactly is being said? The point?

Seems everyone of these posts seem to live and die within a CRM type of rarefied crew environment that seem so far out of reality from normal aviation business, that honestly, nothing here is worth commenting on.
senseofrelief is offline  
Old 20th Jul 2011, 17:00
  #199 (permalink)  
 
Join Date: Aug 2009
Location: Texas
Age: 64
Posts: 7,197
Received 391 Likes on 242 Posts
alf5071h

I appreciate that sim time costs money. I have some small experience in pilot training syllabus change and revision with the Navy. We were plagued by the know nothings who would harp on the "replace aircraft hours with sim hours" without knowing what limitations sims have ... or even knowing which generation of sims we had been funded for.
(Grrrr, still makes me mad, even now, how foolish this "guidance" was as compared to tools at hand). And then if we wanted to buy new or upgrade the sims ... where is the money?) See also the thorny problem discussed in sim training for the stall or upset or spin case: can one afford to build the sim that can give you that training?

I suspect that the airline industry runs into similar institutional problems.

In re knowing your systems to the depth I advocate, versus "need to know level of training."

I cannot concur with knowing the systems to the depth of one or two briefing slides. The ability to trouble shoot and work through a degraded mode requires both well crafted SOPs and procedures (QRH/Memory Items/ECAMS/what have you) and an understanding of what the system is doing as you turn various things on and off. You need depth of understanding.

A rough analogy is the understanding of how a car works when one has overhauled an engine and replaced a transmission,
versus
"get in and drive" level of knowledge typically resident in a motorist.

The former is often able to get more out of a car, or know what not to do with it, than the latter as things begin to go wrong.

As systems get more complex and interrelated, the pilots must be educated, or educate themselves, or both, on how these complex pieces interract with one another.

Any organization will want to standardize training to ensure a certain minimum standard is achieved and maintained, and a predictable result be attained. (Ecucation in depth will help with the ability of aircrew to interact with techs/maintenance, and thus reduce fault isolation and remedy time cycles).

The institution has to invest in the continuing education.

See John T"s comments about change. It is ever with us.

So too is the requirement, not option, of both education and training so that you get the most out of your system. <== That would seem to get a return on the bottom line, would it not, if only via cost avoidance?

(At this point, seque to FOQA and someone yelling about Six Sigma in pilot training, and I'll be riding off into the sunset. )

Final thought: the computer in the cockpit is like a firearm in the hands of the standard citizen. Dangerous if you aren't well trained and educated in its use, a great asset if you are well trained and educated in its use.
Lonewolf_50 is offline  

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off



Contact Us - Archive - Advertising - Cookie Policy - Privacy Statement - Terms of Service

Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.