PPRuNe Forums - View Single Post - Computers in the cockpit and the safety of aviation
Old 26th Jan 2011, 06:55
  #127 (permalink)  
PBL
 
Join Date: Sep 2000
Location: Bielefeld, Germany
Posts: 955
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by alf
Peter, I disagree that the role of the certification pilot is limited by ‘math’.
alf, that may be either because I haven't explained myself well, or because you are not that familiar with the statistical reasoning, or both.

The practical limit of statistical testing of software-based functionality is around one failure/dangerous failure per hundred thousand hours; or 1 in 10^(-5) per ophour. You can bench-test the kit to this level, and maybe perform a certain limited variety of partial-integration tests, but you can't do full integration without flight test.

Keep in mind that the certification standard for DAL A critical kit is 1 in 10^(-9) per op hour, that is, ten thousand times the reliability level of which you can be assured to any reasonable level of confidence by bench testing and flight experience.

If you want to be assured with reasonable confidence that dangerous anomalies will not occur with a probability any more than 1 in 10^(-6) per op hour, it will actually take you the total ophours in the entire service life of the fleet to do so. And you are still, with 1 in 10^(-6), a factor of one thousand under the usual certification requirement for catastrophic events, and a factor of ten under that for hazardous events. That is the combinatorics of software anomalies and there is no way around that math. Recall what you said earlier:
Originally Posted by alf
raw data or otherwise, those at the front of the aircraft are going to use whatever is presented. Thus as you know, a key aspect of certification is that this data must not be hazardously misleading.
and it follows from what I just said that you currently cannot confidently get within a factor of ten of that assurance by general methods. There are some specific methods for specific architectures which promise to be able to attain such assurance with confidence, but these methods are state-of-the-art research (I just reviewed what will be a seminal piece of work on this, which should appear in 2011. Then add the umpteen years it will take for this to become common knowledge.....).

It took ten years of flying Boeing 777's around the world before the critical configuration anomaly showed itself out of Perth in 2005. It took 15 years of flying A330's around the world before the filtering anomaly showed up at Learmonth.

Software-based systems are simply different. The math was put out there by a couple of seminal papers in 1993, and at the turn of the century there were still some supposedly-knowledgeable avionics designers who did not know the hard limitations on testing of software or "proven through experience" supposed-validations. Ten years after that, with Byzantine anomalies on one heavily-used machine that came within days of having its AW certificate revoked, the 2005 Perth incident and Learmonth and similar, avionics engineers and assessors are somewhat more aware of the severe limitations.

I work on critical-digital-system standardisation committees with engineers who were still not precisely aware of the statistical limitations even a couple of years ago, fifteen years after the published results, even though there was a general awareness. However, the situation has recently changed in some countries such as Germany. I can't talk about the work until it is concluded and published, though, because of the protocols involved in standardisation work. It does not cover either avionics or medical equipment - just everything else.

Originally Posted by alf
Modern systems certification involves both man and machine; thus, more than one perspective is required, and neither need dominate.
Unfortunately the math dominates, as the auto industry now knows well. Manufacturers and component suppliers do extensive road testing of all bits of kit, as well as enormously much unit testing and partial-integration testing. But some of that kit really does get 10^8 to 10^10 ophours on it, amazingly, from all the installations throughout the industry. And it fails. And that costs the manufacturers and suppliers huge amounts of money in compensation, which they don't talk about but would dearly like to reduce.

The aviation industry doesn't see that - often - because the number of op hours aren't there.

That doesn't make the role of a certification test pilot any less important than it ever was, as you carefully point out with good reason. But there are some things heshe just can't do.

PBL

Last edited by PBL; 26th Jan 2011 at 07:10.
PBL is offline