PDA

View Full Version : Aircraft Safety


5206
27th Mar 2005, 22:32
Sitting quietly in the background, I've read some interesting stuff on safety in a few threads. I have a few things to pose for thought from a procurement perspective:

1. How great is the pressure to accept lower safety standards?

2. How can this pressure be vented effectively?

3. How do our flying brethren view procurement safety - do they just want to get on with it and leave any argument to "at the subsequent BOI" or is there any interest in the overall safety of the platforms they strap themselves to?

Safety_Helmut
27th Mar 2005, 22:50
5206

You,ve raised some interesting points....


....but your motives for asking ? :suspect:

Safety_Helmut

5206
27th Mar 2005, 23:05
Been there, had that pressure, wonder what others do. Also wonder what those flying on the frontline make of what goes on in this field. My experience ranges from hostile (you just want to spoil our toys) to supportive (tell it like it is mate). Not after any project specific info - not my game - just general flavour of how project level safety is perceived out there.

Safeware
27th Mar 2005, 23:13
5206,

1. The more far reaching the effect, the greater the pressure.
2. Tell it like it is and let whoever is responsible up the chain make the call.
3. Varied, as you say.

sw

engineer(retard)
28th Mar 2005, 09:10
Agree pretty much with safeware, it is rare for anyone to be open and on the record about relaxing safety standards, it goes against ingrained training and experience. Point 2 is usually the focus for cases where the risk is not tangible. It also depends on who is going to pick up the tab for the fix as to where the pressure goes. If it is a fault of industry, then it is easier for the individual, if it is MOD requirements then hold tight for a rocky ride.

My own experience is that it is rare for someone higher up the chain to overrule or accept a stated risk, unless operations are involved. You have to be a brave man to put your nuts on the block in this way. If mitigating evidence to accept the risk cannot be produced, pressure may come to bear for advice or recommendations to be changed to give the required answer, then it is down to the strength of character of the individuals in the chain.

A single cause/effect scenario, do this and you die, is easily defended. It is when the risk is more esoteric that life gets difficult, i.e in a given situation there is a statistical chance that this catastrophic event may occur if this combination of events happens. This is something that people cannot understand intuitively and will naturally resist.

With 3, I have found boils down to the experience of the individual. Aircrew who have been involved in procurement tend to be much more sympathetic because they have an understanding of the process that goes into providing a release to service. Hostile reactions tends to come when the percieved risk is seen to be blind obedience to rules or statistics.

Safeware
28th Mar 2005, 10:22
eng(retard),

re"hostile reactions", I agree, but would also add the situation where they don't understand the level of "the bar", or why, in that context, it is there. One in a million seems a very small risk, and software failure in particular is hard to put into a numbers game so when we talk about not meeting the required level, it can take significant effort to win the sceptics over.

sw

Safety_Helmut
28th Mar 2005, 10:38
Safeware
software failure in particular is hard to put into a numbers game
Software, something many people in the MoD would even struggle to spell, never mind understand the associated complex issues, and that's not even considering the issues of safety related software, or complex electronic hardware.

Consider statements such as "we haven't got time to worry about the software, we're too busy trying to get the hardware running".

Major projects about to enter service with no proper software support arrangements, oh yes and a new 00-56 which requires a greater level of competence to understand and apply (from both parties). Not a criticism of new standard by the way, just an observation, and a concern for an organisation with very little in the way of competence in this area. Give the aviation functional safety office a call and ask to speak to their software specialist.

:\

Safety_Helmut

Safeware
28th Mar 2005, 11:13
S_H

Major projects about to enter service with no proper software support arrangements

nobody listens eh? I had a quick look at the new 00-60 Pt 3 with eager anticipation, boy was I disappointed - could't really find what had changed in a completely re-issued document!

Give the aviation functional safety office a call and ask to speak to their software specialist

Methinks you are being ironic? ;)

From your perspective then, how do you see things changing under the new 00-56? And do you think it will make any difference to 5206's questions?

sw

Safety_Helmut
28th Mar 2005, 17:21
Safeware

From your perspective then, how do you see things changing under the new 00-56?
The new 00-56 is intended to allow a more flexible approach in developing safety related systems, and in principle this is a sound concept. The very prescriptive nature of 00-54/55/56 has caused many problems in the past. For example, certifying a civil derivative accepted by FAA/CAA/EASA etc as acceptably safe, but not meeting or not being shown to meet MoD's standards.

However, if you look at Mil-Std 882D, this too was intended to be much less prescriptive, now have a look at the draft 882E, it's gone the other way again, just as we reverse direction. If the MoD can properly staff/support its IPTs with properly educated and experienced system safety engineers then the future might look brighter, but how likely is this ?

Perhaps of more pressing concern are the issues surrounding the safety targets currently set in JSP553. Why set targets which cannot be met ? What position does that leave MoD in ? If the target is not achievable and the fast jet fleet indicates that in their case it is not, then MoD should be looking to redefine the target. Unfortunately, one document which sought to do this, albeit in a poor manner, is BP1201 (ES(Air), this is about to disappear because it is felt that POSMS can replace it. Well have a look at POSMS and nowhere in there are avaiation safety targets defined.

I have heard it said that one of the worst things you can do in this field is to say that you are doing something (safety activity) and then not do it. You really do leave yourself open to problems by taking this course.

So to your question, will it make any difference to 5206's questions, well reading across all 3 of them, the Mod is likely to find contractors wanting to be much more flexible in their approach. Unfortunately the MoD is unlikely to understand what contractors are doing without bringing in consultants. On the plus side, contractors are becoming increasingly aware of the business risks they are taking regarding product liability.

Oh yes, Safeware, did you get your name from Nancy ?

Safety_Helmut ;)

Safeware
28th Mar 2005, 19:56
S_H,

did you get your name from Nancy ?
Well, it seemed to fit, given what I do now, and I certainly wouldn't want to claim to be Nancy's love child would I? :)

Intriguing how we always seem to be out of phase with the US isn't it! I agree about achieving targets - demostrating them in a safety case is hard enough, as for reality, hmm.

I have heard it said that one of the worst things you can do in this field is to say that you are doing something (safety activity) and then not do it.

I think a pragmatic approach is that even if you say you are doing something and then don't, is the safety case affected - does it matter, is there alternative evidence? It's stuff that dents the safety case that matters when it isn't done.

If the MoD can properly staff/support its IPTs with properly educated and experienced system safety engineers then the future might look brighter, but how likely is this ?
It can't in the long term, mainly because when it does get the right people in, it can't keep them: "safety is everybody's business" but not for a career path.

sw

Safety_Helmut
28th Mar 2005, 20:22
Safeware

I may not have got my message across very well when I talked about claiming to do something which is not done. You are of course correct when you question the importance of such if it does not affect the safety case.

I had heard of this situation with regard to the manner in which investigations have been conducted in the past, particulalry when lawyers have been involved.

I think a very good example of this would be claiming to operate a SMS, this would obviously then add significant weight to the safety case. Have a look at DASMS and see what you think ?

Safety_Helmut :\

Blacksheep
29th Mar 2005, 03:32
Two important "words"

Risk and ALARP

No-one deliberately messes about with safety in the aeronautical world. Sometimes the risk assessment turns out to be wrong, true, but people don't deliberately ignore other folks' safety either.

We just don't.

Safety_Helmut
29th Mar 2005, 07:01
Two misunderstood and abused words:

Risk and ALARP

Once heard some fairly senior management discussing "injecting risk" into aircraft maintenance activities, when asked what sort of risk they were referring to, eg safety, business, mission etc, they didn't know, they had considered that.

Of course you are right, no one would deliberately compromises safety, would they ? But when your safety management arrangements are dependent on competence which does not exist and you know that, are you not still culpable ?

Safety_Helmut

engineer(retard)
29th Mar 2005, 10:44
The problem with ALARP is what is reasonable and to whom? I know that you (Helmut and SW) can cite case law and I would be interested to see it.

Also defining system engineers is another bone of contention. I often see job ads for systems engineers that only discusses software (sometimes only windows), to me a systems engineer is broad based - for aerosystems that should also include structures, engines, aerodynamics etc, software is only part of the neccesary skill set.

I'm sorry to say that I do not entirely agree with Black Sheep. I have had a situation in the very recent past where all parties except the industry project management agreed that there was a significant and unacceptable safety risk (even the industry test pilot and chief engineer agreed with us, until they were no longer invited to the meetings) that could not be mitigated.

It took nearly 18 months of point blank refusal to endorse a design review and attached milestone payment before industry relented. SW will almost certainly know of this case, happy to provide a PM as a memory jogger. They did not set out to compromise safety, but having done a poor preliminary hazard assessment, were late in providing their safety documentation, ended up a long way down a commercial creek without a paddle. This was a staring match to see who would blink.

Regards

Retard

Safeware
29th Mar 2005, 11:24
BlackSheep,

Maybe it is your commercial flying environment but "No-one deliberately messes about with safety in the aeronautical world" isn't always correct - it is rare, and not just for (military) operational reasons.

eng(retard),
I can't personally cite case law, but I can see where it could be cited in future (but hopefully not). I do agree with you on the systems engineer point though. The RAF proports to train aerosystems engineers, but few get the whole picture as you describe.

Have also been in staring contests - doesn't do anyone any good.

sw

5206
29th Mar 2005, 11:30
Thanks guys, some interesting discussion.

Equally? interesting is the fact that it is all engineers who have replied. :(

5206

Safety_Helmut
29th Mar 2005, 11:32
ALARP (As Low As Reasonably Practicable) is a principle based on the 1949 case between Evans and NCB. In his summing up the judge stated:
“’Reasonably practicable’ is a narrower term than ‘physically possible’ and seems to me to imply that a computation must be made by the owner in which the quantum of risk is placed on one scale and the sacrifice involved in the measures necessary for averting the risk (whether in money, time or trouble) is placed in the other, and that, if it be shown that there is a gross disproportion between them - the risk being insignificant in relation to the sacrifice - the defendants discharge the onus on them.”
The following quote is taken from "Principles and Guidelines to Assist HSE in its Judgements that Duty-Holders Have Reduced Risk As Low As Reasonably Practicable"
“In any assessment as to whether risks have been reduced ALARP, measures to reduce risk can be ruled out only if the sacrifice involved in taking them would be grossly disproportionate to the benefits of the risk reduction.”
Establishing this level of gross disproportionality is of course contentious. Look back at the examples where companies have known of a risk and made a decision to take the hit and pay compensation to those affected.

The HSE document "Reducing Risks Protecting People" (R2P2) has some very good guidance on the subject. Particularly useful for the "what is reasonable and to whom" question. It is possible, and relatively straightforward to establish mortality rates for different occupations, pastimes, lifestyles etc. These are then used to establish levels of acceptability. Consider the differing levels of risk accepted by, a pilot, a maintainer, a storeman and finally the public who we overfly.

An interesting comparison is that of our ALARP based risk management for safety, (despite having quantitative airworthiness targets) and the process used for a large commercial aeroplanes, eg the target set in CS25.1309. Now it is obviously not possible to make such a straightforward comparison, but ALARP does not seem to appear in the civil aeroplane building world. However, if you look into it more deeply the targets are based upon historical loss rates that have been deemed to be acceptable. Combining these with an assumed number of critical systems then gives us the mythical 1x10-9 figure for individual systems.

Safety_Helmut :confused:

BEagle
29th Mar 2005, 13:40
"Principles and Guidelines to Assist HSE in its Judgements that Duty-Holders Have Reduced Risk As Low As Reasonably Practicable"

Mmm - that sounds a right riveting read. Way up there in the best-seller lists alongside 'The Paperclip' and the 'APFS Newsletter', perhaps?

:p

Safety_Helmut
29th Mar 2005, 13:58
5206

Part 3 of your original question has perhaps been answered, the number of engineers answering your question, oh yes, and BEagle's response above.

Safety_Helmut

John Farley
29th Mar 2005, 14:56
Hi 5206.

I have thought long and hard before contributing to this thread because for the last 40 plus years I have been trying to improve (aircraft) safety in one capacity or another. All I have learned tells me that the whole thing is a very complex issue and not easy to simplify in any worthwhile way.

However I will try and show where I agree with previous posters but also where I may have other views. But it will be a long post.

I think your three questions were a very good start at trying to get to identify relevant issues however in the context of such a complex total subject they have limitations.

For example your first question – exactly what do you mean by ‘lower safety standards’? I think I know what you mean but I may not be right. I think you mean ‘lower that we have happily experienced and accepted in our various fields in the past’. Even if I am right this will mean different things to different people depending on their expertise. An engine man today may say his engines are now very much more reliable by any measure you care to use, that those of previous generations – and it would be hard to disagree that example. But as we introduce radically new systems and concepts we necessarily open the door to new failure cases.

As an example of what I mean, in the late 1960s when the development Harriers started flying around with a HUD (the first HUD for the RAF), it was not uncommon for the four pilots involved at Dunsfold to each experience a HUD failure every month. Many did not matter (ie did not create a flight safety risk at the time) because they were failures to off or obvious corruptions of the display. But a few were potentially real killers. One day flying visually back home I happened to notice the HUD showed 30 deg of bank even though I had my wings level at the time. It was not frozen and responded normally to bank changes. This was a case of incorrect information being perfectly displayed by what was also a very compelling display. Tricky for anyone in cloud.

The response of the ‘establishment’ was to say pilots should not use the HUD as a primary flight reference but should also scan head down instruments as a cross check. So now the HUD has made the pilot’s life harder not easier (at least in my book). In the 70s there was a spate of Jaguar and Harrier fatals where the HUD was implicated by the BoI. OK tell the chaps to scan better. Bloody rubbish. They were being badly let down by their kit.

To cut a long story short I wrote an AGARD paper entitled 'Modern Flight Instrument Displays as a Major Military Flight Safety Weakness' which started like this:

Quote:

Consideration of the major causes of flying accidents over which the airframe and engine manufacturers can exert a powerful influence shows the following list:

1. Structural Failure

2. Engine Failure

3. Flying Control Failure

4. Instrument Failure

5. Pilot Error

With the first three of these causes - Structural, Engine and Flying Control failures - while mistakes do occur, the manufacturers have a reasonable record, there is no evidence of complacency, and in addition there is a large well established, government controlled, national bureaucracy offering valuable checks and advice on testing and airworthiness certification. Pilot error in different, but appropriate ways, also attracts much effort aimed at its reduction. Most importantly, so far as the purposes of this paper are concerned, the accident trends related to the first three causes, as well as those due to pilot error, do not appear to have changed fundamentally during the last decade. The same cannot be said of instrument display related accidents.

Since the advent of Head Up and computed displays in general, and the operator's real need to expand the non-visual manoeuvre envelope, there has been a marked increase in display related accidents/incidents in both operational and development flying.

This note suggests that attempts at curing the problem have been based on a false assumption that has ignored the reality of the piloting task in modern high performance jet aircraft. Proposals are offered to improve the situation by both engineering and organisational changes.

End of quote

It seemed to me from my BLEU autoland experience that no engineer would ever send a signal to the tailplane jack of an airliner during an autoland without validating it before it was sent. I suggested in that AGARD paper that flight instrument information should be similarly validated before it was sent to the cockpit because it is no less VITAL to the safety of flight in many circumstances.

Now you say – interesting but so what……the point I am trying to illustrate is that each generation of developers has their own safety issues to IDENTIFY and then confront and fix. I suspect that many of you have concerns regarding flight critical software in which case I can only sympathise with you in trying to put across your concerns to non software literate people. But try you must, because only YOU are in possession of the facts and understanding – which brings me to the second question.

Only workers at the coal face can vent the pressure to make do with less than you think is safe.

I was a worker once BUT I was very privileged to work for a design office that had a good culture. If I got the information and put it across properly my views were usually accepted as the culture was to make better aeroplanes not more profit. And before anybody says it is not like that in the real world today I understand the problem but it never does any harm to mention that better goods sell better – year after year and if they are good enough will even be bought by the Pentagon. (You don’t do that by just doing the minimum to meet some home grown spec and contract). In a few cases my views were not accepted so then you need plan B where you get the customer in the pub and explain what the problem is so that the next time they come to fly your jet they amazingly trip over the problem which elevates its status to the required level. Civilian court martials do exist but providing you are right they can’t really toss you out.

Safeware said ‘Tell it like it is and let whoever is responsible up the chain make the call.’ Mmmmm. Not entirely sure about that.

In my book a test pilot who flies the jet, gets the data, reports on it and then sits quietly down until he is asked to fly again is NOT a test pilot but a pilot who flies flight tests. To me a test pilot is someone who accepts the responsibility to push until the report is implemented. He wants a safer jet. I think many engineers are able to push harder and higher if they choose to. It is a lot of work, involves personal risks but can give the greatest satisfaction when what you know needs doing is done.

Safeware’s answer to question three gives me no problem. He is right. Ask any five pilots a question and you will likely get six answers because one of ‘em will later change his mind. I jest a tad of course but aircrew are the same as engineers some are better quality than others when it comes to the issues we are talking about here.

JF

PS edited to include the title

5206
29th Mar 2005, 15:40
Thanks John, it seems that somethings never change. Is your paper widely available?

What I meant by "accept lower safety standards" was that the bar has been set (rightly or wrongly) and, having failed, excuses as to why accepting a lower standard of evidence is ok may be made. This as opposed to "life's getting harder"

As for pilot response, it strikes me that when things go wrong a la Mull of Kintyre, people become very interested in what went on beforehand - the Boscombe FADEC analysis etc. All this is quite right - IMHO the pilot's can't be accused of negligence because there is insufficient evidence to support this claim. However, all this after the event stuff is because of a tragedy - what is the perception of things before the tragedy (and the support for those trying to argue for a robust safety case), without the benefit of hindsight?

What are we going to say when C-130J/Merlin/JSF/Typhoon/Apache crash because of a software error that can't be traced but the safety case is weak?

Flying is risky, we engineers don't aim to take the fun out of it though.

5206

Safeware
29th Mar 2005, 15:54
John,

Safeware said ‘Tell it like it is and let whoever is responsible up the chain make the call.’ Mmmmm. Not entirely sure about that
This was from my perspective - I am not in the same organisation as the man making the decision, but he wants my advice, so all I can do is explain as best I can:

I can only sympathise with you in trying to put across your concerns to non software literate people. But try you must, because only YOU are in possession of the facts and understanding
That's what I have to do, but as a systems engineer.

I like your view on the role of the tp.

sw

John Farley
29th Mar 2005, 16:20
Thanks chaps

If you have access to AGARD papers it was AGARD Conference Proceedings No 347. ISBN 92-835-0342-2

But if anybody wants the text just send me your email address

As to your point 5206 about arguing before the event I guess any case for the installation of adequate crash recorders - including retrospectively - is hardly a difficult one to justify on cost grounds. May not be a sexy cause but it sure is one good way to spend money. Look at what they are trying to establish from a heap of Herc bits in Iraq just because it had no recorder.

Future aircraft safety always has depended on learing from the past - whether that was yesterday or yonks ago. Without recorders one's learing may well be very limited. But you all know that.....

JF

MovinWings
29th Mar 2005, 16:59
JF- More words of wisdom from the wise, and well informed! As another poster mentioned, things don't change much.

However, how do the posters on this forum feel about the introduction of quite drastic man-power cuts on the front line? Single man see-offs, self checking / signing technicians etc.... Same Task....Less People + More Limited Supervision = Disaster (Surely?)....... Also, those self signing / supervising technicians will be working harder, that has to help.

Will this be a tough lesson in the obvious? I truely hope not.

MW

5206
29th Mar 2005, 17:22
MW,

I agree that cuts at the sharp end increase risk when there is no corresponding cut in task.

IIRC, there was (about 10 years ago) a sqn boss who was v frustrated that there wasn't enough manpower to see off either 4 or 6 jets. This was just after a MAVA which had drastically cut sqn manpower - working from HAS site. Boss went to Sqn WO, responsible for manpower an demanded to know what was up. The WO run through the manpower chart - leave, courses, GDT, detached etc etc and identified that if everyone on site was thrown at the see-offs- no trade cover etc - they would be one man short. "What would that man have to do?" asked the Boss. Man a fire extinguisher was te reply. Job was done, jets seen off, Boss returns to office and declares Sqn non-op due to a lack of manpower. Went down like the proverbial.

5206

BEagle
29th Mar 2005, 17:44
JF - many thanks for your sage words!

I am currently involved in a little discussionette with a certain aeroplane manufacturer regarding manual override for certain safety critical systems controls which they consider are not needed - software will be sufficient. Hoorah....

Really?

An ex-Harrier TP chum (Airbedane) has given me an insight into his view of the dangers of relying upon s/w after it nearly killed him - and the particular s/w I'm discussing has never been specified to be flight critical... Thus my old-fashioned and slightly jaundiced aircrew view that s/w will f**k up and I want my aircrew chums to survive that incident is reinforced by very high-priced TP opinion!

Just see the recent Virgin Atlantic expereince with the A340-600 fuel system computerised madness to see what I mean. For ex-Hunter mates, it's like having the bingo lights monitored by the fuel gauges - so if they gauge says there's plenty and the bingo sensors say there isn't, then the bingo lights are overridden.....

Until it goes quiet, that is...... Then on comes the "Oh Bugger" caption.

Safety_Helmut
29th Mar 2005, 19:41
5206, you asked:
What are we going to say when C-130J/Merlin/JSF/Typhoon/Apache crash because of a software error that can't be traced but the safety case is weak?
The short answer I suppose is that it may be very difficult to conclusively prove a software fault was too blame.

Many years ago I discussed a project to look at 'instrumenting' real time critical software systems to aid post incident investigation. It would in theory I believe be possible to do, however, in practice there are significant issues to contend with, eg timing overheads, increased complexity etc.

As to the safety case being weak: A standard approach to improving integrity in critical systems is design and development diversity, for example the FCS architectures in modern airliners. Now have a look at the avionics and flight control architectures of some of the aircraft cited above, multiple lanes of identical hardware, catering for random hardware failure. Unfortunately often running identical software which will fail systematically. Have a look at Ariane 501 for a good example of what can happen. So in these situations, are the safety cases already weakened ?

Safety_Helmut

John Farley
29th Mar 2005, 19:57
BEags

Stick at it mate. It can hardly be career limiting now!

JF

5206
29th Mar 2005, 20:40
Found this today:

Autopilots 'turn crew into machine-minders and threaten safety'
By Barrie Clement, Transport Editor
29 March 2005


"Autopilot" is a word from aviation that has entered common usage, but the increasing use of computers on aircraft is potentially dangerous, according to pilots.

Senior aircrew believe that the growing reliance on electronics has reduced pilots to "machine minders'' with a decreasing ability to fly the planes manually.

Manufacturers are being warned that the principal cause of passenger deaths - "controlled flight into terrain", where the aircraft is deliberately aimed at the ground - is being caused by the domination of computers.

Malcolm Scott, a senior British Airways pilot, warned that in an emergency flight crew could be misled in to shutting down the wrong engine.

Writing in The Log, the journal of the British Airline Pilots' Association (Bapa), he says that Ecam, an electronic aid to decision making, can actually give the wrong advice.

He gives the example of a bird-strike encountered just after take-off by an Airbus A320. "One engine indicates an engine fire that is delivering full power, while the other engine has failed ... The Ecam prioritises the fire and instructs the crew to shut down the only engine delivering thrust. To follow the Ecam would result in the certain loss of the aircraft,'' Mr Scott writes.

He said that he had demonstrated the scenario on a simulator, but the senior training captain concerned was convinced that Ecam must be right and must be followed.

Mr Scott argued that in the late Nineties the industry was facing a crossroads - the pilots could either be progressively "designed out'' of the system or aircraft engineers could ensure that the captain's role was strengthened. Mr Scott believes the aviation industry has "more or less abandoned'' the era of the pilot.

He said his employer was increasingly discouraging manual flying on its Airbus fleet. "This has led to a de-skilled workforce with a consequent rise in manual flying errors,'' he writes. Mr Scott believes that the trend will lead to fully automated airliners. A transition phase would be a fully automatic aircraft with one human "systems monitor'' on board.

He called for better training so that instead of instructing pilots to follow computers blindly, they would be educated about their fallibility. "We need to develop procedures that take advantage of human strength while being tolerant of human weaknesses,'' he writes.

Mervyn Granshaw, chairman of the association and a working pilot, said that the industry had been warned about the concerns on the flight deck. "It might be time to get a grip on the situation, although it might be too late,'' he said. He believes that Airbus had gone further in reducing the input of pilots than Boeing.

"The great thing about computers is that they can process that amount of data and give you answers, but they are not perfect. There are scenarios that have not been thought about. It's not because we as pilots are special or precious, but that the human intellect has something to contribute."


It was here:
http://news.independent.co.uk/uk/transport/story.jsp?story=624483

So, we need more systems monitors then? :) :) :)

Blacksheep
30th Mar 2005, 01:06
In my earlier post I said
Sometimes the risk assessment turns out to be wrong, true, but people don't deliberately ignore other folks' safety either.
Much of the foregoing discussion seems to bear this out. Facing down people, who due to a lack of technical insight may try to manipulate the Risk Assessment, is part of the job; ALARP means what it says and we can't simply build in everything that anyone can possibly think of in the name of safety. Everything that adds cost must be justifiable on the basis of a proper Risk Assessment. That the assessment may sometimes turn out to be wrong doesn't necessarily defeat the argument.

Some of John Farley's discussion, especially regarding 'taking the customer down the pub', illustrates what one might need to do when testing indicates that the risk has been incorrectly calculated. Its part of the process, but I still suggest that aviation people do not deliberately ignore other folks' safety. Actually, being belligerent by nature, I find that fighting my corner is one of the most enjoyable aspects of my job; as some of you already noticed, many engineering people possess this characteristic.

engineer(retard)
30th Mar 2005, 12:47
Blacksheep

The issue I discussed previously was purely a financial case, and the people I was fighting were engineers who had become project managers. Admittedly, there may have been financiers pulling the strings behind them. Whilst I agree that fighting your corner is a common trait, you are not always in the office where the final decision is made. Sometimes your objections have been paraphrased and weakened as your report moves through the system. Fortunately, I have not been in the position where "I told you so" has happened and hopefully never will.

Beagle keep fighting, I would suggest that you make them present their safety case proving they are correct. I have found this to be the path of least resistance.

Helmut

I find the common hardware case a bit scary, as systematic hardware failures are not unknown. Particularly if there is a divergence form the cleared envelope or environment. As I am sure you are aware, even with multiple redundant systems, single points of failure are found late in development. In my experience this has been either end of the system stick or control surface.

Safeware
30th Mar 2005, 20:42
eng(retard)

True, there are issues with h/w redundancy, but diversity and No Single Point of Failure (NSPF) "should" also be considerations.

sw

Blacksheep
31st Mar 2005, 01:52
engineer(retard) I'm sorry to have to admit that I'm one of those turncoat project managers myself. I do try never to let safety slip - its something that was engrained in me from the very first day of my apprenticeship and I see it as one of the most important considerations in aviation.

5206
31st Mar 2005, 20:46
For those that don't cruise other forums, found this:

A340 Fuel computers (http://www.pprune.org/forums/showthread.php?s=&threadid=168501)

A good example of why the UK looks to more stringent development (and safety evidence) for High Integrity software.

And for our military aircraft with less engines, less crew??

5206