PPRuNe Forums - View Single Post - Error reporting and safety psychology
View Single Post
Old 20th Nov 2001, 01:51
  #4 (permalink)  
Genghis the Engineer
Moderator
 
Join Date: Feb 2000
Location: UK
Posts: 14,241
Received 52 Likes on 28 Posts
Post

(Dr?) Grant,

A very interesting question, which I've debated on several occasions with Medical friends.

Speaking as both a pilot and an engineer, I'd say that the culture you are asking about pervades - has to pervade - all of professional aerospace. It's not specifically dedicated to pilots, although there are peculiarities to the piloting profession.

Firstly let me try and explain what the professional aerospace culture is. It is firstly about obsessive attention to detail, and continual doubt and questioning of ones own, and ones colleagues abilities. Anybody not seen to doubt their own infallibility is regarded with great suspicion. A co-pilot is encouraged to question his captain's actions, I praise any of my subordinates for spotting my own cock-ups, and they know I will point out their mistakes, then "file and forget". This is a hard culture to adopt, and most of us take a few years to get the hang of it. I'd say about 8 in my case.

How is this achieved. Well, probably a lot of it is because all of us are subject to constant monitoring and review, by senior pilots, by the authorities - in a continuous chain. Even the national authorities such as FAA and CAA are annually audited by an international body called ICAO. This is ingrained in the culture, but also you might next month be auditing the chap auditing you now (or visa versa), which encourages a lack of blame. Next month my office will be ransacked by the annual CAA audit, which I look forward to as a way to ensure I'm doing the job right - honest. If (or usually when) they find deficiencies, we will agree a programme to bring things back up to speed, they will NOT take disciplinary action except after repeated opportunities to correct problems (which has happened, but only when the CAA think it's necessary to protect the public).

However there are other more tangible elements to the system, one of the most significant (at least here in the UK) is something called CHIRP. CHIRP, or Confidential Human Factors Reporting Programme ( www.chirp.co.uk . CHIRP is an independent organisation to which anybody may send a report on a failing by themselves or anybody else. The receiving body carefully disidentifies the report, then passes it to a panel of experts for investigation. If the panel has questions they go through the receiving body (the only people to know your identity). Ultimately a report is issued in a way that ensures everybody within the industry knows a mistake was made and how to ensure that they don't repeat it. Exceptionally a company (say an airline) might be told "this problem occurred with one of your crews - amend your training procedures". Once the report is issued, the reporter's details are destroyed. CHIRP exists throughout civil aviation, both in flying and engineering, and the military have their own equivalents which work the same way.

Another example (which, it should be said, works better in the UK than the USA, at least in civil aviation) is accident investigation. The UK Air Accidents Investigation Branch has a mandate to investigate accidents "so that there will be no more accidents". They are allowed to blame procedures, manuals, or recommend amendments to equipment. They are not allowed to blame people. I have actually sat in a meeting at AAIB where the Principle Inspector stopped the meeting and said "I'm sorry, we're getting to close to blaming somebody in this discussion."

So, when professionals know that their mistakes are investigated like that they are prepared to be open. Nobody wants to be bad at their job, so if they're allowed to go to their colleagues for help, or to report their own mistakes for others to learn from, they will.

I hope this helps, no doubt others will have their own views.

G
Genghis the Engineer is offline