PPRuNe Forums - View Single Post - Simulator Training for strong crosswind landings
Old 15th Jun 2014, 21:47
  #71 (permalink)  
AirRabbit
 
Join Date: Apr 2005
Location: Southeast USA
Posts: 801
Likes: 0
Received 0 Likes on 0 Posts
Originally Posted by mnttech
AirRabbit

Interesting posts. In your opinion, what has the effect on the US simulator standards been by the change from AC120-40C and FAR Part 60? What has the effect been on having most if not all of the inspectors based out of one city, instead of all over the country?
Hi mnttech
With respect to your first question (In your opinion, what has the effect on the US simulator standards been by the change from AC120-40C and FAR Part 60?) It was just about 18-20 months ago that I was involved in a seminar where the participants were from a substantial representation of airplane manufacturers, simulator manufacturers, airlines, pilot organizations, training organizations, and training providers, as well as regulatory authorities – and during one of the session breaks, the 15 or 20 delegates who gathered around the podium were discussing what they each believed was the most significant advancement in simulation in the past decade. The rather surprising majority agreed that it was likely the publication of the US FAA’s Part 60 regulation. Some cited the fact that the document represented a regulatory requirement but that it also was full of explanations and example references. Others who cited the explanatory nature of the document also described the fact that it covered simulators and training devices for both airplanes and for helicopters, at all the appropriate levels for each; and still others felt that the manner in which the document was developed – the wide ranging participation of all interested parties – made it the most significant advancement in the use of simulators. Additionally, you might be interested to know that, actually, the Advisory Circular 120-40C was never completed and published (although there were some authorizations granted, I understand, for a few organizations to use “Draft -40C” for some applications, as there appeared to be some significant commitment and funding that might have been otherwise lost – attesting to the agreeability of the FAA to recognize the “real world” rather than maintaining a “cold, disinterested look at individuals” … and, the part 60 document did, in fact, pick up on a lot of the major differences from AC 120-40B on which -40C was to be focused. I know that the FAA cited the motivation to provide as much of the references and standards that were regularly used and relied upon as references to be contained in a single document (the Pt 60 regulation) as was possible to prevent individuals from having to continually reference 5 or 6 other documents to find the answer to a single question or determine a single requirement. I suspicion that this was not completed to the extent that some had desired, but it did provide for a wider range of answers to questions that previously required researching multiple documents.

With respect to your second question (What has the effect been on having most if not all of the inspectors based out of one city, instead of all over the country?) … The FAA organization charged with the responsibility and the authority to conduct inspections and issue qualifications was originally composed of a small staff headquartered in Washington DC, was named the National Simulator Evaluation Team (NSET), and was moved to Atlanta. This was to mirror the agency’s goal of having each of the 9 regions in the US, i.e., New England, Eastern, Southern, Central, Southwestern, Great Lakes, Rocky Mountain, Western, and Nortwestern (later an Alaskan region was established as well) be the Region-of-Responsibility for one of the major FAA functions. The Southern Region, with headquarters in Atlanta, was designated as having the responsibility of Simulator Evaluation and Qualification. But, this “team” remained organized the same way – and each of the 9 regions designated one inspector, domiciled in that region, to be that region’s representative. This structure was maintained for the first 20 years or so of its operation. However, as anyone who deals with remotely cited operations would probably recognize, the level of standardization and the understanding of directives, orders, agreements, etc., can easily become potential focal points for misinterpretation and non-standard actions. Part of the justification was that there were insufficient funds to move each NSET member to Atlanta, and leaving those inspectors in their present location would allow the simulators in their location to be regularly evaluated without incurring travel costs. However, this presented two other issues: first, only one NSET member would likely evaluate those specific simulators; and second, with any specific entity as wholly dependent on technology as are simulators, having these entities critically examined by only one individual, this could easily result in perceived or actual compromises that would not be possible and not expected or suspicioned if every simulator was at least periodically inspected by other members of the NSET. Additionally, this regional assignment of NSET members required that twice a year all inspectors would travel to Atlanta for a week-long standardization meeting, and that meant that for at least twice a year for a 2-week period (allowing time for travel) no simulators, anywhere, would be evaluated – AND there was the expense of having to have such meetings. Starting in the 1990s, the NSET first changed its identity, preferring to be known as the NSP (changing the designation from an “evaluation team” to a more broadly functioning “program”). With this change there was also an examination of the level of standardization achieved with inspectors having to attend standardization meetings twice a year – essentially suspending evaluation schedules for 2 weeks at a time twice a year – with a potential of moving all inspectors into Atlanta, where formalized week-long standardization meetings could be replaced with regular conversations on a much more regular basis … even as often as every morning for those who were present for coffee conversations, or periodic water-cooler meetings throughout the day, and/or one-on-one meetings with whomever was in the offices at the time a question needed some field experience to answer competently, and do so more easily and more regularly, without interrupting evaluation travel requirements.

As I understand the NSP functioning, this arrangement has benefitted a majority of the simulator sponsors and most of the NSP staff. However, change is preeminently on the horizon, and the NSP is not insulated from change, I am sure. Also, I am aware of several NSP inspectors who needed to be in locations other than Atlanta for personal (family or medical) reasons and in some cases, these few individuals have been authorized to permanently move their home to other locations. I do not know the numbers nor do I know the corporate attitude with respect to these few examples. Perhaps there will be additional moves in the future – perhaps not. But I am of the opinion that the central mission of this organization has been and will likely continue to be devoted to the competent structure of simulation standards and the competent and effective evaluation of such devices on a basis that can result in the assurance that those devices remain in a condition to perform the functions for which each was designed, qualified, and approved. And, as an example of that has been the dedication of the NSP toward the development, incorporation, and functioning of a reasonable and efficient Quality Management System (QMS) located at each simulator sponsor’s location. I have been told that the reason for this is the fact that the numbers of US qualified and regularly evaluated simulation devices continues to grow … but the US government is reluctant to increase the size of the NSP staff. Knowing these obviously diametrical opposed circumstances, it is easy to understand why the NSP is seeking to develop and incorporate a professionally constructed and dependable QMS.
AirRabbit is offline