PDA

View Full Version : How valid are the jaa learning objectives?


SuperTed
14th Jan 2002, 16:56
How valid are the jaa learning objectives?? Do the exam writers only stick to the topics mentioned on such objectives?

Thanks,
ST

Gazeem2
15th Jan 2002, 01:28
I often wonder what the JAA examiners are actually after.

The multi-guess format always makes exams easier and makes question spoting possible; now as professionals we are obviously not going to stoop to question-spotting BUT :-

a proportion of JAA questions have no right or wrong answers but a more right or least wrong answer, and in some cases it depends upon the inflection or interpretation of the question.

In these cases applied knowledge may not work and learning the correct response (answer) probably will.

If the CAA truly wanted to test an individual's knowledge surely long answer questions should be set?

SuperTed
15th Jan 2002, 14:32
What possible chance do we have if there is no right or wrong answer to these things.

How could you possibly know the best response to the questions when you havent seen the question bank?

Keith.Williams.
16th Jan 2002, 01:32
SuperTed,

You appear to be asking three questions:

ARE THE LEARNING OBJECTIVES VALID?
The short answer is they probably are not.

The objective training process is based on the concept of identifying what exactly the students needs to be able to do when qualified, then training him/her to do it. The first part of the process is a training needs analysis, which identifies the main tasks that make up the job for which the training is intended. These tasks are then broken down into sub tasks, which are again broken down into sub-sub tasks. This process is repeated until the "any fool can do that" stage has been reached. Each of the tasks, sub task and sub-sub tasks are then analysed to identify the knowledge, skills and attitudes that are required to achieve them. A list of learning objectives is then constructed to provide the necessary knowledge skills and attitudes. Exam questions are then constructed to test the achievement of these objectives.

By now, you have probably concluded that no such training needs analysis has ever been conducted for the JAR ATPL. In fact, the sum total of full training needs analyses carried out in the entire field of human endeavour, can probably be counted on the fingers of one foot. Many organisations claim to do them but the temptation to take shortcuts is irresistible.

So the JAR ATPL learning objectives are little more than an expansion of those used for the old national systems. However, the introduction of JAR has at least resulted in an improvement in the clarity of the objectives. For some subjects the old CAA lists comprising less than a half page of A4, now cover ten or twenty pages. This doesn't necessarily mean that more material is covered, but the material is more clearly defined. As for the question of whether the material is relevant……Well if you ever find yourself at 40000 ft needing to communicate with your aircraft systems in binary code, you would probably be wiser to spend your time cancelling future newspaper and milk deliveries!


ARE THE EXAMS BASED ON THE LEARNING OBJECTIVES?
Well most of the examiners would (usually) say that they (generally) are. Many students and instructors would however disagree. It is certainly true that in the early days the FTOs interpreted the objectives much more narrowly than the examiners. This inevitably led to a good many surprises in the early exams. But the system has now been in place for a couple of years and the FTOs have lots of feedback questions and a pretty good idea of what is required. There will of course always be a few surprises as examiners push the envelope just that little bit further as they build up (pad out) the question bank.


HOW CAN YOU HOPE TO DEAL WITH QUESTIONS FOR WHICH THERE IS MORE THAN ONE CORRECT ANSWER?
There is nothing inherently wrong in giving a number of answers of varying degrees of accuracy. This is in fact a good way of ensuring that students really do know their subjects.

For example if a questions ask for the relationship between Vx and Vy, the options might include the following:

a. Vx is greater than Vy.
b. Vx is less than Vy.
c. Vx is equal to Vy.
d. Vx is less than or equal to Vy.

Option a is only true if you take Vx for a jet and Vy for a prop so we can (probably) discount that one. Option b is true at all altitudes below the absolute ceiling and option c is true at the absolute ceiling. But option d is correct at all altitudes. This question tests not only knowledge of the basic relationship between Vx and Vy but also how they vary with altitude.

In reality, however some of the questions have been far more dubious. One in the Performance papers about a year ago took the form of:

How do Vx and Vy vary with increasing altitude?

a. Both increase.
b. Both decrease.
c. Vx increases and Vy decreases.
d. Vx decreases and Vy increases.

In this case, none of the options are correct. Vx (in terms of CAS) remains constant, while Vy decreases until they are equal at the absolute ceiling.

This question (or one very much like it) appeared in a number of exams and led to a great deal of discussion (heated argument) between POF and Performance instructors and (even more heated argument) between FTOs and the CAA. The matter was discussed at one of the regular meetings between examiners and FTOs and it soon became apparent that none of those present new the true answer. So, the CAA announced that such questions would be withdrawn until the matter was resolved. The FTOs advised students accordingly.

In the next set of examinations the Performance paper included 4 questions along the lines of:

How do Vx and Vy vary with changes in mass?

How do Vx and Vy vary with changes in ambient temperature?

The students of course objected, but the examiners pointed out (quite correctly) that these questions were valid. This of course did little to placate the students.

The examiners eventually concluded that because the rate of decrease in Vy is very small (only about I Kt per 4000 ft) it is okay to say that it does not vary with altitude. This led to the quite remarkable situation that in Performance Vx and Xy both remain constant with changing altitude, while in POF Vx is less than Vy at low level, but equal to Vy at the absolute ceiling.

WATCH THIS SPACE IN FUTURE EXAMS!!!!!

[ 16 January 2002: Message edited by: Keith Williams. ]</p>

Tinstaafl
16th Jan 2002, 17:07
It also doesn't look like the JAR questions are properly validated prior to use.

There are criteria that must be met before a question can be considered acceptable. These include things such as:

1. Does it test the intended syllabus topic?

2. Level of difficulty. There are a number of standards eg 70% pass rate from the intended candidate population. Does the question achieve the required rate. It could be too difficult or too easy. An example would be an ATPL level question being used for a PPL exam, or a PPL level question used for ATPL etc.

3. Correct grammar/spelling & is understandable.

4. Only one correct response that meets all conditions set in the question stem.

etc etc (can't remember the rest. I last studied it 15 years ago <img src="redface.gif" border="0"> )

If the difficulty level hasn't been tested on a statistically valid sample of the candidate population then the exam must be graded on a curve ie bottom 20% fail or whatever - until such time as enough data has been collected to allow the equivalent cut-off percentage to be used eg 75% minimum to pass the exam.

Question design & validation is just about a field in itself. Would anyone like to take bets about the validation process that the exams have undergone?

Dick Whittingham
18th Jan 2002, 00:40
First, in the fields of writing objectives, validating the question bank and weeding out ambiguities, I was there, for Met, and I think that a generally sound job was done. As we go through the exams we, the FTOs, continually ask for revision of doubtful questions. Again, the CAA is generally helpful and flexible.

Second, in dealing with specific questions, IMHO, you have to be very careful. The matter of Vx and Vy is complicated. For a start, you can’t just use “Vy” without defining what Vy is. I suppose that a reasonably tight definition might be: “for a given aircraft and power plant installation, the EAS at which the power available exceeds the power required for level flight by the greatest margin at ISA msl conditions”.

Narrowing it down further, to jet powered aircraft, we find that Vy will be a figure determined by the shape of the power required and power available curves that are specific to one aircraft. Thrust, for example, usually varies with EAS, and the variation depends on the engine installation. What happens at altitudes above msl, and in conditions that are not ISA will depend on how the power available and required curves alter. This again will be specific to the aircraft.

Nevertheless, it is generally true for typical jet aircraft that Vy as defined reduces with increasing height. Note that Vy in CAS/RAS/IAS will rise in relation to Vy in EAS as TAS increases. The decrease in best rate of climb speeds can be marked. Figures in IAS for the atypical Hunter Mk 1 were:

msl. 430kt
10,000ft 400kt
20,000ft 365kt
30,000ft 329kt
40,000ft 260kt

The IAS given at the higher altitudes is defined by Mcdr.

The figures given in aircraft operating manuals, like those above, are not necessarily Vy as defined above. If wide variations of speed produce only small variations in time to height, it might be easier and simpler to use a single CAS in the climb. This, of course, is not the same thing as saying that Vy as defined is constant with increasing height. Air traffic control might also require a fixed CAS in the climb for traffic separation.

If you are short on power, you will soon arrive at a point where Vy has reduced to approximately Vimd and where you have no excess power available. Roughly then, Vy has reduced to Vx and you are at your absolute ceiling. This does not happen if you have unlimited power available. In those conditions, Vy will be constrained by Mcdr at first, with EAS reducing, and when the EAS for Mcdr reaches Vimp, again Vy and Vx coincide. Not that this has much practical significance, for terrain avoidance climbs are rare at FL400. Vy/Vx then remains at approximately Mcdr as you climb, with EAS reducing further, until EAS reduces to your minimum control speed. (Normally taken as 110%Vs1g). Vy/Vx remains at this EAS until Mach number rises to Mmo. Now you really have reached your absolute ceiling, coffin corner.

I hope I do not fall in to the same category as another instructor who said he did not make posts to show off his own skill and knowledge - and then demonstrated at least partial achievement of his aim! If you want the full picture, I recommend a book “Aircraft Performance Measurement” by Dr Eschelby at Cranfield.

Dick W

Tinstaafl
18th Jan 2002, 21:49
Hi Dick,

The point I make about validation of exams is that the exam must be tested on a statistically valid & representative population prior to release if a percentage cut-off is to be used for pass/fail assessment.

If that hasn't been done then the exam must be graded on a curve ie using x standard deviations for the candidates sitting that particular exam/question combination.

Only after adequate numbers have sat the exam can the percentage cut off be correlated with the standard deviation.

AFAIK the exams were introduced on Day 1 with an already set percentage pass mark. When were they tested against the candidate population to establish the percentage?

Dick Whittingham
18th Jan 2002, 23:33
Tinstaafl,

You've got me there. I don't know.

I think (prob 30 in TAF terms)that all that has hapened is that allowances have been made when nearly everybody fails one or more specific questions.

There can be such things as absolute standards, that you either pass or fail, regardless of the distribution of ability in the general population.

What should have happened in the JAA, and should have happened long ago in the CAA, is that users should have defined the skill and knowledge levels required for, say, a FO on his first pax sortie. This would have defined a syllabus of training and in turn a series of training objectives. All objectives have the unwritten introduction "At the end of this course the student should be able to...."

Schools would then take the objectives and set up training programs to achieve the aim. This would be tested internally by school tests and externally by the JAA exams.

The system would then hand over to the users new FOs with the stated qualifications.

Of course, this did not happen, and the rest is history. Still, with a lot of effort and care we have a reasonably good system, not perfect, but not complete cr*p.

When the Portugese rep first saw the proposed Air Law objectives he said " this means no Portugese will ever fly again". It isn't all that bad.

Dick W

Keith.Williams.
19th Jan 2002, 01:48
Dick,

If I may take a few of your comments in turn:

Quote: "First, in the fields of writing objectives, validating the question bank and weeding out ambiguities, I was there, for MET, and I think that we did a generally sound job"

This illustrates my point regard the lack of a proper training needs analysis. When setting about creating a set of objectives it is tempting to get together a group of interested parties and have a good old brain storming session. This might well generate a set of objectives with which most of the participants agree, but it is not really a very scientific approach. Nor is it a proper training needs analysis. Using this method invariably results in some irrelevant material being included and/or some essential material being omitted. Teaching too much is simply wasteful, but missing out essential material can be positively dangerous. A good many people have managed to kill themselves (and others in their vicinity) through not having the correct skills, knowledge and attitudes. My point regarding the validity of the objectives is simply that without a proper analysis we are unlikely to identify truly valid learning objectives.


Quote: "As we go through the exams we, the FTOs, continually ask for revision of doubtful questions. Again, the CAA is generally helpful and flexible."

I agree. Nothing in any of my posts has indicated that the CAA is anything other than (generally) helpful and flexible. But the need to continuously go back to them to query problem questions is a direct result of the failures of the initial analysis and question writing processes. If the system is so good, why did so many students get caught out last year by the sudden introduction of lots of probing questions on tacho systems. Did Bristol's notes already cover the subject to the required extent, or did you (like all the other FTOs) quickly revise them. Your notes have a reputation for being excellent, so if yours had to be improved, what does that mean for students in less excellent schools?


Quote: "Second, in dealing with specific questions, IMHO, you have to be very careful".

I agree, but SuperTed asked about the validity of the exam questions. If we do not deal with specific examples, we will get nowhere. Students complain that questions are unfair….CAA says they are not….. End of discussion.


Your discussion of Vx and Vy is excellent, but I feel we need to be careful about definitions. You are correct in saying that if you have infinite power it is aerodynamics that will eventually limit your altitude. But absolute ceiling, aerodynamic ceiling and maximum operating altitude are all different things. The syllabus requires students to have a clear understanding of each, and to be able to distinguish between the three. Students reading a couple of your comments might be confused.

I am sorry if you feel that my last post was simply intended to demonstrate my knowledge of the subject. This was not my intention. A question was asked and I responded to it. Nothing in my comments about Vx and Vy were beyond the level required by the students. My description of the training needs analysis process was intended only to provide a context for my assessment that the learning objectives probably aren't valid.

What I actaully said in the much earlier post to which you refered was "we must ensure our answers are pitched at the correct level to ensure they are of use to the recipients". That isn't quite the same as saying "don't demonstrate your knolwedge".


TINSTAAFL,
I agree with everything you have said about how the process of question/exam validation SHOULD work. But unless the original learning objectives are based on a proper training needs analysis, any validation process can do no more than confirm that both the examiners' and the FTOs are employing the same interpretation of the objectives. To take an extreme example, we might teach the students lots of interesting things about all the valve and switches in their hydraulic systems, The teaching might be excellently designed an executed and the questions perfectly constructed. But unless we tell them to put the rear down before landing ……

One of the ways in which the validation process can go wrong is illustrated in a current string in the tech log. A member asked for advice concerning the number of gimbals, axes of motion and freedoms of motion of various gyroscopes. After a couple of days I noticed that no responses had been made, so I consulted a reference (The RAF's AP 3456) and posted a response, together with the health warning that I am not an expert on gyroscopes. My reference indicated that the degrees of freedom of a gyroscope include spin. This means that the number of degrees of freedom is one more than the number of gimbals. A subsequent response revealed that although the commonly accepted convention is that degrees of freedom do include spin, the French JAR examiner responsible for this subject had decided that this was incorrect. So in order to pass the examination, students must know not what is generally accepted to be true, but what an individual JAR examiner believes to be true! HUMMMMMMMMMMM???


FINALLY
Because of the way this string has developed, readers might conclude that I have a very low regard for the JAR exams. This is not entirely true. Because a great deal of feedback has been gathered over the past couple of years, the FTOs have a pretty good idea of what is required. Students can therefore expect to be properly prepared for their exams. If some of the figures recently quoted in this forum (91% passes at first attempt ) students can have little to fear.

Most systems, however badly designed, eventually evolve to a reasonably effective state. But the fact that the objectives have no real basis makes them no better nor worse than the old national systems. Improving pass rates do not mean that JAR ATPL holders will have been taught all those important little things necessary to keep themselves and their aircraft in one piece.

[ 20 January 2002: Message edited by: Keith Williams. ]</p>

Tinstaafl
19th Jan 2002, 03:33
Definitely agree Keith. Appropriate objectives are very necessary. Well constructed objectives in turn lead to appropriate questions. Mind you, that doesn't necessarily imply 'good' questions.

Doesn't that open a new can of worms! What objective safety analysis was used to derive data to determine whether or not a particular knowledge area is necessary? None, I should think.

I suspect a combination of each country unwilling to give up some of their most cherished limitations + some grouping of various industry types giving their opinion on what they each believe should be included.

Dick, a slightly absurd example that illustrates what I've been saying would be if those of us who teach aviation theory ie presumably experts in the field <img src="wink.gif" border="0"> , sat a new PPL exam and our score used to decide if the exam was too 'easy' (or not <img src="redface.gif" border="0"> )

That's where testing against the target population is required. This testing normalises the questions/exams to a common reference eg difficulty level , understandability etc etc using standard deviations.

As for having to go back to the CAA after the fact to ask that questions be altered indicates an abysmally poor validation process.

Effectively, the first students to sit the exam(s) were the test subjects in the 'validation' process. They should have had their exam scores determined using Standard Deviation marking, not the fixed percentage method. After the data was accummulated from a valid candidate sample then the marking can legitimately be changed to fixed score.

Keith.Williams.
19th Jan 2002, 17:19
Dick,

Further to my last post (yes I know it was long enough already). Your figures for the Hunter show that the quoted Vy lapse rate of 1 Kt per 4000 ft is incorrect. It should be noted that this figure is not my invention, but was produced by (would you believe it) the CAA examiners.

At the CAA/FTO meeting to which I referred earlier, the PPSC representative was asked to come up with a definitive statement of how Vx and Vy behave as altitude increases. He invited me to contribute and after a day or so we came to the conclusion that Vx remains constant while Vy decreases to converge on Vx at the absolute ceiling. He then passed this on the CAA.

A couple of days later I asked him if there had been any further developments and he stated that. "Apparently there is an aerodynamics expert upstairs in the CAA building. He has stated that Vy only decreases by 1 Kt per 4000 ft altitude increase, so it is okay to say that it remains constant". At this point I realized that there was no hope of resolving the matter, so I threw my hand in with this particular discussion.

[ 20 January 2002: Message edited by: Keith Williams. ]</p>

Alex Whittingham
21st Jan 2002, 15:02
Keith, et al.

I have had to use Alex's name to log on, but it is Dick writing

No, I wasn't poking fun at you. The phrase was used by another poster, and I just couldn't resistit!

The rest of my posts are simply a plea for clarity and accuracy in the JAA training and exams, a sentiment which, I am sure, you all share.On the specific point of Vy, after the meeting that you quote Alex took some actual data and Martin Eshelby's book, Aircraft Performance, Theory and Practice (correct title and spelling) to the CAA, who retired to think about it. We have heard no more, but I still teach Vy reducing, with a verbal warning about the popssible conflict.

As to the writing of the LOs, we all agree it was done in the wrong sequence, as I replied to Tinstaafl. Perhaps we will eventually have a systematic review, but don't hold you breath.

Eshelby's book is ISBN 0 340 75897 X,recommended for instructors, but not for students.

JAR25 gives gusts in EAS.

hasta pronto,

Dick