PPRuNe Forums

PPRuNe Forums (https://www.pprune.org/)
-   Professional Pilot Training (includes ground studies) (https://www.pprune.org/professional-pilot-training-includes-ground-studies-14/)
-   -   Performance ATPL (https://www.pprune.org/professional-pilot-training-includes-ground-studies/582719-performance-atpl.html)

Dream2Jet 8th Aug 2016 17:16

Performance ATPL
 
anyone know what to do for this exam, its just a nightmare I hear, totally rejigged and even schools dont know where to study from, any suggestions?

RedBullGaveMeWings 9th Aug 2016 07:09

Ehm, care to explain? What do you mean by "rejigged"?

paco 9th Aug 2016 15:58

Probably lots of new questions that no-one has seen before.

That's what is supposed to happen. :)

Phil

Odai 9th Aug 2016 21:51

When I did my ATPLs (finished summer 2015), it was a real pain to work so hard to properly understand all the material (even as a Physics undergrad) and yet get significantly lower averages across most exams than those candidates who simply "bashed the question bank". There were only one or two exceptions (GNav especially).

Doesn't exactly fill me with confidence in the ability of scores of candidates that will go on to sit in the RHS of airliners. Glad to see this is all changing.

paco 10th Aug 2016 05:40

EASA are making a determined effort to stop the database bashing. In my opinion, a lot of recent accidents have been down to that (i.e. lack of deep knowledge), and it's now a matter of public safety.

Performance is linked very much with POF though, so that adds an extra layer.

Phil

Martin_123 10th Aug 2016 10:48

I'm in all support for changing the QBs to prevent database bashing, however my experience with HPL and MET is that the new questions coming along haven't really gone through any quality control at all! I had number of questions where none of the answers was correct, I had number of them where 2 answers were correct, I feel like I'm a guinea pig, leaving my feedback on nearly every second question, pointing out the terrible wording and mistakes.. I'm sure they will catch on eventually and will improve the new questions over time, but guess what? That doesn't improve my scores!

Ok so far I have nothing to complain about, I have 4 exams done with good scores, first time passes, but I know a number of people who really studied, who demonstrated their knowledge in revision courses but ended up with big fist in their faces simply because they got questions that do not only make sense from syllabus point of view, but are a grammatical nightmare. Their career perspectives are seriously damaged simply because some clerks in authority didn't do their job properly and the lack of accountability means the ATPL study will continue to be unfair to people.

If we want proper change for the better, ATPL theory needs to be taught/examined by accredited universities and exams need to fulfill a certain academic standard before they are attempted. The situation we had in February where, reportedly, only one student passed the OPS exam in the entire UK would raise a lot of red flags in the academic environment.

paco 10th Aug 2016 12:34

My experience with computing tells me that we should leave the universities out of it, although London Met had a good rep when they started.

What we need is qualified people overseeing the process, and not bureaucrats. By all means subject it to quality control, but the schools certainly need references from which to teach and by how much. In theory, if the questions matched the syllabuses they could introduce new questions every day and nobody would notice.

Right now, this can't happen because no school I know has any faith in the database, either because the questions do not match the LOs or because around 20% of the questions are actually wrong (having said that, I know that recently added questions do go through stringent quality control, so perhaps there are some old stragglers around).

So, you can't blame people for bashing the database, because otherwise they wouldn't pass at all, but it is not satisfactory by a long way.

Phil

Martin_123 10th Aug 2016 13:29

from what I gather talking to instructors, it's the new questions that lack the quality control. What's strange is that it doesn't appear to be that dramatic in all subjects. There are some that are more affected by bad questions than others. For example I haven't heard a single peep about Gen Nav despite it having some new questions added recently.

Instruments seem fine as well - except I heard that helicopter exam had a question about synchroscopes and N1, which I bet can be very frustrating

paco 10th Aug 2016 13:36

Not seen synchroscopes on helicopters yet! :)

Part of the problem is that some of the technical reviewers are not up to much, although the UK one for ops is ex-London Met and knows his stuff. That could explain why there is such a difference between subjects. I will mention it in suitable places.....

Phil

Martin_123 10th Aug 2016 14:09

just because someone knows his stuff doesn't automatically mean that they can formulate a question well and provide decent answers.. don't mean to offend anyone, I really don't know much about how these questions are prepared or how are they tested, but perhaps some sanity testing is in order - perhaps form a workgroup of competent people and have them test the questions first, to see if they can answer them, before releasing them in CAA circulation

paco 10th Aug 2016 14:37

The tech reviewers don't write the questions, people like me do :)

Their job is to see if they make sense, that the references given are correct and make sure that they follow the LOs - as mentioned, he is pretty good at that. Also, the girls in EASA are sharp cookies too.

So what's the answer? Don't know really. I only know that if I was wanting to get to here, I wouldn't have started from there, as the Irish say.

And it's not any CAA that's the problem - it's EASA.

Phil

Martin_123 10th Aug 2016 17:02

how often would you get your questions rejected or asked for re-wording or improvements?

I'm just wondering if the system fails by trusting and respecting authors too much? For example - does the reviewer look at the question from a hygienic point of view, to make sure it represents the LO, it has all the right attachments, there are no spelling errors - or - would they also complain if the question is badly worded or there is more than one correct answer? In other words, are tech reviewers competent in the subject they are reviewing?

I appreciate your positive opinion of the people involved but somehow these rubbish questions still get through and lately - in huge numbers (reportedly). I simply don't believe enough is being done to produce good quality exams. More importantly, we know that there have been lots of complaints and comments logged to CAA, but their lack of action is raising concerns.

I don't buy the EASA argument, somehow the guys here doing their ATPL's with IAA don't see any issues, one or two odd questions, but nothing like I saw in the UK. No complains from my Dutch or Latvian friends also doing exams with their respective authorities. If it's supposed to be the same QB, how come the feedback is so different? If this keeps up, I don't see how UK training providers will be able to attract further business from EU. I think its in your and Alex's best interest to put the pressure on authorities to meet the quality standards that this industry demands

paco 10th Aug 2016 17:46

Quite frequently - they even get a language review. They must be in a specified format in terms of punctuation, etc. I can tell you that the UK person was very proactive and quite correct in everything he said. His checking was certainly detailed, but others are definitely not so good. As with TREs, there is no consistency, which is one thing you can admire Transport Canada for.

It's a good point you make about authorities - I have no answer to that one. I've heard nothing from our students with the IAA either, or from our Dutch guys. As the saying goes, I couldn't possibly comment.

Phil

Alex Whittingham 11th Aug 2016 12:37

I'll echo Martin123's observations. Putting more questions in the CQB would have been a great idea if (a) the syllabus was properly written (b) the questions followed the syllabus (c) they were produced to a high standard and (d) they were properly validated. Sometimes a good question writer can compensate for a poor syllabus and unreliable validation; unfortunately if the person writing the question is not as good as he/she might be then we are leaning on an already rickety quality control process and failures slip through. English language validation in some subjects is definitely a problem.

In many cases the new questions are fine. In some subjects (I am told particularly in helicopter subjects) my instructors are actually decidedly impressed by the new output. However, in many other subjects some of the questions - not all - are a long way off 'fine'. The paradox is that unsatisfactory new questions drive candidates to use question banks even more as they struggle to find answers to questions they find otherwise unanswerable. That's an own goal, EASA. The NPA 29 syllabus they are now working to was once part of a comment and response process. The UK CAA told us 'You've been complaining about the syllabus for years - now is the chance for you to comment'. EASA decided to introduce it without taking account of any of the many comments, they have just passed them on to the next working group.

It may be that new questions have yet to filter in to exams in other EASA states. The UK CAA tend to be the most proactive national authority, and I have to say generally well intentioned but that is little comfort to the affected candidates.


All times are GMT. The time now is 11:05.


Copyright © 2024 MH Sub I, LLC dba Internet Brands. All rights reserved. Use of this site indicates your consent to the Terms of Use.