I remember that one confusing me too!
If I remember correctly, the reason is as follows:
While ISA assumes a 2degree C drop in temp per 1000, it's an assumed average.
Accroding to Thom (if I remember correctly) dry air actualy drops by 3 degrees per 1000ft , and saturated (wet) air by 1.5 degrees. The air under the cloud base is assumed to be dry, so you take your temp drop as 3 degrees per 1000 ft.
However the due point also falls by 0.5 degrees C per 1000ft. Hence your target temp is not 16 degrees....it's falling by 1/2 degree as you go up.
Therefore for every 1000ft you go up, you get 2.5C closer to the due point. ie. your temp drops by 3 degrees, but your due point moves away by 0.5 degrees.
So in your question...you have a difference of 4.5 degrees....divide this by 2.5=1.8(thousand)=1,800 feet. Answer A is the closest.
By the way, I agree with IO540, that in practice this is virtually useless, but it's what you need for the exam!
Hope that helps.
dp