When in high school, I encountered a standardized test question that went something like this:
A 100 ohm resistor is dissipating 10 watts. If it is replaced by a 50 ohm resistor,
and everything else is unchanged, how much power will be dissipated?
To which I pondered: you can maintain a constant supply voltage, in which case current (and power) will double. OR - you can maintain a constant current, in which case voltage (and power) will be reduced to half. But (by definition)
you cannot keep both voltage and current constant!
I answered it as if voltage remained constant, because I was sure the test-writer never comprehended any other way.