In the case of a transformer, why does increasing the voltage decrease the current
If an ideal step up transformer increases the voltage from 10V to 20V say, why does the current halve?
I know it happens and I know the equations for it, I know to say otherwise would contradict the law that energy cannot be created.
But putting the equations aside, why exactly does it happen?
EDIT: The part I don't understand is that when the potential difference increases, and the resistance (in the secondary) remains constant (I think?), why does current decrease?