- #1
dolle39
- 4
- 0
Hi,
I have a dilemma about transformers that I want to discuss. Say that I have connected an AC source to a step-up transformer with Np turns in the primary winding and Ns in the secondary winding. Say that the secondary winding is connected to a resistance R.
We know that the voltage in the secondary winding will be (Vp * Ns/Np) and if Ns>Np the voltage will increase. But as I understand this voltage increase must come at a cost of a decrease of current in the secondary winding in order to have P = I*V be the same on both sides of the transformer.
But we also know that I = V/R and thus if the voltage did increase in the secondary winding and the current did decrease, this must mean that the resistance increased in the secondary winding. But where does this resistance come from? Is it some sort of imaginary resistance?
I mean in my world an increase in voltage should mean an increase in current.
I think that I am making some fundamental mistake so please help me sort it out.
I have a dilemma about transformers that I want to discuss. Say that I have connected an AC source to a step-up transformer with Np turns in the primary winding and Ns in the secondary winding. Say that the secondary winding is connected to a resistance R.
We know that the voltage in the secondary winding will be (Vp * Ns/Np) and if Ns>Np the voltage will increase. But as I understand this voltage increase must come at a cost of a decrease of current in the secondary winding in order to have P = I*V be the same on both sides of the transformer.
But we also know that I = V/R and thus if the voltage did increase in the secondary winding and the current did decrease, this must mean that the resistance increased in the secondary winding. But where does this resistance come from? Is it some sort of imaginary resistance?
I mean in my world an increase in voltage should mean an increase in current.
I think that I am making some fundamental mistake so please help me sort it out.