Hi, I have a dilemma about transformers that I want to discuss. Say that I have connected an AC source to a step-up transformer with Np turns in the primary winding and Ns in the secondary winding. Say that the secondary winding is connected to a resistance R. We know that the voltage in the secondary winding will be (Vp * Ns/Np) and if Ns>Np the voltage will increase. But as I understand this voltage increase must come at a cost of a decrease of current in the secondary winding in order to have P = I*V be the same on both sides of the transformer. But we also know that I = V/R and thus if the voltage did increase in the secondary winding and the current did decrease, this must mean that the resistance increased in the secondary winding. But where does this resistance come from? Is it some sort of imaginary resistance? I mean in my world an increase in voltage should mean an increase in current. I think that I am making some fundamental mistake so please help me sort it out.
I believe the impedance of the secondary inductor is where the resistance comes from. More turns = more inductance = more impedance since you have more self induction generating more counter EMF in your secondary.
These are ratios (secondary to primary). Voltage on secondary increases relative to primary. Current on secondary decreases relative to primary. Resistance on secondary increases (as you said) relative to primary. Example: 2:1 stepup ratio, 1Vac input, 1ohm secondary load. Voltage at secondary = 2Vac (per turns ratio) Current at secondary = 2amps (ohms law) Load seen by source = 0.25 ohms (turns ratio squared) current at source = 4 amps (ohms law) Power at source = 4 watts Power at load = 4 watts. Note: Voltage has gone up from 1V to 2V. Current has gone down from 4A to 2A. Resistance has gone up from 0.25 ohms to 1 ohm