Trasformers and Ohms law

  • Thread starter Lsos
  • Start date
Probably a simple question for electrical people, to which I never picked up the answer in my studies. I know I'm missing something simple, or just thinking about it the wrong way.

A transformer increases voltage, and decreases current. And yet...Ohms law states that higher voltage results in higher current. For some reason this doesn't fit into what a transformer does. So, what gives?

I should know this because a transformer is analagous to a transmission in a mechanical system, and mechanically it somehow makes sense. But electrically it doesn't. I'm missing some key piece, and I need someone to walk me through this please...



Science Advisor
The power passing through a transformer to the secondary is slightly less than that entering the transformer, because transformers are efficient, but not perfect

Assume for a moment, though, that we did have a perfect transformer.

If we put a load of 500 watts on a 250 volt secondary, this would be 250 volts at 2 amps (to get 500 watts) flowing from the transformer to the load.

Now if the transformer had a 125 volt primary and 125 volts being supplied to it, then the power entering the transformer would have to be 500 watts (to get 500 watts out).
So, how much current is flowing in the primary?
125 volts times 4 amps is 500 watts.

So, the current flowing reduced from 4 amps at the input to 2 amps at the load. The transformer didn't restrict the current, though. The higher voltage just made less current necessary for the same power.

Want to reply to this thread?

"Trasformers and Ohms law" You must log in or register to reply here.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving

Top Threads