Trasformers and Ohms law

  • Thread starter Lsos
  • Start date
  • #1
803
9
Probably a simple question for electrical people, to which I never picked up the answer in my studies. I know I'm missing something simple, or just thinking about it the wrong way.

A transformer increases voltage, and decreases current. And yet...Ohms law states that higher voltage results in higher current. For some reason this doesn't fit into what a transformer does. So, what gives?

I should know this because a transformer is analagous to a transmission in a mechanical system, and mechanically it somehow makes sense. But electrically it doesn't. I'm missing some key piece, and I need someone to walk me through this please...

transformers_optimus.jpg
 

Answers and Replies

  • #2
vk6kro
Science Advisor
4,081
40
The power passing through a transformer to the secondary is slightly less than that entering the transformer, because transformers are efficient, but not perfect

Assume for a moment, though, that we did have a perfect transformer.

If we put a load of 500 watts on a 250 volt secondary, this would be 250 volts at 2 amps (to get 500 watts) flowing from the transformer to the load.

Now if the transformer had a 125 volt primary and 125 volts being supplied to it, then the power entering the transformer would have to be 500 watts (to get 500 watts out).
So, how much current is flowing in the primary?
125 volts times 4 amps is 500 watts.

So, the current flowing reduced from 4 amps at the input to 2 amps at the load. The transformer didn't restrict the current, though. The higher voltage just made less current necessary for the same power.
 

Related Threads on Trasformers and Ohms law

  • Last Post
Replies
9
Views
3K
  • Last Post
Replies
1
Views
2K
Replies
3
Views
2K
  • Last Post
Replies
24
Views
14K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
20
Views
1K
  • Last Post
Replies
8
Views
6K
  • Last Post
Replies
8
Views
5K
Replies
41
Views
3K
  • Last Post
3
Replies
57
Views
10K
Top