How Do Transformers Work with Ohm's Law?

Click For Summary
SUMMARY

This discussion clarifies the relationship between transformers and Ohm's Law, specifically addressing the misconception that higher voltage directly correlates with higher current. A transformer increases voltage while decreasing current, maintaining power balance, as illustrated by a perfect transformer example with a 500-watt load. The discussion emphasizes that while voltage increases, the current decreases proportionally to ensure power remains constant, demonstrating the efficiency of transformers in electrical systems.

PREREQUISITES
  • Understanding of Ohm's Law
  • Basic knowledge of electrical power calculations
  • Familiarity with transformer operation principles
  • Concept of power conservation in electrical systems
NEXT STEPS
  • Study transformer efficiency and losses in real-world applications
  • Learn about different types of transformers and their uses
  • Explore advanced electrical power calculations involving AC circuits
  • Investigate the relationship between voltage, current, and resistance in various electrical components
USEFUL FOR

Electrical engineering students, electricians, and professionals involved in power distribution and transformer design will benefit from this discussion.

Lsos
Messages
803
Reaction score
9
Probably a simple question for electrical people, to which I never picked up the answer in my studies. I know I'm missing something simple, or just thinking about it the wrong way.

A transformer increases voltage, and decreases current. And yet...Ohms law states that higher voltage results in higher current. For some reason this doesn't fit into what a transformer does. So, what gives?

I should know this because a transformer is analagous to a transmission in a mechanical system, and mechanically it somehow makes sense. But electrically it doesn't. I'm missing some key piece, and I need someone to walk me through this please...

transformers_optimus.jpg
 
Engineering news on Phys.org
The power passing through a transformer to the secondary is slightly less than that entering the transformer, because transformers are efficient, but not perfect

Assume for a moment, though, that we did have a perfect transformer.

If we put a load of 500 watts on a 250 volt secondary, this would be 250 volts at 2 amps (to get 500 watts) flowing from the transformer to the load.

Now if the transformer had a 125 volt primary and 125 volts being supplied to it, then the power entering the transformer would have to be 500 watts (to get 500 watts out).
So, how much current is flowing in the primary?
125 volts times 4 amps is 500 watts.

So, the current flowing reduced from 4 amps at the input to 2 amps at the load. The transformer didn't restrict the current, though. The higher voltage just made less current necessary for the same power.
 

Similar threads

Replies
10
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
8K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
15
Views
3K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
30
Views
3K
Replies
9
Views
5K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 57 ·
2
Replies
57
Views
14K