Current and Voltage in Transformers

Click For Summary
SUMMARY

The discussion centers on the relationship between current and voltage in transformers, specifically addressing misconceptions about step-down transformers. Chris questions how a step-down transformer can increase current while simultaneously reducing voltage, referencing Ohm's Law. The consensus is that the current drawn by a resistor is determined by its resistance, regardless of the supply's capacity to provide higher current. The example of a 110V AC supply illustrates that the load's impedance dictates current draw, not the supply's maximum capability.

PREREQUISITES
  • Understanding of Ohm's Law
  • Basic knowledge of transformer operation
  • Familiarity with AC power supply characteristics
  • Concept of load impedance
NEXT STEPS
  • Research the principles of transformer operation and efficiency
  • Study the effects of load impedance on current draw in AC circuits
  • Explore practical applications of step-down transformers in electrical systems
  • Learn about power supply ratings and their implications for circuit design
USEFUL FOR

Electrical engineers, students studying circuit theory, and anyone involved in designing or analyzing transformer applications will benefit from this discussion.

cavis
Messages
8
Reaction score
0
Hi there,
I've got a fairly simple question theoretical question about transformers and I suspect it stems from a misconception I have about them. I understand that step down transformers can be used in situations where a high current is desired but that they also reduce the voltage in the secondary turns relative to the primary turns.

Here in a nutshell is what's causing me grief. I'm envisioning an AC power supply that is connected to just a single resistor and has a certain current through the resistor as determined by the resistance. If instead the AC power supply was connected to a step down transformer and the resistor was connected to the other side of the circuit, I'm faced with a bit of a paradox.

One the one hand, I know that the current should be greater as a result of the stepping down of the current, but shouldn't a straightforward application of Ohm's law tell us that since the voltage applied to the resistor was stepped down that the current would also drop? What am I missing here?

Cheers,

Chris
 
Engineering news on Phys.org
The current through a resistor is a direct function (Ohm's Law) of the voltage across it.

The fact that the supply to the resistor is capable of providing more current than the resistor draws is irrelevant.
 
Think of it in another way. Your 110V wall AC plug usually can supply up to 15A of current. But if you plug a 100W light bult, It only draw about 1A of current and no more. Just because the AC plug is capable to supply 15A does not mean it will drive 15A out no matter what device you plug into. It is only the capability of the wall plug.

The current draw is govern by the impedance of the load, if you put a 110KΩ resistor across a 110V source, you are going to draw 1mA even though the source is capable to supply 15A.
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
Replies
30
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
8
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
Replies
2
Views
1K
Replies
2
Views
2K
Replies
6
Views
4K
  • · Replies 44 ·
2
Replies
44
Views
8K
  • · Replies 21 ·
Replies
21
Views
3K