Current and Voltage in Transformers

AI Thread Summary
Step-down transformers reduce voltage while increasing current, adhering to the principle of conservation of energy. When a resistor is connected to a transformer, the current drawn depends on the resistor's resistance and the voltage applied, as dictated by Ohm's Law. The power supply's capacity to provide current does not dictate the actual current drawn by the load; it only indicates the maximum potential. For example, a 110V outlet can supply up to 15A, but a connected device will only draw the current it requires based on its resistance. Understanding the relationship between voltage, current, and resistance clarifies the perceived paradox in transformer operation.
cavis
Messages
8
Reaction score
0
Hi there,
I've got a fairly simple question theoretical question about transformers and I suspect it stems from a misconception I have about them. I understand that step down transformers can be used in situations where a high current is desired but that they also reduce the voltage in the secondary turns relative to the primary turns.

Here in a nutshell is what's causing me grief. I'm envisioning an AC power supply that is connected to just a single resistor and has a certain current through the resistor as determined by the resistance. If instead the AC power supply was connected to a step down transformer and the resistor was connected to the other side of the circuit, I'm faced with a bit of a paradox.

One the one hand, I know that the current should be greater as a result of the stepping down of the current, but shouldn't a straightforward application of Ohm's law tell us that since the voltage applied to the resistor was stepped down that the current would also drop? What am I missing here?

Cheers,

Chris
 
Engineering news on Phys.org
The current through a resistor is a direct function (Ohm's Law) of the voltage across it.

The fact that the supply to the resistor is capable of providing more current than the resistor draws is irrelevant.
 
Think of it in another way. Your 110V wall AC plug usually can supply up to 15A of current. But if you plug a 100W light bult, It only draw about 1A of current and no more. Just because the AC plug is capable to supply 15A does not mean it will drive 15A out no matter what device you plug into. It is only the capability of the wall plug.

The current draw is govern by the impedance of the load, if you put a 110KΩ resistor across a 110V source, you are going to draw 1mA even though the source is capable to supply 15A.
 
Thread 'Weird near-field phenomenon I get in my EM simulation'
I recently made a basic simulation of wire antennas and I am not sure if the near field in my simulation is modeled correctly. One of the things that worry me is the fact that sometimes I see in my simulation "movements" in the near field that seems to be faster than the speed of wave propagation I defined (the speed of light in the simulation). Specifically I see "nodes" of low amplitude in the E field that are quickly "emitted" from the antenna and then slow down as they approach the far...
Hello dear reader, a brief introduction: Some 4 years ago someone started developing health related issues, apparently due to exposure to RF & ELF related frequencies and/or fields (Magnetic). This is currently becoming known as EHS. (Electromagnetic hypersensitivity is a claimed sensitivity to electromagnetic fields, to which adverse symptoms are attributed.) She experiences a deep burning sensation throughout her entire body, leaving her in pain and exhausted after a pulse has occurred...
Back
Top