# When transformers decrease voltage, do they increase the current?

by partialfracti
Tags: current, decrease, increase, transformers, voltage
 P: 22 Before a few days ago I always thought that when transformers decrease the voltage, the transformers increase the current. Then on another thread at Physics Forums, people have told me that when transformers decrease voltage, they don't increase the current. The PFers told me that the transformers increase the current available. I think I understand it now. Here's my current theory: When transformers decrease the voltage, the transformers increase the current that is available if a circuit is formed. Before a circuit is formed, there is no current. Is my current theory correct?
 Sci Advisor P: 3,956 Yes, you have it right. If no power is drawn from the lower voltage, then the current is only what is wasted in the transformer. IF YOU DRAW THE SAME POWER AT A LOWER VOLTAGE...... then the current must be greater because power = Voltage times current.
 P: 3,793 Current capability and voltage ratio of a transformer is kind of independent. YOu can only look at the rating of the transformer to determine how much you can draw. If you are talking about the current ratio, if you have a 10:1 step down transformer, if you draw 1A on the secondary, you draw only about 0.1A at the primary.
P: 1,781

## When transformers decrease voltage, do they increase the current?

This is confusing and wrong.

Put a load on a transformer and energize the primary.

Vprimary*Iprimary = Vsecondary*Isecondary.

If the transformer reduces the voltage by half it will exactly double the current. This is the definition of the ideal transformer.
P: 3,793
 Quote by Antiphon This is confusing and wrong. Put a load on a transformer and energize the primary. Vprimary*Iprimary = Vsecondary*Isecondary. Ideal transformer has core that has zero reluctance and the wires has infinite conductance!!!! If the transformer reduces the voltage by half it will exactly double the current. This is the definition of the ideal transformer.
We are talking about real life transformers. The wires has to support the increase in current. Current capability is limited by the size of the wire and the size of the core. You push it, the reliability will suffer. Yes, if you only talk about ideal transformer, everything is possible.

Of cause, if you just doing an experiment only and you'll be there while you are runing it. You can push how ever which way, turn it off when you start to smell something. But if you are designing a product, this is a no no to play this kind of pushing the limit if it is for a product.
P: 1,781
 Quote by yungman We are talking about real life transformers. The wires has to support the increase in current. Current capability is limited by the size of the wire and the size of the core. You push it, the reliability will suffer. Yes, if you only talk about ideal transformer, everything is possible. Of cause, if you just doing an experiment only and you'll be there while you are runing it. You can push how ever which way, turn it off when you start to smell something. But if you are designing a product, this is a no no to play this kind of pushing the limit if it is for a product.
You may be talking about real life transformers. The original post is not.

A real transformer is a very good approximation of an ideal one when it's loaded.

I have one in my laboratory thats rated for 100va. Its a 2:1 step down made by a real-world company that has no students designing it's products.

When I connect a 50w, 60volt lamp to the secondary and connect the primary to 120V, the primary current is HALF the secondary current, and the primary voltage is TWICE the secondary voltage to within 2%. The OP is asking about the 98% that I'm talking about, not the 2% that you're talking about.
 P: 3,793 I think the original question is confusing. I interpreted he meant if he reduce the voltage than the rated input voltage, the output voltage decrease and he can get more current out of it because it lower the power. I think what he meant is actually simple step down transformer!!! I modified my original post. If you meant when you draw 1A on the secondary of a 10:1 transformer, then you draw about 0.1A at the primary. If that is what he meant.
P: 22
 Quote by yungman If you are talking about the current ratio, if you have a 10:1 step down transformer, if you draw 1A on the secondary, you draw only about 0.1A at the primary.

Then why do you say that the current capability and voltage ratio of a transformer is kind of independent?
P: 22
 Quote by Antiphon If the transformer reduces the voltage by half it will exactly double the current.
If the transformer reduces the voltage by half it will exactly the double the current when there is a load on the transformer. Is this correct?
P: 1,781
 Quote by partialfracti If the transformer reduces the voltage by half it will exactly the double the current when there is a load on the transformer. Is this correct?
No. Always.
P: 3,793
 Quote by partialfracti Then why do you say that the current capability and voltage ratio of a transformer is kind of independent?
The original question is very confusing. If all asked was whether a transformer that STEP DOWN from 110V to 11V, then if the secondary is drawing 1A, then the primary is drawing 0.1A since the step down ratio is 10:1.

I really got thrown off by "the transformer increase the current"!!! I was thinking you refer to current capability of the transformer, that has nothing to do with the stepping down ratio, it has to do with what the transformer rated.

Usually people refer the transformer is a step down transformer, then the input current to output current ratio increase like the example I gave on top.
 PF Patron Sci Advisor P: 10,020 You can't just "increase the current" into a load without changing the volts or the resistance of the load. Some of this thread is confusing Cause and Effect. The turns ratio of the transformer determines the ratio of input volts to output volts. Vs = N Vp (the first step in the argument). The output volts will cause current to flow in the load (Secondary Current = Secondary Volts / Load resistance) The secondary current then determines the primary current. Ip = N Is In a real transformer, in which there is finite resistance in the windings and in which the magnetic flux is not linked 100%, Ip may be a bit more and Vs may be a bit less than the above formulae suggest, particularly under load. So there are two answers to the question. In fact, many off-the-shelf transformers behave in a pretty 'ideal' way.
 P: 4,513 Without reading all the threads, in addition to vk6kro and sophiecentaur have said:- The idea transformer has the relationship IpVp = IsVs: Power delivered into the primary is the power out of the secondary, where the V's and I's are functions of time. Real transformers are not perfect: 1)There is a "magnetization current" in the primary when there is no load on the secondary. This is fairly constant, independent of load. 2) There are series resistive losses in both the primary and secondary, as someone mentioned. 3) There are core losses. It takes energy to magnetize and demagnetize the core that shows up as heat. 4) Mutual inductance. Two inductors close to each other may make a transformer, but a poor one. Mutual inductance says how much of the magnetic flux passing through the primary also passes through the secondary. Ideal transformers have a mutual inductance of one. 5) Transformers radiate energy and electromagnetically couple to surrounding circuits and conductive material. 6) There may be 6, 7, or more I can't think of, right off. ...Umm. Here's one I forgot. Transformer windings will have mutual capacitance. This changes how the voltages and currents are related as well.
 Mentor P: 21,648 I also thought the question (and some of the answers) was cumbersome. My take on it is that you can't have a voltage drop across a transformer without a current, so there is no realistic case where they can be independent of each other except due to losses. The only realistic situation I can think of where they'd be different is if you had an energized primary and open secondary. Then you could have a voltage without a current at the secondary. But still, that's just a 100% loss situation.
P: 3,793
 Quote by russ_watters I also thought the question (and some of the answers) was cumbersome. My take on it is that you can't have a voltage drop across a transformer without a current, so there is no realistic case where they can be independent of each other except due to losses. The only realistic situation I can think of where they'd be different is if you had an energized primary and open secondary. Then you could have a voltage without a current at the secondary. But still, that's just a 100% loss situation.
I got thrown off too, I think he is just refer to step down ratio and primary to secondary current ratio only. OP need to clarify his question.
 P: 1,781 All of the confusion is stemming from treating the transformer as a source or load. In fact it is neither. A resistor is a load and defines a ratio of voltage and current at it's terminals. No matter what source it's connected to, this ratio will never change. If the voltage goes down, the current also goes down. A source is different. Whether a voltage or current source, it has no such defined ratio. It depends on the load that is connected. So far so good. Now consider the ideal 2:1 step-down transformer. If you connect it to a 100 volt source, the secondary voltage is 50 volts. The primary and secondary current are unknown and could be anything. Once you put a load on the secondary, then that load defines the current, not the transformer. What the transformer enforces at all times is that the OUTPUT voltage will be half of the INPUT voltage and current will will observe the inverse relationship. The original post asked whether transformers increase current when they decrease voltage. The answer is yes and the question is not ambiguous or confusing unless you are considering the secondary current and the secondary voltage only. As I have shown, that ratio is completely independent of the transformer and depends only on the load. The confused answers were not considering the primary-to-secondary conversion action of thranformers which is their only defined behavior. Bringing is copper losses and ratings only increased the confusion and were not referenced either implicitly or explicitly by the OP.
P: 4,513
 Quote by Antiphon All of the confusion is stemming from treating the transformer as a source or load. In fact it is neither.
A transformer primary with an open secondary is an inductive load with series resistance to good approximation in most cases.

A transformer primary with an loaded secondary is an inductive load with series resistance and parallel load impedence to good approximation in most cases.
PF Patron
P: 10,020
 Quote by Antiphon All of the confusion is stemming from treating the transformer as a source or load. In fact it is neither. . . . . . . The original post asked whether transformers increase current when they decrease voltage. The answer is yes and the question is not ambiguous or confusing unless you are considering the secondary current and the secondary voltage only. As I have shown, that ratio is completely independent of the transformer and depends only on the load. OP.
The problem with the original question was that it seems to imply that halving the secondary voltage actually doubles the secondary current - which is not true at all - in fact, if the load is the same, the current will also be halved.

The simplest equation for a transformer,
V1I1 = V2I2
was, I'm sure, the only thing that the OP refers to - nothing fancy about the real behaviour of a transformer with inductance and resistances involved. As could be expected on a PF thread, contributors have introduced loads of sophisticated factors which, while true, valid and interesting, can only confuse someone who just wants to clear up the basics of transformer operation.

That basic equation just tells you that, if the voltage is stepped down, the input current is proportionally less than the secondary current - that is the 'causal' line of events. The transformer doesn't, in some way, 'force' more current out of the secondary. The secondary current will just depend on the secondary volts and the load.

This is analogous to a lever used as a velocity multiplier against a frictional load (Short end on the effort side) The load end of the lever will exert a smaller force than the effort and could move further than the effort end - but only if the load friction is small enough to allow it. A rather clunky equivalent, I admit but I never really get on with analogies.

 Related Discussions Introductory Physics Homework 5 Introductory Physics Homework 10 Special & General Relativity 9 General Math 8