Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

When transformers decrease voltage, do they increase the current?

  1. Jan 12, 2011 #1
    Before a few days ago I always thought that when transformers decrease the voltage, the transformers increase the current. Then on another thread at Physics Forums, people have told me that when transformers decrease voltage, they don't increase the current. The PFers told me that the transformers increase the current available.

    I think I understand it now.

    Here's my current theory: When transformers decrease the voltage, the transformers increase the current that is available if a circuit is formed. Before a circuit is formed, there is no current. Is my current theory correct?
  2. jcsd
  3. Jan 12, 2011 #2


    User Avatar
    Science Advisor

    Yes, you have it right.

    If no power is drawn from the lower voltage, then the current is only what is wasted in the transformer.

    IF YOU DRAW THE SAME POWER AT A LOWER VOLTAGE...... then the current must be greater because power = Voltage times current.
  4. Jan 12, 2011 #3
    Current capability and voltage ratio of a transformer is kind of independent. YOu can only look at the rating of the transformer to determine how much you can draw.

    If you are talking about the current ratio, if you have a 10:1 step down transformer, if you draw 1A on the secondary, you draw only about 0.1A at the primary.
    Last edited: Jan 13, 2011
  5. Jan 13, 2011 #4
    This is confusing and wrong.

    Put a load on a transformer and energize the primary.

    Vprimary*Iprimary = Vsecondary*Isecondary.

    If the transformer reduces the voltage by half it will exactly double the current. This is the definition of the ideal transformer.
  6. Jan 13, 2011 #5
    We are talking about real life transformers. The wires has to support the increase in current. Current capability is limited by the size of the wire and the size of the core. You push it, the reliability will suffer. Yes, if you only talk about ideal transformer, everything is possible.

    Of cause, if you just doing an experiment only and you'll be there while you are runing it. You can push how ever which way, turn it off when you start to smell something. But if you are designing a product, this is a no no to play this kind of pushing the limit if it is for a product.
    Last edited: Jan 13, 2011
  7. Jan 13, 2011 #6
    You may be talking about real life transformers. The original post is not.

    A real transformer is a very good approximation of an ideal one when it's loaded.

    I have one in my laboratory thats rated for 100va. Its a 2:1 step down made by a real-world company that has no students designing it's products.

    When I connect a 50w, 60volt lamp to the secondary and connect the primary to 120V, the primary current is HALF the secondary current, and the primary voltage is TWICE the secondary voltage to within 2%. The OP is asking about the 98% that I'm talking about, not the 2% that you're talking about.
  8. Jan 13, 2011 #7
    I think the original question is confusing. I interpreted he meant if he reduce the voltage than the rated input voltage, the output voltage decrease and he can get more current out of it because it lower the power. I think what he meant is actually simple step down transformer!!!

    I modified my original post. If you meant when you draw 1A on the secondary of a 10:1 transformer, then you draw about 0.1A at the primary. If that is what he meant.
    Last edited: Jan 13, 2011
  9. Jan 13, 2011 #8

    Then why do you say that the current capability and voltage ratio of a transformer is kind of independent?
  10. Jan 13, 2011 #9
    If the transformer reduces the voltage by half it will exactly the double the current when there is a load on the transformer. Is this correct?
  11. Jan 13, 2011 #10
    No. Always.
  12. Jan 14, 2011 #11
    The original question is very confusing. If all asked was whether a transformer that STEP DOWN from 110V to 11V, then if the secondary is drawing 1A, then the primary is drawing 0.1A since the step down ratio is 10:1.

    I really got thrown off by "the transformer increase the current"!!! I was thinking you refer to current capability of the transformer, that has nothing to do with the stepping down ratio, it has to do with what the transformer rated.

    Usually people refer the transformer is a step down transformer, then the input current to output current ratio increase like the example I gave on top.
  13. Jan 14, 2011 #12


    User Avatar
    Science Advisor
    Gold Member

    You can't just "increase the current" into a load without changing the volts or the resistance of the load. Some of this thread is confusing Cause and Effect.

    The turns ratio of the transformer determines the ratio of input volts to output volts.
    Vs = N Vp (the first step in the argument).

    The output volts will cause current to flow in the load (Secondary Current = Secondary Volts / Load resistance)
    The secondary current then determines the primary current.
    Ip = N Is

    In a real transformer, in which there is finite resistance in the windings and in which the magnetic flux is not linked 100%, Ip may be a bit more and Vs may be a bit less than the above formulae suggest, particularly under load.
    So there are two answers to the question. In fact, many off-the-shelf transformers behave in a pretty 'ideal' way.
  14. Jan 14, 2011 #13
    Without reading all the threads, in addition to vk6kro and sophiecentaur have said:-

    The idea transformer has the relationship

    IpVp = IsVs:

    Power delivered into the primary is the power out of the secondary, where the V's and I's are functions of time.

    Real transformers are not perfect:

    1)There is a "magnetization current" in the primary when there is no load on the secondary. This is fairly constant, independent of load.
    2) There are series resistive losses in both the primary and secondary, as someone mentioned.
    3) There are core losses. It takes energy to magnetize and demagnetize the core that shows up as heat.
    4) Mutual inductance. Two inductors close to each other may make a transformer, but a poor one. Mutual inductance says how much of the magnetic flux passing through the primary also passes through the secondary. Ideal transformers have a mutual inductance of one.
    5) Transformers radiate energy and electromagnetically couple to surrounding circuits and conductive material.
    6) There may be 6, 7, or more I can't think of, right off.

    ...Umm. Here's one I forgot. Transformer windings will have mutual capacitance. This changes how the voltages and currents are related as well.
    Last edited: Jan 14, 2011
  15. Jan 14, 2011 #14


    User Avatar

    Staff: Mentor

    I also thought the question (and some of the answers) was cumbersome. My take on it is that you can't have a voltage drop across a transformer without a current, so there is no realistic case where they can be independent of each other except due to losses.

    The only realistic situation I can think of where they'd be different is if you had an energized primary and open secondary. Then you could have a voltage without a current at the secondary. But still, that's just a 100% loss situation.
    Last edited: Jan 14, 2011
  16. Jan 14, 2011 #15
    I got thrown off too, I think he is just refer to step down ratio and primary to secondary current ratio only. OP need to clarify his question.
  17. Jan 14, 2011 #16
    All of the confusion is stemming from treating the transformer as a source or load. In fact it is neither.

    A resistor is a load and defines a ratio of voltage and current at it's terminals. No matter what source it's connected to, this ratio will never change. If the voltage goes down, the current also goes down.

    A source is different. Whether a voltage or current source, it has no such defined ratio. It depends on the load that is connected.

    So far so good. Now consider the ideal 2:1 step-down transformer. If you connect it to a 100 volt source, the secondary voltage is 50 volts. The primary and secondary current are unknown and could be anything. Once you put a load on the secondary, then that load defines the current, not the transformer.

    What the transformer enforces at all times is that the OUTPUT voltage will be half of the INPUT voltage and current will will observe the inverse relationship.

    The original post asked whether transformers increase current when they decrease voltage. The answer is yes and the question is not ambiguous or confusing unless you are considering the secondary current and the secondary voltage only. As I have shown, that ratio is completely independent of the transformer and depends only on the load.

    The confused answers were not considering the primary-to-secondary conversion action of thranformers which is their only defined behavior. Bringing is copper losses and ratings only increased the confusion and were not referenced either implicitly or explicitly by the OP.
  18. Jan 15, 2011 #17
    A transformer primary with an open secondary is an inductive load with series resistance to good approximation in most cases.

    A transformer primary with an loaded secondary is an inductive load with series resistance and parallel load impedence to good approximation in most cases.
  19. Jan 15, 2011 #18


    User Avatar
    Science Advisor
    Gold Member

    The problem with the original question was that it seems to imply that halving the secondary voltage actually doubles the secondary current - which is not true at all - in fact, if the load is the same, the current will also be halved.

    The simplest equation for a transformer,
    V1I1 = V2I2
    was, I'm sure, the only thing that the OP refers to - nothing fancy about the real behaviour of a transformer with inductance and resistances involved. As could be expected on a PF thread, contributors have introduced loads of sophisticated factors which, while true, valid and interesting, can only confuse someone who just wants to clear up the basics of transformer operation.

    That basic equation just tells you that, if the voltage is stepped down, the input current is proportionally less than the secondary current - that is the 'causal' line of events. The transformer doesn't, in some way, 'force' more current out of the secondary. The secondary current will just depend on the secondary volts and the load.

    This is analogous to a lever used as a velocity multiplier against a frictional load (Short end on the effort side) The load end of the lever will exert a smaller force than the effort and could move further than the effort end - but only if the load friction is small enough to allow it. A rather clunky equivalent, I admit but I never really get on with analogies.
  20. Jan 15, 2011 #19


    User Avatar
    Science Advisor

    The equation
    V1*I1 = V2*I2
    just says that "power in = power out".
    This applies directly if the transformer losses are negligible, as they should be.

    So, suppose you have a transformer with two secondaries. 120 volt primary and 12 volt and 6 volt secondaries.

    Suppose you put a 12 watt load on the 12 volt secondary.
    There will be a current of 1 amp drawn from the 12 V secondary and there will be an extra 12 watts entering the transformer at the primary. 12 watts at 120 volts is 0.1 amps.

    Now put a 12 watt load on the 6 volt secondary.
    There will be a current of 2 amps flowing.
    At the primary of the transformer, there will be 12 watts entering the transformer (or an extra 12 watts if the first load is still in place). So the primary current will be an extra 0.1 amps.

    Note that the current and voltage are inversely related so that the power stays the same in each case.
    Also note that the result is the same whether the transformer is at 5 % of its rated capability or 90%.
  21. May 19, 2013 #20
    How do i make current reduction

    I have a transformer with a rating as input 230V Ac and output as 12V AC with 6V AC and a GND inbetween 12V and 6V.i.e., 12,6,0,-6,12 is what the connections say.The output is rated as 3A .I am having a stepper motor which has a rating of 1.5A ,so how to reduce the current drawn from the transformer.Also can I connect both 6A as well as 12 A from the same transformer?Also can I take multiple circuits to act as load so the current in each circuit follows Kirchoffs law and divides the 3A?
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted

Similar Discussions: When transformers decrease voltage, do they increase the current?