Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Resistors vs transformers

  1. Dec 31, 2008 #1
    Hello, I am new here and this looks like a good place to get answers! Wondering if someone could clarify something for me.

    When withdrawing current from a socket to an electrical device, the current is determined by I= V/R. So the manufacturer of the device can add resistors to manipulate the current going through. Correct?

    My second question is that instead of adding resistors to decrease current can the manufacturer just add a transformer instead which will decrease voltage to produce the same current, and if so, does that mean that transformers and resistors can be interchangeable for this particular application? would there a preference for using either?

    Thank you for answering!
  2. jcsd
  3. Dec 31, 2008 #2
    Hello Ralph...sounds like you may be beginning to study electricity!!
    Sometimes this is done but in practice it is avoided because the resistor utilizes power....and gives off heat...to no purpose....much more efficient to design the circuit for minimum power (minimum current) use....Such impedance matching might more practical in amplifiers for very low power applications....

    Sure, but that becomes expensive.. a smart designer would avoid such a costly application.
    And again, the solution depends on the application...

    Consider light bulbs, for example... Many materials might be used, but it turns out tungston in a vacuum has the longevity and resistance characteristics to work pretty well. Yet a fluorescent (gas) type bulb is considerably more efficient and hence can support the additional cost of a "ballast"....see wikipedia, fluorescent lamp, for a more complete discussion. (This also spreads toxic mercury.)
  4. Dec 31, 2008 #3


    User Avatar
    Science Advisor

    Not quite, using a transformer to reduce the voltage will increase the current proportionately since the power the load uses is the same (think conservation of energy). By halving the voltage the current in the secondary will be doubled:

    [tex] \frac{V_p}{V_s} = \frac{I_s}{I_p}[/tex]

    The benefit of a transformer would be to reduce transmission line losses since they are equal to I^2R. Hence, pumping up the voltage and sending the same VA will reduce the current and thus the line loss.

    Hope this helps.

  5. Jan 1, 2009 #4
    If he were to use a transformer to lower the voltage applied to a resistive load the current and power would both decrease. Current through the transformer secondary will be determined by the load. (up to a point :wink:)

    I= E/R and P= IE
  6. Jan 1, 2009 #5
    Thank you for the responses!

    I'm still not clear on a few things:
    For example: Let's say a manufacturer creates a device and minimizes the resistance of it as much as possible to minimize power losses. Since the resistance of this device is constant and the voltage from the socket is also constant then the current/power through the device are determined by I = V/R and P=VI and are both fixed.
    What if the current/power is too much for the device to work properly? What is done to extract the required amount of power/current from the socket while keeping the resistance at the minimum?
    Also, what determines the amount of power required by the device in the first place?

    Responses are appreciated!

  7. Jan 1, 2009 #6


    User Avatar
    Science Advisor

    True, however, the load (or component) is designed to work under specific conditions (i.e. requires a certain amount of power to operate properly). So given that the power is constant in the load (or within some design range), one cannot simply reduce the voltage arbitrarily (and thus current in the secondary and power the load draws) to limit the current. The complete circuit would need to be designed with all of that in mind which probably wouldn't be the most practical way to limit current. Good point though. :approve:

  8. Jan 1, 2009 #7


    User Avatar
    Science Advisor

    Happy New Year to you too!

    Most devices work within some power range (i.e. a minimum requirement for it to work and a maximum that it can withstand).

    If the power (some combination of current and voltage) is too little or too much, the device will not operate or it will fail. Hence, the device must be designed with this in mind. In other words, the designer will ensure that the device's resistance is such that at the operating voltage it will not draw too much power (beyond the devices limit), or too little power (not enough to operate).

    Hope this helps.

  9. Jan 1, 2009 #8


    User Avatar

    Staff: Mentor

    Welcome to the PF, RalphM!
  10. Jan 1, 2009 #9


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Electronic devices are designed to perform a task. The accomplishment of that task is the primary concern of the designer. Power consumption by a device is determined by the various electronic components used to accomplish the design goals. A good design will keep power consumption to a minimum while accomplishing the original task.

    A device uses exactly as much current, or power if you wish, as it needs. The power supply does not control the amount of current drawn unless the capabilities of the supply is exceeded. The current draw is determined by the load.

    For example; you can run your tape player on your cars 12v battery even though it only draws a few hunderd milliamps. The battery is capable of driving the starter motor which can draw nearly a 100amps.

    Perhaps Ivan Seeking will drop in and discuss some of the methods he has used to extend battery life when the current draw of a circiut is high.
  11. Jan 1, 2009 #10
    Thanks guys you've cleared up a lot. I feel I'm starting to get a deeper understanding.

    I'm still a bit unclear about power consumption.

    For example: Let's say you connect the hot and neutral of your 120VAC socket with a 1 Ohm wire. Then I = 120 Amps from I=V/R. Power = 14400 Watts from P=VI. Does this mean that you are losing/wasting 14400 Watts of power?
    If you use a 10 Ohm wire instead of the 1 Ohm wire the I = 12 amps and power is 1440Watts.
    It seems that the conclusion to be drawn is that you save on power by using a wire (or device) with higher resistance. This doesn't seem correct.
    Can someone please point out the logical flaw?
  12. Jan 1, 2009 #11


    User Avatar
    Staff Emeritus
    Science Advisor
    Homework Helper

    There is no flaw, your conclusions are entirely correct.
  13. Jan 2, 2009 #12
    If the wire is intended to be the power consuming device this is correct. You have described as an example a simple resistance heater. All of the electrical power is converted to heat.

    Just for fun you could pull one of the heating elements out of your stove and measure the resistance.

    Your statement may not seem correct to you because you are only considering a wire. If you have a VERY low resistance wire providing electricity to a 10 ohm load then (for the most part) the 10 ohm load is what determines the power consumed. (This is only if the source is adequate of course) The wire is not as significant.

    In real world applications such as the electrical circuits in your house the resistance of the wiring has to be considered. If you have a load that is large (low resistance) then you need lower resistance wire providing the power. Normally this is done with larger gauge wire or using copper instead of aluminum. If the wire is inadequate it will heat because it is consuming power. The term used for the adequacy of wire is ampacity.
  14. Jan 2, 2009 #13
    About the electrical supply:
    Since the hot and neutral wires have a resistance and that resistance is the minimum that a circuit can have assuming you connect them with a wire having 0 Ohm. Does that mean that the maximum power that can be extracted from an outlet is constant and can be calculated as follows:

    V= 120V
    R = Rmin (resistance of hot and neutral wires)
    I = 120 / Rmin

    P = (120 * 120) / Rmin

    And let's say that you did connect them with the 0 Ohm wire what happens to all the power. Does it just all get wasted into the earth (except for a small amount that heats the hot and neutral wires)?
  15. Jan 2, 2009 #14
    Yes, but you will trip the circuit over-current protection before you reach the theoretical max current. 15 amps at 14 AWG copper wire for example.

    This is called a short circuit and without circuit over-current protection the conductors will be destroyed by the heat thus opening the circuit and perhaps burning down the house.
    Last edited: Jan 2, 2009
  16. Jan 2, 2009 #15
    in your reply to the first paragraph you mention "circuit protectors" and in the second you mention circuit "over-current protection". Is this the same thing?

    Thanks for the replies you've helped a lot.
  17. Jan 2, 2009 #16
    Yes, sorry to have been unclear.

    Circuit over-current protection is often circuit breakers or fuses. There are many different types and applications.

    I am very glad if I have helped, you are welcome.

    Edit: I fixed post 14
    Last edited: Jan 2, 2009
  18. Jan 2, 2009 #17


    User Avatar

    Staff: Mentor

    BTW, a general comment on power transfer....

    Do not forget to include the source impedance in your calculations. True, for AC Mains power distribution, the source impedance is pretty low, but it is not negligible. For example, have you ever seen your lights flicker briefly as a high-current motor appliance starts up (like your garage compressor, or your workshop table saw)? That's due to source voltage droop at the high-current part of the motor startup.

    EDIT -- The Rsource I'm mentioning is a combination of the impedance of the power pole distribution transformer (low) plus the resistance of the wiring getting down into the home and out into the various branches in the home.
  19. Jan 2, 2009 #18
    Here is an example of the effects of a short circuit:

    I don't know if I can say where specifically, but a large research facility here in the south uses large motors (about 44,000 hp each). They are synchronous motors and on shutdown an operator (not me, I don't work there) did not shut down stator current. The result was, as the motor slowed and the back emf dropped to zero the current through the stator tried to approach the theoretical limit. As you might imagine the source is large and the stator windings are (well... were) large. The supply transformer has over-current protection but it was configured improperly so it did not trip and was destroyed. The line from TVA is 161,000 volts, it was damaged also before the substation providing the 161k shut down. Large holes were blown in the motor frame (several inches thick) and the windings were vaporized.

    Not your typical pinched extension cord, but yet the same in theory.
  20. Jan 2, 2009 #19


    Staff: Mentor

    You can actually use this fact to understand why a low resistance draws more power in home circuit. The maximum power transfer is always obtained when the source impedance matches the load impedance. Voltage sources are inherently low impedance so low load impedances result in high power for a voltage source. On the other hand, current sources are inherently high impedance so high load impedances give high power for a current source.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Resistors vs transformers
  1. Resistors question (Replies: 1)