Small wires burn up devices?

  1. Hello,
    I am pondering a topic and i cant quite find the answer that i am looking for. It concerns amp draw. I understand ohms law and the relationships it entails. I am confused with the idea that people say a device will draw more amps than it is supposed to and cause things to burn up. I understand the idea of putting too much power to something and burning it up, but i am just making sure that my thought pattern is correct for the contex of "drawing too much current"
    if you have x amount of voltage and x amount of resistance, then you can only draw a certain amount of current, correct? Just because you have a smaller wire (which will increase the resistance) does not mean that you are pulling any more amps or volts, rather you are decreasing them right? So the wire will burn up, and not the device it is connected to?
    That was pretty mush moshed because i am having a hard time explaining what i want to know.

    I guess i could ask just directly:
    1.) can you burn up a device by using too small of a power cord? (other than a motor)
    2.) When considering power (P=IV) how can you possibly put less voltage onto something and it draw way more current? Like if you had something that was rated 120v 400w, would it burn up if you only hooked 2v to it? the current would then be 200amps instead of 10/3?? This idea goes against my understanding of ohms.
     
    Last edited: Aug 12, 2014
  2. jcsd
  3. I am going to take a crack at your questions but hopefully someone else can give a better answer or even correct mine.

    1) Using a smaller wire will increase the resistance of that wire (modeling the wire as a simple resistor) as you have said. This will cause the wire to burn out before the device it is connected to.

    2) If you are using a constant unregulated power source (meaning P is constant) then by the equation you have provided (P=IV), if you decrease the voltage, the current will increase. This may cause the device to burn.
     
    Last edited: Aug 12, 2014
  4. davenn

    davenn 3,675
    Science Advisor
    Gold Member
    2014 Award

    Hi there Jeff

    there are 2 main reasons why a device will draw more current than it's supposed to

    1) you supply a higher voltage that what it was designed for = more current draw - Ohms Law
    2) if there is a low resistance / short circuit fault in the device - this will result in a higher current being drawn, even tho the device is receiving the correct voltage


    Correct


    correct, you will get to a point where the wire cannot carry the current required by the device and it will burn up ( fuse)

    This is how a fuse protects something - its diameter/makeup is designed to carry a certain current


    NO see above comments


    No, not for purely resistive loads --- a light globe, a heater element etc, but motors are an exception. when the voltage is lowered on the motor, under load, it will draw more current, heat up and burn out the wire windings
    This is because motors present a much more complex load to the power supply


    cheers
    Dave
     
    Last edited: Aug 12, 2014
    1 person likes this.
  5. Undersized wires cause a voltage drop at the appliance under load. You could design a circuit that would burn up if the input voltage gets too low. One would hope that engineers do the opposite and consider what will happen to their design if the voltage drops.

    Undersized wires are a problem unto themselves. They don't need any reason to be avoided other than the fact that they are a fire hazard.
     
  6. davenn

    davenn 3,675
    Science Advisor
    Gold Member
    2014 Award

    Yes that is correct, particularly where high currents are being used as with my transceiver radio gear where I'm hauling >20 amps at 13.8V.
    BUT the voltage drop will not cause damage to the transmitter, it will just mean the transmitter cannot produce full power out.

    As a result, I use a suitably sized conductor diameter that still is easily flexible and to keep it as short as practical

    The DC power lead that comes with the radio is usually ~ 6 - 8 ft long. Most of us guys cut that down to ~ 4 ft and this can solve a up to 2V voltage drop during transmit


    Dave
     
  7. Thanks guys for your help. The idea is a lot more clear now. I suppose that when something is labeled 120v 400w, that is more a ratio (and a rating i am sure) than anything right? It means that when you put 120v on it you get 400w of work? So if i connected 2v to it instead, and if it even worked at all, i would only get 20/3w?

    Thanks again!
     
  8. sophiecentaur

    sophiecentaur 13,698
    Science Advisor
    Gold Member

    This is only correct if the device happens to be a heater (i.e. resistive and not operating at high temperature). If a motor is supplied with very low volts, it will not manage to turn at all and will, therefore produce no back emf. It will consequently take much more current than the simple formula would suggest. Likewise, for a filament lamp, the low temperature resistance may be 1/10 of the normal resistance when it is glowing. There are other devices that will take virtually no current with a low voltage supply (very high resistance).
    Basically, only resistive devices follow Ohm's Law and, you should remember that Ohm's Law states the behaviour at constant temperature. Ohm's Law is, strictly, not the 'definition' of Resistance; it simply states that Resistance is constant at constant temperature (subtle difference there!).
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?

0
Draft saved Draft deleted