Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Voltage Droop

  1. Jul 27, 2010 #1
    Dear Forum,

    Hello all, this is my first post. I'm trying to teach myself EE enough to design basic circuits and play with microcontrollers (my background is computers, programing, and, um, music). I've started reading The Art of Electronics and many Internet sources but I seem to be missing something fundamental. I'm hoping that someone can point me in the right direction.

    If V = IR (Ohm's law) then why does the voltage from a non-ideal voltage source droop/sag when then current draw of the load increases? Such as the fairly generic linked graph shows:

    http://en.wikipedia.org/wiki/File:Droop_behaviour.png" [Broken]

    Thank you very much in advance,
    Noob
     
    Last edited by a moderator: May 4, 2017
  2. jcsd
  3. Jul 27, 2010 #2
    Hello,

    Well you need to understand series circuits and perhaps voltage dividers.

    An ideal voltage source has zero output resistance in series with it.

    All real non-ideal sources have some small but non zero series resistance.

    Let us suppose we start with a real 100 volt source which has a 1ohm series output resistance.
    We model this by an ideal source of 100v in series with 1 ohm.
    Let us connect a variable 99 ohm resistor to it. So the total resistance seen by the ideal source is 100 ohms.
    99 volts appear across the load and 1 volt is dropped across the series resistor. (V=IR in both cases)

    As we reduce the load resistor to increase the current draw, the ideal voltage stays at 100volts and its series resitance stays at 1 ohm. When the load resistance has been reduced to say 49 ohms the total reistanc seen by the ideal source is now 50 ohms and the current 2 amps.
    So 2 volts is now dropped across the series 1ohm resistance and 98 volts across the load resistance (again V=IR in both cases)

    Reduce the load reistance still further to 1 ohm and the current increases to 100/2 amps.
    50 volts is now dropped across the series resistor and only 50 volts across the load.

    So the voltage, as seen by the load, has drooped from 99 volts to 50 volts in this example.

    The values quoted are extreme for demonstration. Normal series resistances of power supplies are normally measured in milli or even micro ohms.
     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook