Why Does Voltage Source Droop/Sag?

Click For Summary
SUMMARY

The discussion focuses on the phenomenon of voltage droop or sag in non-ideal voltage sources, specifically when the current draw increases. It explains that while an ideal voltage source maintains a constant voltage, real voltage sources have a finite output resistance that causes voltage drops under load. For instance, a 100V source with a 1-ohm series resistance demonstrates significant voltage drop as the load resistance decreases, illustrating Ohm's Law (V = IR) in action. This behavior is critical for understanding circuit design and the limitations of real-world power supplies.

PREREQUISITES
  • Understanding of Ohm's Law (V = IR)
  • Basic knowledge of series circuits
  • Familiarity with voltage dividers
  • Concept of output resistance in voltage sources
NEXT STEPS
  • Study the characteristics of ideal vs. non-ideal voltage sources
  • Learn about series resistance and its impact on circuit performance
  • Explore voltage divider circuits and their applications
  • Investigate power supply design and specifications
USEFUL FOR

Electronics enthusiasts, electrical engineering students, and circuit designers seeking to understand voltage behavior in practical applications and improve their circuit designs.

EENoob
Messages
1
Reaction score
0
Dear Forum,

Hello all, this is my first post. I'm trying to teach myself EE enough to design basic circuits and play with microcontrollers (my background is computers, programing, and, um, music). I've started reading The Art of Electronics and many Internet sources but I seem to be missing something fundamental. I'm hoping that someone can point me in the right direction.

If V = IR (Ohm's law) then why does the voltage from a non-ideal voltage source droop/sag when then current draw of the load increases? Such as the fairly generic linked graph shows:

http://en.wikipedia.org/wiki/File:Droop_behaviour.png"

Thank you very much in advance,
Noob
 
Last edited by a moderator:
Engineering news on Phys.org
Hello,

If V = IR (Ohm's law) then why does the voltage from a non-ideal voltage source droop/sag when then current draw of the load increases?

Well you need to understand series circuits and perhaps voltage dividers.

An ideal voltage source has zero output resistance in series with it.

All real non-ideal sources have some small but non zero series resistance.

Let us suppose we start with a real 100 volt source which has a 1ohm series output resistance.
We model this by an ideal source of 100v in series with 1 ohm.
Let us connect a variable 99 ohm resistor to it. So the total resistance seen by the ideal source is 100 ohms.
99 volts appear across the load and 1 volt is dropped across the series resistor. (V=IR in both cases)

As we reduce the load resistor to increase the current draw, the ideal voltage stays at 100volts and its series resitance stays at 1 ohm. When the load resistance has been reduced to say 49 ohms the total reistanc seen by the ideal source is now 50 ohms and the current 2 amps.
So 2 volts is now dropped across the series 1ohm resistance and 98 volts across the load resistance (again V=IR in both cases)

Reduce the load reistance still further to 1 ohm and the current increases to 100/2 amps.
50 volts is now dropped across the series resistor and only 50 volts across the load.

So the voltage, as seen by the load, has drooped from 99 volts to 50 volts in this example.

The values quoted are extreme for demonstration. Normal series resistances of power supplies are normally measured in milli or even micro ohms.
 

Similar threads

Replies
11
Views
4K
  • · Replies 16 ·
Replies
16
Views
2K
Replies
15
Views
3K
Replies
9
Views
6K
  • · Replies 3 ·
Replies
3
Views
10K
Replies
35
Views
8K
Replies
14
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 35 ·
2
Replies
35
Views
11K
  • · Replies 8 ·
Replies
8
Views
3K