Does Using a Chip in a DC Voltage Supply Reduce Average Voltage Significantly?

  • Context: Undergrad 
  • Thread starter Thread starter PsychonautQQ
  • Start date Start date
  • Tags Tags
    Dc Supply Voltage
Click For Summary
SUMMARY

The discussion centers on the impact of using a chip, specifically a linear voltage regulator, in a DC voltage supply circuit on the average output voltage. It is established that typical linear regulators require an input voltage 2-3V higher than the desired output voltage, known as the "dropout voltage." Low dropout (LDO) regulators are available that require only 0.5 to 0.7V above the output. Additionally, using a higher input voltage can lead to increased power dissipation in the regulator, necessitating careful consideration of thermal management.

PREREQUISITES
  • Understanding of linear voltage regulators and dropout voltage specifications
  • Familiarity with transformer and bridge rectifier functions
  • Knowledge of power dissipation calculations in electronic circuits
  • Experience with capacitor functions in smoothing DC voltage
NEXT STEPS
  • Research "Linear Voltage Regulator specifications" for detailed dropout voltage requirements
  • Explore "Low Dropout Regulator (LDO) advantages and applications"
  • Investigate "Power dissipation in voltage regulators and thermal management techniques"
  • Learn about "Capacitor sizing for DC voltage smoothing in power supplies"
USEFUL FOR

Electronics engineers, hobbyists designing power supply circuits, and anyone involved in optimizing voltage regulation in DC power systems.

PsychonautQQ
Messages
781
Reaction score
10
A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart I'm hoping you know what I'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?
 
Physics news on Phys.org
PsychonautQQ said:
A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart I'm hoping you know what I'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?

Linear Voltage Regulator -- http://en.wikipedia.org/wiki/Linear_regulator

.
 
PsychonautQQ said:
A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart I'm hoping you know what I'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?

Depends on the regulator.

Typical cheap linear regulators need the input to be 2-3V higher than the output. So a 5V regulator usually needs around 8V on it's input. You have to look at the specification for the regulator to be certain what is required. Usually this is the "Drop out voltage" parameter although it may have different names on different data sheets.

This common regulator family need the input to be 2V higher than the output...
http://www.fairchildsemi.com/ds/LM/LM7805.pdf

You can buy "Low drop out" versions of many linear regulators. For example this one has the same pin out as the 780X series but only needs the input to be 0.5 to 0.7V higher than the output.

http://www.ti.com/lit/ds/symlink/lm2940-n.pdf

I haven't checked to see if other parameters are different.

In some cases you may choose to use a higher input voltage than the data sheet says you need. For example in the event of a very brief power cut the input capacitor will continue to supply current to the regulator. This means the regulator circuit will continue to work until the input capacitor voltage falls below the output voltage plus the drop out voltage. By that time the power might have returned meaning your circuit continues to work unaffected by the brief power cut.

However you shouldn't just use a higher input voltage without working out what effect that has on the power dissipation in the regulator... For example suppose your circuit draws 1A at 5V and you choose a regulator that drops 1V. You could feed it 6V in which case it the power dissipated in regulator will be 1A * 1V = 1W. If you crank up the input voltage to say 5V + 3V = 8V then the power dissipated will be 3W and a bigger heatsink will be needed (or even a different regulator).

Remember that the output voltage of the transformer and hence the bridge rectifier won't always be exactly the voltage specified. They have a manufacturing tollerance. The mains voltage also varies from place to place. So it's normal to design the circuit to have a slightly higher voltage on the input to the regulator to allow for these factors.

Lots to consider eh?
 

Similar threads

  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
4K
  • · Replies 63 ·
3
Replies
63
Views
8K
  • · Replies 10 ·
Replies
10
Views
5K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
5K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K