Does Using a Chip in a DC Voltage Supply Reduce Average Voltage Significantly?

In summary, the conversation discusses using a linear voltage regulator to regulate the output of a DC voltage supply. The input must be 2-3V higher than the output, but there are low drop-out versions available. However, using a higher input voltage can increase power dissipation and may require a larger heatsink. It is also important to consider manufacturing tolerances and variations in mains voltage when designing the circuit.
  • #1
PsychonautQQ
784
10
A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart I'm hoping you know what I'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?
 
Physics news on Phys.org
  • #2
PsychonautQQ said:
A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart I'm hoping you know what I'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?

Linear Voltage Regulator -- http://en.wikipedia.org/wiki/Linear_regulator

.
 
  • #3
PsychonautQQ said:
A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart I'm hoping you know what I'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?

Depends on the regulator.

Typical cheap linear regulators need the input to be 2-3V higher than the output. So a 5V regulator usually needs around 8V on it's input. You have to look at the specification for the regulator to be certain what is required. Usually this is the "Drop out voltage" parameter although it may have different names on different data sheets.

This common regulator family need the input to be 2V higher than the output...
http://www.fairchildsemi.com/ds/LM/LM7805.pdf

You can buy "Low drop out" versions of many linear regulators. For example this one has the same pin out as the 780X series but only needs the input to be 0.5 to 0.7V higher than the output.

http://www.ti.com/lit/ds/symlink/lm2940-n.pdf

I haven't checked to see if other parameters are different.

In some cases you may choose to use a higher input voltage than the data sheet says you need. For example in the event of a very brief power cut the input capacitor will continue to supply current to the regulator. This means the regulator circuit will continue to work until the input capacitor voltage falls below the output voltage plus the drop out voltage. By that time the power might have returned meaning your circuit continues to work unaffected by the brief power cut.

However you shouldn't just use a higher input voltage without working out what effect that has on the power dissipation in the regulator... For example suppose your circuit draws 1A at 5V and you choose a regulator that drops 1V. You could feed it 6V in which case it the power dissipated in regulator will be 1A * 1V = 1W. If you crank up the input voltage to say 5V + 3V = 8V then the power dissipated will be 3W and a bigger heatsink will be needed (or even a different regulator).

Remember that the output voltage of the transformer and hence the bridge rectifier won't always be exactly the voltage specified. They have a manufacturing tollerance. The mains voltage also varies from place to place. So it's normal to design the circuit to have a slightly higher voltage on the input to the regulator to allow for these factors.

Lots to consider eh?
 

What is a DC voltage supply?

A DC voltage supply is a device that converts an input voltage into a constant output voltage. It is commonly used in electronic circuits to provide a stable and reliable source of power.

What components are needed to create a DC voltage supply?

The basic components needed to create a DC voltage supply include a transformer, diode bridge, capacitor, voltage regulator, and output capacitor. Additional components may be necessary depending on the specific design and application.

How does a DC voltage supply work?

A DC voltage supply works by converting AC input voltage into a DC output voltage. This is achieved through a series of steps, including transforming the input voltage, rectifying it to convert AC to DC, filtering the output to smooth out any fluctuations, and regulating the voltage to maintain a constant level.

What is the difference between a regulated and unregulated DC voltage supply?

A regulated DC voltage supply includes a voltage regulator that ensures the output voltage remains constant, regardless of any variations in the input voltage or load. An unregulated DC voltage supply does not have a voltage regulator and therefore the output voltage may vary with changes in the input or load.

What are some common applications of DC voltage supplies?

DC voltage supplies are used in a wide range of electronic devices and equipment, including computers, cell phones, power tools, and household appliances. They are also commonly used in industrial and commercial settings for powering control systems, motors, and sensors.

Similar threads

  • Electrical Engineering
Replies
8
Views
1K
  • Electrical Engineering
Replies
7
Views
3K
  • Electrical Engineering
2
Replies
63
Views
5K
  • Electrical Engineering
Replies
10
Views
2K
  • Electrical Engineering
Replies
7
Views
865
Replies
1
Views
1K
  • Electrical Engineering
Replies
7
Views
3K
  • Electrical Engineering
Replies
8
Views
2K
  • General Engineering
Replies
2
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
6
Views
2K
Back
Top