1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Creating a DC Voltage Supply

  1. Nov 7, 2013 #1
    A dc voltage supply that starts with a transformer, into a bridge rectifier, into a big capacitor, into some sort of chip, into a smaller capacitor. Since yall are smart i'm hoping you know what i'm talking about with this crappy description. I was wondering if running the signal through the chip to make the output more steady cause a big drop in the average voltage?
  2. jcsd
  3. Nov 7, 2013 #2


    User Avatar

    Staff: Mentor

    Linear Voltage Regulator -- http://en.wikipedia.org/wiki/Linear_regulator

  4. Nov 8, 2013 #3


    User Avatar
    Science Advisor
    Homework Helper
    Gold Member

    Depends on the regulator.

    Typical cheap linear regulators need the input to be 2-3V higher than the output. So a 5V regulator usually needs around 8V on it's input. You have to look at the specification for the regulator to be certain what is required. Usually this is the "Drop out voltage" parameter although it may have different names on different data sheets.

    This common regulator family need the input to be 2V higher than the output...

    You can buy "Low drop out" versions of many linear regulators. For example this one has the same pin out as the 780X series but only needs the input to be 0.5 to 0.7V higher than the output.


    I haven't checked to see if other parameters are different.

    In some cases you may choose to use a higher input voltage than the data sheet says you need. For example in the event of a very brief power cut the input capacitor will continue to supply current to the regulator. This means the regulator circuit will continue to work until the input capacitor voltage falls below the output voltage plus the drop out voltage. By that time the power might have returned meaning your circuit continues to work unaffected by the brief power cut.

    However you shouldn't just use a higher input voltage without working out what effect that has on the power dissipation in the regulator... For example suppose your circuit draws 1A at 5V and you choose a regulator that drops 1V. You could feed it 6V in which case it the power dissipated in regulator will be 1A * 1V = 1W. If you crank up the input voltage to say 5V + 3V = 8V then the power dissipated will be 3W and a bigger heatsink will be needed (or even a different regulator).

    Remember that the output voltage of the transformer and hence the bridge rectifier won't always be exactly the voltage specified. They have a manufacturing tollerance. The mains voltage also varies from place to place. So it's normal to design the circuit to have a slightly higher voltage on the input to the regulator to allow for these factors.

    Lots to consider eh?
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Similar Discussions: Creating a DC Voltage Supply
  1. Edisons dc (Replies: 4)

  2. DC propogation (Replies: 6)