Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Simple LED problem.

  1. Oct 16, 2009 #1
    Hi,

    I am trying to link up a series of LEDs and having a bit of trouble calculating the power supply needed. I am actually doing this for a number of applications and wondered if anyone had a system for working this out??

    Thanks,
    John
     
  2. jcsd
  3. Oct 16, 2009 #2
    LED's in paralell? 12Volts always worked good for me. Dont forget the current limiting resistor of appropriate wattage. Usually 1500+/- Ohms worked good for current limiting on 12VDC.

    To be safe, figure the current draw of the LED's and then double it for the power supply rating.

    P.S I have seen people use LED's on 110VAC, they will "self rectify", you just still need the appropriate current limiting resistor, and PLEASE consider safety around 110.

    Is this for an experiment?

    Perhaps some formulas will pop up, these are the "rules of thumb" I have used for decades.
     
  4. Oct 16, 2009 #3
    Thanks for your post. Let me offer an example I am linking 10 No LEDs in series (I think, they are one after the other)

    Reverse Current VR = 5V
    Power dissipation 120 mW
    DC Forward Current 30 mA
    Peak Forward Current [1] 100 mA
    Reverse Voltage 5 V

    Which of these pieces of info would I use to calculate the required power??
     
  5. Oct 16, 2009 #4
    Maybe cheating, but try one of the many LED calculators available online. http://led.linear1.org/1led.wiz" [Broken].
     
    Last edited by a moderator: May 4, 2017
  6. Oct 16, 2009 #5

    berkeman

    User Avatar

    Staff: Mentor

    Your first line of "reverse current" looks to be a typo. There is generally no reverse current spec for LEDs, just a maximum reverse voltage before they pop.

    Also, you generally do not put LEDs in series unless they are well matched. Otherwise the difference in brightness can be seen. If they are well matched, then you should be able to series-connect them.

    What is the typical forward voltage specified for when you are running them at 30mA? Probably around 2V if they are red LEDs, and a bit higher for other colors. You will choose your power supply voltage and series current limiting resistor to match up with the forward voltage drops and target forward current.

    For example, 10 red LEDs would require a total of about 20V at 30mA (series connection). So if you had a 24V power supply, you would have 4V left over to drop across the current-setting resistor. 4V/30mA = 133 Ohms, which just happens to be a standard 1% resistor value.
     
  7. Oct 16, 2009 #6
    First, find out the operating voltage of the led's. Surely this must be given as part of the manufacturer's data. Let's say it's 1.6 volts.
    Now multiply the 1.6 by however many leds you want. If it's 10 leds, they'll take 16 volts to get 'em running.
    Now, say you've got a 30 volt supply. Then 14V/30mA=V/I=R. If you put that value of R is series with the leds they'll get their 30 mA.
    Let me mention what happens if you have only 17 volts to play with. In that case the system becomes very sensitive to the exact value of R, and also to the exact characteristics of the led's.

    As for power, use current x voltage.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Simple LED problem.
Loading...