Determining supply current/voltage to LED

  1. Bearing in mind safety margins, how much current and voltage should I supply an LED if it's rated as such:

    Forward current max: 25mA
    Forward voltage max: 2.5V
    Reverse voltage max: 5V
    Light output min.@ 10mA: 70mcd
    Light output typ.@ 10mA: 200mcd
     
  2. jcsd
  3. Mapes

    Mapes 2,532
    Science Advisor
    Homework Helper
    Gold Member

    Looks like <25mA current and <2.5V voltage.
     
  4. berkeman

    Staff: Mentor

    In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED.

    Quiz Question for hl_world -- Why is this so?
     
  5. Hmm.. I think this was in the analogue electronics part of my course but I left after 6 months and that was a few years ago.

    So I should try 20mA & 2.0v then? Like this:
    [​IMG]
     
  6. Redbelly98

    Redbelly98 12,039
    Staff Emeritus
    Science Advisor
    Homework Helper

    It's common practice to do this with just one resistor, in series with the diode:

    [​IMG]

    EDIT:
    "Answer me these questions three..."

    Assume the LED voltage is 2 V. What's the voltage across the resistor?
    Assume the LED current is 25 mA. What's the current through the resistor?
    To have the voltage and current calculated in the first 2 questions, what's the resistance of the resistor?
     
  7. I don't understand. I used the voltage divider to ensure 2V between 2 nodes at either side of LED. Does the LED just do that for me by having resistance of its own?

    1) 1 volt (3V source - 2V over LED)
    2) 25 mA (same throughout circuit)
    3) R=V/I so 1/0.025= 40Ω

    (To the best of my understanding)
     
  8. Redbelly98

    Redbelly98 12,039
    Staff Emeritus
    Science Advisor
    Homework Helper

    Yes, that's right. Once you set up the circuit, you can measure the power supply and LED voltages, and adjust the resistance if needed. But 40Ω is a good safe starting value.

    Yes, the LED is essentially guaranteed to have 2 V (or could be as high as 2.5V) -- as long as it is not connected directly to a fixed voltage source, and it draws some minimal amount of current.

    Looking at your voltage divider circuit, here is an observation: because of the 1kΩ resistor, the 3V supply will not produce more than 3 mA of current. So the LED will get less than 3 mA of current.
     
  9. [​IMG]

    Here, I have labeled the different points in the circuit (except switch & -/+ nodes) for reference and reduced the 100Ω resistor to 40Ω.

    Now one of the things I don't get about voltage & current is that between points B & D there is a 2v supply. This will cause the current to flow through the 40Ω resistor a lot more than the 2kΩ.

    Using the MAD rule (2000x40)/(2000+40)= 39.216Ω. The current should be 2v/39.216Ω = 51mA which divides between resistors before combining at the negative node. So the current which flows through to 40Ω & LED should be 0.051 x (2000/(40+2000))= 50mA.
    I know I've done something wrong here.
     
  10. Redbelly98

    Redbelly98 12,039
    Staff Emeritus
    Science Advisor
    Homework Helper

    That would be true if there were only 40Ω in parallel with the 2kΩ. However, it is an LED+40Ω series combination in parallel with the 2kΩ. We don't know what the effective resistance of the LED+40Ω is, so we can't say that more current flows through that path.

    Also, thinking of this as a voltage divider: the resistance of the lower section must be less than 2kΩ, due to the LED+40Ω that is in parallel with the 2kΩ resistor. That would mean VBD is less than 2V.

    Another observation: if any appreciable current does flow through the diode, it would have close to 2V, which means close to 2V between C and D. But there is also 2V (or close to it) between B and D. So therefore a very small voltage is between B and C. Just how small we don't really know, but if the circuit were actually built one could measure VBC, and divide it by 40Ω to get the actual current in that path.

    One cannot ignore the effect of the LED on the LED+40Ω branch of the circuit.

    Hope that helps clear things up :smile:. If not, keep posting. You have a pretty good grasp of the basics, so that helps a lot in composing answers to your questions.
     
  11. Redbelly98

    Redbelly98 12,039
    Staff Emeritus
    Science Advisor
    Homework Helper

    Just thought of a simplified explanation for what's going on in this circuit.

    Assume an idealized 2.0V LED:
    i=0 for V < 2.0V
    V = 2.0V for i > 0​

    If ANY current flows through the LED, it will be at 2.0V. That puts point C at 2.0V above ground ( ground is point D or E).

    The voltage divider will tend to put point B at 2.0V above ground also. That would put 0V across the 40Ω resistor (points B and C at the same potential), hence zero current through the 40Ω and LED (path BCD).

    With zero current going through path BCD, all current must flow straight through the 1kΩ and 2kΩ resistors. This current is
    i = 3V / (1+2)kΩ = 1 mA​

    So we have:
    A at 3V
    B & C at 2V
    D & E at 0V

    0 mA through path BCD
    1 mA through path AE

    Hope that helps. In reality there will be a small fraction of the current taking path BCD through the LED, and the LED voltage will be a little less than 2.0V.

    Regards,

    Mark
     
  12. berkeman

    Staff: Mentor

    hl_world -- I haven't read the last couple posts in detail, but do not put a voltage divider around an LED. If you do that in a job interview, you will be shown the door.

    You put a resistor in series with the LED to determine the LED current, based on the supply voltage and the expected LED forward voltage drop. Nothing else.
     
  13. Redbelly98

    Redbelly98 12,039
    Staff Emeritus
    Science Advisor
    Homework Helper

    I would add, "Never use a voltage divider to power something, use them only for making a voltage reference". Would you agree?
     
  14. berkeman

    Staff: Mentor

    Absolutely. Otherwise, you're just wasting power for no reason.
     
  15. Thanks, Redbelly & berkeman. It's been a while since college but recently I've been building LED circuits on a breadboard. Just 1 more question for now though:- The specs of one of the LEDs I ordered says it should be supplied at max forward current/voltage of 100mA/4V. Why does it specify both; wouldn't current be the only relevant factor?
     
  16. MATLABdude

    MATLABdude 1,724
    Science Advisor

    That's probably just the diode voltage at 100 mA. They supply the (nominal) operating voltage along with the current so you can easily figure out whether or not the power supply you have is appropriate for turning it on.

    EDIT: For example, if you had a 3 V, 1 A (max) supply available, would you spec out LEDs with a 4.5V voltage?
     
    Last edited: Feb 19, 2009
  17. Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.
     
  18. MATLABdude

    MATLABdude 1,724
    Science Advisor

    Well, this would give higher current than you want. It's better to use the following:
    [tex]R_{LED}=\frac{V_{Supply}-V_{LED}}{I_{LED}}[/tex]

    EDIT: More info:
    http://alan-parekh.com/led_resistor_calculator.html
     
  19. berkeman

    Staff: Mentor

    Not sure what you mean by that. To bias an LED, you subtract the LED Vf from the supply voltage (and any other voltage drops, like the Vol of a drive gate, or Vsat of a driving transistor), and divide that resistor voltave Vr by the current you want to have passing through the LED. That determines the value of the series resistor.
     
  20. I mean I thought if you connected an LED that needs 4.5v / 100mA in a series circuit, and it was powered by a 3v supply, you would add a 30Ω resistor, it will get a 100mA flow and that would be it (ignoring voltage requirements).
     
  21. Redbelly98

    Redbelly98 12,039
    Staff Emeritus
    Science Advisor
    Homework Helper

    If the LED needs 4.5 V, then a 3V supply will never be able to power it, no matter what resistor you use.

    As berkeman said, you subtract the LED voltage (4.5) from the supply voltage, and then divide by current to get the resistor.

    The supply voltage must be greater than the LED voltage for this to work.
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?