# Determining supply current/voltage to LED

 P: 51 Bearing in mind safety margins, how much current and voltage should I supply an LED if it's rated as such: Forward current max: 25mA Forward voltage max: 2.5V Reverse voltage max: 5V Light output min.@ 10mA: 70mcd Light output typ.@ 10mA: 200mcd
 Sci Advisor HW Helper PF Gold P: 2,532 Looks like <25mA current and <2.5V voltage.
Mentor
P: 41,098
 Quote by hl_world Bearing in mind safety margins, how much current and voltage should I supply an LED if it's rated as such: Forward current max: 25mA Forward voltage max: 2.5V Reverse voltage max: 5V Light output min.@ 10mA: 70mcd Light output typ.@ 10mA: 200mcd
In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED.

Quiz Question for hl_world -- Why is this so?

P: 51
Determining supply current/voltage to LED

 Quote by berkeman In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED. Quiz Question for hl_world -- Why is this so?
Hmm.. I think this was in the analogue electronics part of my course but I left after 6 months and that was a few years ago.

So I should try 20mA & 2.0v then? Like this:
 Mentor P: 12,070 It's common practice to do this with just one resistor, in series with the diode: EDIT: "Answer me these questions three..." Assume the LED voltage is 2 V. What's the voltage across the resistor? Assume the LED current is 25 mA. What's the current through the resistor? To have the voltage and current calculated in the first 2 questions, what's the resistance of the resistor?
 P: 51 I don't understand. I used the voltage divider to ensure 2V between 2 nodes at either side of LED. Does the LED just do that for me by having resistance of its own? 1) 1 volt (3V source - 2V over LED) 2) 25 mA (same throughout circuit) 3) R=V/I so 1/0.025= 40Ω (To the best of my understanding)
Mentor
P: 12,070
 Quote by hl_world 1) 1 volt (3V source - 2V over LED) 2) 25 mA (same throughout circuit) 3) R=V/I so 1/0.025= 40Ω (To the best of my understanding)
Yes, that's right. Once you set up the circuit, you can measure the power supply and LED voltages, and adjust the resistance if needed. But 40Ω is a good safe starting value.

 I don't understand. I used the voltage divider to ensure 2V between 2 nodes at either side of LED. Does the LED just do that for me by having resistance of its own?
Yes, the LED is essentially guaranteed to have 2 V (or could be as high as 2.5V) -- as long as it is not connected directly to a fixed voltage source, and it draws some minimal amount of current.

Looking at your voltage divider circuit, here is an observation: because of the 1kΩ resistor, the 3V supply will not produce more than 3 mA of current. So the LED will get less than 3 mA of current.
 P: 51 Here, I have labeled the different points in the circuit (except switch & -/+ nodes) for reference and reduced the 100Ω resistor to 40Ω. Now one of the things I don't get about voltage & current is that between points B & D there is a 2v supply. This will cause the current to flow through the 40Ω resistor a lot more than the 2kΩ. Using the MAD rule (2000x40)/(2000+40)= 39.216Ω. The current should be 2v/39.216Ω = 51mA which divides between resistors before combining at the negative node. So the current which flows through to 40Ω & LED should be 0.051 x (2000/(40+2000))= 50mA. I know I've done something wrong here.
Mentor
P: 12,070
 Quote by hl_world & current is that between points B & D there is a 2v supply. This will cause the current to flow through the 40Ω resistor a lot more than the 2kΩ.
That would be true if there were only 40Ω in parallel with the 2kΩ. However, it is an LED+40Ω series combination in parallel with the 2kΩ. We don't know what the effective resistance of the LED+40Ω is, so we can't say that more current flows through that path.

Also, thinking of this as a voltage divider: the resistance of the lower section must be less than 2kΩ, due to the LED+40Ω that is in parallel with the 2kΩ resistor. That would mean VBD is less than 2V.

Another observation: if any appreciable current does flow through the diode, it would have close to 2V, which means close to 2V between C and D. But there is also 2V (or close to it) between B and D. So therefore a very small voltage is between B and C. Just how small we don't really know, but if the circuit were actually built one could measure VBC, and divide it by 40Ω to get the actual current in that path.

One cannot ignore the effect of the LED on the LED+40Ω branch of the circuit.

Hope that helps clear things up . If not, keep posting. You have a pretty good grasp of the basics, so that helps a lot in composing answers to your questions.
Mentor
P: 12,070
 Quote by hl_world
Just thought of a simplified explanation for what's going on in this circuit.

Assume an idealized 2.0V LED:
i=0 for V < 2.0V
V = 2.0V for i > 0
If ANY current flows through the LED, it will be at 2.0V. That puts point C at 2.0V above ground ( ground is point D or E).

The voltage divider will tend to put point B at 2.0V above ground also. That would put 0V across the 40Ω resistor (points B and C at the same potential), hence zero current through the 40Ω and LED (path BCD).

With zero current going through path BCD, all current must flow straight through the 1kΩ and 2kΩ resistors. This current is
i = 3V / (1+2)kΩ = 1 mA
So we have:
A at 3V
B & C at 2V
D & E at 0V

0 mA through path BCD
1 mA through path AE
Hope that helps. In reality there will be a small fraction of the current taking path BCD through the LED, and the LED voltage will be a little less than 2.0V.

Regards,

Mark
 Mentor P: 41,098 hl_world -- I haven't read the last couple posts in detail, but do not put a voltage divider around an LED. If you do that in a job interview, you will be shown the door. You put a resistor in series with the LED to determine the LED current, based on the supply voltage and the expected LED forward voltage drop. Nothing else.
Mentor
P: 12,070
 Quote by berkeman ... do not put a voltage divider around an LED. If you do that in a job interview, you will be shown the door.
I would add, "Never use a voltage divider to power something, use them only for making a voltage reference". Would you agree?
Mentor
P: 41,098
 Quote by Redbelly98 I would add, "Never use a voltage divider to power something, use them only for making a voltage reference". Would you agree?
Absolutely. Otherwise, you're just wasting power for no reason.
 P: 51 Thanks, Redbelly & berkeman. It's been a while since college but recently I've been building LED circuits on a breadboard. Just 1 more question for now though:- The specs of one of the LEDs I ordered says it should be supplied at max forward current/voltage of 100mA/4V. Why does it specify both; wouldn't current be the only relevant factor?
P: 1,724
 Quote by hl_world Thanks, Redbelly & berkeman. It's been a while since college but recently I've been building LED circuits on a breadboard. Just 1 more question for now though:- The specs of one of the LEDs I ordered says it should be supplied at max forward current/voltage of 100mA/4V. Why does it specify both; wouldn't current be the only relevant factor?
That's probably just the diode voltage at 100 mA. They supply the (nominal) operating voltage along with the current so you can easily figure out whether or not the power supply you have is appropriate for turning it on.

EDIT: For example, if you had a 3 V, 1 A (max) supply available, would you spec out LEDs with a 4.5V voltage?
 P: 51 Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.
P: 1,724
 Quote by hl_world Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.
Well, this would give higher current than you want. It's better to use the following:
$$R_{LED}=\frac{V_{Supply}-V_{LED}}{I_{LED}}$$