# Determining supply current/voltage to LED (1 Viewer)

### Users Who Are Viewing This Thread (Users: 0, Guests: 1)

#### hl_world

Bearing in mind safety margins, how much current and voltage should I supply an LED if it's rated as such:

Forward current max: 25mA
Forward voltage max: 2.5V
Reverse voltage max: 5V
Light output min.@ 10mA: 70mcd
Light output typ.@ 10mA: 200mcd

#### Mapes

Homework Helper
Gold Member
Looks like <25mA current and <2.5V voltage.

#### berkeman

Mentor
Bearing in mind safety margins, how much current and voltage should I supply an LED if it's rated as such:

Forward current max: 25mA
Forward voltage max: 2.5V
Reverse voltage max: 5V
Light output min.@ 10mA: 70mcd
Light output typ.@ 10mA: 200mcd
In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED.

Quiz Question for hl_world -- Why is this so?

#### hl_world

berkeman said:
In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED.

Quiz Question for hl_world -- Why is this so?
Hmm.. I think this was in the analogue electronics part of my course but I left after 6 months and that was a few years ago.

So I should try 20mA & 2.0v then? Like this:
http://img526.imageshack.us/img526/631/ledvoltdivcircuitcy4.png [Broken]

Last edited by a moderator:

#### Redbelly98

Staff Emeritus
Homework Helper
It's common practice to do this with just one resistor, in series with the diode: EDIT:
"Answer me these questions three..."

Assume the LED voltage is 2 V. What's the voltage across the resistor?
Assume the LED current is 25 mA. What's the current through the resistor?
To have the voltage and current calculated in the first 2 questions, what's the resistance of the resistor?

#### hl_world

I don't understand. I used the voltage divider to ensure 2V between 2 nodes at either side of LED. Does the LED just do that for me by having resistance of its own?

1) 1 volt (3V source - 2V over LED)
2) 25 mA (same throughout circuit)
3) R=V/I so 1/0.025= 40Ω

(To the best of my understanding)

#### Redbelly98

Staff Emeritus
Homework Helper
1) 1 volt (3V source - 2V over LED)
2) 25 mA (same throughout circuit)
3) R=V/I so 1/0.025= 40Ω

(To the best of my understanding)
Yes, that's right. Once you set up the circuit, you can measure the power supply and LED voltages, and adjust the resistance if needed. But 40Ω is a good safe starting value.

I don't understand. I used the voltage divider to ensure 2V between 2 nodes at either side of LED. Does the LED just do that for me by having resistance of its own?
Yes, the LED is essentially guaranteed to have 2 V (or could be as high as 2.5V) -- as long as it is not connected directly to a fixed voltage source, and it draws some minimal amount of current.

Looking at your voltage divider circuit, here is an observation: because of the 1kΩ resistor, the 3V supply will not produce more than 3 mA of current. So the LED will get less than 3 mA of current.

#### hl_world

http://img4.imageshack.us/img4/7349/ledvoltdivcircuiths5.png [Broken]

Here, I have labeled the different points in the circuit (except switch & -/+ nodes) for reference and reduced the 100Ω resistor to 40Ω.

Now one of the things I don't get about voltage & current is that between points B & D there is a 2v supply. This will cause the current to flow through the 40Ω resistor a lot more than the 2kΩ.

Using the MAD rule (2000x40)/(2000+40)= 39.216Ω. The current should be 2v/39.216Ω = 51mA which divides between resistors before combining at the negative node. So the current which flows through to 40Ω & LED should be 0.051 x (2000/(40+2000))= 50mA.
I know I've done something wrong here.

Last edited by a moderator:

#### Redbelly98

Staff Emeritus
Homework Helper
& current is that between points B & D there is a 2v supply. This will cause the current to flow through the 40Ω resistor a lot more than the 2kΩ.
That would be true if there were only 40Ω in parallel with the 2kΩ. However, it is an LED+40Ω series combination in parallel with the 2kΩ. We don't know what the effective resistance of the LED+40Ω is, so we can't say that more current flows through that path.

Also, thinking of this as a voltage divider: the resistance of the lower section must be less than 2kΩ, due to the LED+40Ω that is in parallel with the 2kΩ resistor. That would mean VBD is less than 2V.

Another observation: if any appreciable current does flow through the diode, it would have close to 2V, which means close to 2V between C and D. But there is also 2V (or close to it) between B and D. So therefore a very small voltage is between B and C. Just how small we don't really know, but if the circuit were actually built one could measure VBC, and divide it by 40Ω to get the actual current in that path.

One cannot ignore the effect of the LED on the LED+40Ω branch of the circuit.

Hope that helps clear things up . If not, keep posting. You have a pretty good grasp of the basics, so that helps a lot in composing answers to your questions.

#### Redbelly98

Staff Emeritus
Homework Helper
http://img4.imageshack.us/img4/7349/ledvoltdivcircuiths5.png [Broken]
Just thought of a simplified explanation for what's going on in this circuit.

Assume an idealized 2.0V LED:
i=0 for V < 2.0V
V = 2.0V for i > 0​

If ANY current flows through the LED, it will be at 2.0V. That puts point C at 2.0V above ground ( ground is point D or E).

The voltage divider will tend to put point B at 2.0V above ground also. That would put 0V across the 40Ω resistor (points B and C at the same potential), hence zero current through the 40Ω and LED (path BCD).

With zero current going through path BCD, all current must flow straight through the 1kΩ and 2kΩ resistors. This current is
i = 3V / (1+2)kΩ = 1 mA​

So we have:
A at 3V
B & C at 2V
D & E at 0V

0 mA through path BCD
1 mA through path AE

Hope that helps. In reality there will be a small fraction of the current taking path BCD through the LED, and the LED voltage will be a little less than 2.0V.

Regards,

Mark

Last edited by a moderator:

#### berkeman

Mentor
hl_world -- I haven't read the last couple posts in detail, but do not put a voltage divider around an LED. If you do that in a job interview, you will be shown the door.

You put a resistor in series with the LED to determine the LED current, based on the supply voltage and the expected LED forward voltage drop. Nothing else.

#### Redbelly98

Staff Emeritus
Homework Helper
... do not put a voltage divider around an LED. If you do that in a job interview, you will be shown the door.
I would add, "Never use a voltage divider to power something, use them only for making a voltage reference". Would you agree?

#### berkeman

Mentor
I would add, "Never use a voltage divider to power something, use them only for making a voltage reference". Would you agree?
Absolutely. Otherwise, you're just wasting power for no reason.

#### hl_world

Thanks, Redbelly & berkeman. It's been a while since college but recently I've been building LED circuits on a breadboard. Just 1 more question for now though:- The specs of one of the LEDs I ordered says it should be supplied at max forward current/voltage of 100mA/4V. Why does it specify both; wouldn't current be the only relevant factor?

#### MATLABdude

Thanks, Redbelly & berkeman. It's been a while since college but recently I've been building LED circuits on a breadboard. Just 1 more question for now though:- The specs of one of the LEDs I ordered says it should be supplied at max forward current/voltage of 100mA/4V. Why does it specify both; wouldn't current be the only relevant factor?
That's probably just the diode voltage at 100 mA. They supply the (nominal) operating voltage along with the current so you can easily figure out whether or not the power supply you have is appropriate for turning it on.

EDIT: For example, if you had a 3 V, 1 A (max) supply available, would you spec out LEDs with a 4.5V voltage?

Last edited:

#### hl_world

Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.

#### MATLABdude

Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.
Well, this would give higher current than you want. It's better to use the following:
$$R_{LED}=\frac{V_{Supply}-V_{LED}}{I_{LED}}$$

http://alan-parekh.com/led_resistor_calculator.html

#### berkeman

Mentor
Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.
Not sure what you mean by that. To bias an LED, you subtract the LED Vf from the supply voltage (and any other voltage drops, like the Vol of a drive gate, or Vsat of a driving transistor), and divide that resistor voltave Vr by the current you want to have passing through the LED. That determines the value of the series resistor.

#### hl_world

I mean I thought if you connected an LED that needs 4.5v / 100mA in a series circuit, and it was powered by a 3v supply, you would add a 30Ω resistor, it will get a 100mA flow and that would be it (ignoring voltage requirements).

#### Redbelly98

Staff Emeritus
Homework Helper
If the LED needs 4.5 V, then a 3V supply will never be able to power it, no matter what resistor you use.

As berkeman said, you subtract the LED voltage (4.5) from the supply voltage, and then divide by current to get the resistor.

The supply voltage must be greater than the LED voltage for this to work.

### The Physics Forums Way

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving