In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED.Bearing in mind safety margins, how much current and voltage should I supply an LED if it's rated as such:
Forward current max: 25mA
Forward voltage max: 2.5V
Reverse voltage max: 5V
Light output min.@ 10mA: 70mcd
Light output typ.@ 10mA: 200mcd
Hmm.. I think this was in the analogue electronics part of my course but I left after 6 months and that was a few years ago.berkeman said:In addition to what Mapes said, in order to limit the current to 25mA, you need to know the *minimum* Vf of the LED.
Quiz Question for hl_world -- Why is this so?
Yes, that's right. Once you set up the circuit, you can measure the power supply and LED voltages, and adjust the resistance if needed. But 40Ω is a good safe starting value.1) 1 volt (3V source - 2V over LED)
2) 25 mA (same throughout circuit)
3) R=V/I so 1/0.025= 40Ω
(To the best of my understanding)
Yes, the LED is essentially guaranteed to have 2 V (or could be as high as 2.5V) -- as long as it is not connected directly to a fixed voltage source, and it draws some minimal amount of current.I don't understand. I used the voltage divider to ensure 2V between 2 nodes at either side of LED. Does the LED just do that for me by having resistance of its own?
That would be true if there were only 40Ω in parallel with the 2kΩ. However, it is an LED+40Ω series combination in parallel with the 2kΩ. We don't know what the effective resistance of the LED+40Ω is, so we can't say that more current flows through that path.& current is that between points B & D there is a 2v supply. This will cause the current to flow through the 40Ω resistor a lot more than the 2kΩ.
Just thought of a simplified explanation for what's going on in this circuit.http://img4.imageshack.us/img4/7349/ledvoltdivcircuiths5.png [Broken]
That's probably just the diode voltage at 100 mA. They supply the (nominal) operating voltage along with the current so you can easily figure out whether or not the power supply you have is appropriate for turning it on.Thanks, Redbelly & berkeman. It's been a while since college but recently I've been building LED circuits on a breadboard. Just 1 more question for now though:- The specs of one of the LEDs I ordered says it should be supplied at max forward current/voltage of 100mA/4V. Why does it specify both; wouldn't current be the only relevant factor?
Well, this would give higher current than you want. It's better to use the following:Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.
Not sure what you mean by that. To bias an LED, you subtract the LED Vf from the supply voltage (and any other voltage drops, like the Vol of a drive gate, or Vsat of a driving transistor), and divide that resistor voltave Vr by the current you want to have passing through the LED. That determines the value of the series resistor.Well, I figured it was a simple case of taking the supply voltage, dividing it by the ideal forward current of the LED and applying that resistance to the series circuit.