# Choosing Resistor for LED Circuits

I'm hoping this is the right forum for this question, I've never been to this part of PF before.

I've taken up some interest in working with LED lights but I'm slightly confused with how to build the circuits for them. I'm told you need to put in a resistor to keep the LED safe from high currents, however I'm not sure how to choose one. I read LEDs have a fixed voltage over them, how do they do that, independent of the source? And if a resistor is to reduce the amount of current, it has to be put parallel to the LED, right?

As you can probably tell, this isn't my strong side, and my confusion is rather general. I also might have got some technical terms wrong, since English isn't my native language, but I hope someone can understand and guide me in the generally right direction.

Related Electrical Engineering News on Phys.org
Are you connecting more than one LED?

If not you just take the LED voltage from the supply voltage and divide it by what current you want to use.

Say, for example, you want to power a 1.5V LED at 30mA from a 9V supply. The resistor value would be (9V - 1.5V) / 30mA = 250ohms.

The power used up by the resistor is 7.5V x 30mA = 225mW. I'd suggest that the power rating you choose for the resistor is double this ... i.e. 450mW

Edit: By the way the resistor needs to be in series with the LED. Usually between the positive rail and the anode of the LED.

Last edited:
Thanks for your reply. For now, it's just one LED I'm thinking of, but what if I did want more?

From your explanation I think I can distill that since the LED has a fixed voltage, the rest of the voltage just goes over the resistor, if you put it in series (is that the correct term? As in, in series opposed to parallel?), am I right so far? So just choose the resistor to get the right current through U=I*R.

I'm not sure what you mean with the last part, though. What's a power rating?

berkeman
Mentor
You have it mostly right. See the "considerations in use" section of this wikipedia article about LEDs, for example:

http://en.wikipedia.org/wiki/Led

Yes, the resistor is placed in series with the forward-biased led (current flowing from cathode to anode). The series resistor drops the voltage down to just leave the 1.6V to 2.0V that the LED needs to light up. You adjust the brighness of the LED by how much current you let flow through it, which you set with the value of the series voltage dropping resistor.

Real resistors have max power ratings. A common little through-hole resistor will typically be about 1/4W, for example. If you look up resistors in the Digikey catalog or where ever you buy your parts, they will be sorted by power rating. In the case that Delta cites, he calculated that his voltage dropping resistor would be dissipating a steady 225mW, which really is too much to run a 1/4W resistor at for reliable operation. The next step up would be to use a 1/2W resistor.

Thanks. So basically the power rating is just how much it can take.

I bought this cheap key chain LED flashlight thing, it's made up of three batteries (I think 1.55V ones, but I'm not sure) and a LED (no idea what kind of LED), there doesn't appear to be any resistor in it, unless one of the connecting pieces of metal acts as one, but they're a bit too big for that, I think. So does that mean the LED is unprotected and thus not going to live very long?

Also, does a LED need a minimum amount of voltage (like the 1.5V mentioned in the example)? Because I noticed that with one battery, the LED didn't light up whatsoever, but with two and three, it worked quite well.

ranger
Gold Member
When a LED is forward biased i.e. the correct polarity to make it turn on, it has very little resistance. So according to ohms law, we'll get a lot of current. Different LEDs have different forward voltage requirements and current requirements which must be met. If you fail to follow the manufacturers specs for forward current, you'll destroy the LED, hence you need a current limiting resistor.

In addition to the current and polarity requirements, the LED also needs a minimum (forward) voltage drop across its junction to "turn on" or conduct. Different color LEDs have different forward voltage requirement. Example (I think) 1.7v for a red, 2v for a green, and 2.3v for yellow. As a comparison, a regular silicon diode needs approx. 0.7v forward voltage to conduct.

Alright, that's what I figured, thanks.

However, the LED in the flashlight, is it just unprotected or is it likely there is some built-in resistor or something? (I'm thinking of stripping the LED and wondering if I need a resistor)

Also, is there any way to figure out what a LED requires in voltage and current?

berkeman
Mentor
It's likely that the LED has a built-in resistor. You can get those, but the resistor value is obviously already set for some supply voltage and desired LED current.

To figure out an LED's "voltage and current", just hook it up to a variable power supply and a resistor. Measure the current versus brighness curve, as well as the VI characteristic of the LED. That will help you plan how to best use the LED in a circuit.

Thanks. So, just check at what current it lights up well?

I have another question. Given the formula R = (VS - VL) / I where VS and VL are the voltages of the source and the LED, does that mean if you just choose the voltages right (VS = VL), you don't need a resistor? Since you end up with R=0/I=0. I figure, though, that the current would become to big then. But if there is no voltage left for the resistor, how will we use a resistor to decrease the current?

NoTime
Homework Helper
But if there is no voltage left for the resistor, how will we use a resistor to decrease the current?
In this case it is improtant to realize that LEDs do have an internal resistance.
For a common red LED this is around 22 ohms.
You can use this in the equation to select a source voltage that does not require an external resistor.

For typical LED's, it would be impractical to create a supply just enough to power the LED, remembering an LED is a discrete device. You would be hovering around a critical area of the LED's characteristic curve. There will already be variations in the LED's performance due to manufactruing tolerances, temperature, etc. So by supplying at this voltage, any minor fluctations may generate dramatic changes in the current running within the LED. And therefore shorten the life of the device.

In essence it is best to provide a supply voltage ample enough to cover the voltage turn-on of the LED, and then control the LED's output via a current limiting resistor in series.

To find the right resistor without knowing the LED characteritics, throw say a 330ohm resistor in series and take some measurements of current against an input voltage between 0 and 10V. The curve is exponential and looks a bit like a backward L. The point at which the two lines roughly form a point of the L, is the turn on **voltage[1]. Play with the voltage until the LED is the ideal brightness (but not too bright to over drive it), and take the current[2] reading.

These two values, [1] and [2], will help provide a guide on the LED characteristics and allow you to decide the resistor value for any given supply voltage.

Edit: **Voltage across the LED, NOT the power supply.

Last edited:
Thanks, I'll have a look some time if I can find the right equipment around.

I went and bought some LEDs today and I also asked the guy at the counter about the resistors. He said that if I just connect a LED to a 3V source (they're 3V LEDs) I wouldn't need any resistors. The packaging seems to agree as it lists what kind of resistors I'd need for the LEDs, but only for 5V and up. You don't seem to agree with this, though. Was the guy wrong?

Also, something else I was wondering, if it's true and the LEDs will run just fine on 3V, could I just take two LEDs in series and connect them to 6V? Or doesn't it work that way? Or should I put them parallel on 3V?

NoTime
Homework Helper
Depends.
For a typical red led with a turn on voltage of 1.8v, an internal resistance of 22 ohms and a max current of 20ma.

At the max current the internal resistance will develop a 0.4v drop. So a power supply voltage of less than 2.2 and greater than 1.8v will work without damaging the LED.

Don't know the require parameters (internal resistance, max current and turn on voltage) for your LEDs.

In general when you wire LEDs in series the resistance and turn on voltage add while max current remains constant.
If one LED will work with 3v then two in series will work with 6v.

Be careful to note that a so called 1.5v battery is not necessarily 1.5v

In this case the low internal resistance of a power supply will let you run parallel on 3v.
Normally parallel operation of LED's with a single external resistance will run into a problem wth small variations in turn on voltage..

berkeman
Mentor
I agree 100% with NoTime's comments. Please download the datasheet for the LEDs you are using. Always make it a habit to read the datasheet -- otherwise you are just guessing at important points.

In the real world, we hold design reviews, with datasheets for parts on the BOM (bill of materials) included.

I'm not sure where to get the datasheet. I looked on the site of the manufacturer (Velleman), but I couldn't find it. They do have a list of "Info Sheets", but I can't find any number on the packaging or on the page they have for the product that is in the list. Where do you generally get data sheets?

NoTime
Homework Helper
I did find this on the Velleman site

http://www.calcentron.com/PDF_Documents/led_pdf/LED%20Spec.pdf [Broken]

Reading, it says to call them if you need more detailed info.

You can test the device for turn on voltage and internal resistance.
Max current is more problematical.

Last edited by a moderator: