Choosing Resistor for LED Circuits

In summary, the conversation discusses the use of resistors in LED circuits to protect the LED from high currents and how to choose the right resistor for a certain LED. It also mentions that the resistor should be placed in series with the LED and that the power rating of the resistor should be considered. The conversation also delves into the minimum voltage and current requirements for LEDs and how to test them using a variable power supply and a resistor. Finally, it addresses the possibility of built-in resistors in LED devices and the use of a formula to calculate the resistor value.
  • #1
NanakiXIII
392
0
I'm hoping this is the right forum for this question, I've never been to this part of PF before.

I've taken up some interest in working with LED lights but I'm slightly confused with how to build the circuits for them. I'm told you need to put in a resistor to keep the LED safe from high currents, however I'm not sure how to choose one. I read LEDs have a fixed voltage over them, how do they do that, independent of the source? And if a resistor is to reduce the amount of current, it has to be put parallel to the LED, right?

As you can probably tell, this isn't my strong side, and my confusion is rather general. I also might have got some technical terms wrong, since English isn't my native language, but I hope someone can understand and guide me in the generally right direction.
 
Engineering news on Phys.org
  • #2
Are you connecting more than one LED?

If not you just take the LED voltage from the supply voltage and divide it by what current you want to use.

Say, for example, you want to power a 1.5V LED at 30mA from a 9V supply. The resistor value would be (9V - 1.5V) / 30mA = 250ohms.

The power used up by the resistor is 7.5V x 30mA = 225mW. I'd suggest that the power rating you choose for the resistor is double this ... i.e. 450mW

Edit: By the way the resistor needs to be in series with the LED. Usually between the positive rail and the anode of the LED.
 
Last edited:
  • #3
Thanks for your reply. For now, it's just one LED I'm thinking of, but what if I did want more?

From your explanation I think I can distill that since the LED has a fixed voltage, the rest of the voltage just goes over the resistor, if you put it in series (is that the correct term? As in, in series opposed to parallel?), am I right so far? So just choose the resistor to get the right current through U=I*R.

I'm not sure what you mean with the last part, though. What's a power rating?
 
  • #4
You have it mostly right. See the "considerations in use" section of this wikipedia article about LEDs, for example:

http://en.wikipedia.org/wiki/Led

Yes, the resistor is placed in series with the forward-biased led (current flowing from cathode to anode). The series resistor drops the voltage down to just leave the 1.6V to 2.0V that the LED needs to light up. You adjust the brighness of the LED by how much current you let flow through it, which you set with the value of the series voltage dropping resistor.

Real resistors have max power ratings. A common little through-hole resistor will typically be about 1/4W, for example. If you look up resistors in the Digikey catalog or where ever you buy your parts, they will be sorted by power rating. In the case that Delta cites, he calculated that his voltage dropping resistor would be dissipating a steady 225mW, which really is too much to run a 1/4W resistor at for reliable operation. The next step up would be to use a 1/2W resistor.
 
  • #5
Thanks. So basically the power rating is just how much it can take.

I bought this cheap key chain LED flashlight thing, it's made up of three batteries (I think 1.55V ones, but I'm not sure) and a LED (no idea what kind of LED), there doesn't appear to be any resistor in it, unless one of the connecting pieces of metal acts as one, but they're a bit too big for that, I think. So does that mean the LED is unprotected and thus not going to live very long?

Also, does a LED need a minimum amount of voltage (like the 1.5V mentioned in the example)? Because I noticed that with one battery, the LED didn't light up whatsoever, but with two and three, it worked quite well.
 
  • #6
When a LED is forward biased i.e. the correct polarity to make it turn on, it has very little resistance. So according to ohms law, we'll get a lot of current. Different LEDs have different forward voltage requirements and current requirements which must be met. If you fail to follow the manufacturers specs for forward current, you'll destroy the LED, hence you need a current limiting resistor.

In addition to the current and polarity requirements, the LED also needs a minimum (forward) voltage drop across its junction to "turn on" or conduct. Different color LEDs have different forward voltage requirement. Example (I think) 1.7v for a red, 2v for a green, and 2.3v for yellow. As a comparison, a regular silicon diode needs approx. 0.7v forward voltage to conduct.
 
  • #7
Alright, that's what I figured, thanks.

However, the LED in the flashlight, is it just unprotected or is it likely there is some built-in resistor or something? (I'm thinking of stripping the LED and wondering if I need a resistor)

Also, is there any way to figure out what a LED requires in voltage and current?
 
  • #8
It's likely that the LED has a built-in resistor. You can get those, but the resistor value is obviously already set for some supply voltage and desired LED current.

To figure out an LED's "voltage and current", just hook it up to a variable power supply and a resistor. Measure the current versus brighness curve, as well as the VI characteristic of the LED. That will help you plan how to best use the LED in a circuit.
 
  • #9
Thanks. So, just check at what current it lights up well?

I have another question. Given the formula R = (VS - VL) / I where VS and VL are the voltages of the source and the LED, does that mean if you just choose the voltages right (VS = VL), you don't need a resistor? Since you end up with R=0/I=0. I figure, though, that the current would become to big then. But if there is no voltage left for the resistor, how will we use a resistor to decrease the current?
 
  • #10
NanakiXIII said:
But if there is no voltage left for the resistor, how will we use a resistor to decrease the current?

In this case it is improtant to realize that LEDs do have an internal resistance.
For a common red LED this is around 22 ohms.
You can use this in the equation to select a source voltage that does not require an external resistor.
 
  • #11
For typical LED's, it would be impractical to create a supply just enough to power the LED, remembering an LED is a discrete device. You would be hovering around a critical area of the LED's characteristic curve. There will already be variations in the LED's performance due to manufactruing tolerances, temperature, etc. So by supplying at this voltage, any minor fluctations may generate dramatic changes in the current running within the LED. And therefore shorten the life of the device.

In essence it is best to provide a supply voltage ample enough to cover the voltage turn-on of the LED, and then control the LED's output via a current limiting resistor in series.

To find the right resistor without knowing the LED characteritics, throw say a 330ohm resistor in series and take some measurements of current against an input voltage between 0 and 10V. The curve is exponential and looks a bit like a backward L. The point at which the two lines roughly form a point of the L, is the turn on **voltage[1]. Play with the voltage until the LED is the ideal brightness (but not too bright to over drive it), and take the current[2] reading.

These two values, [1] and [2], will help provide a guide on the LED characteristics and allow you to decide the resistor value for any given supply voltage.

Edit: **Voltage across the LED, NOT the power supply.
 
Last edited:
  • #12
Thanks, I'll have a look some time if I can find the right equipment around.

I went and bought some LEDs today and I also asked the guy at the counter about the resistors. He said that if I just connect a LED to a 3V source (they're 3V LEDs) I wouldn't need any resistors. The packaging seems to agree as it lists what kind of resistors I'd need for the LEDs, but only for 5V and up. You don't seem to agree with this, though. Was the guy wrong?

Also, something else I was wondering, if it's true and the LEDs will run just fine on 3V, could I just take two LEDs in series and connect them to 6V? Or doesn't it work that way? Or should I put them parallel on 3V?
 
  • #13
Depends.
For a typical red led with a turn on voltage of 1.8v, an internal resistance of 22 ohms and a max current of 20ma.

At the max current the internal resistance will develop a 0.4v drop. So a power supply voltage of less than 2.2 and greater than 1.8v will work without damaging the LED.

Don't know the require parameters (internal resistance, max current and turn on voltage) for your LEDs.
You need this information to answer your question.

In general when you wire LEDs in series the resistance and turn on voltage add while max current remains constant.
If one LED will work with 3v then two in series will work with 6v.

Be careful to note that a so called 1.5v battery is not necessarily 1.5v

In this case the low internal resistance of a power supply will let you run parallel on 3v.
Normally parallel operation of LED's with a single external resistance will run into a problem wth small variations in turn on voltage..
 
  • #14
I agree 100% with NoTime's comments. Please download the datasheet for the LEDs you are using. Always make it a habit to read the datasheet -- otherwise you are just guessing at important points.

In the real world, we hold design reviews, with datasheets for parts on the BOM (bill of materials) included.
 
  • #15
I'm not sure where to get the datasheet. I looked on the site of the manufacturer (Velleman), but I couldn't find it. They do have a list of "Info Sheets", but I can't find any number on the packaging or on the page they have for the product that is in the list. Where do you generally get data sheets?
 
  • #16
I did find this on the Velleman site

http://www.calcentron.com/PDF_Documents/led_pdf/LED%20Spec.pdf

Reading, it says to call them if you need more detailed info.

You can test the device for turn on voltage and internal resistance.
Max current is more problematical.
 
Last edited by a moderator:

1. What is the purpose of a resistor in an LED circuit?

A resistor is used to limit the amount of current that flows through an LED. LEDs have a specific voltage drop, and without a resistor, they can draw too much current and burn out. The resistor ensures that the LED receives the correct amount of current to function properly.

2. How do I calculate the value of a resistor for an LED circuit?

To calculate the value of a resistor for an LED circuit, you need to know the supply voltage, the LED's voltage drop, and the desired current. You can use Ohm's law (R = V/I) to calculate the resistance needed. For example, if the supply voltage is 5V, and the LED's voltage drop is 2V, and you want a current of 20mA, the resistor value would be (5V-2V)/0.02A = 150 ohms.

3. Can I use any resistor for an LED circuit?

No, not all resistors are suitable for LED circuits. The resistor must have a high enough power rating to handle the current flowing through it. It is best to use a resistor with a power rating at least twice the calculated power dissipation. Additionally, resistors have a tolerance, so it is essential to choose one with a tolerance that will not significantly affect the desired current.

4. Is it better to use a higher or lower resistance for an LED circuit?

It depends on the desired current and the LED specifications. Using a higher resistance will result in a lower current, which can increase the LED's lifespan. However, if the resistance is too high, the LED may not light up at all. It is essential to calculate the resistance based on the desired current and the LED's voltage drop to ensure it is not too high or too low.

5. Are there any other factors to consider when choosing a resistor for an LED circuit?

Yes, you should also consider the temperature coefficient and the temperature rating of the resistor. LEDs are sensitive to changes in temperature, so it is crucial to choose a resistor with a low temperature coefficient to minimize the impact of temperature changes on the circuit. Additionally, the resistor's temperature rating should be higher than the expected operating temperature of the circuit to ensure it does not overheat.

Similar threads

  • Electrical Engineering
Replies
5
Views
2K
  • Electrical Engineering
Replies
12
Views
711
  • Electrical Engineering
Replies
5
Views
2K
  • Electrical Engineering
Replies
26
Views
2K
  • Electrical Engineering
Replies
4
Views
907
Replies
26
Views
2K
  • Electrical Engineering
2
Replies
41
Views
4K
  • Electrical Engineering
Replies
20
Views
7K
  • Electrical Engineering
Replies
13
Views
1K
  • Electrical Engineering
Replies
8
Views
2K
Back
Top