Why do LEDs in a circuit need resistors in series?

AI Thread Summary
LEDs require resistors in series to limit current and prevent damage, as different diodes may have varying forward voltages. Using a single resistor for multiple LEDs can lead to uneven brightness, as diodes with lower forward voltages will draw more current, potentially causing overheating and failure. For general-purpose LEDs, separate resistors are recommended to ensure each LED receives the correct voltage and current. Well-matched LEDs can be connected in series or parallel effectively, but typical "jellybean" LEDs should use individual resistors for optimal performance. Understanding the need for separate resistors helps in designing reliable LED circuits.
Borek
Mentor
Messages
29,132
Reaction score
4,556
I am playing with Arduino and LEDs at the moment. LED needs a resistor to limit current, that's clear. However, all examples I see use separate resistor for each diode. As far as I can tell electrically (in terms of limiting current) it shouldn't matter much whether we use single resistor for all LEDs, or separate resistor for each one (see the picture). I already tried to discuss it with a friend of mine, and he told me it is better to use separate resistors, but TBH his explanation (different diodes may have different forward voltages, if they are connected through a single resistor they can have different brightness) wasn't convincing.

So, is there a reason why we use multiple resistors?
diody_rezystory.jpg
 
Engineering news on Phys.org
You can model a diode as a voltage source (the forward voltage) and a very small resistance in series with the source.

You're friend is right. If you put all the diodes in parallel then they must have the same applied voltage. If the built-in voltages are slightly different among the diodes then the lower ones will have to draw more current across their internal resistance to account for the applied voltage.

You can even have a runaway current. The diode that draws the most current gets warmer which lowers its resistance. That makes it draw more current which makes it even warmer and so on.
 
  • Like
Likes donpacino and berkeman
If the LEDs are very well matched, then you can use one resistor (or current driver circuit). But for general batches of jellybean LEDs you get from the local electronics supply, they are not well matched, so only 1 or 2 out of the several hooked in parallel to a single resistor will be full brightness. The others will be dimmer. So for jellybean LEDs, it's best to use separate resistors.

Better LEDs like the ones used in LED lightiing fixtures are well-matched so they can be connected in series or parallel and have the same brightness.

EDIT -- beaten out by Aaron! :smile:
 
  • Like
Likes davenn and cnh1995
Because the voltage seen by the LEDs would vary depending on whether 1, 2, or 3 are lit. You want each LED to get all the voltage it needs no matter what the other LED are doing.
 
Aaron Crowl said:
You can even have a runaway current.
That's a good point. So it may only be practical to connect well-matched LEDs in series, not in parallel.
 
OK, you have convinced me :wink:

No, seriously, it is not that I didn't believe, it is that I want to understand why instead of just parroting the solution. And I think I get it now.

Yesterday I managed to measure the PWM from my model receiver, tonight I plan to do the same, just displaying the result using a shift register. Eventually I want to show the pulse width on a display (that I don't have yet) though.

 
  • Like
Likes berkeman
anorlunda said:
Because the voltage seen by the LEDs would vary depending on whether 1, 2, or 3 are lit. You want each LED to get all the voltage it needs no matter what the other LED are doing.

Is this why it's necessary for LEDs in a circuit to have resistors in series? The answer to this probably depends upon the type of circuit. This is the circuit I'm referring to on a breadboard:

 
Back
Top