1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Voltage required in an LED circuit

  1. Sep 26, 2012 #1
    I am currently working on making two LED eyes for a costume which needs a top row of 2x LEDs, middle row of 6x LEDs and a bottom row of 2x LEDs. I am currently trying to figure out how many volts are needed for the circuit in order to light it (at which point I can see what resistor is needed depending on what batteries I choose). Each LED on this eye is 2.2V except one of the middle ones which is a maximum of 2.4V as it is a replacement for one that died.

    I am in a hurry to complete this as it must be done by Friday. I have attatched a picture of my setup (done in paint).
     

    Attached Files:

  2. jcsd
  3. Sep 26, 2012 #2
    If the work voltage for the other 9 LEDs is 2.2 V, you can just apply the parallel and series circuit rules. For simplicity, it's just 13.2V. However, the power distribution on that special LED and the one that is in series to it depends on the resistance weight.

    Is that what you want?
     
  4. Sep 26, 2012 #3
    Could you show me how you calculated that? I have one more set I have to make which will vary since I only had 20x of the 2.2V (4x of which are dead). Also, will the voltage across the top 2x and bottom 2x equal the voltage through the middle 2x and end 2x?

    Im trying to see what resistor value I would need for the top and bottom such that they dont blow up.
     
  5. Sep 26, 2012 #4
    So you have 10 LEDs: 9 of them are identical, and the rest one is special. Because you are using parallel circuit in the middle part, so you are right that the terminals on these 3 series branches share the same voltage potential.

    Let's make it simple: just think you have 10 identical LEDs, then the result is very easy to see: from the left most LED:

    Add the voltage drop:
    {2.2+2.2}
    2.2 + 2.2 +{2.2+2.2} + 2.2 +2.2
    {2.2+2.2}

    the sum is 13.2V. Now the special LED makes the parallel a little bit more complex due to its special needs on its maximum voltage. One thing here is certain that: without knowing the resistance, we cannot solve the problem. (one equation with two unknowns)

    Here's the solution(s)
    1. Use common sense. These LEDs should be similar to each other, so you cannot have 9 LEDs with 200 Ohm and another with 200,000 Ohm, which means the voltage are roughly evenly distributed over the two LEDs in the middle, even though they are different. In that case, you will not put very high voltages on either of them, because they are similar, so both around 2.2V

    2. If you are not sure, use a meter to test the LED if you have one. That's the easiest way and guaranteed method.

    Moreover, you don't want to increase the voltage over the 3 middle branches just because you have an extra LED that can withdraw 0.2 V. If you put 2.4V, suppose you really want to do that, then all other 4 (at least four that are in parallel) will have >2.2 V work voltage. The general rule is to make sure all can work with safety.
     
  6. Sep 26, 2012 #5
    Oops, the space is replaced. This is what I mean:

    ***********[ 2.2 + 2.2 ]************
    ---2.2+2.2---- [ 2.2 + 2.2 ]----2.2+2.2----
    ***********[ 2.2 + 2.2 ]*************
     
  7. Sep 26, 2012 #6
    Ok thanks. I tried to see if a 9V(9<13.2) battery would light any of them up but not even 1 lit up.
     
  8. Sep 26, 2012 #7
    You need to know the current required for the LED's which are current driven. One diode usually has about 0.7 volt drop so if they are rated at 2.2 volts there is probably a series resistor in there to limit current under some limit to keep from destroying the LED.

    You can see how well they work if you can use some kind of variable voltage supply.

    The LED's usually run on about 30 ma or so per bulb, so if they are in series, you have to multiply the voltage by the number in series. The ones in parallel would have a higher current total but each lamp would still use the same current, about 30 ma. So in the triple circuit, there would be about 90 ma flowing. You might want to just put those three in series so you will only have one path with 30 ma or so flowing, which would simplify the voltage and resistor situation. If you can put them in series that is, could be the wiring is wrapped up inside cloth or somehow hidden but if you can get to the wires, you can certainly put them in series then just add up all the 2.2 volts of the total, 10? which would need 22 volts to run. Or you could put the whole lot in parallel and go with just 3 volts or so at 30 odd ma times the number of bulbs, say 10 bulbs in parallel, would be about 1/3 amp, ~300 ma, total about 1 watt, not much power required.

    If they are in parallel, a single bulb opening up will not stop all the lights. That is how christmas tree lights work, all in parallel so if one bulb burns out, the rest still light up.

    The downside would be if the bulb shorts out, that would drop everything but usually they open up, with a high resistance.
     
  9. Sep 26, 2012 #8
    Here is the voltage of the second eye (its more confusing)
    ***********[ 2.2 + 2.2 ]************
    ---2.6+2.2---- [ 2.4 + 2.2 ]----2.2+2.6----
    ***********[ 2.2 + 2.2 ]*************

    I dont have any way to replace any that break as of now and I want to power all of these together. What battery combination would be best (typical AA[1.2V], 9v)? Also, what level of resistor should I use and where should it go exactly? Im probably going to wire the eyes from one directly to the other and return back to the battery rather than going from the battery directly to each individually (unless individually would be better)
     
  10. Sep 26, 2012 #9
    Ah so your saying wire them in a sort of 'S' pattern in the middle. That would make things a lot easier (except soldering the thing together but oh well).
     
  11. Sep 26, 2012 #10
    Ok I connected them into series so the first eye would be (2.2*9+2.4)=22.2V and the second eye would be (2.2*7+2.4+2.6*2)=23V. Now when I connected the first eye to a 19.2V battery, absolutely nothing lit up. I used a meter to confirm that each LED lights up individually while connected. Do I need to hook it up to 22.2V in order to see it light up?
     
  12. Sep 26, 2012 #11
    You have to supply at least the rated voltage to a LED or no current will flow at all.
    You also need a resistor to limit the current, because a very large current can flow if the supplied voltage is only a little greater than the rated voltage.
    If you put 2 LEDS in parallel, they both need a resistor. The voltage ratings of the LEDs won't be exactly identical and they depend on the temperature as well, If one LED gets a tiny bit hotter, It's voltage will go down, It will rob current from the other LED, making it even hotter etc.
     
  13. Sep 27, 2012 #12

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    What willem2 said.

    It's not a great idea to connect a constant voltage source direct to an LED. Much better to control the current through them using a series resistor. Most 5mm LEDs need around 10-20mA but look up the spec. You don't need 10 resistors. LEDs in series can share a resistor eg

    -led-led-resistor

    If you had a 9V battery you could connect up the LEDs in threes...

    -led-led-led-resistor-

    3 * 2.2 = 6.6
    so
    R = (9-6.6)/0.01 = 240 Ohms

    If you have a 19.2V battery and need 10 Leds then you could make two strings of 5 LEDs in series with a resistor...

    ----led-led-led-led-led-Resistor----
    ----led-led-led-led-led-Resistor----

    5 x 2.2 = 11V

    R = (19.2 - 11) / 0.01 = 820 Ohms
     
  14. Sep 27, 2012 #13

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Not true for an LED there is no series resistor in a standard LED but the PD across them can be anything up to 4V (for the blue). You need the electronVolts to produce the visible photons and the doping is chosen appropriately. The 0.7V is for diodes which are designed to lose as little PD as possible - hence the specialist diodes (small signal and very low power) that have much lower voltage drops.
     
  15. Sep 27, 2012 #14
    Ok I decided that I will wire them in parallel much like christmas tree lights are done. The voltage required for such a small thing is WAYYYY to much to run it in series (i cant be carrying around 50V with me everywhere for 15 hours 3 days in a row). This will allow me to actually get them running off 2.4V (2x AA batteries). I will ONLY be using the 2.2V LEDs that are from unitednuclear. Now heres my new question; how do I determine what resistor that is required to drop the voltage from 2.4V to 2.2V? Eq R=V/I I=20mA
    What is the voltage in this calculation? 2.4V or 0.2V?
     
  16. Sep 27, 2012 #15

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Work out the total current for the total number of LEDs you want to feed, then the amount you need to drop the volts by from your battery. The resistor value is then (Volts Drop)/Current. You can choose to have an R in series with every LED, all of them in parallel or in groups - the same calculation applies to every case using the appropriate current. A single series resistor feeding all LEDs in parallel might get a bit warm, though.
     
  17. Sep 27, 2012 #16
    so total current being the 7x LEDs per eye and 2x eyes multiplied by the current for each (20mA) which would give 0.28A. Thus it would be the voltage drop needed (0.2V) / (0.28A) which would give a 0.714... ohm resistor. Is this a correct calculation?
     
  18. Sep 27, 2012 #17
    Oh and I'm using only 1x resistor that will probably be near the batteries since I don't have room to implement it in/near the eyes.
     
  19. Sep 27, 2012 #18
    <repost as i need answer ASAP so that I dont break the thing>

    so total current being the 7x LEDs per eye and 2x eyes multiplied by the current for each (20mA) which would give 0.28A. Thus it would be the voltage drop needed (0.2V) / (0.28A) which would give a 0.714... ohm resistor. Is this a correct calculation?
     
  20. Sep 27, 2012 #19

    CWatters

    User Avatar
    Science Advisor
    Homework Helper

    Running 10 LED in parallel from one resistor may work BUT if the LED aren't matched one or more LED may hog all the current with the others getting none. Consider what happens if the voltage drop on one diode is say 2.19V and the others need 2.2V.

    As I said before the best strategy is probably to chain several LED in series with one resistor per chain.

    1) Pick a battery voltage (eg 9V)
    2) Work out how many diodes you can put in series while staying below that voltage. For example 3 x 2.2 = 6.6V. If you tried 4 x 2.2V = 8.8V that would be too close to 9V for the current to be well controlled by the resistor.
    3) Pick a resistor to set the current. The resistor is calculated as

    R=(Battery voltage - total LED Voltage)/Current

    For example (9-6.6)/0.01 = 240 Ohms
    4) If using strings of 3 LED then three strings = 9 which is one short of the 10 you wanted. Make a string of one LED with R = (9-2.2)/0.01 = 680 Ohms.
     
  21. Sep 27, 2012 #20
    Lets say that I am using a parallel circuit (as I have to with given space and such). I have 14x total LEDs all rated at 20mA current giving a maximum of 0.28A and max of 2.2V. My battery source is a duo pack of 2x rechargeable batteries giving a total output voltage of 2.4V.
    The formula would follow as this:
    R=V/I
    R=(V_b - V_leds)/I
    R=(2.4-2.2)/0.28=0.714ohms

    If I were to use a 10ohm resistor would this cause any problems? Right now its the lowest valued resister that I have.

    Thanks again.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Voltage required in an LED circuit
Loading...