Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Driving LEDs with 12V

  1. Feb 8, 2017 #1
    Hi all, a bit of a hypothetical for you.

    If I wish to drive a 3V LED at 20 mA from a 12V battery, I can use a dropper resistor. I subtract the LED's forward voltage from the 12V to get 9V, and divide this by the 20 mA, to get the value of resistor needed.

    R=V/I = 9/0.02 = 450 ohm.

    However, if I want to drive four 3V LEDs in series, the combined forward voltages will 'use up' the full 12V, so I have none left to drop across the resistor. I know I still need a resistor to limit the current, but how do I calculate its required value, if I want 20 mA?

    Again, a theoretical problem. I realise I could rearrange the LEDs in the real world, but it bothers me that I can't work out this simple 'thought experiment'.
     
  2. jcsd
  3. Feb 8, 2017 #2
    It's not just a thought experiment, it reflects a real world issue in driving LEDS (you just state an extreme case).

    (edit, had a slip up in my first example, this should hopefully be correct)...

    First, the 3V forward drop (Vfd) you mention will vary device to device, over temperature, and may even drift a bit as the device ages. And that Vfd varies with current - your resistor and voltage source will have variations as well, so you won't get precisely 20mA. The drive circuit must account for these real world variations.

    As the Vfd becomes a higher and higher portion of source voltage, the problem becomes worse. Consider your one LED versus three series LEDS (instead of your extreme case of four). For your one LED, I'll copy your calculation:

    R=V/I = 9/0.02 = 450 ohm

    Now imagine a 10% increase in source voltage (and assume constant Vfd for a 1st order approximation): I = (13.2-3)/450 = 22.67mA. A 13.3% increase in current for a 10% voltage increase (not bad, and less IRL, as the Vfd will increase a bit).

    But now try again with three LEDs. At 12V source and 3V Vfd assumption, you have (Vfd total = 3*3 = 9V). 12V - 9V = 3V for R:

    R = V/I = 3/.02 = 150 ohm.

    And a 10% increase in source voltage gives: I = (13.2-9)/150 = 28 mA - a 40% increase in LED current.

    You can see that the lower the voltage across the R, the more sensitive the circuit is to variations. LEDs 'want' to see a current source. A high voltage and resistance appear more lik e a current source. A low voltage and low R look more like a voltage source. With zero resistance, the LED current will be dependent only on the point on the curve that those LEDs are at, and can be thrown off by any drift into being dim, or being forever dim (damage levels).

    Bottom line, consider the variations in the drive circuit so that worse case conditions fit your desired outputs.
     
    Last edited: Feb 8, 2017
  4. Feb 8, 2017 #3
    Hi,

    That's OK, since this 'experiment' has no solution.
    If the sum of forward voltages (at the required current) is higher than the available power source voltage then whatever you do (with resistors), you won't be able to reach the required current. The circuit will operate at lower current. (I mean, if it will operate at all...)

    Of course, if it's not just about resistors, but (for example) about a boost type LED driver (which is recommended for such tasks) then it's a different story entirely.
     
  5. Feb 8, 2017 #4
    I hadn't considered that the voltage across R would affect the current/voltage relationship of the LEDs. No wonder the resistive dropper is frowned upon somewhat.
     
  6. Feb 8, 2017 #5
    I won't be able to reach the required current? I had assumed if I connected four LEDs in series across a 12V battery they would instantly pop from overcurrent. Or do you think that adding any resistor at all (even resistance of wires) would induce a volatge drop and render the circuit unable to light any of the LEDs?
     
  7. Feb 8, 2017 #6
    A resistive dropper is fine for many circuits, simple, reliable, cheap. But that dropping R wastes power, not a concern in some cases (indicators, etc). In a power LED circuit and if efficiency is important, you will most likely want to use a switching circuit to convert a voltage source (battery or line/main voltage) to constant current source. A constant current source is ideal - as those LED characteristics shift, the circuit adjusts and maintains your 20 mA (or whatever point you design for).
     
  8. Feb 8, 2017 #7
    The problem is, you are on a sharp curve of the characteristics of the LED. It is difficult to predict exactly what will happen due to normal tolerances at that point. You can try to approximate it by looking closely at that point on the curve. But that's just not a good approach.

    If the LEDs actually all draw 20mA right at 3 V, and are well matched, you might get away with connecting 4 of them in series across a 12V supply. But take my earlier example to that extreme - any shift in parameters can easily take those LEDS into a destructive over-current condition.

    You should also be aware of "thermal run away". I think you'll find that the current at a particular voltage increases with temperature. So when those LEDS turn on, they will warm up and draw more current. Which causes more heating, and therefore MORE current - this often increases until destruction.
     
  9. Feb 8, 2017 #8
    It won't.
    Here is a random forward voltage-current characteristics:
    3256Fig01.gif
    In your case what happens is you will get the current (and luminosity) which belongs to the 3V forward voltage. Some 2-3mA, no more. All the LEDs will get the same current, of course.
    They'll barely light.
     
  10. Feb 8, 2017 #9
    Thanks to you both. That's cleared things up nicely!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted