Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Total Current Draw Of LED And Resistor Combined

  1. Jun 18, 2012 #1
    I am using a 3.3 volt led rated 25 ma powered with 12 volts and have 1000 ohm resistor in between. Ohms law says amps = ohms / volts so 12 / 1000 = 12 ma so is this all the LED and resistor combined will draw? As I know the led can only handle 25 ma, but if the led where using this much current you would need less ohms, but would you still need to add the resistor into the equation?

    John
     
  2. jcsd
  3. Jun 18, 2012 #2

    vk6kro

    User Avatar
    Science Advisor

    The voltage across the resistor will be 12 volts minus the LED voltage. So, 12 volts minus 3.3 or 8.7 volts

    So the current in the resistor would be 8.7 volts divided by 1000 ohms or 8.7 mA. This would be the LED current.

    The LED will not be as bright as it could be.

    To run it at full current, you can calculate a new resistor size like this:

    R = 8.7 volts / 0.025 amps = 348 ohms

    So, you would use the next highest available resistor size which is 470 ohms. This would give a current of 18.5 mA. This is probably safer anyway unless you really want every last bit of brightness.

    You could get 348 ohms with a 330 and an 18 ohm resistor in series, but it probably isn't worth the risk of damaging your LED.
     
  4. Jun 19, 2012 #3

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Good advice for this sort of calculation, in general: Write down what you actually know at the start and then gradually fill in the details as they can be deduced. If you are told the Voltage drop on the LED, then use this information for the next step. This applies even to very complex circuit analysis - just like manually solving sets of simultaneous equations, aamof - you solve the bits that you can first and then substitute values in the next ones etc. etc...
     
  5. Jun 20, 2012 #4
    I can understand how much current a Led gets using a resistor, but what formula I would like to know is as the voltage and resistor values change in different led circuits is how much is the resistor drawing in terms of current? If you had two ammeters one after the voltage supply and the other after the resistor you could just subtract the last reading from the first.

    John
     
  6. Jun 20, 2012 #5
    Current through a series circuit is going to be the same.

    So if you placed an ammeter in series inbetween the voltage source and one end of the resistor, and another in series inbetween the opposite end of the resistor and the LED you would see the same value for the current on both ammeters.
     
  7. Jun 20, 2012 #6

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Like I said - do things in the appropriate order. You start with the specified Voltage drop on the LED (different from colour to colour and for different powers). Then you find out how much current the spec sheet tells you that it needs for this power. You can calculate how many volts you 'have left', after the drop across the LED. This value of Voltage and the required value of current will tell you what value of resistor is necessary.

    P.S. If you have two ammeters in a simple series circuit, they will both read the SAME because the current is the same all the way round (where else could it go???) I think that you don't need a 'formula' but you do need to start thinking in the right way for circuit calculations. Read and learn what the two Kirchoff laws tell you about circuits. That is all you need for this sort of problem.

    [edit: snap !]
     
  8. Jun 21, 2012 #7
    I think I understand so if you have like 100 volt supply and you use the correct size resistor and the resistor wattage is not big enough and the resistor gets hot this means the Led will not be getting all the power as a lot of power is lost in the resistor? As any time there is resistance as in a piece of wire or resistor there is power loss is there not? The reason I am thinking this way is that if a Led would need 20 ma to run and you had say 5 ma loss in the resistor or wire if it were a long wire would you not need a total of 25 ma?

    John
     
  9. Jun 21, 2012 #8

    Averagesupernova

    User Avatar
    Gold Member

    John1937 you need to do some review of basic electricity/electronics. Post #5 stated that the current in a series circuit is the same in all components. You seem to have just ignored that or whatever since you said:
    -
    It does not work that way. Current (amps, milliamps, etc.) is a rate. X number of electrons per second that pass a given point is the definition of an ampere. A series circuit cannot have 20 mA in one component (the resistor) and 15 mA in another component (the LED).
    -
    Oh yeah one last thing. If the resistor gets hot in a circuit that doesn't mean that the resistor is 'robbing' power from the LED. I may read you wrong, but it looks like you are implying a bigger resistor that doesn't get as hot will rob less power. That is not the case. It just is better at dissipating than a smaller resistor. Like I said, do some review of basic electricity.
     
  10. Jun 21, 2012 #9
    I think I understand now as these Led resistor calculators do not take in account the resistance of the wire if the wire has 1000 ohm resistance and it tells you you need 1000 ohm resistor you really do not need a resistor as the wire is the resistor.

    John
     
  11. Jun 21, 2012 #10

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    How many km of connecting wire were you expecting to use in a simple LED circuit? Of course, any such calculator assumes that the person building the circuit has a bit of sense.
    But you do not seem to be taking on board that current is not 'used up' on the way round a circuit. Until you understand that, there is no point in continuing with this - it is fundamental to circuit theory (which works every time).
     
  12. Jun 21, 2012 #11

    Averagesupernova

    User Avatar
    Gold Member

    Yeah sophie I was also wondering how much wire was going to be used in order to get up to 1000 ohms worth. There are a few applications where the resistance of the wire is relied upon. One I can think of is a high current ammeter. Otherwise it is usually desirable to get the resistance of the interconnecting wire low enough to completely ignore it.
     
  13. Jun 21, 2012 #12
    Well I have my Led's located from my house to poles in the the driveway and some runs are as long as 350 feet which is underground telephone wire and I believe it is 24 gauge and I have a book that will tell me the resistance of wires in 1000 foot lengths of different sizes, but when you apply 12 volts to a wire of that length it shows 12 volt on the meter on the other end, but I assume a meter is not much of a load as would be a couple Led's. I use to be on the electric board of a power company and they always said they lose 10% power in the transmission line's and I thought a resistor and a Led would be the same type of scenario.

    John
     
  14. Jun 21, 2012 #13

    vk6kro

    User Avatar
    Science Advisor

    You can measure the resistance of that wire loop by joining two wires at one end and then measuring the resistance between the same two wires at the other end.

    A 20 mA LED will not be very bright, though, and no more than a decoration.

    You probably realize that a 20 mA current will develop one volt across 50 ohms, so you can estimate the voltage drop after you measure the total resistance of the wire.
    (multiply the total resistance by 0.02 amps to get the voltage drop.)
     
  15. Jun 22, 2012 #14

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    It would have been a good idea for you to have described the basic problem in more detail, earlier.
    There is no need to feed all LEDs separately. If you have a string of n of them, down a driveway they can all be connected between the same two wires with a single resistor feeding the wire. The current will be nX20mA and you can do 'the sums' to work out the required series resistor value. If there are a lot of LEDs then the resistor may need to be a high power type as it will be getting a bit hot! and should be mounted appropriately. If the resistance of the wire is significant then the LEDs may get a bit dimmer towards the far end. This can be dealt with by using the "Ring Main"** technique. This involves taking another pair of wires to the far end of the chain, connected at both ends, in parallel with your two supply wires. It halves the unwanted resistance of the supply wires and helps to keep the volts across all the LEDs more uniform.
    **Used in all UK homes and offices as a cost effective way of supplying a lot of power outlets. Cheaper and just as good as a 'star' system.
    PS Have you now taken on board this fundamental stuff about currents in circuits?
     
  16. Jun 22, 2012 #15
    1000 ohms resistance * .02 amps = 20 is that 20 % voltage drop or 20 volts
     
  17. Jun 22, 2012 #16

    sophiecentaur

    User Avatar
    Science Advisor
    Gold Member

    Have you come across the formula V=IR?
    You will have great difficulty with any of this is you want to ignore such basic knowledge. The answer to your question is in the formula (which mentions nothing at all about 'percentage loss')
    btw, that stuff about 10% power loss is not fundamental (how could it be?). It's somebody's idea of what would be acceptable in a typical transmission system, in which cable costs are significant. You can choose any loss you want for your own system.

    Also, 700ft of 24 gauge wire will have about 20Ω resistance: nothing like the 1kΩ that has been mentioned earlier. It is hardly a significant factor in any calculations involved in this problem unless you are intending to run a lot of LEDs.
     
  18. Jun 24, 2012 #17
    I can understand if a resistor gets hot and it does not rob power from the Led, but if the resistor gets hot does this not waste current? It just seems to me that a resistor running hot is like a electric heater drawing current in addition to the Led. An example would be 100 volt supply with 5600 ohm resistor driving a Led or 3 volt supply with no resistor driving Led would not the Led on 3 volts use less current overall?

    John
     
  19. Jun 24, 2012 #18

    vk6kro

    User Avatar
    Science Advisor

    A resistor getting hot does use power but the LED current is the same.
    You seem to be using the terms power and current as if they were the same thing.

    LEDs are semiconductor devices which draw almost no current until they have a certain voltage across them and then they can draw enough current to destroy themselves.

    So it is necessary to put a resistor in series with them to limit the current they can draw. In your case, this was 25 mA but we settled on a resistor that would let the LED draw a bit less than this.

    Sometimes, we have a voltage we have to use and this determines the power we will use in the resistor.
    The resistor is there to protect the LED, not to rob power from it.

    You have some choices with your setup. I would have two wires going the full distance down your path and then take a resistor and a LED off at each point where you need a LED.

    At each point you would have 20 mA flowing from the 12 volt line, so there would be 0.24 watts being used. (12volts * 0.02 amps = 0.24 watts)
    0.07 watts of this would be going to the LED, so it isn't very efficient, but there isn't a lot of power being used so it probably doesn't matter.

    One possible choice would be that you could put up to 3 LEDs in series for each (smaller) resistor.
    It may be obvious how you would do this, but I will describe it if you like.
    This would be more efficient, but the wiring would be more messy.

    Another choice that may suit you is to get Christmas lights which are already designed for your mains voltage and then you just string them along your driveway. You would then just find some way to get mains voltage to the string of lights and do it safely.
     
  20. Jun 24, 2012 #19
    I like watts and amps, but I was wondering how you came up with .07 watts going to the Led this seems what I want to know? Those numbers figure out to be 70% loss in inefficiency if that is correct?

    John
     
    Last edited: Jun 24, 2012
  21. Jun 24, 2012 #20

    vk6kro

    User Avatar
    Science Advisor

    3.5 volts * 0.02 amps = 0.07 watts.

    Some of this inefficiency is necessary if you have to use a 12 volt supply.

    If you had a 6 volt supply, the total power used would be 6 volts * 0.02 amps or 0.12 watts.
    So the efficiency would be about 58 %. (0.07 watts / 0.12 watts * 100 = 58.333 % efficient.)

    These losses are trivial, but larger LEDs can use currents of up to an amp and there are highly efficient regulators that can limit their current to safe levels without wasting a lot of power.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Total Current Draw Of LED And Resistor Combined
  1. LED Current Draw? (Replies: 3)

Loading...