## Total Current Draw Of LED And Resistor Combined

I am using a 3.3 volt led rated 25 ma powered with 12 volts and have 1000 ohm resistor in between. Ohms law says amps = ohms / volts so 12 / 1000 = 12 ma so is this all the LED and resistor combined will draw? As I know the led can only handle 25 ma, but if the led where using this much current you would need less ohms, but would you still need to add the resistor into the equation?

John

 PhysOrg.com engineering news on PhysOrg.com >> Company pioneering new types of material for 3-D printer 'ink'>> Student-built innovations to help improve and save lives>> Researchers use light projector and single-pixel detectors to create 3-D images
 Recognitions: Science Advisor The voltage across the resistor will be 12 volts minus the LED voltage. So, 12 volts minus 3.3 or 8.7 volts So the current in the resistor would be 8.7 volts divided by 1000 ohms or 8.7 mA. This would be the LED current. The LED will not be as bright as it could be. To run it at full current, you can calculate a new resistor size like this: R = 8.7 volts / 0.025 amps = 348 ohms So, you would use the next highest available resistor size which is 470 ohms. This would give a current of 18.5 mA. This is probably safer anyway unless you really want every last bit of brightness. You could get 348 ohms with a 330 and an 18 ohm resistor in series, but it probably isn't worth the risk of damaging your LED.
 Recognitions: Gold Member Science Advisor Good advice for this sort of calculation, in general: Write down what you actually know at the start and then gradually fill in the details as they can be deduced. If you are told the Voltage drop on the LED, then use this information for the next step. This applies even to very complex circuit analysis - just like manually solving sets of simultaneous equations, aamof - you solve the bits that you can first and then substitute values in the next ones etc. etc...

## Total Current Draw Of LED And Resistor Combined

I can understand how much current a Led gets using a resistor, but what formula I would like to know is as the voltage and resistor values change in different led circuits is how much is the resistor drawing in terms of current? If you had two ammeters one after the voltage supply and the other after the resistor you could just subtract the last reading from the first.

John

 Quote by John1397 I can understand how much current a Led gets using a resistor, but what formula I would like to know is as the voltage and resistor values change in different led circuits is how much is the resistor drawing in terms of current? If you had two ammeters one after the voltage supply and the other after the resistor you could just subtract the last reading from the first. John
Current through a series circuit is going to be the same.

So if you placed an ammeter in series inbetween the voltage source and one end of the resistor, and another in series inbetween the opposite end of the resistor and the LED you would see the same value for the current on both ammeters.

 Recognitions: Gold Member Science Advisor Like I said - do things in the appropriate order. You start with the specified Voltage drop on the LED (different from colour to colour and for different powers). Then you find out how much current the spec sheet tells you that it needs for this power. You can calculate how many volts you 'have left', after the drop across the LED. This value of Voltage and the required value of current will tell you what value of resistor is necessary. P.S. If you have two ammeters in a simple series circuit, they will both read the SAME because the current is the same all the way round (where else could it go???) I think that you don't need a 'formula' but you do need to start thinking in the right way for circuit calculations. Read and learn what the two Kirchoff laws tell you about circuits. That is all you need for this sort of problem. [edit: snap !]
 I think I understand so if you have like 100 volt supply and you use the correct size resistor and the resistor wattage is not big enough and the resistor gets hot this means the Led will not be getting all the power as a lot of power is lost in the resistor? As any time there is resistance as in a piece of wire or resistor there is power loss is there not? The reason I am thinking this way is that if a Led would need 20 ma to run and you had say 5 ma loss in the resistor or wire if it were a long wire would you not need a total of 25 ma? John

John1937 you need to do some review of basic electricity/electronics. Post #5 stated that the current in a series circuit is the same in all components. You seem to have just ignored that or whatever since you said:
 The reason I am thinking this way is that if a Led would need 20 ma to run and you had say 5 ma loss in the resistor or wire if it were a long wire would you not need a total of 25 ma?
-
It does not work that way. Current (amps, milliamps, etc.) is a rate. X number of electrons per second that pass a given point is the definition of an ampere. A series circuit cannot have 20 mA in one component (the resistor) and 15 mA in another component (the LED).
-
Oh yeah one last thing. If the resistor gets hot in a circuit that doesn't mean that the resistor is 'robbing' power from the LED. I may read you wrong, but it looks like you are implying a bigger resistor that doesn't get as hot will rob less power. That is not the case. It just is better at dissipating than a smaller resistor. Like I said, do some review of basic electricity.

 I think I understand now as these Led resistor calculators do not take in account the resistance of the wire if the wire has 1000 ohm resistance and it tells you you need 1000 ohm resistor you really do not need a resistor as the wire is the resistor. John
 Recognitions: Gold Member Science Advisor How many km of connecting wire were you expecting to use in a simple LED circuit? Of course, any such calculator assumes that the person building the circuit has a bit of sense. But you do not seem to be taking on board that current is not 'used up' on the way round a circuit. Until you understand that, there is no point in continuing with this - it is fundamental to circuit theory (which works every time).
 Yeah sophie I was also wondering how much wire was going to be used in order to get up to 1000 ohms worth. There are a few applications where the resistance of the wire is relied upon. One I can think of is a high current ammeter. Otherwise it is usually desirable to get the resistance of the interconnecting wire low enough to completely ignore it.
 Well I have my Led's located from my house to poles in the the driveway and some runs are as long as 350 feet which is underground telephone wire and I believe it is 24 gauge and I have a book that will tell me the resistance of wires in 1000 foot lengths of different sizes, but when you apply 12 volts to a wire of that length it shows 12 volt on the meter on the other end, but I assume a meter is not much of a load as would be a couple Led's. I use to be on the electric board of a power company and they always said they lose 10% power in the transmission line's and I thought a resistor and a Led would be the same type of scenario. John
 Recognitions: Science Advisor You can measure the resistance of that wire loop by joining two wires at one end and then measuring the resistance between the same two wires at the other end. A 20 mA LED will not be very bright, though, and no more than a decoration. You probably realize that a 20 mA current will develop one volt across 50 ohms, so you can estimate the voltage drop after you measure the total resistance of the wire. (multiply the total resistance by 0.02 amps to get the voltage drop.)
 Recognitions: Gold Member Science Advisor It would have been a good idea for you to have described the basic problem in more detail, earlier. There is no need to feed all LEDs separately. If you have a string of n of them, down a driveway they can all be connected between the same two wires with a single resistor feeding the wire. The current will be nX20mA and you can do 'the sums' to work out the required series resistor value. If there are a lot of LEDs then the resistor may need to be a high power type as it will be getting a bit hot! and should be mounted appropriately. If the resistance of the wire is significant then the LEDs may get a bit dimmer towards the far end. This can be dealt with by using the "Ring Main"** technique. This involves taking another pair of wires to the far end of the chain, connected at both ends, in parallel with your two supply wires. It halves the unwanted resistance of the supply wires and helps to keep the volts across all the LEDs more uniform. **Used in all UK homes and offices as a cost effective way of supplying a lot of power outlets. Cheaper and just as good as a 'star' system. PS Have you now taken on board this fundamental stuff about currents in circuits?

 Quote by vk6kro You can measure the resistance of that wire loop by joining two wires at one end and then measuring the resistance between the same two wires at the other end. A 20 mA LED will not be very bright, though, and no more than a decoration. You probably realize that a 20 mA current will develop one volt across 50 ohms, so you can estimate the voltage drop after you measure the total resistance of the wire. (multiply the total resistance by 0.02 amps to get the voltage drop.)
1000 ohms resistance * .02 amps = 20 is that 20 % voltage drop or 20 volts

Recognitions:
Gold Member
 Quote by John1397 1000 ohms resistance * .02 amps = 20 is that 20 % voltage drop or 20 volts
Have you come across the formula V=IR?
You will have great difficulty with any of this is you want to ignore such basic knowledge. The answer to your question is in the formula (which mentions nothing at all about 'percentage loss')
btw, that stuff about 10% power loss is not fundamental (how could it be?). It's somebody's idea of what would be acceptable in a typical transmission system, in which cable costs are significant. You can choose any loss you want for your own system.

Also, 700ft of 24 gauge wire will have about 20Ω resistance: nothing like the 1kΩ that has been mentioned earlier. It is hardly a significant factor in any calculations involved in this problem unless you are intending to run a lot of LEDs.

 Quote by Averagesupernova Oh yeah one last thing. If the resistor gets hot in a circuit that doesn't mean that the resistor is 'robbing' power from the LED. I may read you wrong, but it looks like you are implying a bigger resistor that doesn't get as hot will rob less power. That is not the case. It just is better at dissipating than a smaller resistor. Like I said, do some review of basic electricity.
I can understand if a resistor gets hot and it does not rob power from the Led, but if the resistor gets hot does this not waste current? It just seems to me that a resistor running hot is like a electric heater drawing current in addition to the Led. An example would be 100 volt supply with 5600 ohm resistor driving a Led or 3 volt supply with no resistor driving Led would not the Led on 3 volts use less current overall?

John