Some basics of power loss and voltage drop in cables

AI Thread Summary
The discussion centers on understanding power loss and voltage drop in electrical cables. Power loss can be calculated using the formula P = I^2 * R, and it's clarified that the total power draw includes the load plus any losses in the cable. A specific example involving a 12V system with a 0.75mm² cable indicates a significant voltage drop, raising concerns about the adequacy of the cable size for the application. The calculations reveal that the actual current and voltage at the load will be lower than expected due to resistance in the wire, leading to dimmer lights. Overall, using thicker cables is recommended to minimize voltage drop and maintain performance.
tommy060289
Messages
20
Reaction score
0
Hey everyone, I could do with a hand checking my understanding of calculating the current loss in cables just to make sure I am doing it right.

First up, is power loss. From my understanding of ohms law and P=VI it is possible to calculate the power loss in cables by using the formula power loss = I^2 * R

So if for example I had a 10W bulb, does this mean my total power draw would be 10W + loss? Is that correct.

Next voltage drop. I found a table which gives me a voltge drop for different cable at different voltages, etc and a 0.75mm^2 two core cable running 12V has a loss of 0.058V/A/m which means on a 3 Amp current that spans 20 m Id have a voltage drop of 0.058*3*40 of almost 7V. Doesnt this sound like a lot. Have I worked something out wrong. I know DC loses out of distance but this still seems like a lot? Obviously the solution is to use thicker cable but I just want to make sure my understanding is correct before going any further. For the record 0.75mm^2 is good for 6 Amp so its not like the cable is unsuitable for the ampage. From my understanding, it would be pretty typical for voltage drop to be no more than 10%

Thanks for your help
 
Engineering news on Phys.org
I have checked your numbers and they seem OK.
The resistivity of copper is 1.7*10^-8 Ωm so a copper cable with cross sectional area
0.75mm^2 would have a resistance of 0.022Ω/m. I don't know where your figure of
0.058Ω/m comes from but the 2 numbers are of the same 'order of magnitude'
hope this helps
 
So if for example I had a 10W bulb, does this mean my total power draw would be 10W + loss? Is that correct.

This isn't entirely true..

A 12 V 10 watt bulb only dissipates 10 watts if it gets 12 volts.

All you can say is that at 12 volts it has a resistance of 14.4 ohms. { 12 * 12 / 14.4 (E2/ R) = 10 watts }

So, if there are losses in the power line or if it gets some other voltage, then you can estimate the power dissipated in the lamp, but you can't calculate it exactly because the resistance of the lamp depends on the temperature of the filament.
 
Here's an example to get you thinking.

Say you have 20 or so lights on a single circuit that draw roughly 10 amps. You would think that a #12 gauge wire would work fine. It does up to a couple hundred feet.

But let's say you have to go 500 ft or so...like in the case of outdoor parking lights in a parking lot or something like that. Let's say you are using single phase 277 volts.

The 12 gauge works fine at 200 ft...however...at 500 ft there is too much of a voltage drop and you lights will not operate correctly. If you use a voltage drop chart or do the math yourself...you will see that you need a #8 gauge wire to keep the voltage drop with the proper parameters.

U can use V=IR (multiply resistance of wire times current)...or P=I^2*R...or P=IV...all will get you there...but V=IR seems easiest in my opinion.
 
so are you saying I should use V = IR to calculate voltage drop or current loss?

surely it wouldn't work for voltage drop as when current is dropped and voltage is increased, the voltage drop reduces.

Im not entirely sure what to do with my current losses though as here is the set up.

I have a 12V battery rigged to 6x 6 Watt light bulbs 20 m away. The bulbs are 12V so the current is 3A.

if I factor in the cable losses for current though wouldn't the current draw be higher than 3A due to the cable losses or would the current just be less than 3A at the light?
 
tommy060289 said:
so are you saying I should use V = IR to calculate voltage drop or current loss?

surely it wouldn't work for voltage drop as when current is dropped and voltage is increased, the voltage drop reduces.

Im not entirely sure what to do with my current losses though as here is the set up.

I have a 12V battery rigged to 6x 6 Watt light bulbs 20 m away. The bulbs are 12V so the current is 3A.

if I factor in the cable losses for current though wouldn't the current draw be higher than 3A due to the cable losses or would the current just be less than 3A at the light?

The voltage drop I am reffering to is in the cable...not the load. U need to know the resistance of your cable...an ohmeter will work...or specs. If you know the current through the cable...simply multiply by the resitance...and u will now have the voltage drop thru the cable. Or sometimes the specs of the cable will give you ohms per feet of whatever.

Actually...you are going to lower the current...simply because you have added the resistance of the load and the resistance of the wire in series. You now have a TOTAL greater resistance. V=IR. In this situation...you have actually lowered the current (because V stayed the same and R went up) if you look at the math. So your light will be getting less than 3 amps...and also be getting less than 12 volts. It should still work fine...just maybe be a hair more dim.
 
tommy060289 said:
Hey everyone, I could do with a hand checking my understanding of calculating the current loss in cables just to make sure I am doing it right.

First up, is power loss. From my understanding of ohms law and P=VI it is possible to calculate the power loss in cables by using the formula power loss = I^2 * R

So if for example I had a 10W bulb, does this mean my total power draw would be 10W + loss? Is that correct.

Next voltage drop. I found a table which gives me a voltge drop for different cable at different voltages, etc and a 0.75mm^2 two core cable running 12V has a loss of 0.058V/A/m which means on a 3 Amp current that spans 20 m Id have a voltage drop of 0.058*3*40 of almost 7V. Doesnt this sound like a lot. Have I worked something out wrong. I know DC loses out of distance but this still seems like a lot? Obviously the solution is to use thicker cable but I just want to make sure my understanding is correct before going any further. For the record 0.75mm^2 is good for 6 Amp so its not like the cable is unsuitable for the ampage. From my understanding, it would be pretty typical for voltage drop to be no more than 10%

Thanks for your help

There's no way you drop 7 volts if your cable can handle 6 amps. If you have a 10 watt load and a 12 volt battery...you are looking at less than an amp.

I'm not sure what V/A/m means?

Using the handy excel chart I have...1 amp at 20 meters for 12 volt DC has a voltage drop of .8 volts guessing an 18 gauge wire. Maybe a hair high...compared to 12 volts...but not horrible. Maybe consider using a 14 gauge wire...only .3 voltage drop and roughly 2% voltage drop which is acceptable.

Incidentally...I'm not totally sure about what I'm saying here...just throwing some things around.
 
Last edited:
V/A/m I would say means Ω/m because V/A = resistance.
That is why I looked up resistivity to calculate the resistance of the cables being talked about.
Copper wire of cross sectional area 0.75mm^2 has a resistance of 0.022Ω/m so 20m of wire has a resistance of 0.45Ω
 
technician said:
V/A/m I would say means Ω/m because V/A = resistance.
That is why I looked up resistivity to calculate the resistance of the cables being talked about.

oK...SO .058 * 20 METERS = 1.16 ohms.

10 watt bulb...P=IV...10w=I*12...current is .83 amps with perfect wire thru lamp.

(V^2)/R=P...(12^2)/R=10...or R=14.4 ohms in light. Add in the 1.16 ohms for wire...total load is 15.56 ohms

V=IR 12=I*15.56...I=.77 amps.

P=IV P=.77* 12 Power to lamp is 9.24 watts...slightly dimmer.

.76 watts was lost in wire.

Do a voltage divider of resistors in series...one resistor of 1.16 (wire) and other resistor is 14.4 ohms (light)

12 volts X (1.16/(1.16 +14.4)) = .89 voltage drop in wire. This agrees with the ~.8 volts I mentioned from the chart above.

Makes sense...I think. With your wire size...your lamp will have 92% of its brightness.
 
Back
Top