Hey everyone, I could do with a hand checking my understanding of calculating the current loss in cables just to make sure im doing it right.(adsbygoogle = window.adsbygoogle || []).push({});

First up, is power loss. From my understanding of ohms law and P=VI it is possible to calculate the power loss in cables by using the formula power loss = I^2 * R

So if for example I had a 10W bulb, does this mean my total power draw would be 10W + loss? Is that correct.

Next voltage drop. I found a table which gives me a voltge drop for different cable at different voltages, etc and a 0.75mm^2 two core cable running 12V has a loss of 0.058V/A/m which means on a 3 Amp current that spans 20 m Id have a voltage drop of 0.058*3*40 of almost 7V. Doesnt this sound like a lot. Have I worked something out wrong. I know DC loses out of distance but this still seems like a lot? Obviously the solution is to use thicker cable but I just want to make sure my understanding is correct before going any further. For the record 0.75mm^2 is good for 6 Amp so its not like the cable is unsuitable for the ampage. From my understanding, it would be pretty typical for voltage drop to be no more than 10%

Thanks for your help

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Some basics of power loss and voltage drop in cables

**Physics Forums | Science Articles, Homework Help, Discussion**