DC transmission lower loss on higher voltage?

Click For Summary
SUMMARY

DC transmission lines exhibit lower losses at higher voltages due to reduced I²R losses, as lower current results in diminished power dissipation. The discussion clarifies that while power can be expressed as V²/R, this formula applies under specific conditions, primarily when considering line resistance as the only resistance. The correct approach for analyzing power transmission involves using P=VI, where V represents the voltage between line and ground. Additionally, DC transmission is more efficient than AC over long distances due to the skin effect, which increases resistance in AC systems.

PREREQUISITES
  • Understanding of electrical power formulas: P=VI and P=V²/R
  • Knowledge of DC and AC transmission principles
  • Familiarity with the skin effect in conductors
  • Basic concepts of resistance in electrical circuits
NEXT STEPS
  • Research the skin effect and its impact on AC transmission efficiency
  • Explore advanced power transmission techniques, including HVDC (High Voltage Direct Current)
  • Study the differences in line losses between AC and DC transmission systems
  • Investigate methods to calculate power dissipation in electrical circuits
USEFUL FOR

Electrical engineers, power system designers, and professionals involved in optimizing transmission line efficiency will benefit from this discussion.

Jay_
Messages
181
Reaction score
0
Does a DC transmission line for a given amount of power, do better at higher voltage? I know power is the product of the Voltage and current. The logic given to me by one of my peers was that I2R losses are less if the current is low. And so for a given power, it would imply V to be higher.

But power can also be given as V2 / R, right? So by this logic wouldn't increasing the voltage increase the power dissipation? Which reasoning is right?
 
Engineering news on Phys.org
Jay_ said:
Does a DC transmission line for a given amount of power, do better at higher voltage? I know power is the product of the Voltage and current. The logic given to me by one of my peers was that I2R losses are less if the current is low. And so for a given power, it would imply V to be higher.

But power can also be given as V2 / R, right? So by this logic wouldn't increasing the voltage increase the power dissipation? Which reasoning is right?

The I^2R formula gives line loss because power is I\cdot V and V=IR is the voltage difference of the line. That is not the same as the supplied voltage which is V = I(R +R_{load}).

The V^2/R formula with V the supplied voltage is assuming the line resistance is the ONLY resistance, i.e. it is the power dissipation if you have a short circuit.

To express line dissipation in terms of supplied voltage you must know load resistance and calculate current to get voltage drop from line resistance, then subtract load power, from total power...or just use first formula once you know current.
 
maybe not think about it in terms of fixing the power.
Take for example a 120 watt light bulb. If you apply anything other than its rated voltage, it won't operate at its rated wattage. It draws current and dissipates power based on its resistance at a fixed temperature, Not based on some value of power that is fixed. The power is determined from the product of the other two, not the other way around.
In discussing a current decrease across a line, this would be done with a decrease in voltage. So discussing a decrease in current and an increase in voltage to keep the power constant really isn't correct. Since you then change the ratio of V/I and that would imply a resistance change which we know won't happen
Any corrections welcomed
 
Jay_ said:
Does a DC transmission line for a given amount of power, do better at higher voltage?
Most likely.

I know power is the product of the Voltage and current. The logic given to me by one of my peers was that I2R losses are less if the current is low. And so for a given power, it would imply V to be higher.
Yes.

But power can also be given as V2 / R, right?
No, because you have already defined V to be the supply voltage. The equation for line losses will involve the voltage drop across the length of the transmission line, and this is not that V.
 
Applying the formulas

P=VI is the correct formula for power if V is the voltage between line and ground and I is the current through the conductors. P is the power transmitted by the line.

P=V*V/R is the correct formula if V is the voltage difference from one end of the line to the other, not the voltage to ground. But in this case power P is the power lost during transmission, not the power transmitted. If the line resistance R is small, then the voltage difference from one end to the other is nearly zero and the power losses are nearly zero.

jambaugh mentioned short circuits. It is the same thing. If you have a short circuit on one end of the line, then the voltage to ground is V on one end and zero on the other end. In that case the voltage difference is V, and V*V/R is the power delivered to the line, but 100% of that power goes to losses and 0% is transmitted. That's one of the reasons why short circuits are bad.
 
Jay_ said:
Does a DC transmission line for a given amount of power, do better at higher voltage? I know power is the product of the Voltage and current. The logic given to me by one of my peers was that I2R losses are less if the current is low. And so for a given power, it would imply V to be higher.

But power can also be given as V2 / R, right? So by this logic wouldn't increasing the voltage increase the power dissipation? Which reasoning is right?

You need to be considering the right "V" in your comparison. The V dropped would be less because the I would be lower for a higher transmission voltage.
 
Jay_ said:
Does a DC transmission line for a given amount of power, do better at higher voltage? I know power is the product of the Voltage and current. The logic given to me by one of my peers was that I2R losses are less if the current is low. And so for a given power, it would imply V to be higher.

this is the case. but it is the case for either AC or DC. the reason why, for very long distances, DC transmission lines have less transmission loss is because of the the skin effect. for AC current, most of the current flow closer to the surface of the cylindrical conductor. that means that the inside of the conductor is not used much and this skin effect effectively reduces the cross section of the conductor and increases its resistance per unit length.

so DC does better than AC because of decreased R in I2R losses assuming the same conductor diameter. it's the same I (at least in terms of r.m.s.).

But power can also be given as V2 / R, right? So by this logic wouldn't increasing the voltage increase the power dissipation? Which reasoning is right?

this is the case when the lossy R is in parallel with the load. this is leakage loss. any parallel path that causes leakage, you want that R to be as high as possible. but series R, you want that to be as low as possible.
 

Similar threads

Replies
8
Views
3K
  • · Replies 32 ·
2
Replies
32
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
2
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K
  • · Replies 10 ·
Replies
10
Views
4K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
192
  • · Replies 9 ·
Replies
9
Views
4K