Why Does Using High Voltages in Power Transmission Reduce Losses?

Click For Summary
SUMMARY

Using high voltages in power transmission effectively reduces power losses due to the relationship defined by the formula P = I squared R. While it may appear that increasing voltage leads to greater losses, the practical application involves transformers with high impedance, which ensures that most of the voltage is across the transformer rather than the transmission line. This results in a lower current flowing through the line, thereby minimizing losses. Understanding this principle is crucial for optimizing power transmission systems.

PREREQUISITES
  • Understanding of electrical power formulas, specifically P = I squared R and P = V squared / R.
  • Knowledge of transformer impedance and its role in power transmission.
  • Familiarity with high voltage power transmission systems.
  • Basic concepts of electrical resistance and current flow.
NEXT STEPS
  • Research the role of transformers in power distribution systems.
  • Learn about the impact of line impedance on voltage drop and power loss.
  • Explore high voltage transmission techniques and their benefits.
  • Study the principles of electrical engineering related to power loss minimization.
USEFUL FOR

Electrical engineers, power system designers, and students studying electrical engineering who are interested in optimizing power transmission efficiency.

paul_harris77
Messages
50
Reaction score
0
We are constantly told at school that in order to reduce power loss in overhead cables, high voltages and low currents are used as P = I squared R. This seems to make sense until you substitute I=V/R into P=VI and get P = V squared / R. Now if voltage is increased in the cable, and resistance is decreased, power loss is at its greatest, completely opposite to the other example. Am I doing something wrong?:confused:

Many thanks

Paul Harris
 
Physics news on Phys.org
paul_harris77 said:
We are constantly told at school that in order to reduce power loss in overhead cables, high voltages and low currents are used as P = I squared R. This seems to make sense until you substitute I=V/R into P=VI and get P = V squared / R. Now if voltage is increased in the cable, and resistance is decreased, power loss is at its greatest, completely opposite to the other example. Am I doing something wrong?:confused:

Many thanks

Paul Harris

What you're saying would be true if the high power line were shorted to ground and all the voltage were across the line. In practice the line feeds a transformer with an impedance much higher than the line. So in reality most of the voltage is across the transformer and the voltage across the line is very low.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 20 ·
Replies
20
Views
12K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 14 ·
Replies
14
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
Replies
1
Views
2K
  • · Replies 18 ·
Replies
18
Views
4K