Why does high voltage transmission reduce Joule losses?

  1. I'm a bit confused about why power is transmitted at high voltage for transmission over long distances. This is generally explained as reducing Joule losses, P = I^2 R, and we minimize the current by maximizing the voltage.

    However, a trivial re-arrangement using Ohm's law gives also P = V^2 / R. This appears to say that higher voltage means more electrical losses.

    What am I missing here? I think I don't understand the statement "minimize the current by maximizing the voltage" -- why are they inversely proportional? Ohm's law says they should be proportional (for fixed R -- perhaps this is the mistaken assumption?).
     
  2. jcsd
  3. Electrical power can be represented P = V*I, this can be rearranged to I = P/V. Maintaining power, you must increase the voltage if current drops.

    A big factor is that by lowering the current, you don't need as large a wire to avoid overheating. Using smaller wire more cost-effective when running wire long distances.
     
  4. berkeman

    Staff: Mentor

    In addition to dawin's comments, the problem with the equations you're written is this:

    ** The resistive losses are correctly expressed as P = I^2 R, because the I is the current flowing in the wire.

    ** The equation you wrote P = V^2 / R does not apply, or would only apply to the voltage DROP in the wire. That V is not the transmission line voltage, so that's why you don't use the transmission line voltage to try to calculate the power losses in the wire.

    Hope that helps to clarify things.
     
Know someone interested in this topic? Share a link to this question via email, Google+, Twitter, or Facebook

Have something to add?