I'm a bit confused about why power is transmitted at high voltage for transmission over long distances. This is generally explained as reducing Joule losses, P = I^2 R, and we minimize the current by maximizing the voltage. However, a trivial re-arrangement using Ohm's law gives also P = V^2 / R. This appears to say that higher voltage means more electrical losses. What am I missing here? I think I don't understand the statement "minimize the current by maximizing the voltage" -- why are they inversely proportional? Ohm's law says they should be proportional (for fixed R -- perhaps this is the mistaken assumption?).