Okay I have a question for you engineer types. I teach a little high school physics and something has struck me as slightly odd in what they get taught. They learn about transformers and are told that a major application of them is in electrical power transmission. They are told that since the power dissipated in a resistor is P=I^2 R then by reducing I (by stepping up V) the power loss is reduced in high voltage lines. My obvious question then is why the reverse is not true, i.e. using Ohm's law it's clear that P=V^2 / R and the most obvious form P=IV shows that it doesn't make any difference to the power loss by stepping up or down the voltage/current. So, am I wrong at this simple level? Alternatively, what is the real reason for high voltage transmission? One guess I would have is that the wires are non-ohmic and high currents increase the resistance of the wire more than high voltage. This could well be bollocks. Alternatively, is the AC nature of the transmission the key. I.E. does it cost more energy to correct the power factor due to the phase shift of the inductive transmission lines if the current is high but it easier if the voltage is high? I have no idea why this should be the case, I'm just guessing here. Can anyone clear this up for me?
Ohm's law referrs to a load and a transmission line is not a load. V is the voltage drop across a resistor (for example). The IR^2 loss really is the reason for transmission lines being high voltage.
The "R" in your 2 equations are not the same resistance. There are 2 loss elements in transmitting power. The conductor loss is I^2*R, and the insulator loss is V^2*G. R is the line series resistance, and G is the line shunt conductance. As an example let's just consider a power level of 150 MW, or 750 kV with 200 A (one phase for illustration). Why, one may ask, 750kV/200A, and why not 200V/750kA? The V^2*G insulation loss is MUCH MUCH LOWER than the I^2*R conduction loss at voltages up to somewhere in the MV range. As an example, if the shunt conductance of a cable is 1/(10 Gohm) or 0.1 nS, and the series resistance is 10 milliohm, at 200V/750kA, the insulator loss is (200V)^2*1e-10S = 4.0 uW. The conductor loss is (750kA)^2*1e-2 ohm = 5.625 GW! At 750kV/200A, the insulator loss is (750kV)^2*1e-10S = 56.25 W. The conductor loss is (200A)^2*1e-2 ohm = 400 W. Quite a difference! As the voltage is raised, the V^2*G insulator loss increases, but the I^2*R conductor loss decreases. Since the conductor losses are enormous in comparison, it pays greatly to transmit at high voltages. In a nutshell, insulators are less lossy than conductors. Insulators are "superinsulating" at normal temperatures, whereas conductors are not superconducting unless cooled down to very low temperatures. In the future when and if high temperature superconductors are available, things could change. BR.
From the economic standpoint, transmission line wires can be much smaller in diameter when the line voltage is high. To get the same power transfer at a low voltage, the conductors would cost a fortune. About loss; AC transmission means one must match the line and load impedance to get the best transfer. If this is not considered, then power would be lost by radiation. (ever hear the hum on your radio when driving under a transmission line?)
Yes. The problem with this line of reasoning is that P = IV is the power delivered by an electrical energy source not the power dissipated by it. Hope that helps. CS
Thanks all, yep I was mixing up the supplied nominal V of the power lines with the V lost by the resistance. Makes perfect sense now! Cheers