Okay I have a question for you engineer types. I teach a little high school physics and something has struck me as slightly odd in what they get taught. They learn about transformers and are told that a major application of them is in electrical power transmission. They are told that since the power dissipated in a resistor is P=I^2 R then by reducing I (by stepping up V) the power loss is reduced in high voltage lines. My obvious question then is why the reverse is not true, i.e. using Ohm's law it's clear that P=V^2 / R and the most obvious form P=IV shows that it doesn't make any difference to the power loss by stepping up or down the voltage/current. So, am I wrong at this simple level? Alternatively, what is the real reason for high voltage transmission? One guess I would have is that the wires are non-ohmic and high currents increase the resistance of the wire more than high voltage. This could well be bollocks. Alternatively, is the AC nature of the transmission the key. I.E. does it cost more energy to correct the power factor due to the phase shift of the inductive transmission lines if the current is high but it easier if the voltage is high? I have no idea why this should be the case, I'm just guessing here. Can anyone clear this up for me?