From what I have read electrical power is transferred at high voltage and low current in order to reduce energy loss in the form of i^2R. High voltage at low current is the same power as low voltage at high current. My question is how do they do it. If they apply a higher voltage and the resistance in the transmission wire is not changing how is the current made smaller. From V=IR applying a higher voltage should increase the current and the power. I'm confused.