Why is it inefficient to use low voltage when transmitting electricity?
These are all basically the same few formulas, but I wrote down all iterations of them.
Anything from Joule's Law/Joule Heating
The Attempt at a Solution
I know that the solution is because when current is high, there is a greater amount of heat loss. When voltage is low, current is high, and vice versa. Therefore we transmit at high voltage and low current. But upon looking at the formulas and trying to understand how they fit in with this explanation, I simply cannot wrap my head around it. I feel like there is a fundamental part of Joule's Law that I am misunderstanding.
Here's what I did:
Joule's Law states P∝i2 when we hold Resistance constant
∴E(heat loss)=Ri2t where we can replace "R" with Ohm's Law (R=V/i)
∴E=Vi2t/i we can cancel the i
∴E=Vit however in this situation, increasing voltage will also increase the heat loss as they are all being multiplied together. Why would we transmit at high voltage then?
Alternatively we could say that E=Ri2t and hold i2 and t constant
∴E∝V/i and in this case, increasing current would actually decrease heat loss while increasing voltage would increase heat loss.
What I don't understand is why we transmit at high voltage given these formulas. I also don't understand where we got the fact that a high voltage means low current and vice versa. Somehow the logic would fit if E∝i/V but that's not true. I feel like I have missed something obvious, but if someone could explain, I'd be very grateful.