So, power lines are designed to deliver power at extremely high voltage to minimize energy dissipation. This was presented in lecture by the fact that I=PV, so for a given power high V => low I, and since (I^2)R represents the energy dissipation, this term will be minimized. That makes sense to me, however, when I think of it from a different perspective, I am tangling ideas and getting something incorrect. If you have power lines, they will be made out of a given material and should have a (relatively) fixed resistance (assuming Ohm's Law applies). Thus, raising the voltage should raise the current. So in trying to minimize energy dissipation, wouldn't increasing the voltage also increase the current? (of course this would also, increase the power at which it is delivered) So to me it seems like if you pick a power level and have a given material (with fixed resistance), you really don't have much choice in varying the voltage-- you just have to pick the one where V*I equals that power level. Otherwise, you can change the geometry of your power lines, but this is now minimizing R, not I. Could someone please clarify? Thank you.