Does a DC transmission line for a given amount of power, do better at higher voltage? I know power is the product of the Voltage and current. The logic given to me by one of my peers was that I2R losses are less if the current is low. And so for a given power, it would imply V to be higher. But power can also be given as V2 / R, right? So by this logic wouldn't increasing the voltage increase the power dissipation? Which reasoning is right?