I would like to understand this problem a bit better... A small city requires 10 MW of power. Suppose that instead of using high voltage lines to supply the power... the power is delivered at 120 V. Assuming a two wire line of 0.50 cm diameter copper wire, estimate the cost of the energy lost to heat per hour per meter. Assume the cost of electricity is about 10 cents per kWh. Heres what im wondering: Should I find the area of the wire (using pi*diameter^2 all over 4) and and then use Ohms Law to solve for the amount--- knowing that V = 120? Thanks for your help.