1. The problem statement, all variables and given/known data A small city requires about 15 MW of power. Suppose that instead of using high-voltage lines to supply the power, the power is delivered at 120 V. Assuming a two-wire line of 0.50-cm-diameter copper wire, estimate the cost of the energy lost to heat per hour per meter. Assume the cost of electricity is about 12 cents per kWh. 2. Relevant equations P=V^2 / R, R=r x l /A, P=I x V 3. The attempt at a solution Can I have a hint at solving the problem? I know the formulas but I'm not sure how to approach. What does it mean by "energy lost to heat"?