1. The problem statement, all variables and given/known data A constant current of 3 A for 4 hours is required to charge an automotive battery, and the battery's terminal voltage is v(t) = 10 + t/2 V, where t is in hours. Assuming an electricity cost $0.12 per KWh, what is the cost to charge the battery? 2. Relevant equations p(t) = i(t)v(t) 3. The attempt at a solution So I just thought that I could multiply current (3A) with the voltage (10 + 4h/2)V and get power in watts, which would be 36W. I divide 36 by 1000, 0.036kW, and then I multiply that by the cost ($0.12/kWh) to get some number/h, and then multiply that number by 4 hours to get the cost to charge the battery. However, this isn't giving me the right answer, I would greatly appreciate it if someone could point me in the right direction. Thank you.