Now, I think i know how to do this one, but I'm just not sure if I'm using the correct figures. To figure out how much it would cost to run an appliance for a certain amount of time per month, you would need to know the wattage of the appliance, correct? Does voltage need to be known? For example, a guitar amplifier puts out 120 Watts, and is 250 Volts. I tried to use just the wattage, converted to kW, and multiply by 3600 to get kW/hrs. For some reason, I got outrageous numbers for the amount of $ it costs. Can anybody tell me what I'm doing wrong? By the way, power costs $.075 per kW/hr.