Calculating Appliance Energy Costs: Wattage vs. Voltage

  • Thread starter Thread starter saschouch
  • Start date Start date
Click For Summary

Homework Help Overview

The discussion revolves around calculating the energy costs associated with running an appliance, specifically focusing on the relationship between wattage, voltage, and cost per kilowatt-hour. The original poster is attempting to understand the correct figures and calculations needed to determine the cost of running a guitar amplifier.

Discussion Character

  • Exploratory, Assumption checking, Mathematical reasoning

Approaches and Questions Raised

  • The original poster questions whether voltage is necessary for the calculation and expresses confusion over their results when using wattage alone. Participants discuss the importance of converting wattage to kilowatts and the correct multiplication factor for time in hours.

Discussion Status

Participants are actively engaging with the original poster's calculations, providing clarifications and corrections regarding the conversion of wattage to kilowatts and the appropriate time factor to use. There is a productive exchange of ideas, with some participants offering insights into common misconceptions about power consumption.

Contextual Notes

The original poster mentions a specific power cost and the wattage of the appliance, indicating a practical application of the calculations. There is an acknowledgment of potential discrepancies in the original calculations, particularly regarding the time factor used.

saschouch
Messages
14
Reaction score
0
Now, I think i know how to do this one, but I'm just not sure if I'm using the correct figures. To figure out how much it would cost to run an appliance for a certain amount of time per month, you would need to know the wattage of the appliance, correct? Does voltage need to be known? For example, a guitar amplifier puts out 120 Watts, and is 250 Volts. I tried to use just the wattage, converted to kW, and multiply by 3600 to get kW/hrs. For some reason, I got outrageous numbers for the amount of $ it costs. Can anybody tell me what I'm doing wrong? By the way, power costs $.075 per kW/hr.
 
Physics news on Phys.org
You also need to divide the wattage by 1000 so that you have kilowatts. The voltage doesn't matter, it's already been taken account of in the calculation of the wattage.

As an aside, often devices are labeled with what their maximum power consumption is, i.e. your amp won't be sucking up a full 120 watts if it's just sitting there with the power on but you're not playing anything on your guitar.

Edit: Sorry, you said that you had converted to kW in doing your calculation. I get 0.25 kWh * $0.75 / kWh ~= 19 cents per hour to run your amp at its maximum power consumption. That doesn't sound so bad, I think.

Edit2: Heh, the corrections just keep coming :wink: Your problem is likely related to the fact that you're multiplying by 3600 (= number of seconds in an hour, I suppose was your reasoning) to get the kWh. You actually should be multiplying by 1 = number of hours in an hour.
 
Last edited:
Ok. I see now. So if at maximum output, it's .12 kW, then take that times 60 times 60 again to get hours, i have an answer of 432. So that's in kW/hrs. then taking that times my $.075 will get how much it costs to run my amp for 1 hour. Thanks.
 
See my second edit, you should be multiplying by hours and not seconds.
 
yes! Thanks a ton!
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
7K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
11
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
6K
Replies
5
Views
5K