View Single Post
physxGuy
#5
Nov19-12, 11:58 AM
P: 7
Quote Quote by K^2 View Post
The voltage is constant. Whatever your outlet supplies. (Yeah, fluctuations in that do happen, so that could be a factor.) Under normal operation, the power consumption is determined by the current being drawn, which can change during operation. A cold electric heater will draw more power than a hot one, because hot coil has higher resistance. A motor will draw less power once it spins up to maximum speed. And so on.

But yes, power rating is the design maximum. It's what device shouldn't be exceeding under normal operations. If nothing goes wrong. In general, it's a little more complicated than simply having fixed resistors. There is more stuff going on depending on what it is that the device does. But overall, your statement is closer to the truth. If nothing is wrong, device will not draw more. It can certainly draw less.
I would be interested in hearing what there is other than resistors - just curiosity.

So, as I understand, I am right about the power then. For example, a 1000W water heater is likely to draw 1000W until it heats up, and while it's heating up, gradually decreasing the wattage, while at *set* temperature, it uses just enough wattage to not get cold, maybe something like 800W.

Another example, a motor will eat up the maximum allowed wattage for maybe 5s, until it spins up and is fully spinning, which is when it'll use just enough to keep spinning - for example, a 200W motor will use full 200W for 5 seconds, then it will use maybe 120W to keep operating.

Assuming I understood the concept correctly, are my numbers in the examples way off, or it's possible?