Does it matter if I turn my computer off?

In summary, leaving your computer running on a cold winter night with the monitor off can help heat your office and potentially reduce the power requirement of your room heating, depending on the efficiency of the space heating. However, leaving the monitor on can also add to the energy consumption due to the inefficiencies of energy conversion in electronic devices. If your computer has a giant spinning hammer, the energy from the motor and air turbulence will eventually contribute to heating the room. Similarly, leaving the monitor on will also add to the heat in the room. Whether leaving your computer on or off is more efficient for heating the room depends on various factors such as the efficiency of your office's heating system and the location of your computer and monitor relative to the thermostat.
  • #1
zacharyr
2
0
If it's a cold winter night and my office keeps the thermostat set to 65°F, does it really waste any power if I leave my computer running with the monitor off? Isn't my computer just helping to heat the office and taking away load from the heater?

Bonus question: what if my computer has a giant spinning hammer that turns whenever the processor is running? Isn't all that energy (the heat from the motor, the air turbulence around the hammer, any sound from the hammer) eventually just going to heat?

Bonus bonus question: What if I leave my monitor on?
 
Physics news on Phys.org
  • #2
If your office is being heated all night when there's no one there then that's wasting power already and in this case it is true that leaving your computer on will help heat the office and therefore reduce the power requirement of your room (space) heating. Whether this is neutral to the overall energy requirement of the office depends on the eficiency of the space heating. If it's reverse-cycle air condidtioning then its a bit more efficient than the direct heating of you computer and so there's still some energy deficit.

From an energy point of veiw the situation for the monitor is no different to the computer, almost all the energy coming from the monitor is in the form of heat and any light will be converted to heat through in the walls of the office or objects in the room anyway.
 
  • #3
In addition to the relative heating efficiency, the total energy use would also depend on how well-mixed the office atmosphere is, and the relative locations of your computer / monitor and the thermostat. In the extreme case where the thermostat is located right next to the monitor, leaving the monitor on could "trick" the heat into staying off...
 
  • #4
First of all, you might consider leaving your computer running simply for electronic reasons, that is to say, electrical equipment undergoes the most stress when it is being turned on and off, and you reduce wear on your equipment by leaving it running. At least, this is what I've heard, though to be honest, I haven't really investigated that question, which is rather interesting, in and of itself.

From an energy standpoint, it IS wasteful to leave your computer running. It is even more wasteful to attach your hammer gobjabber or a parachute or whatever the hell crazy contraption you have hooked up (where do you work, anyways buddy, swinging hammers incorporated?) It is also more wasteful to leave your monitor running. Why? Because anytime you employ a mechanical or electrical device, you are dealing with inefficiencies of energy conversion; no engineering device (so far as I know of) is 100% efficient, and most are much less efficient than 100%, although there are certain devices, such as electrical transformers, which have high efficiencies. Yes, the computer and monitor and whatever else will be generating some heat transfer into the environment, thus reducing the need for the activation of the primary office heater. However, a lot of that energy is also going towards powering the hard drive, moving those electrons about the circuits, overcoming the friction of your hammer-drive apparatus, and so on and so forth.

Thus, the real question is: Which device makes for a more efficient heater, (a.) your computer setup, or (b.) the primary office heater? If it is (b.), you should deactivate (a.). On the astounding basis that your computer setup makes for a more efficient heat transfer device, then you SHOULD leave your computer running. This is extremely unlikely, considering the kind of "design-intentions" of the relative devices.
 
  • #5
First question already answered...
zacharyr said:
Bonus question: what if my computer has a giant spinning hammer that turns whenever the processor is running? Isn't all that energy (the heat from the motor, the air turbulence around the hammer, any sound from the hammer) eventually just going to heat?
Yes.
Bonus bonus question: What if I leave my monitor on?
What about it? Same answers as above.
 
  • #6
mordechai9 said:
First of all, you might consider leaving your computer running simply for electronic reasons, that is to say, electrical equipment undergoes the most stress when it is being turned on and off, and you reduce wear on your equipment by leaving it running. At least, this is what I've heard, though to be honest, I haven't really investigated that question, which is rather interesting, in and of itself.
That is unlikely to be what kills a computer, given that they are typically discarded after a relatively short lifespan due to obsolescence. Consider that people sometimes keep TVs for more than a decade, turning them on and off several times a day and they don't burn out.
From an energy standpoint, it IS wasteful to leave your computer running.
As others said, that really does depend on if the room the computer is in is being heated (and how it is being heated). If the room is being heated by regular electric resistance heating, then it is not wasting anything at all. If it is being heated by gas (the worst case), you are wasting about 30%-40% of the energy and probably spending double the money.
It is even more wasteful to attach your hammer gobjabber or a parachute or whatever the hell crazy contraption you have hooked up (where do you work, anyways buddy, swinging hammers incorporated?)
I'm pretty sure that that was a hypothetical for the purpose of understanding if mechanical energy in a closed room ends up as heat.
It is also more wasteful to leave your monitor running. Why? Because anytime you employ a mechanical or electrical device, you are dealing with inefficiencies of energy conversion; no engineering device (so far as I know of) is 100% efficient, and most are much less efficient than 100%, although there are certain devices, such as electrical transformers, which have high efficiencies. Yes, the computer and monitor and whatever else will be generating some heat transfer into the environment, thus reducing the need for the activation of the primary office heater. However, a lot of that energy is also going towards powering the hard drive, moving those electrons about the circuits, overcoming the friction of your hammer-drive apparatus, and so on and so forth.
And that's why he asked the question - to answer that question. And you have it wrong: all those inefficiencies, the resistance in the wires, the friction in the hard drive, etc, are where the heat comes from. Added together, those inefficiencies total 100% of the energy use of the computer. Ie: all of the energy used by the computer becomes heat.
Thus, the real question is: Which device makes for a more efficient heater, (a.) your computer setup, or (b.) the primary office heater? If it is (b.), you should deactivate (a.). On the astounding basis that your computer setup makes for a more efficient heat transfer device, then you SHOULD leave your computer running. This is extremely unlikely, considering the kind of "design-intentions" of the relative devices.
Unfortunately, electric resistance heating is still a pretty common heating method for office buildings that are more than a few years old. So it is quite possible that his office building has it.

Heck, some office buildings don't even have heat for the interior zones. The computers and lights provide more than is needed to keep the building warm in the winter (they use some form of air conditioning year-round).
 
  • #7
And you have it wrong: all those inefficiencies, the resistance in the wires, the friction in the hard drive, etc, are where the heat comes from. Added together, those inefficiencies total 100% of the energy use of the computer. Ie: all of the energy used by the computer becomes heat.

Sorry, I disagree. If 100% of the energy use of the computer went to inefficiencies, the computer wouldn't have any energy going into transistor operation, hard drive operation, and so on. Not all of the energy goes into heat. Most of the energy goes into the motion of electrons and physical apparati. Now, if you want to make the claim that all motion reduces down to heat energy transfer, you can go ahead and make that claim, however, that's degenerating into definitional or symantic argumentation.

All of the energy used by the computer does not become heat; if that were true, you would have a perfectly efficient "energy to heat" conversion source, which again, violates the concept of engineering inefficiency.
 
  • #8
mordechai9 said:
Sorry, I disagree. If 100% of the energy use of the computer went to inefficiencies, the computer wouldn't have any energy going into transistor operation, hard drive operation, and so on.
Not true. Transistors are little switches. When they switch, they dissipate heat - that's waste heat. But they are electronic switches, not mechanical, so there is no other energy besides the electrical and heat energy. But electrical energy doesn't leave the computer, only heat energy leaves the computer.

And hard drives dissipate heat due to friction. That's waste heat too. They don't do anything mechanical that stores or otherwise transferrs mechanical energy: there is no energy output besides heat energy.
Most of the energy goes into the motion of electrons and physical apparati.
That's all waste heat.
All of the energy used by the computer does not become heat; if that were true, you would have a perfectly efficient "energy to heat" conversion source, which again, violates the concept of engineering inefficiency.
It is true, and that is not a violation of any law of thermodynamics. In fact, that's the 2nd law in action.

Try it from another angle: what does a hard drive do that uses energy but does not generate heat? Ie, what do you mean by "hard drive operation"? Inputs must equal outputs. We have electrical energy input and heat output. What's the other output(s)?
 
  • #9
zacharyr said:
If it's a cold winter night and my office keeps the thermostat set to 65°F, does it really waste any power if I leave my computer running with the monitor off? Isn't my computer just helping to heat the office and taking away load from the heater?

Bonus question: what if my computer has a giant spinning hammer that turns whenever the processor is running? Isn't all that energy (the heat from the motor, the air turbulence around the hammer, any sound from the hammer) eventually just going to heat?

Bonus bonus question: What if I leave my monitor on?

Ha! I had this discussion at my old forum. Though I may have been talking to myself. I do that alot.

My solution was to give all of the 750 watt old pc's to old people who needed heat in the winter, running SETI software, thereby making the cpu's use lots of energy and keeping our old folk semi-warm.

My new macbook, unfortunately, only consumes about 30 watts of energy. About the same amount of energy I'm contributing to the atmosphere by sitting here typing this answer. I can't heat my house by sitting there, and neither can my new pc. But the energy is added to your environment in the winter, so like incandescent bulbs, unwrapped water heaters, and the like; they all contribute to keeping you comfortable.

As for monitors, my monitor at work has a "very" warm surface temperature. I actually turned it off this summer when I left my desk when the thermometer reached 90'f and above. My macbook screen is cool to the touch, so I'd say; "It all depends on what type of monitor you have".

In conclusion, I'd say run everything during the winter, and keep the thermostat at 62°F.
 
  • #10
OmCheeto said:
As for monitors, my monitor at work has a "very" warm surface temperature ... My macbook screen is cool to the touch, so I'd say; "It all depends on what type of monitor you have".
Or, at the very least, where the vent holes are. :rolleyes:
 
  • #11
mordechai9 said:
Sorry, I disagree. If 100% of the energy use of the computer went to inefficiencies, the computer wouldn't have any energy going into transistor operation, hard drive operation, and so on. Not all of the energy goes into heat. Most of the energy goes into the motion of electrons and physical apparati. Now, if you want to make the claim that all motion reduces down to heat energy transfer, you can go ahead and make that claim, however, that's degenerating into definitional or symantic argumentation.

All of the energy used by the computer does not become heat; if that were true, you would have a perfectly efficient "energy to heat" conversion source, which again, violates the concept of engineering inefficiency.

Russ has it right. What actually happens is that you do something useful *before* the energy is finally turned into heat in a computer, and in a heater, you directly turn it into heat. After all, there's conservation of energy, and all the electrical energy that went into the computer, where did it go ? Do you think that a text file on a disk stores a lot of energy ?

Engineering inefficiency is usually considered when one thinks of non-heat energy applications, such as mechanical energy from a motor. Then you put in so much energy, and you only get out so much mechanical energy. This mechanical energy can end up as stored potential energy (lifting weights for instance), or... as heat (friction, air resistance).

Thermodynamically, the internal energy increase equals dU = dQ + dW, where dQ is the received heat, and dW is the received mechanical/electrical energy.

Your computer has received a certain amount of electrical energy dW (during 10 hours working). Now, it didn't change by much its internal energy, right ? (ok, there's a very very tiny amount of energy that IS actually stored on your hard disk). So we have:
dU = 0 = dQ + dW. So we have that dQ = - dW. Your computer took up, as heat, the NEGATIVE of dW, in other words, it gave off heat equal to dW. That's what Russ is saying. Exactly the same applies for a resistive heater.

But your reaction is understandable. I couldn't convince my brother in law, who is a psychiatrist, either.
 
  • #12
vanesch said:
But your reaction is understandable. I couldn't convince my brother in law, who is a psychiatrist, either.

My neighbor has a PhD in chemistry and I can't convince him either...
 
  • #13
zacharyr said:
If it's a cold winter night and my office keeps the thermostat set to 65°F, does it really waste any power if I leave my computer running with the monitor off? Isn't my computer just helping to heat the office and taking away load from the heater?

Bonus question: what if my computer has a giant spinning hammer that turns whenever the processor is running? Isn't all that energy (the heat from the motor, the air turbulence around the hammer, any sound from the hammer) eventually just going to heat?

Bonus bonus question: What if I leave my monitor on?

While the computer does convert the electrical energy into thermal energy, it does not transfer that thermal energy to the environment as well as a heater, because that is not what it was designed to do! Heat is not a form of energy; it is a transfer of energy. Theoretically, a computer which is comsuming 500 Watts of electricl energy is producing as much thermal energy as a 500 Watt space heater. However, it is not designed to transfer that thermal energy as efficiently as the space heater, so it is producing less heat.

First bonus question: Yes it is, but the same considerations apply here as in the first question; a hammer swinging through space does not transfer the same amont of heat to the environment. The key words here are heat transfer, which surprisingly so far no one has touched on.

Bonus-bonus: For the most part, this makes no difference except now some of the energy is converted to light. Light, contrary to popular notion, contains exactly no heat! If the light strikes an light absorbant surface, heat will be generated but in general less heat will be transferred to the atmosphere than before. It is theoreticall possible (not likely) that light can bounce around forever without transferring any heat. So leaving the monitor on does contribute to the overall heat but with a loss of efficiency.

In general, if you have a 500 Watt space heater and a 500 Watt computer (including monitor) the space heater will produce more heat because that is what it is designed to do!
 
  • #14
schroder said:
Theoretically, a computer which is comsuming 500 Watts of electricl energy is producing as much thermal energy as a 500 Watt space heater. However, it is not designed to transfer that thermal energy as efficiently as the space heater, so it is producing less heat.

Where does the rest of the energy go?
 
  • #15
Borek said:
Where does the rest of the energy go?

If all the thermal energy from the cpu, for example, were naturally radiated into the environment as heat, we would not need to design heat sinks and cooling fans. But we know the heat sinks and fans are necessary to keep the cpu from cooking itself. Many of the other components can handle the thermal energy build up so there is no need to provide a more efficient way to transfer the heat away from them. The thermal energy for those components builds up as heat which is largely self-contained in those components and not radiated into the environment. It builds up in the components, in the circuit boards, the cabinet itself but is not entirely transferred out to the environment. Once again, space heaters are designed for the specific purpose of transferring thermal energy as heat and use air circulation over heated fins to do this. You can think of the computer as a container of heat and the space heater as a radiator of heat into the external environment. Do you want a can of red paint, or your room painted red?
 
  • #16
schroder said:
The thermal energy for those components builds up as heat which is largely self-contained in those components and not radiated into the environment. It builds up in the components, in the circuit boards, the cabinet itself but is not entirely transferred out to the environment.

So it builds up ad infinitum? And these elements are getting hotter and hotter as time passes by?
 
  • #17
I can see Borek's point, with the following caveat - maybe the PC & monitor draws 500 watts (nameplate) but as they heat up, in an idle condition, maybe they draw less current. Maybe.

The reason all electric heaters don't look like computers is because a resistance element (ie, some wire) is cheaper than transformers and chips and hard drives and cathode ray tubes... The heaters designed to be heaters are "more efficient" in the engineering sense that they are cheaper per watt delivered to the surrounding atmosphere than a computer monitor. Also, they are generally designed to deliver more power (a small space heater at 25,000 Btu/hr is delivering about 7 kW - much more than the monitor).
 
  • #18
schroder said:
If all the thermal energy from the cpu, for example, were naturally radiated into the environment as heat, we would not need to design heat sinks and cooling fans. But we know the heat sinks and fans are necessary to keep the cpu from cooking itself. Many of the other components can handle the thermal energy build up so there is no need to provide a more efficient way to transfer the heat away from them. The thermal energy for those components builds up as heat which is largely self-contained in those components and not radiated into the environment. It builds up in the components, in the circuit boards, the cabinet itself but is not entirely transferred out to the environment. Once again, space heaters are designed for the specific purpose of transferring thermal energy as heat and use air circulation over heated fins to do this. You can think of the computer as a container of heat and the space heater as a radiator of heat into the external environment. Do you want a can of red paint, or your room painted red?


Completely incorrect, the whole point of the multitude of fans and heatsinks in a computer is to disperse as much heat generated by the components out of the case as possible.
 
  • #19
I'm not even going to touch the misconceptions about heat and power dissipation, I think they've been addressed already. But I will comment on the wear and tear issue.

First of all, you might consider leaving your computer running simply for electronic reasons, that is to say, electrical equipment undergoes the most stress when it is being turned on and off, and you reduce wear on your equipment by leaving it running.

I've also heard this claim repeated many times but I've never actually seen any controlled studies on the subject. My gut feeling is that under the best case conditions leaving the computer running will make some parts fail sooner and some last longer, but under less favourable conditions that leaving the computer running unattended is a definite risk.

Essentially the favourable conditions are where you have a very reliable UPS and a clean air conditioned environment where little or no unforseen or uncontrolled physical activity occurs. The most unfavourable conditions are where you have a not so reliable electrical power supply and where you have limited control over the activities of other people working or moving about in the area where your unattended computer is situated.

At home for example I don't run on a UPS and though we have a reasonably stable power grid here (in East coast Australia) I personally consider the risk of leaving my computer running unattended, when a thunder storm could move in at any time and cause havoc on the grid, to be more risky than having to do a cold start the next morning.

I think the main component to be put under stress at start-up is the PSU (power supply unit), which is an easily replaced component whose failure is unlikely to result in data loss. The sort of risks I expose my computer to when leaving it unattended for long periods of time include power surges that could do real damage, or even just physical bumps and shock, eg earthquake, person (cleaner etc) or animal (if pets in the home) bumping, moving or even knocking over the tower. To me these risks are greater and potentially more damaging than the risk of wear and tear from one or two start-up's per day.
 
Last edited:
  • #20
dst said:
Completely incorrect, the whole point of the multitude of fans and heatsinks in a computer is to disperse as much heat generated by the components out of the case as possible.

But that is exactly the point I was making! We go to all that trouble of designing heat sinks and fans to disperse the heat because it does not disperse naturally. What is “completely incorrect” about that?

Now, if the pc was suspended in the center of the room, and the room was a completely closed system and we allowed sufficient time for the case to heat up so that the entire computer became a radiant body, it would disperse the same amount of heat into the room as the space heater, provided they have equal wattage. I have already alluded to that when I said they have the same amount of thermal energy. But in practice, no computers are suspended in the center of rooms in a closed system. The original question compares the heating capability of a normal computer in a normal environment with the heating capability of a space heater of equal wattage. In that case, the space heater wins because it is designed to transfer the thermal energy as heat into the environment by means of a directed flow of heated air and the computer is not. I am discussing this from the standpoint of applied technology and not pure theory as in a laboratory experiment. In general, I believe it is better to let things do what they were designed to do; let computers compute and heaters heat!
 
  • #21
Schroder, you didn't answer borek's question: if the heat that is generated isn't dissipated, doesn't that mean that the heat keeps building up in the computer?

You are confusing heat transfer efficiency with heat generation efficiency. Heat transfer efficiency is what makes a 500 W computer with good fans cooler than a 500 W computer with bad fans. But make no mistake: the heat generation (or "production") efficiency of both computers is 100%. All of that 500 W becomes heat and all of it makes it to the environment.

The way this looks in the real world is that when you first turn it on, the measurable heat output of a computer with bad fans is lower than the one with good fans. But eventually (rather quickly, actually), it heats up enough to increase the heat transfer rate, and the heat output is identical to the computer with good heat transfer efficiency. Then on the back end, when you turn the computer off, it stays warm longer and puts out more heat. So the overall heat output of the two computers is the same.
 
  • #22
Schroder, you didn't answer borek's question: if the heat that is generated isn't dissipated, doesn't that mean that the heat keeps building up in the computer?

I didn’t answer that because the answer is obvious. Of course it does not keep building up infinitely and it must go out into the environment. It is the path and the efficiency of that path which I was addressing.

You are confusing heat transfer efficiency with heat generation efficiency. Heat transfer efficiency is what makes a 500 W computer with good fans cooler than a 500 W computer with bad fans. But make no mistake: the heat generation (or "production") efficiency of both computers is 100%. All of that 500 W becomes heat and all of it makes it to the environment.

If you read over my posts, you will see that I have said several times that the amount of thermal energy generated will be exactly the same for each device providing the electrical energy consumed is the same. That has never been a point of confusion on my part. I am talking only about heat transfer, as that is what heat is; the transfer of energy. I have been talking of heat efficiency in terms of how it makes it into the environment. I have experience with computer simulations which are performed to account for hot spots, associated thermal stresses, equipment design and process efficiencies that can effect flow and geometry considerations in heat transfer. If it did not matter at all what type of object was radiating, convecting or conducting heat, none of that research would be necessary!
If you should ever design something which is to be used as a heater, you may want to consider flow distribution, header designs, fouling and the best placement of baffles and support systems. The computer sitting on a desk or table will create a hot spot on that desk and conduct into the desk rather than radiate into the space you want heated. Yes, if you have infinite time and a completely closed system, all of the heat will eventually be evenly distributed, but those are hardly “real world” conditions. Take that space heater which has been heating the room so well and turn it so it radiates right into the wall, and suddenly the room is mysteriously not being so well heated. Why is that, you wonder? But if you wait long enough you will have all the heat you want when the wall catches fire! Space heating is all about flow and direction as anything can and does create heat!
 
  • #23
schroder said:
It builds up in the components, in the circuit boards, the cabinet itself but is not entirely transferred out to the environment.

schroder said:
Of course it does not keep building up infinitely and it must go out into the environment.

Either either.
 
  • #24
Borek said:
Either either.

As long as you are going to quote me, you might add this one as well, which you somehow "missed":smile:

The computer sitting on a desk or table will create a hot spot on that desk and conduct into the desk rather than radiate into the space you want heated. Yes, if you have infinite time and a completely closed system, all of the heat will eventually be evenly distributed, but those are hardly “real world” conditions.

But hey, if you want to use your computer as a heater, please be my guest. While you are at it, here are some other suggested uses: door stop; paper weight; step ladder; block to place under your car; work out weight; the list is endless so enjoy!
 
  • #25
schroder said:
If you read over my posts, you will see that I have said several times that the amount of thermal energy generated will be exactly the same for each device providing the electrical energy consumed is the same. That has never been a point of confusion on my part. I am talking only about heat transfer, as that is what heat is; the transfer of energy. I have been talking of heat efficiency in terms of how it makes it into the environment. I have experience with computer simulations which are performed to account for hot spots, associated thermal stresses, equipment design and process efficiencies that can effect flow and geometry considerations in heat transfer. If it did not matter at all what type of object was radiating, convecting or conducting heat, none of that research would be necessary!

Next round :cool:

Ok, I can agree with you that if you put a heating device on an *external wall* with finite thermal conductivity, then you will loose more heat from the room than if you were dissipating the heat somewhere in the middle. The reason is that you've increased the temperature on the inside of the wall far above what it would normally be, and hence, you've increased the heat conduction through the wall because the temperature gradient is higher.

However, come back to your computer. I recon that a computer is in your sense even a slightly better heater than a resistive heater, simply because the air that leaves it is at a lower temperature (hence creates a less intense hot spot) than would a convection heater. The reason for that is that the computer manufacturer doesn't want the components in the computer to become as hot as the resistor in the convector, so he actually builds a better heat transfer system than your convector.
Your desk heating (slightly) through conduction is also not as much an issue as the convector becoming hot, because the low temperature gradient will 1) make for a very small thermal flow through the desk to the ground (much less so than the heat radiated in the ground by the relatively hot surface of your convector) 2) will mostly evacuate the small amount of heat so gathered directly to the surrounding air in the middle of the room, which is about the best heat transfer you can think off.

So we see that a desktop computer as a heater does an almost 100% transfer directly to the room air, somewhere in the middle (as such not increasing the temperature gradient in the outside walls), at a relatively low temperature (no hot spot, at least compared to the convector). When you take these points into consideration, a computer is an even better heater than a convector.
 
  • #26
schroder said:
I didn’t answer that because the answer is obvious. Of course it does not keep building up infinitely and it must go out into the environment. It is the path and the efficiency of that path which I was addressing.
Yes, the answer is obvious, but you're now contradicting yourself. If the heat makes it out to the environment it makes it out to the environment. That's not what you've been arguing. You said, that due to inefficiency some of the heat generated does not make it out to the environment. You can't have it both ways.
If you read over my posts, you will see that I have said several times that the amount of thermal energy generated will be exactly the same for each device providing the electrical energy consumed is the same. That has never been a point of confusion on my part. I am talking only about heat transfer, as that is what heat is; the transfer of energy. I have been talking of heat efficiency in terms of how it makes it into the environment.
Then you're missing what heat transfer efficiency is and you're still contradicting yourself. If 500 W is generated and 400 W of energy is transferred to the envrironment (for example), then that remaining 100 W must be staying in the device and making it hotter.
I have experience with computer simulations which are performed to account for hot spots, associated thermal stresses, equipment design and process efficiencies that can effect flow and geometry considerations in heat transfer. If it did not matter at all what type of object was radiating, convecting or conducting heat, none of that research would be necessary!
That's nice. I'm an HVAC engineer. And no one said heat transfer efficiency didn't matter, but if you are running computer simulations, you really aught to see how you're wrong here. Did you write the simulations yourself or are you using someone else's?
If you should ever design something which is to be used as a heater, you may want to consider flow distribution, header designs, fouling and the best placement of baffles and support systems.
And I do. It's my job. But that doesn't have anything to do with how much energy a computer dissipates to the environment, only where the computer dissipates it and where the a/c should be applied.
Yes, if you have infinite time and a completely closed system, all of the heat will eventually be evenly distributed, but those are hardly “real world” conditions.
It doesn't take infinite time - it only takes a few minutes for the computer to reach equilibrium. Please think about this - if it took an infinite time, your computer would stay hot for an infinite amount of time after you shut it down, slowly dissipating it's enormous built-up heat into the environment. But working with computers, you must know that a computer takes only a few minutes to reach its operating temperature and once it does, the heat transfer is in a steady-state - it does not continue to get hotter and hotter for a long period of time.
Take that space heater which has been heating the room so well and turn it so it radiates right into the wall, and suddenly the room is mysteriously not being so well heated. Why is that, you wonder? But if you wait long enough you will have all the heat you want when the wall catches fire! Space heating is all about flow and direction as anything can and does create heat!
You're contradicting yourself again. You did acknowledge before that a space heater is 100% efficient - are you now saying that a 1500 w space heater will send less than 1500 W of heat out if you point it at a wall? Or are you saying that (for example) 1000 W will go to heating the room and 500 W ill go to heating the wall (some of which is then lost to outside)?
 
Last edited:
  • #27
Lets try this with some numbers: What kind of efficiencies do you use when doing your calculations? Typically, I'd use around 300 W for the computer, not including the monitor. How much of that would you say is going into the room and how much is staying in the computer? Is 100 W staying in the computer?

Or maybe it would be easier to talk just about the processor. A processor, these days, generates around 100 W. How much of that 100 W goes to heating up the processor and heat sink and how much is dissipated into the case?

We can easily calculate how hot the computer should get over time with some simple assumptions about its weight and materials. Ie, if you put 30 W into a 100g block of aluminum (the heat sink), how much temperature will it gain per minute?
 
  • #28
vanesch said:
Next round :cool:

Ok, I can agree with you that if you put a heating device on an *external wall* with finite thermal conductivity, then you will loose more heat from the room than if you were dissipating the heat somewhere in the middle. The reason is that you've increased the temperature on the inside of the wall far above what it would normally be, and hence, you've increased the heat conduction through the wall because the temperature gradient is higher.

However, come back to your computer. I recon that a computer is in your sense even a slightly better heater than a resistive heater, simply because the air that leaves it is at a lower temperature (hence creates a less intense hot spot) than would a convection heater. The reason for that is that the computer manufacturer doesn't want the components in the computer to become as hot as the resistor in the convector, so he actually builds a better heat transfer system than your convector.
Your desk heating (slightly) through conduction is also not as much an issue as the convector becoming hot, because the low temperature gradient will 1) make for a very small thermal flow through the desk to the ground (much less so than the heat radiated in the ground by the relatively hot surface of your convector) 2) will mostly evacuate the small amount of heat so gathered directly to the surrounding air in the middle of the room, which is about the best heat transfer you can think off.

So we see that a desktop computer as a heater does an almost 100% transfer directly to the room air, somewhere in the middle (as such not increasing the temperature gradient in the outside walls), at a relatively low temperature (no hot spot, at least compared to the convector). When you take these points into consideration, a computer is an even better heater than a convector.

You make some good points and you might be right, but I think most respondents are missing the points I am making while dwelling on perceived contradictions. I never made any claim for anything to be 100% efficient, so that claim is false. And as far as infinite build up of heat, that is not something I claimed either. The only contradiction is in my use of the word “environment”. In one case I was including the external environment and in the other I was referring to the environment of the closed system. I apologize for the misunderstanding. I still believe that the computer will not transfer its thermal energy, in the form of heat, into the office environment as well as a well-designed space heater. The reason I give is path losses as in heating up the desk, the floor under the desk and ultimately to the outside world as well as uneven heating due to the geometry of the apparatus. However, I am not now in a position to “prove” my claims in a satisfactory way, unless of course, Scientific American would care to sponsor an experiment! In the meantime I will end my correspondence on this subject. Many thanks to all for the interesting discussion!
 
  • #29
zacharyr said:
If it's a cold winter night and my office keeps the thermostat set to 65°F, does it really waste any power if I leave my computer running with the monitor off?

schroder said:
I still believe that the computer will not transfer its thermal energy, in the form of heat, into the office environment as well as a well-designed space heater. The reason I give is path losses as in heating up the desk, the floor under the desk and ultimately to the outside world as well as uneven heating due to the geometry of the apparatus.

schroder said:
Do you want a can of red paint, or your room painted red?

zacharyr, I'm glad I haven't gotten your question on a test (yet)! It seems like it could depend on how cold the winter is, how long the night is, where you put your computer, where the heater is, and where the thermostat of the heater is (is steady state reached?).

schroder makes thoughtful points and a great analogy! I once left my computer running, but put it in a drawer because I ran out of space on my desk. I left the drawer half open, so the air could circulate and cool my computer. That was a mistake, the computer was way hotter than the room the next morning, something that had never happened when I left it on my desk. Good thing it still worked!
 
  • #30
gmax137 said:
I can see Borek's point, with the following caveat - maybe the PC & monitor draws 500 watts (nameplate) but as they heat up, in an idle condition, maybe they draw less current. Maybe.

The reason all electric heaters don't look like computers is because a resistance element (ie, some wire) is cheaper than transformers and chips and hard drives and cathode ray tubes... The heaters designed to be heaters are "more efficient" in the engineering sense that they are cheaper per watt delivered to the surrounding atmosphere than a computer monitor. Also, they are generally designed to deliver more power (a small space heater at 25,000 Btu/hr is delivering about 7 kW - much more than the monitor).

Not to pick on gmax, but I was just browsing around here seeing a lot of numbers being thrown around that just don't make any sense and thought I'd register and contribute my $.02. :)

For starters, a 500watt PSU on a computer does not mean it's converting 500watts AC to 500watts DC 24/7 if it's left on. That number is its maximum it can potentially convert. Even then you'll only have about 400watts to work with at most inside the PC. It's still pretty rare to see a power supply with an efficiency rating higher than 80% but with groups like http://80plus.org/ things are looking up. You lose 20% or more to heat immediately just from switching.

Even so, it's typical for a PC under moderate load to be using less than than 200 watts. Very few PCs use 500w or more for more than a short period of time, ie during playing a game that keeps both the CPU and GPU maxed out and drawing as much as possible.

A small space heater... very small... would be in the 500w range, but your typical one will be about 1500w. So of course some people will say a heater is a better heater than a computer when they're looking at a ratio of 7.5:1.

"a small space heater at 25,000 Btu/hr is delivering about 7 kW"
You'd be hard pressed to find one that size here (in the US) and it be considered small. That's nearly 30amps @ 240volts or about 4 times the power the average circuit (15A, 110-125V is standard) in a new home can handle.
 
  • #31
The actual power consumption of a computer was covered nicely in the post just previous to this one. Schroder, give it up. ALL of the power consumed by a computer finds its way into the room sooner then later.

I am a working technician and have been for over 30yrs. In that time I have maintained and repaired a large variety of electronic and electro-mechanical devices. It is pretty universal that the best mode is for a electronics device to be left running. Failures and trouble comes upon start up. If you are a bit observant you can see this in even the simplest of electric devices.. A light bulb. How frequently have you observed a bulb to go out after it has been operating and at a constant temperature compare that to how frequently you see a bulb flash and go out on power up. The same mechanism that causes a light bulb to fail on start up is present in EVERY electronics device. Every time your computer is turned on every component must come to operating temperature, while this is happening they are expanding putting stress on all mechanical connections between differing materials. This includes every one of the hundreds of solder connections on your motherboard, video card, and power supply.

This said, it is still a reasonable risk to shut your computer down when looking at some hours of idle time. Just how many hours I really cannot say, 8-10hrs seems like a reasonable SWAG number. I do not think it wise to power cycle a computer multiple times a day, that seems to be encouraging the random failure of one of the many solder connections in the box.

The power consumption of an idle computer will vary with the computer. It is very hard to say since it is determined by the various power saving settings in a Winblows machine.
 
  • #32
Integral said:
If you are a bit observant you can see this in even the simplest of electric devices.. A light bulb. How frequently have you observed a bulb to go out after it has been operating and at a constant temperature compare that to how frequently you see a bulb flash and go out on power up.
How frequently have hospital staff observed babies being born and increases in hospital emerg room activity during a full moon. Compare that to how frequently staff observe babies being born and spikes in hospital emerg room activity during a waning gibbous.

Your and my example are both textbook cases of selective recall.


Not that I'm refuting your entire argument, just the DIY example you cite.
 
Last edited:
  • #33
Integral said:
The actual power consumption of a computer was covered nicely in the post just previous to this one. Schroder, give it up. ALL of the power consumed by a computer finds its way into the room sooner then later.

OK. :smile: The heat transferred by a pc and by a space heater, which are each consuming the exact same amount of electrical power from the grid, must be the same, for all practical purposes. Any differences in the path for the heat from the pc can be offset by any number of differences in the path for the heat for the heater, and are far too complex to be calculated and certainly much too small to be measured.
I have thought of another question in relation to this, but I do not seriously pose it as an argument for a difference in heat transfer, only as a point of interest for research. I think the best science comes about from focusing on these “minor” differences, rather than focusing on the overall similarities. By its very nature, the pc represents an ordered system. From the cpu to the buss and the registers, disk storage, everything is clocked and organized, which is in stark contrast to the heater which is just churning out random, chaotic heat. The heater is perhaps the most efficient entropy generator there is. The pc is certainly generating entropy, and in light of what I said earlier, for all practical purposes just as much as the heater. But, with such a great difference in the way these two “machines” operate, is it totally unreasonable to think that on some microscopic level, the entropy of the pc is less than that of the heater? I think that is an interesting concept to look into, although it does go far beyond the intent of the original question.
 

1. Does turning my computer off frequently damage it?

No, turning your computer off frequently does not damage it. In fact, it is recommended to turn off your computer when it is not in use to save energy and extend its lifespan.

2. Will turning my computer off help save energy?

Yes, turning your computer off when it is not in use will help save energy. Computers use a significant amount of energy, even when in sleep mode, so turning it off completely can make a difference in your energy consumption.

3. Is it better to put my computer in sleep mode or turn it off completely?

It is better to turn your computer off completely when it is not in use. Sleep mode still uses some energy and can cause your computer to run slower when you turn it back on. Turning it off completely will also prevent any potential software or hardware issues.

4. Will turning my computer off every night make it run faster?

Turning your computer off every night will not necessarily make it run faster. However, it can help prevent any software or hardware issues that may slow down your computer over time. Regularly turning off your computer can also help clear its memory and improve its performance.

5. Can turning my computer off and on frequently cause damage?

No, turning your computer off and on frequently will not cause damage. Computers are designed to handle frequent on and off cycles. However, it is important to properly shut down your computer before turning it off to avoid any potential data loss or damage to the system.

Similar threads

Replies
4
Views
816
  • Thermodynamics
Replies
8
Views
1K
Replies
7
Views
2K
  • Programming and Computer Science
Replies
29
Views
3K
Replies
16
Views
20K
Replies
4
Views
2K
  • DIY Projects
Replies
21
Views
5K
  • Earth Sciences
Replies
20
Views
6K
  • Computing and Technology
Replies
10
Views
2K
Back
Top