Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Does it matter if I turn my computer off?

  1. Sep 3, 2008 #1
    If it's a cold winter night and my office keeps the thermostat set to 65°F, does it really waste any power if I leave my computer running with the monitor off? Isn't my computer just helping to heat the office and taking away load from the heater?

    Bonus question: what if my computer has a giant spinning hammer that turns whenever the processor is running? Isn't all that energy (the heat from the motor, the air turbulence around the hammer, any sound from the hammer) eventually just going to heat?

    Bonus bonus question: What if I leave my monitor on?
     
  2. jcsd
  3. Sep 4, 2008 #2

    uart

    User Avatar
    Science Advisor

    If your office is being heated all night when there's no one there then that's wasting power already and in this case it is true that leaving your computer on will help heat the office and therefore reduce the power requirement of your room (space) heating. Whether this is neutral to the overall energy requirement of the office depends on the eficiency of the space heating. If it's reverse-cycle air condidtioning then its a bit more efficient than the direct heating of you computer and so there's still some energy deficit.

    From an energy point of veiw the situation for the monitor is no different to the computer, almost all the energy coming from the monitor is in the form of heat and any light will be converted to heat through in the walls of the office or objects in the room anyway.
     
  4. Sep 4, 2008 #3
    In addition to the relative heating efficiency, the total enegy use would also depend on how well-mixed the office atmosphere is, and the relative locations of your computer / monitor and the thermostat. In the extreme case where the thermostat is located right next to the monitor, leaving the monitor on could "trick" the heat into staying off...
     
  5. Sep 6, 2008 #4
    First of all, you might consider leaving your computer running simply for electronic reasons, that is to say, electrical equipment undergoes the most stress when it is being turned on and off, and you reduce wear on your equipment by leaving it running. At least, this is what I've heard, though to be honest, I haven't really investigated that question, which is rather interesting, in and of itself.

    From an energy standpoint, it IS wasteful to leave your computer running. It is even more wasteful to attach your hammer gobjabber or a parachute or whatever the hell crazy contraption you have hooked up (where do you work, anyways buddy, swinging hammers incorporated?) It is also more wasteful to leave your monitor running. Why? Because anytime you employ a mechanical or electrical device, you are dealing with inefficiencies of energy conversion; no engineering device (so far as I know of) is 100% efficient, and most are much less efficient than 100%, although there are certain devices, such as electrical transformers, which have high efficiencies. Yes, the computer and monitor and whatever else will be generating some heat transfer into the environment, thus reducing the need for the activation of the primary office heater. However, a lot of that energy is also going towards powering the hard drive, moving those electrons about the circuits, overcoming the friction of your hammer-drive apparatus, and so on and so forth.

    Thus, the real question is: Which device makes for a more efficient heater, (a.) your computer setup, or (b.) the primary office heater? If it is (b.), you should deactivate (a.). On the astounding basis that your computer setup makes for a more efficient heat transfer device, then you SHOULD leave your computer running. This is extremely unlikely, considering the kind of "design-intentions" of the relative devices.
     
  6. Sep 6, 2008 #5

    russ_watters

    User Avatar

    Staff: Mentor

    First question already answered....
    Yes.
    What about it? Same answers as above.
     
  7. Sep 6, 2008 #6

    russ_watters

    User Avatar

    Staff: Mentor

    That is unlikely to be what kills a computer, given that they are typically discarded after a relatively short lifespan due to obsolescence. Consider that people sometimes keep TVs for more than a decade, turning them on and off several times a day and they don't burn out.
    As others said, that really does depend on if the room the computer is in is being heated (and how it is being heated). If the room is being heated by regular electric resistance heating, then it is not wasting anything at all. If it is being heated by gas (the worst case), you are wasting about 30%-40% of the energy and probably spending double the money.
    I'm pretty sure that that was a hypothetical for the purpose of understanding if mechanical energy in a closed room ends up as heat.
    And that's why he asked the question - to answer that question. And you have it wrong: all those inefficiencies, the resistance in the wires, the friction in the hard drive, etc, are where the heat comes from. Added together, those inefficiencies total 100% of the energy use of the computer. Ie: all of the energy used by the computer becomes heat.
    Unfortunately, electric resistance heating is still a pretty common heating method for office buildings that are more than a few years old. So it is quite possible that his office building has it.

    Heck, some office buildings don't even have heat for the interior zones. The computers and lights provide more than is needed to keep the building warm in the winter (they use some form of air conditioning year-round).
     
  8. Sep 8, 2008 #7
    Sorry, I disagree. If 100% of the energy use of the computer went to inefficiencies, the computer wouldn't have any energy going into transistor operation, hard drive operation, and so on. Not all of the energy goes into heat. Most of the energy goes into the motion of electrons and physical apparati. Now, if you want to make the claim that all motion reduces down to heat energy transfer, you can go ahead and make that claim, however, that's degenerating into definitional or symantic argumentation.

    All of the energy used by the computer does not become heat; if that were true, you would have a perfectly efficient "energy to heat" conversion source, which again, violates the concept of engineering inefficiency.
     
  9. Sep 8, 2008 #8

    russ_watters

    User Avatar

    Staff: Mentor

    Not true. Transistors are little switches. When they switch, they dissipate heat - that's waste heat. But they are electronic switches, not mechanical, so there is no other energy besides the electrical and heat energy. But electrical energy doesn't leave the computer, only heat energy leaves the computer.

    And hard drives dissipate heat due to friction. That's waste heat too. They don't do anything mechanical that stores or otherwise transferrs mechanical energy: there is no energy output besides heat energy.
    That's all waste heat.
    It is true, and that is not a violation of any law of thermodynamics. In fact, that's the 2nd law in action.

    Try it from another angle: what does a hard drive do that uses energy but does not generate heat? Ie, what do you mean by "hard drive operation"? Inputs must equal outputs. We have electrical energy input and heat output. What's the other output(s)?
     
  10. Sep 8, 2008 #9

    OmCheeto

    User Avatar
    Gold Member

    Ha! I had this discussion at my old forum. Though I may have been talking to myself. I do that alot.

    My solution was to give all of the 750 watt old pc's to old people who needed heat in the winter, running SETI software, thereby making the cpu's use lots of energy and keeping our old folk semi-warm.

    My new macbook, unfortunately, only consumes about 30 watts of energy. About the same amount of energy I'm contributing to the atmosphere by sitting here typing this answer. I can't heat my house by sitting there, and neither can my new pc. But the energy is added to your environment in the winter, so like incandescent bulbs, unwrapped water heaters, and the like; they all contribute to keeping you comfortable.

    As for monitors, my monitor at work has a "very" warm surface temperature. I actually turned it off this summer when I left my desk when the thermometer reached 90'f and above. My macbook screen is cool to the touch, so I'd say; "It all depends on what type of monitor you have".

    In conclusion, I'd say run everything during the winter, and keep the thermostat at 62°F.
     
  11. Sep 8, 2008 #10

    DaveC426913

    User Avatar
    Gold Member

    Or, at the very least, where the vent holes are. :rolleyes:
     
  12. Sep 9, 2008 #11

    vanesch

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    Russ has it right. What actually happens is that you do something useful *before* the energy is finally turned into heat in a computer, and in a heater, you directly turn it into heat. After all, there's conservation of energy, and all the electrical energy that went into the computer, where did it go ? Do you think that a text file on a disk stores a lot of energy ?

    Engineering inefficiency is usually considered when one thinks of non-heat energy applications, such as mechanical energy from a motor. Then you put in so much energy, and you only get out so much mechanical energy. This mechanical energy can end up as stored potential energy (lifting weights for instance), or... as heat (friction, air resistance).

    Thermodynamically, the internal energy increase equals dU = dQ + dW, where dQ is the received heat, and dW is the received mechanical/electrical energy.

    Your computer has received a certain amount of electrical energy dW (during 10 hours working). Now, it didn't change by much its internal energy, right ? (ok, there's a very very tiny amount of energy that IS actually stored on your hard disk). So we have:
    dU = 0 = dQ + dW. So we have that dQ = - dW. Your computer took up, as heat, the NEGATIVE of dW, in other words, it gave off heat equal to dW. That's what Russ is saying. Exactly the same applies for a resistive heater.

    But your reaction is understandable. I couldn't convince my brother in law, who is a psychiatrist, either.
     
  13. Sep 9, 2008 #12

    Borek

    User Avatar

    Staff: Mentor

    My neighbor has a PhD in chemistry and I can't convince him either...
     
  14. Sep 9, 2008 #13
    While the computer does convert the electrical energy into thermal energy, it does not transfer that thermal energy to the environment as well as a heater, because that is not what it was designed to do! Heat is not a form of energy; it is a transfer of energy. Theoretically, a computer which is comsuming 500 Watts of electricl energy is producing as much thermal energy as a 500 Watt space heater. However, it is not designed to transfer that thermal energy as efficiently as the space heater, so it is producing less heat.

    First bonus question: Yes it is, but the same considerations apply here as in the first question; a hammer swinging through space does not transfer the same amont of heat to the environment. The key words here are heat transfer, which surprisingly so far no one has touched on.

    Bonus-bonus: For the most part, this makes no difference except now some of the energy is converted to light. Light, contrary to popular notion, contains exactly no heat! If the light strikes an light absorbant surface, heat will be generated but in general less heat will be transferred to the atmosphere than before. It is theoreticall possible (not likely) that light can bounce around forever without transferring any heat. So leaving the monitor on does contribute to the overall heat but with a loss of efficiency.

    In general, if you have a 500 Watt space heater and a 500 Watt computer (including monitor) the space heater will produce more heat because that is what it is designed to do!
     
  15. Sep 9, 2008 #14

    Borek

    User Avatar

    Staff: Mentor

    Where does the rest of the energy go?
     
  16. Sep 9, 2008 #15
    If all the thermal energy from the cpu, for example, were naturally radiated into the environment as heat, we would not need to design heat sinks and cooling fans. But we know the heat sinks and fans are necessary to keep the cpu from cooking itself. Many of the other components can handle the thermal energy build up so there is no need to provide a more efficient way to transfer the heat away from them. The thermal energy for those components builds up as heat which is largely self-contained in those components and not radiated into the environment. It builds up in the components, in the circuit boards, the cabinet itself but is not entirely transferred out to the environment. Once again, space heaters are designed for the specific purpose of transferring thermal energy as heat and use air circulation over heated fins to do this. You can think of the computer as a container of heat and the space heater as a radiator of heat into the external environment. Do you want a can of red paint, or your room painted red?
     
  17. Sep 9, 2008 #16

    Borek

    User Avatar

    Staff: Mentor

    So it builds up ad infinitum? And these elements are getting hotter and hotter as time passes by?
     
  18. Sep 9, 2008 #17
    I can see Borek's point, with the following caveat - maybe the PC & monitor draws 500 watts (nameplate) but as they heat up, in an idle condition, maybe they draw less current. Maybe.

    The reason all electric heaters dont look like computers is because a resistance element (ie, some wire) is cheaper than transformers and chips and hard drives and cathode ray tubes... The heaters designed to be heaters are "more efficient" in the engineering sense that they are cheaper per watt delivered to the surrounding atmosphere than a computer monitor. Also, they are generally designed to deliver more power (a small space heater at 25,000 Btu/hr is delivering about 7 kW - much more than the monitor).
     
  19. Sep 9, 2008 #18

    dst

    User Avatar


    Completely incorrect, the whole point of the multitude of fans and heatsinks in a computer is to disperse as much heat generated by the components out of the case as possible.
     
  20. Sep 9, 2008 #19

    uart

    User Avatar
    Science Advisor

    I'm not even going to touch the misconceptions about heat and power dissipation, I think they've been addressed already. But I will comment on the wear and tear issue.

    I've also heard this claim repeated many times but I've never actually seen any controlled studies on the subject. My gut feeling is that under the best case conditions leaving the computer running will make some parts fail sooner and some last longer, but under less favourable conditions that leaving the computer running unattended is a definite risk.

    Essentially the favourable conditions are where you have a very reliable UPS and a clean air conditioned environment where little or no unforseen or uncontrolled physical activity occurs. The most unfavourable conditions are where you have a not so reliable electrical power supply and where you have limited control over the activities of other people working or moving about in the area where your unattended computer is situated.

    At home for example I dont run on a UPS and though we have a reasonably stable power grid here (in East coast Australia) I personally consider the risk of leaving my computer running unattended, when a thunder storm could move in at any time and cause havoc on the grid, to be more risky than having to do a cold start the next morning.

    I think the main component to be put under stress at start-up is the PSU (power supply unit), which is an easily replaced component whose failure is unlikely to result in data loss. The sort of risks I expose my computer to when leaving it unattended for long periods of time include power surges that could do real damage, or even just physical bumps and shock, eg earthquake, person (cleaner etc) or animal (if pets in the home) bumping, moving or even knocking over the tower. To me these risks are greater and potentially more damaging than the risk of wear and tear from one or two start-up's per day.
     
    Last edited: Sep 9, 2008
  21. Sep 9, 2008 #20
    But that is exactly the point I was making! We go to all that trouble of designing heat sinks and fans to disperse the heat because it does not disperse naturally. What is “completely incorrect” about that?

    Now, if the pc was suspended in the center of the room, and the room was a completely closed system and we allowed sufficient time for the case to heat up so that the entire computer became a radiant body, it would disperse the same amount of heat into the room as the space heater, provided they have equal wattage. I have already alluded to that when I said they have the same amount of thermal energy. But in practice, no computers are suspended in the center of rooms in a closed system. The original question compares the heating capability of a normal computer in a normal environment with the heating capability of a space heater of equal wattage. In that case, the space heater wins because it is designed to transfer the thermal energy as heat into the environment by means of a directed flow of heated air and the computer is not. I am discussing this from the standpoint of applied technology and not pure theory as in a laboratory experiment. In general, I believe it is better to let things do what they were designed to do; let computers compute and heaters heat!
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?