1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Energy for computation

  1. Jul 29, 2010 #1
    Hi all,

    This question has been intriguing me for a long time and now that I discovered this forum I thought it is a good place to ask.

    Take a typical CPU. To operate it draws energy from the power supply, does some computation and emits heat. Since state of energy can only be transformed and not created or lost, are the "interesting computational results" somewhere in this equation or does emitted heat equate to input energy and calculations are just an useful side effect?

    I am not a physicist, but If they are a side effect that would really be interesting to think about. Would it imply all the answers are already present in the original data, and we need to spend energy just to shuffle it around in order to get our answer. Wouldn't it also imply that every computation can be optimised so much that it is not needed at all? (spend zero energy to get the answer)

    On the other hand, if computation result is not just a side effect but needs to be in the energy transfer equation, what is the relation between computed results and energy invested? How it is measured and expressed?
     
  2. jcsd
  3. Jul 29, 2010 #2

    Cleonis

    User Avatar
    Gold Member

    To my knowledge: the heat that is generated is from the fact that a lot of current is running through the CPU. The cycles of the clock frequency propagate through the CPU. Transistors are switching back and forth between states of current running through them and no current running. It all adds up to a lot of resistence to current, so a lot of heat is generated.

    Taking the wider context of your question, I think I agree with you on the following point: there are intruiguing parallels between information and entropy. It seems to me that even a 100% efficient computer would require energy to do it's information processing.

    Perhaps DNA computing offers clues as to energy costs of the information processing itself.
     
  4. Jul 29, 2010 #3
    I'm not sure about this, but...
    Seems to me that the CPU is simply reading a string of zeroes and ones (the 'input') and writing another string ('output'). any interpretation of those strings takes place in YOUR mind, not in the CPU. The CPU can't tell that, when displayed, that string of output (1001011000100.....etc) represents "War and Peace," the prime factors of the input, or the Mona Lisa, or any other interesting results.

    I think the energy associated with the interpretation comes from the Wheaties you ate this morning.

    Just my 2 cents.
     
  5. Jul 29, 2010 #4
    more than that; entropy is a subset of information theory.


    Surprisingly, quantum theory tells us it is only the erasure (elimination) of a bit that takes energy....that's the only action that releases heat..it's called Landauer's principle and is counterintutive (like much of physics). The dissipation of stored information is the same as increasing entropy. Turns out the reversible operations don't increase entropy, irreversible ones do.

    Source: DECODING THE UNIVERSE, CHARLES SEIFE, PAGES 81-85, 2006

    also try : http://en.wikipedia.org/wiki/Landauer's_principle
     
    Last edited: Jul 29, 2010
  6. Jul 29, 2010 #5
    Interesting but I think you have just shifted my question from computer processing to brain processing and that my original question still holds. In case of brain processing, is the chemical energy inputted equal to heat dissipated, or is there another component in the output which describes the useful work without which the equation is not balanced?
     
  7. Jul 29, 2010 #6
    I thought mine was a quite innocent question, but now it seems I managed to get myself into some rather complicated theories. :I

    But from what I read on this link, and on one that followed there about reversible computing, it seems like my original thinking was justified. How I understand it, is that it may be possible to carry out computations with much less and possibly even zero energy. As long as the algorithm is reversible, it should be possible, if I read this theory correctly.

    I still do not understand how energy transfer for computation works today, is all input energy dissipated as heat, and "useful work" cannot really be quantified. I do not yet understand this entropy business so I will be reading more on that..
     
    Last edited: Jul 29, 2010
  8. Jul 29, 2010 #7
    http://en.wikipedia.org/wiki/Random-access_memory



    In general, solid state devices which form a lot of memory in use today, use electrons moving around....they are opposed by resistance...so each electron moved encounters that a creates a tiny amount of heat....the power consumed is IE...or I2R (same thing)......
    how EXACTLY this relates to Launder's principle I don't understand.
     
  9. Jul 29, 2010 #8

    K^2

    User Avatar
    Science Advisor

    Note that heat must be produced by a processor. Even if you were to make the whole thing superconducting, it is inevitable. There are very fundamental principles of thermodynamics involved.

    But yes, 100% of power consumed by CPU becomes either heat or EM radiation. There is some energy of "state" of CPU which constantly varies, but it is tiny compared to everything else, and it's not something that constantly consumes power. It may store some or release some.
     
  10. Jul 30, 2010 #9
    So it is true that calculations can not be quantified in the energy equation. Doesn't this imply it is possible to do the same calculations with approaching to zero energy, or even more extreme, that calculations are not needed at all ie. all possible results are simultaneously present in the original data and it is just a matter of knowing how to read them?
     
  11. Jul 30, 2010 #10
    As I recall, you expend energy when "erasing" information. Ideas along the lines of "reversible computation" allow you to derive the result that you can expend arbitrarily little energy to perform a computation, but the less energy you expend the slower it gets.

    A system that is fully reversible and runs forwards or backwards depending on random motion of the environment, without any way of biasing the direction, will evolve toward the solution in the manner of a random walk. In order to have it move N steps in the forward direction, you must want longer and longer for that random eventuality.
     
  12. Jul 30, 2010 #11
    In general, solid state devices which form a lot of memory in use today, use electrons moving around....they are opposed by resistance...so each electron moved encounters that and creates a tiny amount of heat....the power consumed is IE...or I2R (same thing)......
    how EXACTLY this relates to Launder's principle I don't understand.[/QUOTE]

     
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook