Is Computation Merely a Side Effect of Energy Transformation in CPUs?

AI Thread Summary
The discussion centers on whether computation is merely a side effect of energy transformation in CPUs, with participants exploring the relationship between energy input, heat output, and computational results. It is noted that while CPUs generate heat due to electrical resistance, the actual computation may not directly correlate with energy consumption, raising questions about the efficiency of information processing. Concepts like Landauer's principle suggest that only the erasure of information requires energy, implying that reversible computations could potentially minimize energy expenditure. Participants also ponder whether all computational answers exist inherently in the data, leading to the idea that energy might only be needed to manipulate this information. Overall, the conversation highlights the complex interplay between computation, energy, and entropy in computing systems.
k354
Messages
7
Reaction score
0
Hi all,

This question has been intriguing me for a long time and now that I discovered this forum I thought it is a good place to ask.

Take a typical CPU. To operate it draws energy from the power supply, does some computation and emits heat. Since state of energy can only be transformed and not created or lost, are the "interesting computational results" somewhere in this equation or does emitted heat equate to input energy and calculations are just an useful side effect?

I am not a physicist, but If they are a side effect that would really be interesting to think about. Would it imply all the answers are already present in the original data, and we need to spend energy just to shuffle it around in order to get our answer. Wouldn't it also imply that every computation can be optimised so much that it is not needed at all? (spend zero energy to get the answer)

On the other hand, if computation result is not just a side effect but needs to be in the energy transfer equation, what is the relation between computed results and energy invested? How it is measured and expressed?
 
Physics news on Phys.org
k354 said:
Take a typical CPU.
[...] does emitted heat equate to input energy and calculations are just an useful side effect?

To my knowledge: the heat that is generated is from the fact that a lot of current is running through the CPU. The cycles of the clock frequency propagate through the CPU. Transistors are switching back and forth between states of current running through them and no current running. It all adds up to a lot of resistence to current, so a lot of heat is generated.

Taking the wider context of your question, I think I agree with you on the following point: there are intruiguing parallels between information and entropy. It seems to me that even a 100% efficient computer would require energy to do it's information processing.

Perhaps DNA computing offers clues as to energy costs of the information processing itself.
 
I'm not sure about this, but...
Seems to me that the CPU is simply reading a string of zeroes and ones (the 'input') and writing another string ('output'). any interpretation of those strings takes place in YOUR mind, not in the CPU. The CPU can't tell that, when displayed, that string of output (1001011000100...etc) represents "War and Peace," the prime factors of the input, or the Mona Lisa, or any other interesting results.

I think the energy associated with the interpretation comes from the Wheaties you ate this morning.

Just my 2 cents.
 
there are intruiguing parallels between information and entropy

more than that; entropy is a subset of information theory.

It seems to me that even a 100% efficient computer would require energy to do it's information processing.


Surprisingly, quantum theory tells us it is only the erasure (elimination) of a bit that takes energy...that's the only action that releases heat..it's called Landauer's principle and is counterintutive (like much of physics). The dissipation of stored information is the same as increasing entropy. Turns out the reversible operations don't increase entropy, irreversible ones do.

Source: DECODING THE UNIVERSE, CHARLES SEIFE, PAGES 81-85, 2006

also try : http://en.wikipedia.org/wiki/Landauer's_principle
 
Last edited:
gmax137 said:
I'm not sure about this, but...
Seems to me that the CPU is simply reading a string of zeroes and ones (the 'input') and writing another string ('output'). any interpretation of those strings takes place in YOUR mind, not in the CPU. The CPU can't tell that, when displayed, that string of output (1001011000100...etc) represents "War and Peace," the prime factors of the input, or the Mona Lisa, or any other interesting results.

I think the energy associated with the interpretation comes from the Wheaties you ate this morning.

Just my 2 cents.

Interesting but I think you have just shifted my question from computer processing to brain processing and that my original question still holds. In case of brain processing, is the chemical energy inputted equal to heat dissipated, or is there another component in the output which describes the useful work without which the equation is not balanced?
 
Naty1 said:
more than that; entropy is a subset of information theory.

Surprisingly, quantum theory tells us it is only the erasure (elimination) of a bit that takes energy...that's the only action that releases heat..it's called Landauer's principle is is counterintutive (like much of physics). The dissipation of stored information is the same as increasing entropy. Turns out the reversible operations don't increase entropy, irreversible ones do.

Source: DECODING THE UNIVERSE, CHARLES SEIFE, PAGES 81-85, 2006

also try : http://en.wikipedia.org/wiki/Landauer's_principle

I thought mine was a quite innocent question, but now it seems I managed to get myself into some rather complicated theories. :I

But from what I read on this link, and on one that followed there about reversible computing, it seems like my original thinking was justified. How I understand it, is that it may be possible to carry out computations with much less and possibly even zero energy. As long as the algorithm is reversible, it should be possible, if I read this theory correctly.

I still do not understand how energy transfer for computation works today, is all input energy dissipated as heat, and "useful work" cannot really be quantified. I do not yet understand this entropy business so I will be reading more on that..
 
Last edited:
I still do not understand how energy transfer for computation works today, is all input energy dissipated as heat, and "useful work" cannot really be quantified

http://en.wikipedia.org/wiki/Random-access_memory



“First of all, as chip geometries shrink and clock frequencies rise, the transistor leakage current increases, leading to excess power consumption and heat...

In general, solid state devices which form a lot of memory in use today, use electrons moving around...they are opposed by resistance...so each electron moved encounters that a creates a tiny amount of heat...the power consumed is IE...or I2R (same thing)...
how EXACTLY this relates to Launder's principle I don't understand.
 
Note that heat must be produced by a processor. Even if you were to make the whole thing superconducting, it is inevitable. There are very fundamental principles of thermodynamics involved.

But yes, 100% of power consumed by CPU becomes either heat or EM radiation. There is some energy of "state" of CPU which constantly varies, but it is tiny compared to everything else, and it's not something that constantly consumes power. It may store some or release some.
 
K^2 said:
Note that heat must be produced by a processor. Even if you were to make the whole thing superconducting, it is inevitable. There are very fundamental principles of thermodynamics involved.

But yes, 100% of power consumed by CPU becomes either heat or EM radiation. There is some energy of "state" of CPU which constantly varies, but it is tiny compared to everything else, and it's not something that constantly consumes power. It may store some or release some.

So it is true that calculations can not be quantified in the energy equation. Doesn't this imply it is possible to do the same calculations with approaching to zero energy, or even more extreme, that calculations are not needed at all ie. all possible results are simultaneously present in the original data and it is just a matter of knowing how to read them?
 
  • #10
k354 said:
I am not a physicist, but If they are a side effect that would really be interesting to think about. Would it imply all the answers are already present in the original data, and we need to spend energy just to shuffle it around in order to get our answer. Wouldn't it also imply that every computation can be optimized so much that it is not needed at all? (spend zero energy to get the answer)

As I recall, you expend energy when "erasing" information. Ideas along the lines of "reversible computation" allow you to derive the result that you can expend arbitrarily little energy to perform a computation, but the less energy you expend the slower it gets.

A system that is fully reversible and runs forwards or backwards depending on random motion of the environment, without any way of biasing the direction, will evolve toward the solution in the manner of a random walk. In order to have it move N steps in the forward direction, you must want longer and longer for that random eventuality.
 
  • #11
In general, solid state devices which form a lot of memory in use today, use electrons moving around...they are opposed by resistance...so each electron moved encounters that and creates a tiny amount of heat...the power consumed is IE...or I2R (same thing)...
how EXACTLY this relates to Launder's principle I don't understand.[/QUOTE]

Naty1 said:
 
Back
Top