Is Computation Merely a Side Effect of Energy Transformation in CPUs?

In summary, the CPU is just reading a string of zeroes and ones (the 'input') and writing another string ('output'). Any interpretation of those strings takes place in YOUR mind, not in the CPU. The CPU can't tell that, when displayed, that string of output (1001011000100...etc) represents "War and Peace," the prime factors of the input, or the Mona Lisa, or any other interesting results.
  • #1
k354
7
0
Hi all,

This question has been intriguing me for a long time and now that I discovered this forum I thought it is a good place to ask.

Take a typical CPU. To operate it draws energy from the power supply, does some computation and emits heat. Since state of energy can only be transformed and not created or lost, are the "interesting computational results" somewhere in this equation or does emitted heat equate to input energy and calculations are just an useful side effect?

I am not a physicist, but If they are a side effect that would really be interesting to think about. Would it imply all the answers are already present in the original data, and we need to spend energy just to shuffle it around in order to get our answer. Wouldn't it also imply that every computation can be optimised so much that it is not needed at all? (spend zero energy to get the answer)

On the other hand, if computation result is not just a side effect but needs to be in the energy transfer equation, what is the relation between computed results and energy invested? How it is measured and expressed?
 
Physics news on Phys.org
  • #2
k354 said:
Take a typical CPU.
[...] does emitted heat equate to input energy and calculations are just an useful side effect?

To my knowledge: the heat that is generated is from the fact that a lot of current is running through the CPU. The cycles of the clock frequency propagate through the CPU. Transistors are switching back and forth between states of current running through them and no current running. It all adds up to a lot of resistence to current, so a lot of heat is generated.

Taking the wider context of your question, I think I agree with you on the following point: there are intruiguing parallels between information and entropy. It seems to me that even a 100% efficient computer would require energy to do it's information processing.

Perhaps DNA computing offers clues as to energy costs of the information processing itself.
 
  • #3
I'm not sure about this, but...
Seems to me that the CPU is simply reading a string of zeroes and ones (the 'input') and writing another string ('output'). any interpretation of those strings takes place in YOUR mind, not in the CPU. The CPU can't tell that, when displayed, that string of output (1001011000100...etc) represents "War and Peace," the prime factors of the input, or the Mona Lisa, or any other interesting results.

I think the energy associated with the interpretation comes from the Wheaties you ate this morning.

Just my 2 cents.
 
  • #4
there are intruiguing parallels between information and entropy

more than that; entropy is a subset of information theory.

It seems to me that even a 100% efficient computer would require energy to do it's information processing.


Surprisingly, quantum theory tells us it is only the erasure (elimination) of a bit that takes energy...that's the only action that releases heat..it's called Landauer's principle and is counterintutive (like much of physics). The dissipation of stored information is the same as increasing entropy. Turns out the reversible operations don't increase entropy, irreversible ones do.

Source: DECODING THE UNIVERSE, CHARLES SEIFE, PAGES 81-85, 2006

also try : http://en.wikipedia.org/wiki/Landauer's_principle
 
Last edited:
  • #5
gmax137 said:
I'm not sure about this, but...
Seems to me that the CPU is simply reading a string of zeroes and ones (the 'input') and writing another string ('output'). any interpretation of those strings takes place in YOUR mind, not in the CPU. The CPU can't tell that, when displayed, that string of output (1001011000100...etc) represents "War and Peace," the prime factors of the input, or the Mona Lisa, or any other interesting results.

I think the energy associated with the interpretation comes from the Wheaties you ate this morning.

Just my 2 cents.

Interesting but I think you have just shifted my question from computer processing to brain processing and that my original question still holds. In case of brain processing, is the chemical energy inputted equal to heat dissipated, or is there another component in the output which describes the useful work without which the equation is not balanced?
 
  • #6
Naty1 said:
more than that; entropy is a subset of information theory.

Surprisingly, quantum theory tells us it is only the erasure (elimination) of a bit that takes energy...that's the only action that releases heat..it's called Landauer's principle is is counterintutive (like much of physics). The dissipation of stored information is the same as increasing entropy. Turns out the reversible operations don't increase entropy, irreversible ones do.

Source: DECODING THE UNIVERSE, CHARLES SEIFE, PAGES 81-85, 2006

also try : http://en.wikipedia.org/wiki/Landauer's_principle

I thought mine was a quite innocent question, but now it seems I managed to get myself into some rather complicated theories. :I

But from what I read on this link, and on one that followed there about reversible computing, it seems like my original thinking was justified. How I understand it, is that it may be possible to carry out computations with much less and possibly even zero energy. As long as the algorithm is reversible, it should be possible, if I read this theory correctly.

I still do not understand how energy transfer for computation works today, is all input energy dissipated as heat, and "useful work" cannot really be quantified. I do not yet understand this entropy business so I will be reading more on that..
 
Last edited:
  • #7
I still do not understand how energy transfer for computation works today, is all input energy dissipated as heat, and "useful work" cannot really be quantified

http://en.wikipedia.org/wiki/Random-access_memory



“First of all, as chip geometries shrink and clock frequencies rise, the transistor leakage current increases, leading to excess power consumption and heat...

In general, solid state devices which form a lot of memory in use today, use electrons moving around...they are opposed by resistance...so each electron moved encounters that a creates a tiny amount of heat...the power consumed is IE...or I2R (same thing)...
how EXACTLY this relates to Launder's principle I don't understand.
 
  • #8
Note that heat must be produced by a processor. Even if you were to make the whole thing superconducting, it is inevitable. There are very fundamental principles of thermodynamics involved.

But yes, 100% of power consumed by CPU becomes either heat or EM radiation. There is some energy of "state" of CPU which constantly varies, but it is tiny compared to everything else, and it's not something that constantly consumes power. It may store some or release some.
 
  • #9
K^2 said:
Note that heat must be produced by a processor. Even if you were to make the whole thing superconducting, it is inevitable. There are very fundamental principles of thermodynamics involved.

But yes, 100% of power consumed by CPU becomes either heat or EM radiation. There is some energy of "state" of CPU which constantly varies, but it is tiny compared to everything else, and it's not something that constantly consumes power. It may store some or release some.

So it is true that calculations can not be quantified in the energy equation. Doesn't this imply it is possible to do the same calculations with approaching to zero energy, or even more extreme, that calculations are not needed at all ie. all possible results are simultaneously present in the original data and it is just a matter of knowing how to read them?
 
  • #10
k354 said:
I am not a physicist, but If they are a side effect that would really be interesting to think about. Would it imply all the answers are already present in the original data, and we need to spend energy just to shuffle it around in order to get our answer. Wouldn't it also imply that every computation can be optimized so much that it is not needed at all? (spend zero energy to get the answer)

As I recall, you expend energy when "erasing" information. Ideas along the lines of "reversible computation" allow you to derive the result that you can expend arbitrarily little energy to perform a computation, but the less energy you expend the slower it gets.

A system that is fully reversible and runs forwards or backwards depending on random motion of the environment, without any way of biasing the direction, will evolve toward the solution in the manner of a random walk. In order to have it move N steps in the forward direction, you must want longer and longer for that random eventuality.
 
  • #11
In general, solid state devices which form a lot of memory in use today, use electrons moving around...they are opposed by resistance...so each electron moved encounters that and creates a tiny amount of heat...the power consumed is IE...or I2R (same thing)...
how EXACTLY this relates to Launder's principle I don't understand.[/QUOTE]

Naty1 said:
 

FAQ: Is Computation Merely a Side Effect of Energy Transformation in CPUs?

1. What is energy for computation?

Energy for computation refers to the amount of energy required to perform calculations and process information on a computer or electronic device. This energy is typically supplied by a power source, such as a battery or electricity from an outlet.

2. How is energy for computation measured?

Energy for computation is typically measured in joules (J) or watt-hours (Wh). These units represent the amount of energy consumed over a period of time, with joules being a measure of energy and watt-hours being a measure of power (energy per unit time).

3. How does energy for computation impact the environment?

The production and consumption of energy for computation can have a significant impact on the environment, as it often relies on non-renewable sources of energy such as coal or natural gas. Additionally, the manufacturing and disposal of electronic devices can also contribute to pollution and waste.

4. What are some ways to reduce energy consumption for computation?

There are several ways to reduce energy consumption for computation, such as using energy-efficient devices, optimizing computer settings, and turning off devices when not in use. Additionally, using renewable sources of energy, such as solar or wind power, can also help reduce the environmental impact of energy for computation.

5. How is the energy for computation being improved?

Scientists are constantly researching and developing new technologies to improve the efficiency and sustainability of energy for computation. This includes advancements in renewable energy sources, as well as the development of more energy-efficient computer components and algorithms.

Back
Top