# A Thermodynamic cost of information processing (or lack thereof)

Tags:
1. Jul 19, 2016

### Nantes

Say you wanted to heat up your room. You could choose between an electric heater that outputs 500W or a PC that draws 500W. In both cases, the amount your room would heat up by would be the same, but the second case is much more advantageous because the computer can process information before generating the heat. You could for example tell your computer to calculate the molecular folding of proteins and help find a cure for cancer and other diseases (a project which already exists, called Folding@Home) before it converts that energy to heat, and it would still be just an efficient as the electrical heater (that is, 100% thermally efficient).

But why? In basically every other physical process I can remember, there is a trade-off you have to make. If you wanted to lift a car using just your physical strength, you could use pulleys to drastically lower the force required. But in this case, the trade-off is that you have to pull the pulley for many meters for the car to move a few centimeters upwards. The reduced force required is "transformed" into increased distance required. However, in the case of computers, there seems to be no trade-off besides the fact that the microprocessor is much more complex (and thus more costly) to make than the electric heater plates.

If I didn't know better, I would have said that a computer that outputs 500W of heat probably draws more than 500W of electricity, and the difference would be due to energy being transformed into information processing. But I do know better.

So what's the explanation for information processing's apparent lack of trade-off?

2. Jul 19, 2016

### Truecrimson

You can always do worse than the ultimate limit. I think that theoretically reversible computing which dissipates no heat can be done and there's probably a trade-off with practicality. But there is no barrier to be less efficient than the limit.

3. Jul 19, 2016

### Staff: Mentor

Scientific American published an article on this subject about 30 years ago. Sorry, but I can't find the link.

The subject of the article was the theoretical lower limit on energy required to process one bit. The link between information and energy is very profound, for example the famous Schrödinger equation.

Be sure not to confuse knowledge with information. A bit is a bit, regardless of whether it is meaningless, of profound interest to human knowledge.

4. Jul 19, 2016

### Andy Resnick

creating information costs energy- kT ln 2 per bit.

http://www3.imperial.ac.uk/pls/portallive/docs/1/55905.PDF

Yes, your computer draws 500W of electricity, but it's different than a space heater- for one thing, your computer does not have a constant draw of 500W. But let's say you are drawing the full 500W continuously. The computer is less efficient at converting electricity into heat than the space heater. In your computer, some of the electrical energy is converted into changes in state of memory devices (hard disk, RAM, etc)- those changes cost kT ln 2 per bit.

5. Jul 19, 2016

### Nantes

Are you saying not 100% of the electricity sent to a computer becomes heat?

6. Jul 19, 2016

### Khashishi

That energy cost still becomes heat in the end.

7. Jul 19, 2016

### Delta Kilo

Actually it's erasing information that costs energy. (It even says so in the title of the article you linked)
And as far as I understand Landauer principle, the work done on a bit to erase information gets converted to heat.

I do not see how it is possible. Say I switch the computer on, compute π to whatever decimal digit and then switch it off. The initial and the final states are the same (assuming I didn't save anything on disk) - where did the energy go?

The actual trade-off is in using electricity to heat the room in the first place. First we burn coal, then we convert heat to electricity, paying Carnot cycle tax and then we are wasting it all by converting it back to low-grade heat It would be (in theory) more efficient to simply heat the room with coal. But we wouldn't be able to compute digits of π with it.

8. Jul 19, 2016

### Nantes

While it makes sense, at the same time it doesn't, because there are other ways to acquire electricity, such as hydro dams and solar panels. You couldn't feasibly use the kinetic energy of water to heat something up without converting it to electricity first. In this case where is the trade-off?

Edit: wait, you can. You could use the kinetic energy of water to move a piston that creates friction with some metal structure, which would heat up and thus give off heat.

9. Jul 19, 2016

### Andy Resnick

Because of the definition of a cyclic process: if you make a copy of information, you have to erase one before you are back to where you started.

In terms of your computing π example, if there's no way for you to tell that your computer did anything, how do you know it in fact computed π?

10. Jul 19, 2016

### Andy Resnick

That is correct- some of the energy (presumably) went into changing the contents of various non-volatile memory components.

11. Jul 19, 2016

### Delta Kilo

Hydro dam is already a high-grade (low entropy) energy source. In theory, you could compute π on a Babbage Engine running directly off the water wheel (I'd love to see that!). Solar power is a bit trickier but you can treat solar radiation as a heat source with 6000K effective temperature which makes it (in theory) a very high-grade heat.

The trade-off is not in quantity of energy (it is conserved) but in quality or "grade". This is engineering rather than physical term because it refers to the proportion of "useful energy output" and what is useful and what is waste depends on the context. If the desired output is heat in the room, them setting a lump of coal on fire is very efficient. If OTOH, the desired output is computation, we need to start with high-grade energy such as electrical or mechanical. "Downgrading" energy is trivial, upgrading incurs losses, all according to Carnot and 2nd law of thermodynamics.

Last edited: Jul 19, 2016
12. Jul 19, 2016

### Delta Kilo

With all due respect, I cannot see this happening.

For example, in flash memory, the bits are stored as charges on floating transistor gates. Granted, charged gates store some small amount of electric potential energy. Setting the bit to 1 will rise the amount of potential energy in the system by a small amount, setting it to 0 will decrease it by exactly the same amount. The difference in energy between start and end state will be proportional to the difference in the total numbers of 1s and 0s stored in memory before and after, irrespectively of how many times the bits have been overwritten in the process. And it is trivial to keep the same total number of 1s and 0s before and after if we wanted to.

Now consider quantum dot memory where the bits are stored as spin directions, at the same energy level. There would be no change in the amount of internal energy at all. And this is exactly the kind of memory we are going to be using if we want to push the envelope towards Landauer limit.

13. Jul 20, 2016

### Andy Resnick

I think you are missing the essential point. I suggest reading more of the relevant literature (physics, information theory, thermodynamics) as there are a lot of great reference materials out there that can explain the system much better than I could.

14. Jul 20, 2016

### Delta Kilo

The essential point here is that a 500W resistive heater and a computer drawing 500W of electricity will produce heat at exactly the same rate, namely 500 Joules per second.

I specifically object to the following (emphasis mine):
Specifically, that kT ln 2 of energy per bit that you refer to is wasted as heat and is already included in the 500W balance.

15. Jul 20, 2016

### Andy Resnick

But surely you know that's not true. Consider a 500W audio amplifier- not all 500W is converted to heat, some is converted into sound.

Computers have moving parts (hard drive platters spin and read/write heads move), DVD lasers and displays generate light, information is transferred amongst various components, the list goes on. Maybe some perspective would help: how many bits/sec correspond to 500W (hint: a lot).

16. Jul 20, 2016

### Delta Kilo

And, assuming the room is soundproof, all of it eventually ends up as heat as soon as the last echo of it dies out.

1. The computer I'm typing this on does not have moving parts.
2. All that light get scattered, reflected, absorbed and re-emitted as thermal radiation, i.e. heat.
3. Yes, a lot of information gets transferred back and forth. How does that affect energy balance?
4. Heat, heat and more heat. Resistive losses, light, sound, mechanical stresses, EM radiation, all eventually dissipated as heat (let's assume we have perfect EM shielding so no EM radiation leaves the room, otherwise it would be a loophole too easy to exploit :)
Seriously, assuming a sustained steady state, where does the energy accumulate if it is not dissipated as heat? What form does it take?

17. Jul 20, 2016

### Andy Resnick

Now you are just being difficult. Yes Virginia, in the end the universe suffers ignominious heat death.

18. Jul 20, 2016

### Delta Kilo

Let's not beat around the bush, shall we?
Yes, I concede that a typical computer produces mechanical and EM waves (sound, vibration, light, RF etc) and if those are allowed to escape the room, they will not contribute to the heat in the room. Well, obviously. But this is a strawman you've brought in at message #15. Let's just drop it, shall we?

The real issue at hand (the one this thread is about) relates to the thermodynamical cost of erasing information, a.k.a. Landauer's Principle that you invoked here:
I say the work expended to erase information is converted to heat and is included in the total amount of heat generated.
You seem to be saying that some of the energy is not converted to heat but instead goes into "changes in state of memory device" and as a result the total amount of heat generated is less. But you seem to be evading the question on what form this energy takes. Please answer it.