Thermodynamic cost of information processing (or lack thereof)

In summary: Actually it's erasing information that costs energy. (It even says so in the title of the article you linked)And as far as I understand Landauer principle, the work done on a bit to erase information gets converted to heat.
  • #1
Nantes
54
5
Say you wanted to heat up your room. You could choose between an electric heater that outputs 500W or a PC that draws 500W. In both cases, the amount your room would heat up by would be the same, but the second case is much more advantageous because the computer can process information before generating the heat. You could for example tell your computer to calculate the molecular folding of proteins and help find a cure for cancer and other diseases (a project which already exists, called Folding@Home) before it converts that energy to heat, and it would still be just an efficient as the electrical heater (that is, 100% thermally efficient).

But why? In basically every other physical process I can remember, there is a trade-off you have to make. If you wanted to lift a car using just your physical strength, you could use pulleys to drastically lower the force required. But in this case, the trade-off is that you have to pull the pulley for many meters for the car to move a few centimeters upwards. The reduced force required is "transformed" into increased distance required. However, in the case of computers, there seems to be no trade-off besides the fact that the microprocessor is much more complex (and thus more costly) to make than the electric heater plates.

If I didn't know better, I would have said that a computer that outputs 500W of heat probably draws more than 500W of electricity, and the difference would be due to energy being transformed into information processing. But I do know better.

So what's the explanation for information processing's apparent lack of trade-off?
 
Science news on Phys.org
  • #2
You can always do worse than the ultimate limit. I think that theoretically reversible computing which dissipates no heat can be done and there's probably a trade-off with practicality. But there is no barrier to be less efficient than the limit.
 
  • #3
Scientific American published an article on this subject about 30 years ago. Sorry, but I can't find the link.:oldgrumpy:

The subject of the article was the theoretical lower limit on energy required to process one bit. The link between information and energy is very profound, for example the famous Schrödinger equation.

Be sure not to confuse knowledge with information. A bit is a bit, regardless of whether it is meaningless, of profound interest to human knowledge.
 
  • #4
Nantes said:
<snip>

If I didn't know better, I would have said that a computer that outputs 500W of heat probably draws more than 500W of electricity, and the difference would be due to energy being transformed into information processing. But I do know better.

So what's the explanation for information processing's apparent lack of trade-off?

creating information costs energy- kT ln 2 per bit.

http://www3.imperial.ac.uk/pls/portallive/docs/1/55905.PDF

Yes, your computer draws 500W of electricity, but it's different than a space heater- for one thing, your computer does not have a constant draw of 500W. But let's say you are drawing the full 500W continuously. The computer is less efficient at converting electricity into heat than the space heater. In your computer, some of the electrical energy is converted into changes in state of memory devices (hard disk, RAM, etc)- those changes cost kT ln 2 per bit.
 
  • #5
Andy Resnick said:
In your computer, some of the electrical energy is converted into changes in state of memory devices (hard disk, RAM, etc)- those changes cost kT ln 2 per bit.

Are you saying not 100% of the electricity sent to a computer becomes heat?
 
  • #6
That energy cost still becomes heat in the end.
 
  • #7
Andy Resnick said:
creating information costs energy- kT ln 2 per bit.

http://www3.imperial.ac.uk/pls/portallive/docs/1/55905.PDF
Actually it's erasing information that costs energy. (It even says so in the title of the article you linked)
And as far as I understand Landauer principle, the work done on a bit to erase information gets converted to heat.

Andy Resnick said:
The computer is less efficient at converting electricity into heat than the space heater. In your computer, some of the electrical energy is converted into changes in state of memory devices (hard disk, RAM, etc).
I do not see how it is possible. Say I switch the computer on, compute π to whatever decimal digit and then switch it off. The initial and the final states are the same (assuming I didn't save anything on disk) - where did the energy go?

Nantes said:
So what's the explanation for information processing's apparent lack of trade-off?
The actual trade-off is in using electricity to heat the room in the first place. First we burn coal, then we convert heat to electricity, paying Carnot cycle tax and then we are wasting it all by converting it back to low-grade heat It would be (in theory) more efficient to simply heat the room with coal. But we wouldn't be able to compute digits of π with it.
 
  • #8
Delta Kilo said:
The actual trade-off is in using electricity to heat the room in the first place. First we burn coal, then we convert heat to electricity, paying Carnot cycle tax and then we are wasting it all by converting it back to low-grade heat It would be (in theory) more efficient to simply heat the room with coal. But we wouldn't be able to compute digits of π with it.

While it makes sense, at the same time it doesn't, because there are other ways to acquire electricity, such as hydro dams and solar panels. You couldn't feasibly use the kinetic energy of water to heat something up without converting it to electricity first. In this case where is the trade-off?

Edit: wait, you can. You could use the kinetic energy of water to move a piston that creates friction with some metal structure, which would heat up and thus give off heat.
 
  • #9
Delta Kilo said:
Actually it's erasing information that costs energy. (It even says so in the title of the article you linked)
And as far as I understand Landauer principle, the work done on a bit to erase information gets converted to heat.

Because of the definition of a cyclic process: if you make a copy of information, you have to erase one before you are back to where you started.

In terms of your computing π example, if there's no way for you to tell that your computer did anything, how do you know it in fact computed π?
 
  • #10
Nantes said:
Are you saying not 100% of the electricity sent to a computer becomes heat?

That is correct- some of the energy (presumably) went into changing the contents of various non-volatile memory components.
 
  • #11
Nantes said:
While it makes sense, at the same time it doesn't, because there are other ways to acquire electricity, such as hydro dams and solar panels.
Hydro dam is already a high-grade (low entropy) energy source. In theory, you could compute π on a Babbage Engine running directly off the water wheel (I'd love to see that!). Solar power is a bit trickier but you can treat solar radiation as a heat source with 6000K effective temperature which makes it (in theory) a very high-grade heat.

The trade-off is not in quantity of energy (it is conserved) but in quality or "grade". This is engineering rather than physical term because it refers to the proportion of "useful energy output" and what is useful and what is waste depends on the context. If the desired output is heat in the room, them setting a lump of coal on fire is very efficient. If OTOH, the desired output is computation, we need to start with high-grade energy such as electrical or mechanical. "Downgrading" energy is trivial, upgrading incurs losses, all according to Carnot and 2nd law of thermodynamics.
 
Last edited:
  • #12
Andy Resnick said:
That is correct- some of the energy (presumably) went into changing the contents of various non-volatile memory components.
With all due respect, I cannot see this happening.

For example, in flash memory, the bits are stored as charges on floating transistor gates. Granted, charged gates store some small amount of electric potential energy. Setting the bit to 1 will rise the amount of potential energy in the system by a small amount, setting it to 0 will decrease it by exactly the same amount. The difference in energy between start and end state will be proportional to the difference in the total numbers of 1s and 0s stored in memory before and after, irrespectively of how many times the bits have been overwritten in the process. And it is trivial to keep the same total number of 1s and 0s before and after if we wanted to.

Now consider quantum dot memory where the bits are stored as spin directions, at the same energy level. There would be no change in the amount of internal energy at all. And this is exactly the kind of memory we are going to be using if we want to push the envelope towards Landauer limit.
 
  • #13
Delta Kilo said:
With all due respect, I cannot see this happening.

I think you are missing the essential point. I suggest reading more of the relevant literature (physics, information theory, thermodynamics) as there are a lot of great reference materials out there that can explain the system much better than I could.
 
  • #14
Andy Resnick said:
I think you are missing the essential point.
The essential point here is that a 500W resistive heater and a computer drawing 500W of electricity will produce heat at exactly the same rate, namely 500 Joules per second.

I specifically object to the following (emphasis mine):
Andy Resnick said:
The computer is less efficient at converting electricity into heat than the space heater. In your computer, some of the electrical energy is converted into changes in state of memory devices (hard disk, RAM, etc)- those changes cost kT ln 2 per bit.
Specifically, that kT ln 2 of energy per bit that you refer to is wasted as heat and is already included in the 500W balance.
 
  • #15
Delta Kilo said:
The essential point here is that a 500W resistive heater and a computer drawing 500W of electricity will produce heat at exactly the same rate, namely 500 Joules per second.

But surely you know that's not true. Consider a 500W audio amplifier- not all 500W is converted to heat, some is converted into sound.

Computers have moving parts (hard drive platters spin and read/write heads move), DVD lasers and displays generate light, information is transferred amongst various components, the list goes on. Maybe some perspective would help: how many bits/sec correspond to 500W (hint: a lot).
 
  • #16
Andy Resnick said:
But surely you know that's not true. Consider a 500W audio amplifier- not all 500W is converted to heat, some is converted into sound.
And, assuming the room is soundproof, all of it eventually ends up as heat as soon as the last echo of it dies out.

Andy Resnick said:
Computers have moving parts (hard drive platters spin and read/write heads move), DVD lasers and displays generate light, information is transferred amongst various components, the list goes on.
1. The computer I'm typing this on does not have moving parts.
2. All that light get scattered, reflected, absorbed and re-emitted as thermal radiation, i.e. heat.
3. Yes, a lot of information gets transferred back and forth. How does that affect energy balance?
4. Heat, heat and more heat. Resistive losses, light, sound, mechanical stresses, EM radiation, all eventually dissipated as heat (let's assume we have perfect EM shielding so no EM radiation leaves the room, otherwise it would be a loophole too easy to exploit :)
Seriously, assuming a sustained steady state, where does the energy accumulate if it is not dissipated as heat? What form does it take?
 
  • #17
Delta Kilo said:
And, assuming the room is soundproof, all of it eventually ends up as heat as soon as the last echo of it dies out. <snip>

Now you are just being difficult. Yes Virginia, in the end the universe suffers ignominious heat death.
 
  • #18
Let's not beat around the bush, shall we?
Yes, I concede that a typical computer produces mechanical and EM waves (sound, vibration, light, RF etc) and if those are allowed to escape the room, they will not contribute to the heat in the room. Well, obviously. But this is a strawman you've brought in at message #15. Let's just drop it, shall we?

The real issue at hand (the one this thread is about) relates to the thermodynamical cost of erasing information, a.k.a. Landauer's Principle that you invoked here:
Andy Resnick said:
The computer is less efficient at converting electricity into heat than the space heater. In your computer, some of the electrical energy is converted into changes in state of memory devices (hard disk, RAM, etc)- those changes cost kT ln 2 per bit.

I say the work expended to erase information is converted to heat and is included in the total amount of heat generated.
You seem to be saying that some of the energy is not converted to heat but instead goes into "changes in state of memory device" and as a result the total amount of heat generated is less. But you seem to be evading the question on what form this energy takes. Please answer it.
 

1. What is the thermodynamic cost of information processing?

The thermodynamic cost of information processing refers to the amount of energy required to carry out a specific computation or process in a system. This cost is determined by the laws of thermodynamics, which govern the transfer and conversion of energy.

2. How does information processing impact thermodynamics?

Information processing involves the manipulation and transmission of data, which requires energy. This energy can have thermodynamic consequences, such as increasing the entropy or disorder in a system. Therefore, information processing can affect the overall thermodynamic cost of a system.

3. Is there a limit to the amount of information processing that can be done without increasing thermodynamic cost?

Yes, there is a fundamental limit known as the Landauer limit, which states that any irreversible computation must generate a minimum amount of heat equal to kTln2, where k is the Boltzmann constant and T is the temperature of the system. This means that no computation can be done without some thermodynamic cost.

4. How does the size of a system impact the thermodynamic cost of information processing?

The size of a system can impact the thermodynamic cost of information processing in several ways. For example, smaller systems tend to have lower thermodynamic costs due to their higher energy efficiency. However, as the size of a system increases, the amount of energy required for information processing may also increase, leading to a higher overall thermodynamic cost.

5. Can information processing be done without any thermodynamic cost?

No, according to the laws of thermodynamics, energy is always required to perform any type of work, including information processing. However, advancements in technology and the development of more efficient computing systems can help reduce the thermodynamic cost of information processing.

Similar threads

Replies
22
Views
2K
Replies
32
Views
2K
Replies
152
Views
5K
Replies
7
Views
2K
  • Thermodynamics
Replies
2
Views
1K
  • Special and General Relativity
Replies
7
Views
300
  • Thermodynamics
Replies
3
Views
1K
Replies
7
Views
2K
  • Thermodynamics
Replies
15
Views
1K
Replies
1
Views
614
Back
Top