Entropy and information: Do physicists still believe so?

Click For Summary
SUMMARY

The discussion centers on the relationship between entropy and information, specifically through the lens of von Neumann entropy. It is established that acquiring information about a system's microstates decreases the number of accessible states, thus reducing entropy as described by the equation S1 = k*ln(W-w). The act of erasing information, however, increases entropy by an amount of kB ln 2, a principle that has implications in reversible computing and quantum computing. The "Maxwell's demon" paradox, which relates to this concept, was resolved in the 1960s, clarifying the connection between information and thermodynamic principles.

PREREQUISITES
  • Understanding of thermodynamic principles, particularly entropy
  • Familiarity with von Neumann entropy and its mathematical representation
  • Knowledge of reversible computing concepts
  • Basic principles of quantum computing and its requirements
NEXT STEPS
  • Research the implications of Landauer's principle in information theory
  • Explore the concept of reversible computing and its applications
  • Study the historical context and resolution of the Maxwell's demon paradox
  • Investigate the role of entropy in quantum computing and its operational requirements
USEFUL FOR

Physicists, computer scientists, and researchers in thermodynamics and quantum computing will benefit from this discussion, particularly those interested in the intersection of information theory and entropy.

LTP
Messages
24
Reaction score
0
The entropy of a system is
S = k*ln W
If we obtain some information about the microstates, e.g. know the location and velocities of some of the molecules, then W is decreased by a amount w, so
S1 = k*ln(W-w)
That is an decrease in entropy, i.e. S1 < S.

Do physicists still "believe" so?
 
Science news on Phys.org
LTP said:
The entropy of a system is
S = k*ln W
If we obtain some information about the microstates, e.g. know the location and velocities of some of the molecules, then W is decreased by a amount w, so
S1 = k*ln(W-w)
That is an decrease in entropy, i.e. S1 < S.

Do physicists still "believe" so?

Yes, at least for von Neumann entropy. The point is that to "know" this microstate information, means that you've limited the number of states the system can be in because now it are those microstates that are compatible with your information: the only way to know that information, is by forcing it somehow in a subset of states.
In practice this doesn't make any difference, because for a statistically significant number of particles (from the moment it starts making sense doing thermodynamics), the amount of knowledge you'd need to gather is so huge before it starts influencing the numerical value of entropy, that it is not feasible.

The reason why this lowers entropy is that you can USE that information to go to a "real" lower-entropy state.

Imagine a gas in a box, which is a mixture of two components A and B. Now, you know that the entropy of this mixture is higher than if component A was in the left half of the box and component B in the right half (eventually with a wall in between). It is the entropy of mixture.
But imagine that somehow, we know all the microstates of motion of all the particles A and B. Imagine that we have still a wall in the box, but this time with a tiny shutter. We could use all that information to pilot the "Maxwell demon", each time we KNOW that a molecule A is going to the left, or a molecule B is going to the right.
So we could go back to the state before the mixture, without (with an ideal demon) spending any "entropy" elsewhere.
 
Last edited:
vanesch said:
Yes, at least for von Neumann entropy.
But also in "classical" entropy?

vanesch said:
Imagine a gas in a box, which is a mixture of two components A and B. Now, you know that the entropy of this mixture is higher than if component A was in the left half of the box and component B in the right half (eventually with a wall in between). It is the entropy of mixture.
But imagine that somehow, we know all the microstates of motion of all the particles A and B. Imagine that we have still a wall in the box, but this time with a tiny shutter. We could use all that information to pilot the "Maxwell demon", each time we KNOW that a molecule A is going to the left, or a molecule B is going to the right.
So we could go back to the state before the mixture, without (with an ideal demon) spending any "entropy" elsewhere.
So what is preventing us from being such a demon? Do you imply that the act of information acquisition increases the entropy?
 
LTP said:
So what is preventing us from being such a demon? Do you imply that the act of information acquisition increases the entropy?

No, as it turns out it is the act of ERASING information that increases the entropy by an amount kb ln 2. This can actually be shown experimentally. There is a whole field called "reversible computing" where people study circuits where information is never erased which actually reduces the energy consumption somewhat.
It is also a field that has become increasingly important over the past decade or so since it turns out that quantum computers must be reversible in order to work; meaning many results from reversible computing are also applicable also to quantum computing.

The "Maxwell's demon" paradox was one of the great unsolved mysteries in physics until it was solved about 80 years ago.
 
f95toli said:
No, as it turns out it is the act of ERASING information that increases the entropy by an amount kb ln 2.
I agree.

f95toli said:
This can actually be shown experimentally.
Really? - How? AFAIK "normal" computers don't come anywhere near that lower limit.
f95toli said:
The "Maxwell's demon" paradox was one of the great unsolved mysteries in physics until it was solved about 80 years ago.
More likely 40 years, Landauer published his paper in 1961, Bennett published his solution to MS in the same decade.
 

Similar threads

  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 5 ·
Replies
5
Views
637
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K