Entropy and information: Do physicists still believe so?

In summary, physicists believe that obtaining information about the microstates of a system can decrease its entropy. However, the act of erasing that information can actually increase the entropy. This was a great unsolved mystery until it was solved about 40-80 years ago through the study of reversible computing and quantum computing.
  • #1
LTP
24
0
The entropy of a system is
S = k*ln W
If we obtain some information about the microstates, e.g. know the location and velocities of some of the molecules, then W is decreased by a amount w, so
S1 = k*ln(W-w)
That is an decrease in entropy, i.e. S1 < S.

Do physicists still "believe" so?
 
Science news on Phys.org
  • #2
LTP said:
The entropy of a system is
S = k*ln W
If we obtain some information about the microstates, e.g. know the location and velocities of some of the molecules, then W is decreased by a amount w, so
S1 = k*ln(W-w)
That is an decrease in entropy, i.e. S1 < S.

Do physicists still "believe" so?

Yes, at least for von Neumann entropy. The point is that to "know" this microstate information, means that you've limited the number of states the system can be in because now it are those microstates that are compatible with your information: the only way to know that information, is by forcing it somehow in a subset of states.
In practice this doesn't make any difference, because for a statistically significant number of particles (from the moment it starts making sense doing thermodynamics), the amount of knowledge you'd need to gather is so huge before it starts influencing the numerical value of entropy, that it is not feasible.

The reason why this lowers entropy is that you can USE that information to go to a "real" lower-entropy state.

Imagine a gas in a box, which is a mixture of two components A and B. Now, you know that the entropy of this mixture is higher than if component A was in the left half of the box and component B in the right half (eventually with a wall in between). It is the entropy of mixture.
But imagine that somehow, we know all the microstates of motion of all the particles A and B. Imagine that we have still a wall in the box, but this time with a tiny shutter. We could use all that information to pilot the "Maxwell demon", each time we KNOW that a molecule A is going to the left, or a molecule B is going to the right.
So we could go back to the state before the mixture, without (with an ideal demon) spending any "entropy" elsewhere.
 
Last edited:
  • #3
vanesch said:
Yes, at least for von Neumann entropy.
But also in "classical" entropy?

vanesch said:
Imagine a gas in a box, which is a mixture of two components A and B. Now, you know that the entropy of this mixture is higher than if component A was in the left half of the box and component B in the right half (eventually with a wall in between). It is the entropy of mixture.
But imagine that somehow, we know all the microstates of motion of all the particles A and B. Imagine that we have still a wall in the box, but this time with a tiny shutter. We could use all that information to pilot the "Maxwell demon", each time we KNOW that a molecule A is going to the left, or a molecule B is going to the right.
So we could go back to the state before the mixture, without (with an ideal demon) spending any "entropy" elsewhere.
So what is preventing us from being such a demon? Do you imply that the act of information acquisition increases the entropy?
 
  • #4
LTP said:
So what is preventing us from being such a demon? Do you imply that the act of information acquisition increases the entropy?

No, as it turns out it is the act of ERASING information that increases the entropy by an amount kb ln 2. This can actually be shown experimentally. There is a whole field called "reversible computing" where people study circuits where information is never erased which actually reduces the energy consumption somewhat.
It is also a field that has become increasingly important over the past decade or so since it turns out that quantum computers must be reversible in order to work; meaning many results from reversible computing are also applicable also to quantum computing.

The "Maxwell's demon" paradox was one of the great unsolved mysteries in physics until it was solved about 80 years ago.
 
  • #5
f95toli said:
No, as it turns out it is the act of ERASING information that increases the entropy by an amount kb ln 2.
I agree.

f95toli said:
This can actually be shown experimentally.
Really? - How? AFAIK "normal" computers don't come anywhere near that lower limit.
f95toli said:
The "Maxwell's demon" paradox was one of the great unsolved mysteries in physics until it was solved about 80 years ago.
More likely 40 years, Landauer published his paper in 1961, Bennett published his solution to MS in the same decade.
 

1. What is entropy and how does it relate to information?

Entropy is a measure of the disorder or randomness in a system. In physics, it is often used to describe the amount of energy that is unavailable to do work. In information theory, entropy is used to measure the uncertainty or randomness in a message or data. As the amount of information in a system increases, the entropy also increases.

2. Is entropy considered a fundamental concept in physics?

Yes, entropy is considered a fundamental concept in physics. It is a key principle in the second law of thermodynamics, which states that the total entropy of a closed system will always increase over time. This law has been observed and tested extensively in various physical systems, making entropy a crucial concept in understanding the behavior of the universe.

3. Are there any recent developments or discoveries in the study of entropy and information?

Yes, there have been many recent developments and discoveries in the study of entropy and information. One area of interest is quantum information theory, which explores the relationship between information and quantum mechanics. Another recent development is the discovery of the holographic principle, which suggests that the information in a three-dimensional space can be encoded in a two-dimensional surface.

4. Do all physicists agree on the concept of entropy?

While the concept of entropy is widely accepted among physicists, there are still ongoing debates and discussions about its interpretation and application in different fields. Some physicists argue that entropy is a statistical concept and does not have a physical existence, while others consider it to be a fundamental property of the universe.

5. How does the concept of entropy apply to everyday life?

Entropy can be seen in many aspects of everyday life. For example, in cooking, as heat is applied to a system (such as a pot of water), the molecules become more disordered and the entropy increases. In information technology, the storage and transmission of data is subject to the laws of entropy, as information can be lost or corrupted over time. In general, the concept of entropy helps us understand the natural tendency for systems to become more disordered over time.

Similar threads

Replies
22
Views
1K
  • Thermodynamics
Replies
3
Views
1K
  • Thermodynamics
Replies
1
Views
731
Replies
13
Views
1K
Replies
1
Views
904
Replies
2
Views
842
Replies
1
Views
2K
Replies
4
Views
1K
  • Thermodynamics
Replies
4
Views
2K
Replies
57
Views
3K
Back
Top