What is the connection between Maxwell's Demon and Liouville's Theorem?

In summary, the argument makes no reference to the size of the air molecules, and assumes that the demon himself can be neglected.
  • #71
Acala said:
I don't see how this could be considered anything other than statistical, though I am certainly no expert.
I think it's fair to say that the second law is purely statistical. It is not an unbreakable law, it's a way to make predictions. The more complex the system, the more reliable the prediction, but the systems that thermodynamics are applied to are so spectacularly complex that the predictions are essentially completely reliable. Good thing too-- our lives depend on that constantly.
 
Physics news on Phys.org
  • #72
Ah, that is a clever solution, Rap.

And thanks Ken G, that always troubled me about the concept. I can see how it's virtually completely true even if it's possible to be broken in exceedingly rare circumstances.
 
  • #73
Acala said:
And thanks Ken G, that always troubled me about the concept. I can see how it's virtually completely true even if it's possible to be broken in exceedingly rare circumstances.
Indeed, some think that the origin of our universe was just one of those exceedingly rare circumstances where the second law had a hiccup!
 
  • #74
In case anyone is interested, I have posted an attachement of a paper on "The Gibbs Paradox" by E.T. Jaynes. It seems relevent, especially because Rap posted about particle indistinguishability. Jaynes seems to be taking a viewpoint similar to Ken G's; namely that entropy is largely related to the observers knowledge and their intent in setting up an experimental apparatus.

A thermodynamic state is defined by specifying a small number of macroscopic quantities such as temperature, volume, magnetization, stress, etc--Denote them by {X1,X2,Xn} which are observed and/or controlled by the experimenter, where n is seldom greater than 4.

From this he goes on to give a number of illustrative physical examples which relate to the entropy of mixing of indistinguishable and vaguely distinguishable particles and uses this to support his thesis that entropy is not a property of the microstate, and that there is no paradox because the propositions of thermodynamics only relate to pre-defined sets of macrostates and are not meant to be statements about the physical microstates. He says about the second law:

Therefore, the correct statement of the second law is not that an entropy decrease is impossible in principle, or even improbable;rather that it cannot be achieved reproducibly by manipulating the macrovariables {X1, X2, Xn} that we have chosen to define our macrostate. Any attempt to write a second law stronger than this will put one at the mercy of a trickster, who can produce a violation of it
 

Attachments

  • gibbs.paradox.pdf
    202 KB · Views: 221
  • #75
JDStupi said:
In case anyone is interested, I have posted an attachement of a paper on "The Gibbs Paradox" by E.T. Jaynes. It seems relevent, especially because Rap posted about particle indistinguishability. Jaynes seems to be taking a viewpoint similar to Ken G's; namely that entropy is largely related to the observers knowledge and their intent in setting up an experimental apparatus.
I don't think this is Ken G's viewpoint. Ken G, am I wrong?
 
  • #76
JDStupi said:
In case anyone is interested, I have posted an attachement of a paper on "The Gibbs Paradox" by E.T. Jaynes. It seems relevent, especially because Rap posted about particle indistinguishability. Jaynes seems to be taking a viewpoint similar to Ken G's; namely that entropy is largely related to the observers knowledge and their intent in setting up an experimental apparatus.

From this he goes on to give a number of illustrative physical examples which relate to the entropy of mixing of indistinguishable and vaguely distinguishable particles and uses this to support his thesis that entropy is not a property of the microstate, and that there is no paradox because the propositions of thermodynamics only relate to pre-defined sets of macrostates and are not meant to be statements about the physical microstates. He says about the second law:

As far as I am concerned, Jaynes is up there with Boltzmann and Gibbs when it comes to understanding entropy. He is the number one contributor, in my mind, to the understanding of the relationship between thermodynamic entropy and information entropy.

His explanation of the Gibbs paradox gives a deep insight into entropy - If you have two gases separated by a partition, and they have identical particles, then removing the partition changes nothing - the resulting gas is in equilibrium and the entropy is the sum of the entropies of the two gases when the partition was in. If they are different particles, no matter how small the difference, upon removing the partition, you have non-equilibrium, and upon equilibrium, you have a net increase in entropy - the total entropy is greater than the sum of the two original entropies and the increase is always the same. The crucial point is that if they are different but you have no experimental ability to tell you that they are different, then removing the partition changes nothing - there is no detectable disequilibrium, and no entropy change. Entropy is not only a function of the system, its a function of what you happen to know, or choose to know about the system.

This is applicable to the concept of "correct Boltzmann counting", where, when you calculate the entropy of a gas assuming the particles have separate identities, you wind up with a non-extensive entropy (entropies do not add) and you have to subtract log(N!) to get the right answer. You can see that saying that the particles are distinguishable is equivalent to taking your original gas and instead of having two boxes as in the Gibbs paradox, you have N separate boxes, each containing one particle which is different in some way (i.e. its distinguishable) from every other particle. Again, as in the Gibbs paradox, the entropies will not add. But you have no experimental ability to tell you that the particles are different. Therefore, if you calculate entropy by assuming they are distinguishable, you have to subtract that log(N!) error you made by that assumption. And now entropies add up (i.e. its extensive).

If you want a really good book on this subject, which goes through it carefully and clearly, giving many examples, check out "A Farewell to Entropy: Statistical Thermodynamics Based on Information" by Arieh Ben-Naim.
 
  • #77
lugita15 said:
I don't think this is Ken G's viewpoint. Ken G, am I wrong?
JDStupi is right-- that is very much my viewpoint. Entropy emerges once we have defined the sets of categories of states that fit to what we know and what our goals are. Once we have that, we get entropy from ln(N), and then we have the second law, which simply states that less populated categories will give way to more populated ones, over time, and this progress will be more reliable if the systems involved are very huge. We also get the condition for processes that occur "spontaneously", which is everything that happens in thermodynamics, and that is Nf > Ni. It might not be quite tautological, but it is certainly pretty simple and logical. The difficulty is in keeping track of what N is for things like Demons, and that's a worthwhile exercise along the lines of what Rap is trying to do, but if we want to know how the Demon works, the issue isn't really if it will hold, but why it must hold, or else the Demon will work in reverse.
 
Last edited:
  • #78
According to March 8, 2012 article in "Nature" (see citation below), the "idea of a connection between information and thermodynamics can can be traced back to Maxwell’s ‘demon’ " and the Landauer Principle which helped to resolve the paradox has finally been experimentally verified.

According to the Nature article, "The paradox of the apparent violation of the second law can be resolved by noting that during a full thermodynamic cycle, the memory of the demon, which is used to record the coordinates of each molecule, has to be reset to its initial state11,12 . Indeed, according to Landauer’s principle, any logically irreversible transformation of classical information is necessarily accompanied by the dissipation of at least kTln(2) of heat per lost bit (about 3 3 10221 J at room temperature (300 K)), where k is the Boltzmann constant and T is the temperature."

See: Antoine Bérut, et al., "Experimental verification of Landauer’s principle linking information and thermodynamics" Nature 483, 187–189 (08 March 2012)
http://phys.org/news/2012-03-landauer-dissipated-memory-erased.html
 
  • #79
From JDStuple in post #74 on the possibility of a Mazwell's Demon, originally from "The Gibbs Paradox" by E.T. Jaynes:
Therefore, the correct statement of the second law is not that an entropy decrease is impossible in principle, or even improbable;rather that it cannot be achieved reproducibly by manipulating the macrovariables {X1, X2, Xn} that we have chosen to define our macrostate. Any attempt to write a second law stronger than this will put one at the mercy of a trickster, who can produce a violation of it.
Simon Van der Meer of CERN, in his 1984 Nobel Prize in Physics lecture , explains how he used RF electronics to violate Liouville's Theorem (see http://en.wikipedia.org/wiki/Liouville's_theorem_(Hamiltonian ).) and reduce the size (phase space) of a relativistic antiproton beam. See Simon Van der Meer's lecture (http://ki1.nobel.ki.se/nobel_prizes/physics/laureates/1984/meer-lecture.pdf) following Equation (1) on page 294.
 
Last edited by a moderator:

Similar threads

  • Electromagnetism
Replies
8
Views
1K
  • Electromagnetism
Replies
28
Views
3K
  • Electromagnetism
Replies
5
Views
2K
Replies
7
Views
2K
  • Electromagnetism
Replies
2
Views
4K
  • Electromagnetism
3
Replies
81
Views
17K
  • Mechanical Engineering
Replies
9
Views
2K
  • Quantum Interpretations and Foundations
Replies
25
Views
971
  • Beyond the Standard Models
Replies
23
Views
3K
Back
Top