Entropy is disorder = outmoded theory?

In summary: Third, the entropy of a system is not a measure of how much disorder or chaos exists in that system, but of the energy dispersal within that system. Lastly, it is not clear whether Frank Lambert agrees with Peter Atkins on this point.
  • #36
andrebourbaki said:
Prof. Lambert is just confused. As an educator, he has found a distorted, simplified presentation that makes things easier for his students, but in fact it is in itself even more confused than anything he criticises about "disorder". He switches between dispersal of energy over physical volume and dispersal of energy over micro-states, apparently unaware that a system can only be entirely in or entirely out of a microstate so that all of its energy is in the microstate.
Prof. Lambert is not trying to redefine entropy. He is just trying to help students understand it.

The tendency of the universe toward increase in entropy can be thought of as a tendency toward macro states for which the number of equivalent micro states increases. Unless one defines disorder in a special way, it is difficult to see how this can be described as a tendency toward disorder.

A cup of boiling poured over an iceberg: The disorder in the cup of boiling water has decreased and we end up with more ice. Has disorder increased? Has energy dispersed? Has the system and surroundings assumed a state in which the number of micro states equivalent to that macrostate increased?

And his idea of dispersal over physical position is obviously false: two states can have their local energy content completely, i.e., uniformly dispersed over the same volume but have different entropies.
I don' t follow you there. Can you provide an example?

AM
 
Science news on Phys.org
  • #37
I looked up Atkins' 8th edition on Amazon, and he links dispersal of energy and disorder. I've never heard of "dispersal of energy", but it seems ok to me.

It seems compatible with this analogy I like from Kardar's notes: "This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."

As information is lost because of the limited resolution of our measuring apparatus, many different microstates (the precise positions of the two liquids) will be compatible with the macrostate (the reading indicated by our limited resolution measuring apparatus). So as entropy increases, we have less and less information about the precise position of things, so things seem more "disordered" to us in that sense.
 
  • #38
atyy said:
I looked up Atkins' 8th edition on Amazon, and he links dispersal of energy and disorder. I've never heard of "dispersal of energy", but it seems ok to me.

It seems compatible with this analogy I like from Kardar's notes: "This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."

As information is lost because of the limited resolution of our measuring apparatus, many different microstates (the precise positions of the two liquids) will be compatible with the macrostate (the reading indicated by our limited resolution measuring apparatus). So as entropy increases, we have less and less information about the precise position of things, so things seem more "disordered" to us in that sense.

Equating "energy dispersal" with entropy has problems as well. It is more accurate to think of an increase in entropy as a change in thermodynamic state (macro state) for which the number of equivalent microstates increases. This better explains the Gibbs paradox regarding mixing of gases which the "energy dispersal" concept does not do so well.

Frank Lambert's opposition the "disorder" approach is tied to his opposition to linking or equating "thermodynamic entropy" with Shannon's concept of entropy based on information theory. In my view, while there is an interesting mathematical similarity, the two entropies do seem to relate to very different things. However, there are different views on this.

In any event, I think that the "energy dispersal" concept is better and easier to grasp than a concept of "disorder" or a concept based on information theory. It may also be more intuitive than a concept based on micro states.

When energy spreads out, entropy increases. When energy becomes more concentrated, entropy decreases. Fundamentally, however, the second law is a statistical law. It says essentially that nature does not tend toward physically possible but statistically improbable configurations. So while we can think about it as having to do with energy dispersal it is really about probability. As long as we realize that, it seems to me that "energy dispersal" is one way to conceptualize entropy.

AM
 
Last edited:
  • #39
Andrew Mason said:
Equating "energy dispersal" with entropy has problems as well. It is more accurate to think of an increase in entropy as a change in thermodynamic state (macro state) for which the number of equivalent microstates increases. This better explains the Gibbs paradox regarding mixing of gases which the "energy dispersal" concept does not do so well.

Frank Lambert's opposition the "disorder" approach is tied to his opposition to linking or equating "thermodynamic entropy" with Shannon's concept of entropy based on information theory. In my view, while there is an interesting mathematical similarity, the two entropies do seem to relate to very different things. However, there are different views on this.

In any event, I think that the "energy dispersal" concept is better and easier to grasp than a concept of "disorder" or a concept based on information theory. It may also be more intuitive than a concept based on micro states.

When energy spreads out, entropy increases. When energy becomes more concentrated, entropy decreases. Fundamentally, however, entropy is a statistical law. It says essentially that nature does not to tend toward physically possible but statistically improbable configurations. So while we can think about it as having to do with energy dispersal it is really about probability. As long as we realize that, it seems to me that "energy dispersal" is one way to conceptualize entropy.

The Shannon entropy concept is exactly the same as that based on microstates in statistical physics. Both are basically the "number of states" or "number of possibilities". In information theory, these go by the name of "typical sequences" and the relevant theorem is something called "asymptotic equipartition" which is the rigourous version of what physicists learn. Then the Shannon mutual information is basically the change in entropy. Gaining information is a reduction in entropy, which makes sense in that if one is certain, then there is a reduction in the number of possibilities from many to only one.

The microstate definition is important in attempts like Boltzmann's use of kinetic theory to understand how irreversibility can arise from reversible laws of dynamics, and the basic idea that there is a loss of information due to the limited resolution of our measuring instruments - or in the Landauer explanation - due to the erasure of information due to finite memory.

I do agree that it is important to learn the classical equilibrium thermodynamic concept of entropy, because it comes from the Kelvin and Clausius statements of the second law, which are basically experimental facts. And this definition can be used independently of the microstate definition.

However, if Lambert opposes the Shannon definition, then he is also erroneously opposing the microstate definition.
 
Last edited:
  • #40
entropy, disorder, and dispersion

thermodynamic entropy is a precisely defined concept. Informational entropy is a statistical mechanics concept of entropy, first introduced by Boltzmann, refined by Gibbs, and re-discovered and applied to wider fields by Shannon, and is a concept that seems to be different, but Boltzmann was, with difficulty, able to prove that it was very closely related to thermodynamic entropy provided one assumed the Stosszahlansatz, or, as it is often called in English, 'molecular chaos', a hypothesis which is approximately true.

The intuitive concept of 'disorder' is the normal way to motivate the definition of entropy, but 'disorder' cannot really be given a precise definitiion except by using informational entropy, the number of micro-states compatible with our knowledge that the system is in a given macro-state.
The increasing disorder in a deck of cards produced by shuffling is a traditional example used to teach students about the statistical mechanical definition of entropy, it goes back at least to Sir James Jeans in 1900. Prof. Lambert and a few other fringe figures seem to be allergic to the deck of cards metaphor. It is only a metaphor, but they have carried out a crusade against the use of the concept of 'disorder' even though this has been proven to be useful in the discussion of lambda phase transitions.

The intuitive concept of 'energy dispersal' is preferred by some of these fringe figures, but their claims that it has been adopted by Atkins's eighth edition of his standard textbook are false. (And who knows how many of their other claims about it being adopted in texts are false or exaggerated.) What Atkins actually says is very sensible, so I reproduce it here.

"The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of 'disorder' and 'the dispersal of matter and energy' that are used widely to introduce the concept of entropy: a more 'disorderly' distribution of energy and matter corresponds to a greater number of microstates associated with the same total energy." --- p. 81

p. 143. "This increase in entropy is what we expect when one gas disperses into the other and the disorder increases."

On the other hand, it would be easier to make the intuitive concept of 'dispersal of energy' precise and quantitative, but then it disagrees with the precise and quantitative definition of entropy, as far as I can tell. Suppose, but I don't know that this fringe group has ever done so, but suppose for the sake of discussion we say 'standard deviation of the spatial distribution of energy' (supposing, for simplicity, that the density of matter is fixed and constant and uniform). (I am well aware this gives it the wrong units...)
 
Last edited:

Similar threads

Replies
4
Views
1K
Replies
3
Views
1K
Replies
7
Views
5K
  • Thermodynamics
Replies
15
Views
8K
  • Introductory Physics Homework Help
Replies
3
Views
721
Replies
5
Views
4K
  • Thermodynamics
Replies
25
Views
7K
Replies
7
Views
8K
Replies
1
Views
1K
Back
Top