Entropy is disorder = outmoded theory?

Click For Summary
The discussion centers on the evolving understanding of entropy, particularly the shift from viewing it as a measure of disorder to interpreting it as a measure of energy dispersal. The recent editions of Peter Atkins' "Physical Chemistry" have reduced references to disorder, aligning more closely with Frank Lambert's perspective on entropy. Participants express confusion regarding the implications of this change and the clarity of definitions surrounding entropy. While some argue that equating entropy with disorder is limited, others highlight its relevance in various scientific contexts, including thermodynamics and information theory. The conversation underscores the complexity of entropy and the need for precise definitions in scientific discourse.
  • #31


Andy Resnick said:
I suppose you could solve the problem that way, but it's much easier for this problem to think in terms of the buoyancy force.
I guess you could. My mindset was on micro-states, but you certainly don't have to think in terms of micro-states to solve it.
 
Science news on Phys.org
  • #32


Naty1 said:
Whoa!
entropy as a measure of disorder has WIDE application...


From a paper titled "Entropy" by Frank Lambert:

Quote from the paper:

"The definition, "entropy is disorder", used in all US first-year college and university textbooks prior to 2002, has been deleted from 15 of 16 new editions or new texts published since 2002[2]. Entropy is not ‘disorder’ [3] nor is entropy change a change from order to disorder."


Link to cited paper:

http://docs.google.com/viewer?a=v&q...0nO6kA&sig=AHIEtbQt85_upRLNPdIu3SnPB8k7sGkGCg


The terms "computer science", and "information science" are just as amorphous and undefinable as is the term "consciousness." Just sayin'. :)

-s
 
  • #33


Andy Resnick said:
Personally, I see the entropy of a state as that amount of energy unavailable to perform useful work.
While the change in entropy is related to the amount of heat energy that is not available to perform useful work, one has to be careful in the language used.

First of all, your statement might confuse people into thinking that entropy is a measure of energy. It isn't.

It also might lead students to believe that the "energy unavailable to perform useful work" is a function of the thermodynamic state of a substance. Rather, the "energy unavailable to perform useful work" is a function of the difference between two states.

Finally, this statement might lead students to believe that "energy unavailable to perform useful work" is proportional to entropy or a change in entropy. If one defines "Energy Unavailable to Perform Useful Work" as the difference between the maximum work that is theoretically possible to obtain in a (reversible) process between those same two thermodynamic states and the work actually obtained from a process between two thermodynamic states (sometimes referred to as Lost Work), this may be correct for some processes (where temperature is constant). However, it would still be misleading because the "Energy Unavailable to Perform Useful Work" is not all the energy that is not available to perform useful work. It is only part of the energy that cannot be converted to work.

AM
 
  • #34


brainstorm said:
My Wan, good post - especially the point about heat still existing as molecular motion/KE in a system in equilibrium. Also, good point about heat itself not being a substance. Heat, volume, and pressure can be described as three dimensions of a system, imo. If volume is constant, the heat expresses itself as pressure, which indicates that heat is molecular momentum, imo. I realize that infrared radiation is also heat, but I think it could be argued that heat results from all radiation, including infrared - but maybe I'm missing something. Whoever posted that heat has nothing to do with molecular energy, I can't understand how they could say that.
One has to be very careful because the terminology of thermodynamics was developed before we knew about molecules.

The first law of thermodynamics refers to three forms of energy: heat flow (Q), internal energy (U) and mechanical work (W). What you describe as "heat" is U (ie. the energy due to molecular motion), not Q. Q is heat flow: a transfer of energy into or out of a particular body of matter.

Since Q is commonly referred to as heat or heat flow, it might be better to refer to the kinetic energy at the molecular level as thermal energy or just internal energy rather than heat.
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
If all molecules were moving at the same velocity, it would not be in thermal equilibrium and, hence, would have no temperature.

AM
 
  • #35
Lambert's woolly "definition" of entropy

Prof. Lambert is just confused. As an educator, he has found a distorted, simplified presentation that makes things easier for his students, but in fact it is in itself even more confused than anything he criticises about "disorder". He switches between dispersal of energy over physical volume and dispersal of energy over micro-states, apparently unaware that a system can only be entirely in or entirely out of a microstate so that all of its energy is in the microstate. And his idea of dispersal over physical position is obviously false: two states can have their local energy content completely, i.e., uniformly dispersed over the same volume but have different entropies.
I would not like to criticize his idea of "energy dispersal" merely because he cannot offer a clear and definite definition of "dispersal" because one cannot do that with "disorder" either: the only clear and definite quantitative definition of disorder would be to define it as being the entropy, which would be circular. So that would be an unfair criticism.

Furthermore, his advocates like to talk about recent chemistry texts, but nearly all of them that I can find are published by firms like Cengage or Alphascript, which although not exactly vanity presses or self-publishing, certainly do not engage in the kind of thorough peer review that mainstream textbook publishers do.
 
Last edited:
  • #36
andrebourbaki said:
Prof. Lambert is just confused. As an educator, he has found a distorted, simplified presentation that makes things easier for his students, but in fact it is in itself even more confused than anything he criticises about "disorder". He switches between dispersal of energy over physical volume and dispersal of energy over micro-states, apparently unaware that a system can only be entirely in or entirely out of a microstate so that all of its energy is in the microstate.
Prof. Lambert is not trying to redefine entropy. He is just trying to help students understand it.

The tendency of the universe toward increase in entropy can be thought of as a tendency toward macro states for which the number of equivalent micro states increases. Unless one defines disorder in a special way, it is difficult to see how this can be described as a tendency toward disorder.

A cup of boiling poured over an iceberg: The disorder in the cup of boiling water has decreased and we end up with more ice. Has disorder increased? Has energy dispersed? Has the system and surroundings assumed a state in which the number of micro states equivalent to that macrostate increased?

And his idea of dispersal over physical position is obviously false: two states can have their local energy content completely, i.e., uniformly dispersed over the same volume but have different entropies.
I don' t follow you there. Can you provide an example?

AM
 
  • #37
I looked up Atkins' 8th edition on Amazon, and he links dispersal of energy and disorder. I've never heard of "dispersal of energy", but it seems ok to me.

It seems compatible with this analogy I like from Kardar's notes: "This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."

As information is lost because of the limited resolution of our measuring apparatus, many different microstates (the precise positions of the two liquids) will be compatible with the macrostate (the reading indicated by our limited resolution measuring apparatus). So as entropy increases, we have less and less information about the precise position of things, so things seem more "disordered" to us in that sense.
 
  • #38
atyy said:
I looked up Atkins' 8th edition on Amazon, and he links dispersal of energy and disorder. I've never heard of "dispersal of energy", but it seems ok to me.

It seems compatible with this analogy I like from Kardar's notes: "This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."

As information is lost because of the limited resolution of our measuring apparatus, many different microstates (the precise positions of the two liquids) will be compatible with the macrostate (the reading indicated by our limited resolution measuring apparatus). So as entropy increases, we have less and less information about the precise position of things, so things seem more "disordered" to us in that sense.

Equating "energy dispersal" with entropy has problems as well. It is more accurate to think of an increase in entropy as a change in thermodynamic state (macro state) for which the number of equivalent microstates increases. This better explains the Gibbs paradox regarding mixing of gases which the "energy dispersal" concept does not do so well.

Frank Lambert's opposition the "disorder" approach is tied to his opposition to linking or equating "thermodynamic entropy" with Shannon's concept of entropy based on information theory. In my view, while there is an interesting mathematical similarity, the two entropies do seem to relate to very different things. However, there are different views on this.

In any event, I think that the "energy dispersal" concept is better and easier to grasp than a concept of "disorder" or a concept based on information theory. It may also be more intuitive than a concept based on micro states.

When energy spreads out, entropy increases. When energy becomes more concentrated, entropy decreases. Fundamentally, however, the second law is a statistical law. It says essentially that nature does not tend toward physically possible but statistically improbable configurations. So while we can think about it as having to do with energy dispersal it is really about probability. As long as we realize that, it seems to me that "energy dispersal" is one way to conceptualize entropy.

AM
 
Last edited:
  • #39
Andrew Mason said:
Equating "energy dispersal" with entropy has problems as well. It is more accurate to think of an increase in entropy as a change in thermodynamic state (macro state) for which the number of equivalent microstates increases. This better explains the Gibbs paradox regarding mixing of gases which the "energy dispersal" concept does not do so well.

Frank Lambert's opposition the "disorder" approach is tied to his opposition to linking or equating "thermodynamic entropy" with Shannon's concept of entropy based on information theory. In my view, while there is an interesting mathematical similarity, the two entropies do seem to relate to very different things. However, there are different views on this.

In any event, I think that the "energy dispersal" concept is better and easier to grasp than a concept of "disorder" or a concept based on information theory. It may also be more intuitive than a concept based on micro states.

When energy spreads out, entropy increases. When energy becomes more concentrated, entropy decreases. Fundamentally, however, entropy is a statistical law. It says essentially that nature does not to tend toward physically possible but statistically improbable configurations. So while we can think about it as having to do with energy dispersal it is really about probability. As long as we realize that, it seems to me that "energy dispersal" is one way to conceptualize entropy.

The Shannon entropy concept is exactly the same as that based on microstates in statistical physics. Both are basically the "number of states" or "number of possibilities". In information theory, these go by the name of "typical sequences" and the relevant theorem is something called "asymptotic equipartition" which is the rigourous version of what physicists learn. Then the Shannon mutual information is basically the change in entropy. Gaining information is a reduction in entropy, which makes sense in that if one is certain, then there is a reduction in the number of possibilities from many to only one.

The microstate definition is important in attempts like Boltzmann's use of kinetic theory to understand how irreversibility can arise from reversible laws of dynamics, and the basic idea that there is a loss of information due to the limited resolution of our measuring instruments - or in the Landauer explanation - due to the erasure of information due to finite memory.

I do agree that it is important to learn the classical equilibrium thermodynamic concept of entropy, because it comes from the Kelvin and Clausius statements of the second law, which are basically experimental facts. And this definition can be used independently of the microstate definition.

However, if Lambert opposes the Shannon definition, then he is also erroneously opposing the microstate definition.
 
Last edited:
  • #40
entropy, disorder, and dispersion

thermodynamic entropy is a precisely defined concept. Informational entropy is a statistical mechanics concept of entropy, first introduced by Boltzmann, refined by Gibbs, and re-discovered and applied to wider fields by Shannon, and is a concept that seems to be different, but Boltzmann was, with difficulty, able to prove that it was very closely related to thermodynamic entropy provided one assumed the Stosszahlansatz, or, as it is often called in English, 'molecular chaos', a hypothesis which is approximately true.

The intuitive concept of 'disorder' is the normal way to motivate the definition of entropy, but 'disorder' cannot really be given a precise definitiion except by using informational entropy, the number of micro-states compatible with our knowledge that the system is in a given macro-state.
The increasing disorder in a deck of cards produced by shuffling is a traditional example used to teach students about the statistical mechanical definition of entropy, it goes back at least to Sir James Jeans in 1900. Prof. Lambert and a few other fringe figures seem to be allergic to the deck of cards metaphor. It is only a metaphor, but they have carried out a crusade against the use of the concept of 'disorder' even though this has been proven to be useful in the discussion of lambda phase transitions.

The intuitive concept of 'energy dispersal' is preferred by some of these fringe figures, but their claims that it has been adopted by Atkins's eighth edition of his standard textbook are false. (And who knows how many of their other claims about it being adopted in texts are false or exaggerated.) What Atkins actually says is very sensible, so I reproduce it here.

"The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of 'disorder' and 'the dispersal of matter and energy' that are used widely to introduce the concept of entropy: a more 'disorderly' distribution of energy and matter corresponds to a greater number of microstates associated with the same total energy." --- p. 81

p. 143. "This increase in entropy is what we expect when one gas disperses into the other and the disorder increases."

On the other hand, it would be easier to make the intuitive concept of 'dispersal of energy' precise and quantitative, but then it disagrees with the precise and quantitative definition of entropy, as far as I can tell. Suppose, but I don't know that this fringe group has ever done so, but suppose for the sake of discussion we say 'standard deviation of the spatial distribution of energy' (supposing, for simplicity, that the density of matter is fixed and constant and uniform). (I am well aware this gives it the wrong units...)
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
6K
  • · Replies 15 ·
Replies
15
Views
9K
  • · Replies 5 ·
Replies
5
Views
4K
  • · Replies 25 ·
Replies
25
Views
8K
  • · Replies 5 ·
Replies
5
Views
527
  • · Replies 7 ·
Replies
7
Views
10K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 38 ·
2
Replies
38
Views
68K