Entropy is disorder = outmoded theory?

AI Thread Summary
The discussion centers on the evolving understanding of entropy, particularly the shift from viewing it as a measure of disorder to interpreting it as a measure of energy dispersal. The recent editions of Peter Atkins' "Physical Chemistry" have reduced references to disorder, aligning more closely with Frank Lambert's perspective on entropy. Participants express confusion regarding the implications of this change and the clarity of definitions surrounding entropy. While some argue that equating entropy with disorder is limited, others highlight its relevance in various scientific contexts, including thermodynamics and information theory. The conversation underscores the complexity of entropy and the need for precise definitions in scientific discourse.
Ray Eston Smith Jr
Messages
32
Reaction score
0
"entropy is disorder" = outmoded theory??

The wikipedia article I quote below is confusing me. I followed the links to Frank Lambert's website, where he claims that Peter Atkins, in the 8th edition of his Physical Chemistry, has come around to Lambert's idea that entropy is not related to disorder. Could this be true? I would be less surprised to learn that Atkins had become a born-again Christian. However, according to my old-fashioned understanding of entropy, even very improbable things happen occasionally, given a sufficiently huge number of trials. Seriously, I was about to start reading the 7th edition of Atkins Physical Chemistry, but now I'm wondering if I should look for a more reliable author.

http://en.wikipedia.org/wiki/Introduction_to_entropy
Traditionally, 20th century textbooks have introduced entropy as order and disorder so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with Frank L. Lambert describing entropy as energy dispersal describes entropy as measuring "the spontaneous dispersal of energy — at a specific temperature."

http://entropysite.oxy.edu/
March 2006
Atkins' "Physical Chemistry" has been the best selling text worldwide in this subject for many years.* The new 8th edition was published in the US March 16. However, in previous editions Atkins described systems and entropy change in terms of order and disorder or chaos.* Even though he long has used the phrase, "the dispersal of energy", it was confined to an order-disorder view of thermodynamics — for example, to spontaneous changes being "always accompanied by a dispersal of energy into a more disordered state".** (Or "to chaos" in his book "The Second Law".)
In contrast to the Second Law chapter of the 7th edition, which had some 27 instances of using "order to disorder" as a rationale for change, "disorder" and "disorderly" are mentioned only 3 times in the new 8th edition.* Atkins, with co-author dePaula, now state that their view of entropy "summarized by the Boltzmann formula is consistent with our previous statement [earlier in the chapter, re the dispersal of energy in classical thermodynamics] that the entropy is related to the dispersal of energy.
 
Science news on Phys.org


How is dispersal of energy an explanation for why a drop of dye disseminates into a glass of water? Isn't there just a certain amount of molecular motion due to heat-energy, which causes particles to tend to move around randomly such that the dye particles are likely to randomly traverse the particles of clear water and vice versa? Is that energy dispersion or just expression of energy as molecular motion?
 


Originators of new ideas are always claiming support from others...who knows what Atkins believes and whether that REALLY meshes with Lambert's idea.

I would not disregard a proven text because of such a claim. I've never even heard of "dispersal" of energy nor do I know whether Atkins/Lambert agree on what it is.

Entropy is a difficult enough concept that if someone has a better explanation, I'd like to read it.
 


Naty1 said:
Originators of new ideas are always claiming support from others...who knows what Atkins believes and whether that REALLY meshes with Lambert's idea.

I would not disregard a proven text because of such a claim. I've never even heard of "dispersal" of energy nor do I know whether Atkins/Lambert agree on what it is.

Entropy is a difficult enough concept that if someone has a better explanation, I'd like to read it.

Energy "dispersal" is a generally logical concept. It is what happens when a cue-ball breaks a triangle of billiard balls. If you have a room of 40 degree air and you make a fire, convection is the result of high KE among air molecules next to the fire dispersing energy to other molecules they come in contact with. Eventually, the system temperature will reach equilibrium, except the heated air will be pushed up by the denser cool air, which will grow as parts of the warm air disperse their energy through the windows and walls and subsequently sink.

Dispersal is just a pattern of transfer between high-energy particles and lower-energy ones. It's empirical common-sense, no?
 


Actually entropy has a one-to-one relation to the probability of that particular state being a final state after a "sufficient amount of time" has passed. The relation is a very steep exponential so that slightly higher entropy is extraordinarily favoured by probability.
But still lower entropy states do have a finite probability and will occur.
 


Ray Eston Smith Jr said:
The wikipedia article I quote below is confusing me.

There are several potentially confusing things in those sites. First, entropy relates to a given state, and so cannot be used to describe a process ('dispersal' of energy). Second, there seems to be loose control over discussion of entropy and changes to entropy. Third, 'entropy', like 'energy' is not that well defined outside of quantitative relationships.

Personally, I see the entropy of a state as that amount of energy unavailable to perform useful work. The free energy is the total amount of energy available to perform useful work.
 


I wouldn't say it's an "outmoded theory". It's not a theory and never was. It's just an interpretation that helps explain it. As such it may be an outmoded way of explaining entropy.

Equating entropy with disorder only makes sense in a few cases. Gas is more 'disordered' than liquid, both are more disordered than a solid. Mixtures are more disordered than pure substances.

But beyond that, equating entropy with disorder doesn't have much use. Another, more general definition of entropy is "energy not available to perform work". How this relates to entropy being 'disorder' isn't obvious. (In fact, I'd say that once you reconcile these views, that's when you 'get' entropy)
 


There are relationships between disorder, unavailable energy, and thermodynamic entropy. However, relationships do not appear to me to serve as tight enough explanations about: What is thermodynamic entropy as discovered and defined by Clausius.

If thermodynamic entropy is a measure of un-usable energy or a measure of disorder then, why are equilibrium conditions necessary to calculate it? In other words: What is the relationship between absorbing and expelling heat under equilibrium conditions, clearly an ideal condition, but, still necessary in the original macroscopic thermodynamic definition, and the amount of heat that leaks away?

There has already been a very detailed excellent discussion about the development of thermodynamic entropy in another thread. My intention in making this post is only to register an opinion and not to reopen the history of thermodynamic entropy. Even my question is posed without expecting an answer. It is intended to highlight something that I think is of significant importance, and, that I think is passed over in explanations.

James
 


James A. Putnam said:
If thermodynamic entropy is a measure of un-usable energy or a measure of disorder then, why are equilibrium conditions necessary to calculate it? In other words: What is the relationship between absorbing and expelling heat under equilibrium conditions, clearly an ideal condition, but, still necessary in the original macroscopic thermodynamic definition, and the amount of heat that leaks away?

Heat can only "leak away" as long as there is something in the system that is not in equilibrium with whatever is leaking the heat.
 
  • #10


brainstorm said:
Heat can only "leak away" as long as there is something in the system that is not in equilibrium with whatever is leaking the heat.

Nevertheless, that is a condition of the original definition. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered?
 
  • #11


Equating entropy with disorder only makes sense in a few cases...

But beyond that, equating entropy with disorder doesn't have much use.

Whoa!
entropy as a measure of disorder has WIDE application...to the evolution of the entire universe, for example, as long as one understands the effects of gravity on entropy disorder.

Entropy is a subset of information theory. and is extremely useful in understanding black hole event horizons. and is linked to decoherence in quantum mechanics which like entropy let's you know which way time is running. The idea that strings in string theory have entropy goes back to the earliest days of string theory.
 
Last edited:
  • #12


James A. Putnam said:
Nevertheless, that is a condition of the original definition. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered?

The genealogy and hermeneutics of "the original definition" may be of consequence if your interest is the history of science, but empirical logic makes it clear that heat can't go anywhere unless there is something in disequilibrium with something else. The fact that you define the system in exclusion of the thing that is in disequilibrium with it is a methodological failure to include everything in the system that is influencing the parts of it you are studying.
 
  • #13


brainstorm said:
The fact that you define the system in exclusion of the thing that is in disequilibrium with it is a methodological failure to include everything in the system that is influencing the parts of it you are studying.

Clausius defined the system.
 
  • #14


James A. Putnam said:
Clausius defined the system.

I don't know which system you're saying Clausius defined, but I doubt it is any specific system that you have access to study empirically. You have to apply the methodology to test it.
 
  • #15


The system is an irreversible carnot cycle.
 
  • #16


James A. Putnam said:
The system is an irreversible carnot cycle.

In other words, theoretical rather than empirical.
 
  • #17


Of course. So is a frictionless surface. Still, the question remains. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered? This is not my work. Clausius discovered thermodynamic entropy.
 
  • #18


James A. Putnam said:
Of course. So is a frictionless surface. Still, the question remains. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered? This is not my work. Clausius discovered thermodynamic entropy.

My point is that if heat is moving in a system that is "in equilibrium," then there is some element that is receiving the heat, which means that the system is not in equilibrium, probably because it has been defined in exclusion of whatever it is that is receiving the heat.
 
  • #19


Ok you have made your point. I posted my first message because my opinion is that Clausius' discovery is of fundamental importance and remains unexplained. It was ideal and impractical, but it worked and gave us knowledge of a very important property. For some reason, the conditions that he defined are necessary to correctly calculate thermodynamic entropy. I think it is important to know why. If you disagree, that is fine.

We have reached a point where you are repeating your message. I understood your message the first time. You are correct in what you say. However, the ideal circumstances can be very closely approximated by having the heat exchanged very very slowly. The use of very very slow actions is common in the derivations of thermodynamic properties since thermodynamics is defined as applying to equilibrium conditions. Temperature is an equilibrium condition, yet we can still make use of the concept of changes in temperature if the process proceeds very very slowly.
 
  • #20
The reason some authors may want to disassociate entropy from disorder has nothing to do with a rejection that entropy increase involves mixing unlikely macro-states into statistically more probable macro-states, or uniformity of a system. That remains, just as your old fashion understanding of entropy indicates. The term "disorder" in thermodynamics has an opposite meaning in everyday use, and discussions of this is often misunderstood as somehow rejecting the classical interpretation, which is not the case.

The "at a specific temperature" qualifier Lambert's definition used refers only to the total temperature of the system in question, not a uniform temperature of that system.
You can read Lambert's own words here:
http://entropysite.oxy.edu/entropy_isnot_disorder.html
And in J. Chem. Educ., 2002, 79 (2), p 187
http://pubs.acs.org/doi/abs/10.1021/ed079p187

Then there are micro-state issues I find interesting:
(preprint) http://arxiv.org/abs/1005.1683"

brainstorm said:
My point is that if heat is moving in a system that is "in equilibrium," then there is some element that is receiving the heat, which means that the system is not in equilibrium, probably because it has been defined in exclusion of whatever it is that is receiving the heat.
Heat does not have an existence separate from the motion and physical state of the system that defines it. It's an abstraction of the state of things, not itself a substance that flows. Take a tank of of gas with no heat flowing in or out of it, a perfectly enclosed system in thermal equilibrium. Does this mean heat is not moving in this system? No!

Heat is, at the molecular level, is the kinetic energy of the molecules, 1/2Mv^2, a product of molecular velocity. The only way to stop heat from moving in this system is to stop all molecular motion within the tank relative to the tank. This can NEVER happen without heat escaping the tank. The 2nd law makes no claim that motion eventually stops as entropy increases. So what does "in equilibrium" mean?

"In equilibrium" merely means that, on average, the temperature is the same everywhere in the tank, such that if you read the temperature at one place in the tank you know the temperature everywhere in the tank. Yet the molecules are still bouncing around trading energy. Thus heat is perpetually moving in a system, whether in equilibrium or not. The 2nd law, entropy, only requires heat in non-equilibrium to evolve toward equilibrium, not stop moving.
 
Last edited by a moderator:
  • #21


my_wan said:
Heat is, at the molecular level, is the kinetic energy of the molecules, 1/2Mv^2, a product of molecular velocity. The only way to stop heat from moving in this system is to stop all molecular motion within the tank relative to the tank. This can NEVER happen without heat escaping the tank. The 2nd law makes no claim that motion eventually stops as entropy increases. So what does "in equilibrium" mean?

"In equilibrium" merely means that, on average, the temperature is the same everywhere in the tank, such that if you read the temperature at one place in the tank you know the temperature everywhere in the tank. Yet the molecules are still bouncing around trading energy. Thus heat is perpetually moving in a system, whether in equilibrium or not. The 2nd law, entropy, only requires heat in non-equilibrium to evolve toward equilibrium, not stop moving.
No. Heat is not energy. Energy is a microscopically conserved quantity. Heat is a macroscopic thermodynamic concept and really no sentence should start "Heat is, at the molecular level..."

And no. Thermodynamic equilibrium is a much stronger condition than 'merely' having the same temperature everywhere. Thermodynamic equlibrium requires the system's properties are fully determined by specifying extensive thermodynamic parameters, energy, volume, mole numbers. Any system whose properties depend on what happened in the past is not in equilibrium, although it may be at the same temperature as something else.
 
  • #22


James A. Putnam said:
Ok you have made your point. I posted my first message because my opinion is that Clausius' discovery is of fundamental importance and remains unexplained. It was ideal and impractical, but it worked and gave us knowledge of a very important property. For some reason, the conditions that he defined are necessary to correctly calculate thermodynamic entropy. I think it is important to know why. If you disagree, that is fine.

We have reached a point where you are repeating your message. I understood your message the first time. You are correct in what you say. However, the ideal circumstances can be very closely approximated by having the heat exchanged very very slowly. The use of very very slow actions is common in the derivations of thermodynamic properties since thermodynamics is defined as applying to equilibrium conditions. Temperature is an equilibrium condition, yet we can still make use of the concept of changes in temperature if the process proceeds very very slowly.

Our concerns are just different. You are more interested in approximating this Clausius model for purposes of validating his model, which I am largely unfamiliar with. My interest was in establishing the fact that system-delineation involves human bias due to the unwillingness to recognize that a system cannot have an "outside" that is wholly isolated from the system itself. In other words, the boundary of the system is just a black-box for everything excluded by system-delineation. No system is ultimately closed except perhaps the universe containing all possible subsets, but in that case closure itself would seem to be a misapplied concept since the universe contains all possible potentialities and is therefore not "closed" to anything existent.
 
  • #23


My Wan, good post - especially the point about heat still existing as molecular motion/KE in a system in equilibrium. Also, good point about heat itself not being a substance. Heat, volume, and pressure can be described as three dimensions of a system, imo. If volume is constant, the heat expresses itself as pressure, which indicates that heat is molecular momentum, imo. I realize that infrared radiation is also heat, but I think it could be argued that heat results from all radiation, including infrared - but maybe I'm missing something. Whoever posted that heat has nothing to do with molecular energy, I can't understand how they could say that.

my_wan said:
"In equilibrium" merely means that, on average, the temperature is the same everywhere in the tank, such that if you read the temperature at one place in the tank you know the temperature everywhere in the tank. Yet the molecules are still bouncing around trading energy. Thus heat is perpetually moving in a system, whether in equilibrium or not. The 2nd law, entropy, only requires heat in non-equilibrium to evolve toward equilibrium, not stop moving.
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
 
  • #24


brainstorm said:
Our concerns are just different. You are more interested in approximating this Clausius model for purposes of validating his model, which I am largely unfamiliar with. My interest was in establishing the fact that system-delineation involves human bias due to the unwillingness to recognize that a system cannot have an "outside" that is wholly isolated from the system itself. ...

My interest is not in approximating the model. The model is based upon infinitesimally small changes over a very long period of time. That condition brings the model very close to replicating reality. My interest is in learning what it is that Clausius discovered? His definition of thermodynamic entropy is unique. I think it is important to not let that uniqueness go unadressed or passed over. That is all. None of what I have said is original to me.

With respect to the lack of perfect isolation in this universe, we can do very well in getting close enough to it to learn from even ideal cases. I think that your objection does not effectively detract from the value of Clausius' work. There is no such thing as a perfectly adiabatic wall of separation for the purposes of experimentation, but it can be designed to be sufficiently effective to justify speaking in terms of adiabatic separation. If your point is that any remaining imperfection invalidates ideal theoretical analysis, I do not agree.
 
  • #25


James A. Putnam said:
My interest is not in approximating the model. The model is based upon infinitesimally small changes over a very long period of time. That condition brings the model very close to replicating reality. My interest is in learning what it is that Clausius discovered? His definition of thermodynamic entropy is unique. I think it is important to not let that uniqueness go unadressed or passed over. That is all. None of what I have said is original to me.

With respect to the lack of perfect isolation in this universe, we can do very well in getting close enough to it to learn from even ideal cases. I think that your objection does not effectively detract from the value of Clausius' work. There is no such thing as a perfectly adiabatic wall of separation for the purposes of experimentation, but it can be designed to be sufficiently effective to justify speaking in terms of adiabatic separation. If your point is that any remaining imperfection invalidates ideal theoretical analysis, I do not agree.

Again, it is not clausius' or anyone else's work that I'm interested in. Nor is it any kind of near-ideal isolation of systems. My interest is in establishing that no such thing as a closed-system can be delineated operationally. The significance of this is that no system can reach equilibrium in practice without there being some extraneous factors that are in disequilibrium with it and therefore render total entropy an empirical impossibility.
 
  • #26


My message concerned Clausius' work. I am interested in it. Insignificant extraneous factors are not important and do not detract from it. Your interest concerns something that is separate from mine. I will leave it at that. You may have the last word.
 
  • #27


James A. Putnam said:
My message concerned Clausius' work. I am interested in it. Insignificant extraneous factors are not important and do not detract from it. Your interest concerns something that is separate from mine. I will leave it at that. You may have the last word.

Ok, my last word is that I don't understand how you can call system-extraneous factors insignificant if they fundamentally undermine the possibility of any real system ever reaching actual equilibrium. Now, if you want to talk in terms of relative equilibrium, that could be a different story.
 
  • #28


It should be noted that I specified a tank of gas. That is effectively ideal gas at constant volume in an enclosed system. This was specifically to avoid a general treatment of the full range thermodynamics potentials, including radiation. That would be overly complex to make the point that heat is not a substance in itself, and the conditions was specified such that I could safely ignore them. When I said Heat was a product of molecular velocity, it was in the same sentence I specified 1/2Mv^2, so particulate mass was not an absent part of that product.

Yes peteratcam, heat is energy. In the ideal gas situation I specified it is the equipartition of kinetic energy among all the degrees of freedom of the gas molecules (translational and rotational). Just because it makes no sense to call the kinetic energy of an individual particle heat, does not mean heat is not defined by the equipartition of kinetic energy among a mediums degrees of freedom. Yet the heat production definable by the kinetic energy of an individual particle is perfectly definable. That why the space shuttle needs heat shields. It's not the air temp doing that, and both the shuttle and the air is being heated, not just one heating the other. The inability to convert energy to mechanical work is not an indication of a lack of energy either. Increasing the heat in a set number of particles, given relativity, even increases the total mass of the system in direct proportion to the heat energy.

brainstorm said:
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
Even if somehow you got all the molecules of a gas moving at the same velocity, they would not stay that way, even at thermal equilibrium. The same velocity only has statistical meaning for a sufficiently large sample of the molecules. Which is why I specified an average. Equilibrium, in the ideal case specified, only means that it makes no different which random group of molecules you specify, the average remains the same.

This entails, at the level of individual molecules, you have a distribution of many different velocities characterized by the Boltzmann distribution. So, at this individual particle level, it is not just constant direction-changes. Higher energy particles are constantly transferring that energy to particles with lower kinetic energy and visa versa. If all the particle masses are the same, then if a fast particle hits a particle sitting motionless, relative to the container, the fast particle stops and the motionless particle takes on the velocity and kinetic energy the fast particle had (assuming rotation is unaffected). So it's better described as kinetic energy exchanges than direction-changes.

Of course once you bring in thermal radiation and other field interactions between the particles, the situation gets more complex. Yet the rules remain exactly the same, only how the degrees of freedom are defined changes.
 
  • #29


There's an interesting thermodynamic effect involving acceleration I've never seen described, but makes perfect sense.

It's well known a helium balloon will migrate toward a heat source, because hot air is less dense than cold air. Now tie a helium to the center console of an enclosed car, such that it floats just below the roof. Now step on the gas and the helium balloon will accelerate toward the front of the car. Hit the break and it accelerates to the back of the car. As if it was responding to inertial forces in the opposite direction of everything else in the car.

Does the reason this happens not make sense to anybody? Clue: it has to do with the Boltzmann distribution, the range of particles momentums among individual molecules.
 
Last edited:
  • #30


my_wan said:
Does the reason this happens not make sense to anybody? Clue: it has to do with the Boltzmann distribution, the range of particles momentums among individual molecules.

I suppose you could solve the problem that way, but it's much easier for this problem to think in terms of the buoyancy force.
 
  • #31


Andy Resnick said:
I suppose you could solve the problem that way, but it's much easier for this problem to think in terms of the buoyancy force.
I guess you could. My mindset was on micro-states, but you certainly don't have to think in terms of micro-states to solve it.
 
  • #32


Naty1 said:
Whoa!
entropy as a measure of disorder has WIDE application...


From a paper titled "Entropy" by Frank Lambert:

Quote from the paper:

"The definition, "entropy is disorder", used in all US first-year college and university textbooks prior to 2002, has been deleted from 15 of 16 new editions or new texts published since 2002[2]. Entropy is not ‘disorder’ [3] nor is entropy change a change from order to disorder."


Link to cited paper:

http://docs.google.com/viewer?a=v&q...0nO6kA&sig=AHIEtbQt85_upRLNPdIu3SnPB8k7sGkGCg


The terms "computer science", and "information science" are just as amorphous and undefinable as is the term "consciousness." Just sayin'. :)

-s
 
  • #33


Andy Resnick said:
Personally, I see the entropy of a state as that amount of energy unavailable to perform useful work.
While the change in entropy is related to the amount of heat energy that is not available to perform useful work, one has to be careful in the language used.

First of all, your statement might confuse people into thinking that entropy is a measure of energy. It isn't.

It also might lead students to believe that the "energy unavailable to perform useful work" is a function of the thermodynamic state of a substance. Rather, the "energy unavailable to perform useful work" is a function of the difference between two states.

Finally, this statement might lead students to believe that "energy unavailable to perform useful work" is proportional to entropy or a change in entropy. If one defines "Energy Unavailable to Perform Useful Work" as the difference between the maximum work that is theoretically possible to obtain in a (reversible) process between those same two thermodynamic states and the work actually obtained from a process between two thermodynamic states (sometimes referred to as Lost Work), this may be correct for some processes (where temperature is constant). However, it would still be misleading because the "Energy Unavailable to Perform Useful Work" is not all the energy that is not available to perform useful work. It is only part of the energy that cannot be converted to work.

AM
 
  • #34


brainstorm said:
My Wan, good post - especially the point about heat still existing as molecular motion/KE in a system in equilibrium. Also, good point about heat itself not being a substance. Heat, volume, and pressure can be described as three dimensions of a system, imo. If volume is constant, the heat expresses itself as pressure, which indicates that heat is molecular momentum, imo. I realize that infrared radiation is also heat, but I think it could be argued that heat results from all radiation, including infrared - but maybe I'm missing something. Whoever posted that heat has nothing to do with molecular energy, I can't understand how they could say that.
One has to be very careful because the terminology of thermodynamics was developed before we knew about molecules.

The first law of thermodynamics refers to three forms of energy: heat flow (Q), internal energy (U) and mechanical work (W). What you describe as "heat" is U (ie. the energy due to molecular motion), not Q. Q is heat flow: a transfer of energy into or out of a particular body of matter.

Since Q is commonly referred to as heat or heat flow, it might be better to refer to the kinetic energy at the molecular level as thermal energy or just internal energy rather than heat.
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
If all molecules were moving at the same velocity, it would not be in thermal equilibrium and, hence, would have no temperature.

AM
 
  • #35
Lambert's woolly "definition" of entropy

Prof. Lambert is just confused. As an educator, he has found a distorted, simplified presentation that makes things easier for his students, but in fact it is in itself even more confused than anything he criticises about "disorder". He switches between dispersal of energy over physical volume and dispersal of energy over micro-states, apparently unaware that a system can only be entirely in or entirely out of a microstate so that all of its energy is in the microstate. And his idea of dispersal over physical position is obviously false: two states can have their local energy content completely, i.e., uniformly dispersed over the same volume but have different entropies.
I would not like to criticize his idea of "energy dispersal" merely because he cannot offer a clear and definite definition of "dispersal" because one cannot do that with "disorder" either: the only clear and definite quantitative definition of disorder would be to define it as being the entropy, which would be circular. So that would be an unfair criticism.

Furthermore, his advocates like to talk about recent chemistry texts, but nearly all of them that I can find are published by firms like Cengage or Alphascript, which although not exactly vanity presses or self-publishing, certainly do not engage in the kind of thorough peer review that mainstream textbook publishers do.
 
Last edited:
  • #36
andrebourbaki said:
Prof. Lambert is just confused. As an educator, he has found a distorted, simplified presentation that makes things easier for his students, but in fact it is in itself even more confused than anything he criticises about "disorder". He switches between dispersal of energy over physical volume and dispersal of energy over micro-states, apparently unaware that a system can only be entirely in or entirely out of a microstate so that all of its energy is in the microstate.
Prof. Lambert is not trying to redefine entropy. He is just trying to help students understand it.

The tendency of the universe toward increase in entropy can be thought of as a tendency toward macro states for which the number of equivalent micro states increases. Unless one defines disorder in a special way, it is difficult to see how this can be described as a tendency toward disorder.

A cup of boiling poured over an iceberg: The disorder in the cup of boiling water has decreased and we end up with more ice. Has disorder increased? Has energy dispersed? Has the system and surroundings assumed a state in which the number of micro states equivalent to that macrostate increased?

And his idea of dispersal over physical position is obviously false: two states can have their local energy content completely, i.e., uniformly dispersed over the same volume but have different entropies.
I don' t follow you there. Can you provide an example?

AM
 
  • #37
I looked up Atkins' 8th edition on Amazon, and he links dispersal of energy and disorder. I've never heard of "dispersal of energy", but it seems ok to me.

It seems compatible with this analogy I like from Kardar's notes: "This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."

As information is lost because of the limited resolution of our measuring apparatus, many different microstates (the precise positions of the two liquids) will be compatible with the macrostate (the reading indicated by our limited resolution measuring apparatus). So as entropy increases, we have less and less information about the precise position of things, so things seem more "disordered" to us in that sense.
 
  • #38
atyy said:
I looked up Atkins' 8th edition on Amazon, and he links dispersal of energy and disorder. I've never heard of "dispersal of energy", but it seems ok to me.

It seems compatible with this analogy I like from Kardar's notes: "This information, however, is inevitably transported to shorter scales. A useful image is that of mixing two immiscible fluids. While the two fluids remain distinct at each point, the transitions in space from one to the next occur at finer resolution on subsequent mixing. At some point, a finite resolution in any measuring apparatus will prevent keeping track of the two components."

As information is lost because of the limited resolution of our measuring apparatus, many different microstates (the precise positions of the two liquids) will be compatible with the macrostate (the reading indicated by our limited resolution measuring apparatus). So as entropy increases, we have less and less information about the precise position of things, so things seem more "disordered" to us in that sense.

Equating "energy dispersal" with entropy has problems as well. It is more accurate to think of an increase in entropy as a change in thermodynamic state (macro state) for which the number of equivalent microstates increases. This better explains the Gibbs paradox regarding mixing of gases which the "energy dispersal" concept does not do so well.

Frank Lambert's opposition the "disorder" approach is tied to his opposition to linking or equating "thermodynamic entropy" with Shannon's concept of entropy based on information theory. In my view, while there is an interesting mathematical similarity, the two entropies do seem to relate to very different things. However, there are different views on this.

In any event, I think that the "energy dispersal" concept is better and easier to grasp than a concept of "disorder" or a concept based on information theory. It may also be more intuitive than a concept based on micro states.

When energy spreads out, entropy increases. When energy becomes more concentrated, entropy decreases. Fundamentally, however, the second law is a statistical law. It says essentially that nature does not tend toward physically possible but statistically improbable configurations. So while we can think about it as having to do with energy dispersal it is really about probability. As long as we realize that, it seems to me that "energy dispersal" is one way to conceptualize entropy.

AM
 
Last edited:
  • #39
Andrew Mason said:
Equating "energy dispersal" with entropy has problems as well. It is more accurate to think of an increase in entropy as a change in thermodynamic state (macro state) for which the number of equivalent microstates increases. This better explains the Gibbs paradox regarding mixing of gases which the "energy dispersal" concept does not do so well.

Frank Lambert's opposition the "disorder" approach is tied to his opposition to linking or equating "thermodynamic entropy" with Shannon's concept of entropy based on information theory. In my view, while there is an interesting mathematical similarity, the two entropies do seem to relate to very different things. However, there are different views on this.

In any event, I think that the "energy dispersal" concept is better and easier to grasp than a concept of "disorder" or a concept based on information theory. It may also be more intuitive than a concept based on micro states.

When energy spreads out, entropy increases. When energy becomes more concentrated, entropy decreases. Fundamentally, however, entropy is a statistical law. It says essentially that nature does not to tend toward physically possible but statistically improbable configurations. So while we can think about it as having to do with energy dispersal it is really about probability. As long as we realize that, it seems to me that "energy dispersal" is one way to conceptualize entropy.

The Shannon entropy concept is exactly the same as that based on microstates in statistical physics. Both are basically the "number of states" or "number of possibilities". In information theory, these go by the name of "typical sequences" and the relevant theorem is something called "asymptotic equipartition" which is the rigourous version of what physicists learn. Then the Shannon mutual information is basically the change in entropy. Gaining information is a reduction in entropy, which makes sense in that if one is certain, then there is a reduction in the number of possibilities from many to only one.

The microstate definition is important in attempts like Boltzmann's use of kinetic theory to understand how irreversibility can arise from reversible laws of dynamics, and the basic idea that there is a loss of information due to the limited resolution of our measuring instruments - or in the Landauer explanation - due to the erasure of information due to finite memory.

I do agree that it is important to learn the classical equilibrium thermodynamic concept of entropy, because it comes from the Kelvin and Clausius statements of the second law, which are basically experimental facts. And this definition can be used independently of the microstate definition.

However, if Lambert opposes the Shannon definition, then he is also erroneously opposing the microstate definition.
 
Last edited:
  • #40
entropy, disorder, and dispersion

thermodynamic entropy is a precisely defined concept. Informational entropy is a statistical mechanics concept of entropy, first introduced by Boltzmann, refined by Gibbs, and re-discovered and applied to wider fields by Shannon, and is a concept that seems to be different, but Boltzmann was, with difficulty, able to prove that it was very closely related to thermodynamic entropy provided one assumed the Stosszahlansatz, or, as it is often called in English, 'molecular chaos', a hypothesis which is approximately true.

The intuitive concept of 'disorder' is the normal way to motivate the definition of entropy, but 'disorder' cannot really be given a precise definitiion except by using informational entropy, the number of micro-states compatible with our knowledge that the system is in a given macro-state.
The increasing disorder in a deck of cards produced by shuffling is a traditional example used to teach students about the statistical mechanical definition of entropy, it goes back at least to Sir James Jeans in 1900. Prof. Lambert and a few other fringe figures seem to be allergic to the deck of cards metaphor. It is only a metaphor, but they have carried out a crusade against the use of the concept of 'disorder' even though this has been proven to be useful in the discussion of lambda phase transitions.

The intuitive concept of 'energy dispersal' is preferred by some of these fringe figures, but their claims that it has been adopted by Atkins's eighth edition of his standard textbook are false. (And who knows how many of their other claims about it being adopted in texts are false or exaggerated.) What Atkins actually says is very sensible, so I reproduce it here.

"The concept of the number of microstates makes quantitative the ill-defined qualitative concepts of 'disorder' and 'the dispersal of matter and energy' that are used widely to introduce the concept of entropy: a more 'disorderly' distribution of energy and matter corresponds to a greater number of microstates associated with the same total energy." --- p. 81

p. 143. "This increase in entropy is what we expect when one gas disperses into the other and the disorder increases."

On the other hand, it would be easier to make the intuitive concept of 'dispersal of energy' precise and quantitative, but then it disagrees with the precise and quantitative definition of entropy, as far as I can tell. Suppose, but I don't know that this fringe group has ever done so, but suppose for the sake of discussion we say 'standard deviation of the spatial distribution of energy' (supposing, for simplicity, that the density of matter is fixed and constant and uniform). (I am well aware this gives it the wrong units...)
 
Last edited:

Similar threads

Back
Top