Entropy is disorder = outmoded theory?

In summary: Third, the entropy of a system is not a measure of how much disorder or chaos exists in that system, but of the energy dispersal within that system. Lastly, it is not clear whether Frank Lambert agrees with Peter Atkins on this point.
  • #1
Ray Eston Smith Jr
32
0
"entropy is disorder" = outmoded theory??

The wikipedia article I quote below is confusing me. I followed the links to Frank Lambert's website, where he claims that Peter Atkins, in the 8th edition of his Physical Chemistry, has come around to Lambert's idea that entropy is not related to disorder. Could this be true? I would be less surprised to learn that Atkins had become a born-again Christian. However, according to my old-fashioned understanding of entropy, even very improbable things happen occasionally, given a sufficiently huge number of trials. Seriously, I was about to start reading the 7th edition of Atkins Physical Chemistry, but now I'm wondering if I should look for a more reliable author.

http://en.wikipedia.org/wiki/Introduction_to_entropy
Traditionally, 20th century textbooks have introduced entropy as order and disorder so that it provides "a measurement of the disorder or randomness of a system". It has been argued that ambiguities in the terms used (such as "disorder" and "chaos") contribute to widespread confusion and can hinder comprehension of entropy for most students. A more recent formulation associated with Frank L. Lambert describing entropy as energy dispersal describes entropy as measuring "the spontaneous dispersal of energy — at a specific temperature."

http://entropysite.oxy.edu/
March 2006
Atkins' "Physical Chemistry" has been the best selling text worldwide in this subject for many years.* The new 8th edition was published in the US March 16. However, in previous editions Atkins described systems and entropy change in terms of order and disorder or chaos.* Even though he long has used the phrase, "the dispersal of energy", it was confined to an order-disorder view of thermodynamics — for example, to spontaneous changes being "always accompanied by a dispersal of energy into a more disordered state".** (Or "to chaos" in his book "The Second Law".)
In contrast to the Second Law chapter of the 7th edition, which had some 27 instances of using "order to disorder" as a rationale for change, "disorder" and "disorderly" are mentioned only 3 times in the new 8th edition.* Atkins, with co-author dePaula, now state that their view of entropy "summarized by the Boltzmann formula is consistent with our previous statement [earlier in the chapter, re the dispersal of energy in classical thermodynamics] that the entropy is related to the dispersal of energy.
 
Science news on Phys.org
  • #2


How is dispersal of energy an explanation for why a drop of dye disseminates into a glass of water? Isn't there just a certain amount of molecular motion due to heat-energy, which causes particles to tend to move around randomly such that the dye particles are likely to randomly traverse the particles of clear water and vice versa? Is that energy dispersion or just expression of energy as molecular motion?
 
  • #3


Originators of new ideas are always claiming support from others...who knows what Atkins believes and whether that REALLY meshes with Lambert's idea.

I would not disregard a proven text because of such a claim. I've never even heard of "dispersal" of energy nor do I know whether Atkins/Lambert agree on what it is.

Entropy is a difficult enough concept that if someone has a better explanation, I'd like to read it.
 
  • #4


Naty1 said:
Originators of new ideas are always claiming support from others...who knows what Atkins believes and whether that REALLY meshes with Lambert's idea.

I would not disregard a proven text because of such a claim. I've never even heard of "dispersal" of energy nor do I know whether Atkins/Lambert agree on what it is.

Entropy is a difficult enough concept that if someone has a better explanation, I'd like to read it.

Energy "dispersal" is a generally logical concept. It is what happens when a cue-ball breaks a triangle of billiard balls. If you have a room of 40 degree air and you make a fire, convection is the result of high KE among air molecules next to the fire dispersing energy to other molecules they come in contact with. Eventually, the system temperature will reach equilibrium, except the heated air will be pushed up by the denser cool air, which will grow as parts of the warm air disperse their energy through the windows and walls and subsequently sink.

Dispersal is just a pattern of transfer between high-energy particles and lower-energy ones. It's empirical common-sense, no?
 
  • #5


Actually entropy has a one-to-one relation to the probability of that particular state being a final state after a "sufficient amount of time" has passed. The relation is a very steep exponential so that slightly higher entropy is extraordinarily favoured by probability.
But still lower entropy states do have a finite probability and will occur.
 
  • #6


Ray Eston Smith Jr said:
The wikipedia article I quote below is confusing me.

There are several potentially confusing things in those sites. First, entropy relates to a given state, and so cannot be used to describe a process ('dispersal' of energy). Second, there seems to be loose control over discussion of entropy and changes to entropy. Third, 'entropy', like 'energy' is not that well defined outside of quantitative relationships.

Personally, I see the entropy of a state as that amount of energy unavailable to perform useful work. The free energy is the total amount of energy available to perform useful work.
 
  • #7


I wouldn't say it's an "outmoded theory". It's not a theory and never was. It's just an interpretation that helps explain it. As such it may be an outmoded way of explaining entropy.

Equating entropy with disorder only makes sense in a few cases. Gas is more 'disordered' than liquid, both are more disordered than a solid. Mixtures are more disordered than pure substances.

But beyond that, equating entropy with disorder doesn't have much use. Another, more general definition of entropy is "energy not available to perform work". How this relates to entropy being 'disorder' isn't obvious. (In fact, I'd say that once you reconcile these views, that's when you 'get' entropy)
 
  • #8


There are relationships between disorder, unavailable energy, and thermodynamic entropy. However, relationships do not appear to me to serve as tight enough explanations about: What is thermodynamic entropy as discovered and defined by Clausius.

If thermodynamic entropy is a measure of un-usable energy or a measure of disorder then, why are equilibrium conditions necessary to calculate it? In other words: What is the relationship between absorbing and expelling heat under equilibrium conditions, clearly an ideal condition, but, still necessary in the original macroscopic thermodynamic definition, and the amount of heat that leaks away?

There has already been a very detailed excellent discussion about the development of thermodynamic entropy in another thread. My intention in making this post is only to register an opinion and not to reopen the history of thermodynamic entropy. Even my question is posed without expecting an answer. It is intended to highlight something that I think is of significant importance, and, that I think is passed over in explanations.

James
 
  • #9


James A. Putnam said:
If thermodynamic entropy is a measure of un-usable energy or a measure of disorder then, why are equilibrium conditions necessary to calculate it? In other words: What is the relationship between absorbing and expelling heat under equilibrium conditions, clearly an ideal condition, but, still necessary in the original macroscopic thermodynamic definition, and the amount of heat that leaks away?

Heat can only "leak away" as long as there is something in the system that is not in equilibrium with whatever is leaking the heat.
 
  • #10


brainstorm said:
Heat can only "leak away" as long as there is something in the system that is not in equilibrium with whatever is leaking the heat.

Nevertheless, that is a condition of the original definition. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered?
 
  • #11


Equating entropy with disorder only makes sense in a few cases...

But beyond that, equating entropy with disorder doesn't have much use.

Whoa!
entropy as a measure of disorder has WIDE application...to the evolution of the entire universe, for example, as long as one understands the effects of gravity on entropy disorder.

Entropy is a subset of information theory. and is extremely useful in understanding black hole event horizons. and is linked to decoherence in quantum mechanics which like entropy let's you know which way time is running. The idea that strings in string theory have entropy goes back to the earliest days of string theory.
 
Last edited:
  • #12


James A. Putnam said:
Nevertheless, that is a condition of the original definition. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered?

The genealogy and hermeneutics of "the original definition" may be of consequence if your interest is the history of science, but empirical logic makes it clear that heat can't go anywhere unless there is something in disequilibrium with something else. The fact that you define the system in exclusion of the thing that is in disequilibrium with it is a methodological failure to include everything in the system that is influencing the parts of it you are studying.
 
  • #13


brainstorm said:
The fact that you define the system in exclusion of the thing that is in disequilibrium with it is a methodological failure to include everything in the system that is influencing the parts of it you are studying.

Clausius defined the system.
 
  • #14


James A. Putnam said:
Clausius defined the system.

I don't know which system you're saying Clausius defined, but I doubt it is any specific system that you have access to study empirically. You have to apply the methodology to test it.
 
  • #15


The system is an irreversible carnot cycle.
 
  • #16


James A. Putnam said:
The system is an irreversible carnot cycle.

In other words, theoretical rather than empirical.
 
  • #17


Of course. So is a frictionless surface. Still, the question remains. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered? This is not my work. Clausius discovered thermodynamic entropy.
 
  • #18


James A. Putnam said:
Of course. So is a frictionless surface. Still, the question remains. Is your point that the original definition is inherently mistaken and can therefore be disregarded in favor of other definitions that do not have to take into account that which Clausius discovered? This is not my work. Clausius discovered thermodynamic entropy.

My point is that if heat is moving in a system that is "in equilibrium," then there is some element that is receiving the heat, which means that the system is not in equilibrium, probably because it has been defined in exclusion of whatever it is that is receiving the heat.
 
  • #19


Ok you have made your point. I posted my first message because my opinion is that Clausius' discovery is of fundamental importance and remains unexplained. It was ideal and impractical, but it worked and gave us knowledge of a very important property. For some reason, the conditions that he defined are necessary to correctly calculate thermodynamic entropy. I think it is important to know why. If you disagree, that is fine.

We have reached a point where you are repeating your message. I understood your message the first time. You are correct in what you say. However, the ideal circumstances can be very closely approximated by having the heat exchanged very very slowly. The use of very very slow actions is common in the derivations of thermodynamic properties since thermodynamics is defined as applying to equilibrium conditions. Temperature is an equilibrium condition, yet we can still make use of the concept of changes in temperature if the process proceeds very very slowly.
 
  • #20
The reason some authors may want to disassociate entropy from disorder has nothing to do with a rejection that entropy increase involves mixing unlikely macro-states into statistically more probable macro-states, or uniformity of a system. That remains, just as your old fashion understanding of entropy indicates. The term "disorder" in thermodynamics has an opposite meaning in everyday use, and discussions of this is often misunderstood as somehow rejecting the classical interpretation, which is not the case.

The "at a specific temperature" qualifier Lambert's definition used refers only to the total temperature of the system in question, not a uniform temperature of that system.
You can read Lambert's own words here:
http://entropysite.oxy.edu/entropy_isnot_disorder.html
And in J. Chem. Educ., 2002, 79 (2), p 187
http://pubs.acs.org/doi/abs/10.1021/ed079p187

Then there are micro-state issues I find interesting:
(preprint) http://arxiv.org/abs/1005.1683" [Broken]

brainstorm said:
My point is that if heat is moving in a system that is "in equilibrium," then there is some element that is receiving the heat, which means that the system is not in equilibrium, probably because it has been defined in exclusion of whatever it is that is receiving the heat.
Heat does not have an existence separate from the motion and physical state of the system that defines it. It's an abstraction of the state of things, not itself a substance that flows. Take a tank of of gas with no heat flowing in or out of it, a perfectly enclosed system in thermal equilibrium. Does this mean heat is not moving in this system? No!

Heat is, at the molecular level, is the kinetic energy of the molecules, 1/2Mv^2, a product of molecular velocity. The only way to stop heat from moving in this system is to stop all molecular motion within the tank relative to the tank. This can NEVER happen without heat escaping the tank. The 2nd law makes no claim that motion eventually stops as entropy increases. So what does "in equilibrium" mean?

"In equilibrium" merely means that, on average, the temperature is the same everywhere in the tank, such that if you read the temperature at one place in the tank you know the temperature everywhere in the tank. Yet the molecules are still bouncing around trading energy. Thus heat is perpetually moving in a system, whether in equilibrium or not. The 2nd law, entropy, only requires heat in non-equilibrium to evolve toward equilibrium, not stop moving.
 
Last edited by a moderator:
  • #21


my_wan said:
Heat is, at the molecular level, is the kinetic energy of the molecules, 1/2Mv^2, a product of molecular velocity. The only way to stop heat from moving in this system is to stop all molecular motion within the tank relative to the tank. This can NEVER happen without heat escaping the tank. The 2nd law makes no claim that motion eventually stops as entropy increases. So what does "in equilibrium" mean?

"In equilibrium" merely means that, on average, the temperature is the same everywhere in the tank, such that if you read the temperature at one place in the tank you know the temperature everywhere in the tank. Yet the molecules are still bouncing around trading energy. Thus heat is perpetually moving in a system, whether in equilibrium or not. The 2nd law, entropy, only requires heat in non-equilibrium to evolve toward equilibrium, not stop moving.
No. Heat is not energy. Energy is a microscopically conserved quantity. Heat is a macroscopic thermodynamic concept and really no sentence should start "Heat is, at the molecular level..."

And no. Thermodynamic equilibrium is a much stronger condition than 'merely' having the same temperature everywhere. Thermodynamic equlibrium requires the system's properties are fully determined by specifying extensive thermodynamic parameters, energy, volume, mole numbers. Any system whose properties depend on what happened in the past is not in equilibrium, although it may be at the same temperature as something else.
 
  • #22


James A. Putnam said:
Ok you have made your point. I posted my first message because my opinion is that Clausius' discovery is of fundamental importance and remains unexplained. It was ideal and impractical, but it worked and gave us knowledge of a very important property. For some reason, the conditions that he defined are necessary to correctly calculate thermodynamic entropy. I think it is important to know why. If you disagree, that is fine.

We have reached a point where you are repeating your message. I understood your message the first time. You are correct in what you say. However, the ideal circumstances can be very closely approximated by having the heat exchanged very very slowly. The use of very very slow actions is common in the derivations of thermodynamic properties since thermodynamics is defined as applying to equilibrium conditions. Temperature is an equilibrium condition, yet we can still make use of the concept of changes in temperature if the process proceeds very very slowly.

Our concerns are just different. You are more interested in approximating this Clausius model for purposes of validating his model, which I am largely unfamiliar with. My interest was in establishing the fact that system-delineation involves human bias due to the unwillingness to recognize that a system cannot have an "outside" that is wholly isolated from the system itself. In other words, the boundary of the system is just a black-box for everything excluded by system-delineation. No system is ultimately closed except perhaps the universe containing all possible subsets, but in that case closure itself would seem to be a misapplied concept since the universe contains all possible potentialities and is therefore not "closed" to anything existent.
 
  • #23


My Wan, good post - especially the point about heat still existing as molecular motion/KE in a system in equilibrium. Also, good point about heat itself not being a substance. Heat, volume, and pressure can be described as three dimensions of a system, imo. If volume is constant, the heat expresses itself as pressure, which indicates that heat is molecular momentum, imo. I realize that infrared radiation is also heat, but I think it could be argued that heat results from all radiation, including infrared - but maybe I'm missing something. Whoever posted that heat has nothing to do with molecular energy, I can't understand how they could say that.

my_wan said:
"In equilibrium" merely means that, on average, the temperature is the same everywhere in the tank, such that if you read the temperature at one place in the tank you know the temperature everywhere in the tank. Yet the molecules are still bouncing around trading energy. Thus heat is perpetually moving in a system, whether in equilibrium or not. The 2nd law, entropy, only requires heat in non-equilibrium to evolve toward equilibrium, not stop moving.
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
 
  • #24


brainstorm said:
Our concerns are just different. You are more interested in approximating this Clausius model for purposes of validating his model, which I am largely unfamiliar with. My interest was in establishing the fact that system-delineation involves human bias due to the unwillingness to recognize that a system cannot have an "outside" that is wholly isolated from the system itself. ...

My interest is not in approximating the model. The model is based upon infinitesimally small changes over a very long period of time. That condition brings the model very close to replicating reality. My interest is in learning what it is that Clausius discovered? His definition of thermodynamic entropy is unique. I think it is important to not let that uniqueness go unadressed or passed over. That is all. None of what I have said is original to me.

With respect to the lack of perfect isolation in this universe, we can do very well in getting close enough to it to learn from even ideal cases. I think that your objection does not effectively detract from the value of Clausius' work. There is no such thing as a perfectly adiabatic wall of separation for the purposes of experimentation, but it can be designed to be sufficiently effective to justify speaking in terms of adiabatic separation. If your point is that any remaining imperfection invalidates ideal theoretical analysis, I do not agree.
 
  • #25


James A. Putnam said:
My interest is not in approximating the model. The model is based upon infinitesimally small changes over a very long period of time. That condition brings the model very close to replicating reality. My interest is in learning what it is that Clausius discovered? His definition of thermodynamic entropy is unique. I think it is important to not let that uniqueness go unadressed or passed over. That is all. None of what I have said is original to me.

With respect to the lack of perfect isolation in this universe, we can do very well in getting close enough to it to learn from even ideal cases. I think that your objection does not effectively detract from the value of Clausius' work. There is no such thing as a perfectly adiabatic wall of separation for the purposes of experimentation, but it can be designed to be sufficiently effective to justify speaking in terms of adiabatic separation. If your point is that any remaining imperfection invalidates ideal theoretical analysis, I do not agree.

Again, it is not clausius' or anyone else's work that I'm interested in. Nor is it any kind of near-ideal isolation of systems. My interest is in establishing that no such thing as a closed-system can be delineated operationally. The significance of this is that no system can reach equilibrium in practice without there being some extraneous factors that are in disequilibrium with it and therefore render total entropy an empirical impossibility.
 
  • #26


My message concerned Clausius' work. I am interested in it. Insignificant extraneous factors are not important and do not detract from it. Your interest concerns something that is separate from mine. I will leave it at that. You may have the last word.
 
  • #27


James A. Putnam said:
My message concerned Clausius' work. I am interested in it. Insignificant extraneous factors are not important and do not detract from it. Your interest concerns something that is separate from mine. I will leave it at that. You may have the last word.

Ok, my last word is that I don't understand how you can call system-extraneous factors insignificant if they fundamentally undermine the possibility of any real system ever reaching actual equilibrium. Now, if you want to talk in terms of relative equilibrium, that could be a different story.
 
  • #28


It should be noted that I specified a tank of gas. That is effectively ideal gas at constant volume in an enclosed system. This was specifically to avoid a general treatment of the full range thermodynamics potentials, including radiation. That would be overly complex to make the point that heat is not a substance in itself, and the conditions was specified such that I could safely ignore them. When I said Heat was a product of molecular velocity, it was in the same sentence I specified 1/2Mv^2, so particulate mass was not an absent part of that product.

Yes peteratcam, heat is energy. In the ideal gas situation I specified it is the equipartition of kinetic energy among all the degrees of freedom of the gas molecules (translational and rotational). Just because it makes no sense to call the kinetic energy of an individual particle heat, does not mean heat is not defined by the equipartition of kinetic energy among a mediums degrees of freedom. Yet the heat production definable by the kinetic energy of an individual particle is perfectly definable. That why the space shuttle needs heat shields. It's not the air temp doing that, and both the shuttle and the air is being heated, not just one heating the other. The inability to convert energy to mechanical work is not an indication of a lack of energy either. Increasing the heat in a set number of particles, given relativity, even increases the total mass of the system in direct proportion to the heat energy.

brainstorm said:
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
Even if somehow you got all the molecules of a gas moving at the same velocity, they would not stay that way, even at thermal equilibrium. The same velocity only has statistical meaning for a sufficiently large sample of the molecules. Which is why I specified an average. Equilibrium, in the ideal case specified, only means that it makes no different which random group of molecules you specify, the average remains the same.

This entails, at the level of individual molecules, you have a distribution of many different velocities characterized by the Boltzmann distribution. So, at this individual particle level, it is not just constant direction-changes. Higher energy particles are constantly transferring that energy to particles with lower kinetic energy and visa versa. If all the particle masses are the same, then if a fast particle hits a particle sitting motionless, relative to the container, the fast particle stops and the motionless particle takes on the velocity and kinetic energy the fast particle had (assuming rotation is unaffected). So it's better described as kinetic energy exchanges than direction-changes.

Of course once you bring in thermal radiation and other field interactions between the particles, the situation gets more complex. Yet the rules remain exactly the same, only how the degrees of freedom are defined changes.
 
  • #29


There's an interesting thermodynamic effect involving acceleration I've never seen described, but makes perfect sense.

It's well known a helium balloon will migrate toward a heat source, because hot air is less dense than cold air. Now tie a helium to the center console of an enclosed car, such that it floats just below the roof. Now step on the gas and the helium balloon will accelerate toward the front of the car. Hit the break and it accelerates to the back of the car. As if it was responding to inertial forces in the opposite direction of everything else in the car.

Does the reason this happens not make sense to anybody? Clue: it has to do with the Boltzmann distribution, the range of particles momentums among individual molecules.
 
Last edited:
  • #30


my_wan said:
Does the reason this happens not make sense to anybody? Clue: it has to do with the Boltzmann distribution, the range of particles momentums among individual molecules.

I suppose you could solve the problem that way, but it's much easier for this problem to think in terms of the buoyancy force.
 
  • #31


Andy Resnick said:
I suppose you could solve the problem that way, but it's much easier for this problem to think in terms of the buoyancy force.
I guess you could. My mindset was on micro-states, but you certainly don't have to think in terms of micro-states to solve it.
 
  • #32


Naty1 said:
Whoa!
entropy as a measure of disorder has WIDE application...


From a paper titled "Entropy" by Frank Lambert:

Quote from the paper:

"The definition, "entropy is disorder", used in all US first-year college and university textbooks prior to 2002, has been deleted from 15 of 16 new editions or new texts published since 2002[2]. Entropy is not ‘disorder’ [3] nor is entropy change a change from order to disorder."


Link to cited paper:

http://docs.google.com/viewer?a=v&q...0nO6kA&sig=AHIEtbQt85_upRLNPdIu3SnPB8k7sGkGCg


The terms "computer science", and "information science" are just as amorphous and undefinable as is the term "consciousness." Just sayin'. :)

-s
 
  • #33


Andy Resnick said:
Personally, I see the entropy of a state as that amount of energy unavailable to perform useful work.
While the change in entropy is related to the amount of heat energy that is not available to perform useful work, one has to be careful in the language used.

First of all, your statement might confuse people into thinking that entropy is a measure of energy. It isn't.

It also might lead students to believe that the "energy unavailable to perform useful work" is a function of the thermodynamic state of a substance. Rather, the "energy unavailable to perform useful work" is a function of the difference between two states.

Finally, this statement might lead students to believe that "energy unavailable to perform useful work" is proportional to entropy or a change in entropy. If one defines "Energy Unavailable to Perform Useful Work" as the difference between the maximum work that is theoretically possible to obtain in a (reversible) process between those same two thermodynamic states and the work actually obtained from a process between two thermodynamic states (sometimes referred to as Lost Work), this may be correct for some processes (where temperature is constant). However, it would still be misleading because the "Energy Unavailable to Perform Useful Work" is not all the energy that is not available to perform useful work. It is only part of the energy that cannot be converted to work.

AM
 
  • #34


brainstorm said:
My Wan, good post - especially the point about heat still existing as molecular motion/KE in a system in equilibrium. Also, good point about heat itself not being a substance. Heat, volume, and pressure can be described as three dimensions of a system, imo. If volume is constant, the heat expresses itself as pressure, which indicates that heat is molecular momentum, imo. I realize that infrared radiation is also heat, but I think it could be argued that heat results from all radiation, including infrared - but maybe I'm missing something. Whoever posted that heat has nothing to do with molecular energy, I can't understand how they could say that.
One has to be very careful because the terminology of thermodynamics was developed before we knew about molecules.

The first law of thermodynamics refers to three forms of energy: heat flow (Q), internal energy (U) and mechanical work (W). What you describe as "heat" is U (ie. the energy due to molecular motion), not Q. Q is heat flow: a transfer of energy into or out of a particular body of matter.

Since Q is commonly referred to as heat or heat flow, it might be better to refer to the kinetic energy at the molecular level as thermal energy or just internal energy rather than heat.
If all molecules are moving at the same velocity in a system, would you describe the energy exchanges of collision as transfers or just direction-changes?
If all molecules were moving at the same velocity, it would not be in thermal equilibrium and, hence, would have no temperature.

AM
 
  • #35
Lambert's woolly "definition" of entropy

Prof. Lambert is just confused. As an educator, he has found a distorted, simplified presentation that makes things easier for his students, but in fact it is in itself even more confused than anything he criticises about "disorder". He switches between dispersal of energy over physical volume and dispersal of energy over micro-states, apparently unaware that a system can only be entirely in or entirely out of a microstate so that all of its energy is in the microstate. And his idea of dispersal over physical position is obviously false: two states can have their local energy content completely, i.e., uniformly dispersed over the same volume but have different entropies.
I would not like to criticize his idea of "energy dispersal" merely because he cannot offer a clear and definite definition of "dispersal" because one cannot do that with "disorder" either: the only clear and definite quantitative definition of disorder would be to define it as being the entropy, which would be circular. So that would be an unfair criticism.

Furthermore, his advocates like to talk about recent chemistry texts, but nearly all of them that I can find are published by firms like Cengage or Alphascript, which although not exactly vanity presses or self-publishing, certainly do not engage in the kind of thorough peer review that mainstream textbook publishers do.
 
Last edited:
<h2>1. What is entropy?</h2><p>Entropy is a concept in physics and thermodynamics that refers to the measure of disorder or randomness in a system. It is often described as the tendency of a system to move towards a state of maximum disorder.</p><h2>2. How is entropy related to disorder?</h2><p>The concept of entropy is often associated with disorder because as the entropy of a system increases, the amount of disorder or randomness in the system also increases. This is because as a system becomes more disordered, there are more possible arrangements of its components, resulting in a higher entropy.</p><h2>3. Is entropy always increasing?</h2><p>In a closed system, the total entropy will always tend to increase over time. This is known as the second law of thermodynamics. However, in an open system, the entropy can decrease locally as long as it is balanced by an increase in entropy elsewhere.</p><h2>4. Why is the idea that "entropy is disorder" considered outmoded?</h2><p>The idea that "entropy is disorder" is considered outmoded because it is an oversimplification of the concept of entropy. While entropy is often associated with disorder, it is a more complex concept that also takes into account the energy and information within a system.</p><h2>5. How is entropy used in different fields of science?</h2><p>Entropy is a fundamental concept in many fields of science, including physics, chemistry, biology, and information theory. It is used to understand and predict the behavior of systems, such as chemical reactions, climate patterns, and even the evolution of living organisms.</p>

1. What is entropy?

Entropy is a concept in physics and thermodynamics that refers to the measure of disorder or randomness in a system. It is often described as the tendency of a system to move towards a state of maximum disorder.

2. How is entropy related to disorder?

The concept of entropy is often associated with disorder because as the entropy of a system increases, the amount of disorder or randomness in the system also increases. This is because as a system becomes more disordered, there are more possible arrangements of its components, resulting in a higher entropy.

3. Is entropy always increasing?

In a closed system, the total entropy will always tend to increase over time. This is known as the second law of thermodynamics. However, in an open system, the entropy can decrease locally as long as it is balanced by an increase in entropy elsewhere.

4. Why is the idea that "entropy is disorder" considered outmoded?

The idea that "entropy is disorder" is considered outmoded because it is an oversimplification of the concept of entropy. While entropy is often associated with disorder, it is a more complex concept that also takes into account the energy and information within a system.

5. How is entropy used in different fields of science?

Entropy is a fundamental concept in many fields of science, including physics, chemistry, biology, and information theory. It is used to understand and predict the behavior of systems, such as chemical reactions, climate patterns, and even the evolution of living organisms.

Similar threads

Replies
4
Views
1K
Replies
3
Views
1K
Replies
7
Views
5K
  • Thermodynamics
Replies
15
Views
8K
  • Introductory Physics Homework Help
Replies
3
Views
692
Replies
5
Views
4K
  • Thermodynamics
Replies
25
Views
7K
Replies
7
Views
8K
Replies
1
Views
1K
Back
Top