Is entropy a measure of disorder ?

AI Thread Summary
The discussion centers on the definition of entropy and its relationship to the concept of disorder. It highlights that traditional definitions, such as those by Clausius, do not explicitly link entropy to disorder, which is often viewed as an intuitive rather than a scientific concept. The Boltzmann-Gibbs interpretation connects entropy to the number of microstates but does not equate it with disorder, as demonstrated by examples of disordered systems at low temperatures. Participants express concern that the common portrayal of entropy as a measure of disorder is misleading and oversimplifies complex concepts. The conversation suggests that a more accurate understanding of entropy could involve its relationship to information theory, where disorder might relate to the amount of information needed to describe a system.
Aidyan
Messages
182
Reaction score
14
Is entropy a measure of "disorder"?

In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or "disorder" which, as far as I know, are only intuitive concepts, not defined in physics. In statistical mechanics Boltzmann and Gibbs told us that there could be an interpretation of entropy as the logarithm of the number of possible microstates, but again, this isn't for me a definition for disorder. In fact, we could equally immagine a completely disordered system at (almost) absolute zero temperature and zero entropy, and nevertheless with a completely amorphous (i.e. disordered) crystal structure.

And yet, lots of people, even renewed physicists, when they speak to the public, they continue to speak of entropy as a measure of disorder. I never could understand where this comes from? I understand that popularizing difficult concepts to an audience of non experts needs simplified analogies, but I think everyone could equally well have an intuitive idea of entropy as a "measure of change", or the "measure of all possible configurations". I simply perceive the identification of entropy with disorder simply as false and misleading. And what is surprising is that primarly professional physicists are responsible for this. Or... am I missing something?
 
Science news on Phys.org
maajdl said:
What would you take as a definition of disorder?
And how would you match it to the entropy?
Another path you might follow is based on information theory.
Eventually, you might then reconcile the wording used in this field.
Disorder is then probably equivalent to "little information needed".
See also:

http://en.wikipedia.org/wiki/Entropy
http://en.wikipedia.org/wiki/Information_theory

No, in information theory entropy could be seen something like as a measure of the lack (or amount) of information, which is in line with Boltzmann-Gibbs interpretation (where, in some sense, it may be seen as a measures of our ignorance of the real microstate of a system vs. all the possible microstates), but that has nothing to do with disorder. I could have all the information on the whereabouts of molecules and atoms in a frozen substance at almost absolut zero (i.e. with zero entropy), and yet with no crystal structure, a completely disordered and amorphous configuration.
 
Aidyan said:
In statistical mechanics Boltzmann and Gibbs told us that there could be an interpretation of entropy as the logarithm of the number of possible microstates, but again, this isn't for me a definition for disorder.
It depends on what you call disorder. Take a well-shuffled deck of cards. I would call the deck "ordered" if for instance cards were in order of suit, or separated by color, or all aces, 2s, 3s, etc together. If you consider all the possible microstates, you will find that there are usually very few that you would called ordered out of the 52! possible combinations, so an ordered state is considered low entropy.

Aidyan said:
In fact, we could equally immagine a completely disordered system at (almost) absolute zero temperature and zero entropy, and nevertheless with a completely amorphous (i.e. disordered) crystal structure.
Which is indeed a higher entropy state. See residual entropy.
 
  • Like
Likes 1 person
Aidyan said:
In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or "disorder" which, as far as I know, are only intuitive concepts, not defined in physics. In statistical mechanics Boltzmann and Gibbs told us that there could be an interpretation of entropy as the logarithm of the number of possible microstates, but again, this isn't for me a definition for disorder. In fact, we could equally immagine a completely disordered system at (almost) absolute zero temperature and zero entropy, and nevertheless with a completely amorphous (i.e. disordered) crystal structure.

And yet, lots of people, even renewed physicists, when they speak to the public, they continue to speak of entropy as a measure of disorder. I never could understand where this comes from? I understand that popularizing difficult concepts to an audience of non experts needs simplified analogies, but I think everyone could equally well have an intuitive idea of entropy as a "measure of change", or the "measure of all possible configurations". I simply perceive the identification of entropy with disorder simply as false and misleading. And what is surprising is that primarly professional physicists are responsible for this. Or... am I missing something?

An amorphous solid has a higher entropy than a crystal that has a regular lattice.

The connection between entropy and disorder is that if a system is disordered, then it requires many parameters to completely describe its state. If you have 10^23 atoms with no particular pattern to their states, then a complete description of the state (classically) would require giving 10^23 positions and momenta. In contrast, if those atoms are arranged neatly into a crystal lattice, then describing the state only requires giving the number of particles and the dimensions of a single cell of the crystal.
 
  • Like
Likes 1 person
Ahh... ok. Now this sounds better... Thanks.
 
I need to calculate the amount of water condensed from a DX cooling coil per hour given the size of the expansion coil (the total condensing surface area), the incoming air temperature, the amount of air flow from the fan, the BTU capacity of the compressor and the incoming air humidity. There are lots of condenser calculators around but they all need the air flow and incoming and outgoing humidity and then give a total volume of condensed water but I need more than that. The size of the...
Thread 'Why work is PdV and not (P+dP)dV in an isothermal process?'
Let's say we have a cylinder of volume V1 with a frictionless movable piston and some gas trapped inside with pressure P1 and temperature T1. On top of the piston lay some small pebbles that add weight and essentially create the pressure P1. Also the system is inside a reservoir of water that keeps its temperature constant at T1. The system is in equilibrium at V1, P1, T1. Now let's say i put another very small pebble on top of the piston (0,00001kg) and after some seconds the system...

Similar threads

Back
Top