Is entropy a measure of disorder ?

In summary: I would say that is more disordered than a frozen substance at absolute zero with no crystal structure.
  • #1
Aidyan
180
13
Is entropy a measure of "disorder"?

In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or "disorder" which, as far as I know, are only intuitive concepts, not defined in physics. In statistical mechanics Boltzmann and Gibbs told us that there could be an interpretation of entropy as the logarithm of the number of possible microstates, but again, this isn't for me a definition for disorder. In fact, we could equally immagine a completely disordered system at (almost) absolute zero temperature and zero entropy, and nevertheless with a completely amorphous (i.e. disordered) crystal structure.

And yet, lots of people, even renewed physicists, when they speak to the public, they continue to speak of entropy as a measure of disorder. I never could understand where this comes from? I understand that popularizing difficult concepts to an audience of non experts needs simplified analogies, but I think everyone could equally well have an intuitive idea of entropy as a "measure of change", or the "measure of all possible configurations". I simply perceive the identification of entropy with disorder simply as false and misleading. And what is surprising is that primarly professional physicists are responsible for this. Or... am I missing something?
 
Science news on Phys.org
  • #2
  • #3
maajdl said:
What would you take as a definition of disorder?
And how would you match it to the entropy?
Another path you might follow is based on information theory.
Eventually, you might then reconcile the wording used in this field.
Disorder is then probably equivalent to "little information needed".
See also:

http://en.wikipedia.org/wiki/Entropy
http://en.wikipedia.org/wiki/Information_theory

No, in information theory entropy could be seen something like as a measure of the lack (or amount) of information, which is in line with Boltzmann-Gibbs interpretation (where, in some sense, it may be seen as a measures of our ignorance of the real microstate of a system vs. all the possible microstates), but that has nothing to do with disorder. I could have all the information on the whereabouts of molecules and atoms in a frozen substance at almost absolut zero (i.e. with zero entropy), and yet with no crystal structure, a completely disordered and amorphous configuration.
 
  • #4
Aidyan said:
In statistical mechanics Boltzmann and Gibbs told us that there could be an interpretation of entropy as the logarithm of the number of possible microstates, but again, this isn't for me a definition for disorder.
It depends on what you call disorder. Take a well-shuffled deck of cards. I would call the deck "ordered" if for instance cards were in order of suit, or separated by color, or all aces, 2s, 3s, etc together. If you consider all the possible microstates, you will find that there are usually very few that you would called ordered out of the 52! possible combinations, so an ordered state is considered low entropy.

Aidyan said:
In fact, we could equally immagine a completely disordered system at (almost) absolute zero temperature and zero entropy, and nevertheless with a completely amorphous (i.e. disordered) crystal structure.
Which is indeed a higher entropy state. See residual entropy.
 
  • Like
Likes 1 person
  • #5
Aidyan said:
In textbooks I never saw a definition of entropy given in terms of a "measure of disorder", and am wondering where this idea comes from? Clausius defined it as an "equivalent measure of change", but I do not see the relation with a concept of "order" or "disorder" which, as far as I know, are only intuitive concepts, not defined in physics. In statistical mechanics Boltzmann and Gibbs told us that there could be an interpretation of entropy as the logarithm of the number of possible microstates, but again, this isn't for me a definition for disorder. In fact, we could equally immagine a completely disordered system at (almost) absolute zero temperature and zero entropy, and nevertheless with a completely amorphous (i.e. disordered) crystal structure.

And yet, lots of people, even renewed physicists, when they speak to the public, they continue to speak of entropy as a measure of disorder. I never could understand where this comes from? I understand that popularizing difficult concepts to an audience of non experts needs simplified analogies, but I think everyone could equally well have an intuitive idea of entropy as a "measure of change", or the "measure of all possible configurations". I simply perceive the identification of entropy with disorder simply as false and misleading. And what is surprising is that primarly professional physicists are responsible for this. Or... am I missing something?

An amorphous solid has a higher entropy than a crystal that has a regular lattice.

The connection between entropy and disorder is that if a system is disordered, then it requires many parameters to completely describe its state. If you have 10^23 atoms with no particular pattern to their states, then a complete description of the state (classically) would require giving 10^23 positions and momenta. In contrast, if those atoms are arranged neatly into a crystal lattice, then describing the state only requires giving the number of particles and the dimensions of a single cell of the crystal.
 
  • Like
Likes 1 person
  • #6
Ahh... ok. Now this sounds better... Thanks.
 

What is entropy?

Entropy is a concept in thermodynamics that measures the amount of disorder or randomness in a system. It is a measure of the number of possible arrangements of a system's particles or energy.

How is entropy related to disorder?

Entropy is often described as a measure of disorder because it increases as a system becomes more disordered. This means that a system with high entropy has many possible arrangements of its particles or energy, while a system with low entropy has fewer possible arrangements.

Is entropy always increasing?

In a closed system, entropy tends to increase over time. This is due to the second law of thermodynamics, which states that the total entropy of a closed system will always increase or remain constant.

Can entropy be negative?

No, entropy is always a positive value or zero. This is because it is a measure of the number of possible arrangements, and there cannot be a negative number of arrangements.

How is entropy used in science?

Entropy is used in various fields of science, such as thermodynamics, information theory, and chemistry. It helps to understand and predict the behavior of systems, and it is also used in calculating the efficiency of energy conversions.

Similar threads

  • Thermodynamics
Replies
6
Views
1K
  • Thermodynamics
Replies
18
Views
4K
  • Thermodynamics
Replies
2
Views
9K
  • Introductory Physics Homework Help
Replies
8
Views
941
  • Thermodynamics
Replies
1
Views
736
  • Thermodynamics
Replies
2
Views
777
Replies
8
Views
2K
  • Thermodynamics
Replies
2
Views
932
  • Thermodynamics
Replies
3
Views
951
  • Thermodynamics
Replies
1
Views
978
Back
Top