Is entropy truly a measure of disorder or is it something else entirely?

  • Thread starter Thread starter LEELA PRATHAP KUMAR
  • Start date Start date
  • Tags Tags
    Disorder Entropy
AI Thread Summary
Entropy is not always a measure of disorder, as the definition of disorder can vary significantly in different contexts. While entropy is one way to quantify disorder, other interpretations exist that do not align with the strict physics definition. The discussion highlights the importance of understanding entropy through Shannon's information theory, which views it as a measure of missing information relative to a prior state. This perspective is supported by recent experimental work on quantum Maxwell demons, suggesting a more comprehensive understanding of entropy. Overall, the relationship between entropy and disorder is complex and not universally applicable.
LEELA PRATHAP KUMAR
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
 
Science news on Phys.org
LEELA PRATHAP KUMAR said:
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
That topic is often discussed here. I suggest a forum search (always a good idea on basic questions). A good place to start is with the links at the bottom of the page (the forum does a brief search for you, based on your subject line)
 
  • Like
Likes DrClaude
LEELA PRATHAP KUMAR said:
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
Certainly not "always." How do you define disorder?
 
Most scientific areas use common words and give them a more narrow technical meaning. There are lots of common uses and broader understandings of physics words that may differ with the strict physics definitions: energy, force, momentum, revolution, period, etc. Disorder is one of those words also.

Entropy is one measure of disorder. There are other uses and non-technical understandings of the word disorder which do not exactly correspond with the physics definition of entropy.
 
Well, this you often read, but to be honest I never understood it. Also the tradiational definition of introducing temperature as an integrating factor to make ##\delta Q=T \mathrm{d} S##, which introduces entropy into the game, didn't help me much.

What I find most convincing is to use Shannon's reinterpretation of entropy as a measure for missing information (relative to a prior state defining complete information) for a given probability distribution, leading to the Shannon-Jaynes-von-Neumann entropy in (quantum-)statistical physics. The idea of course goes back to Szilard's famous paper on the Maxwell demon of 1928.

A good introduction to statistical physics (both classical and quantum) using the information-theoretical approach is

A. Katz, Principles of Statistical Mechanics, W. H. Freeman and Company, San Francisco and London, 1967.

I also think that with the recent experimental work on "quantum Maxwell demon" it's almost empirically proven that the information-theoretical interpretation of entropy is the most comprehensive one.
 
  • Like
Likes DrClaude

Similar threads

Replies
8
Views
1K
Replies
6
Views
2K
Replies
5
Views
2K
Replies
1
Views
2K
Replies
4
Views
2K
Replies
4
Views
3K
Back
Top