Is entropy truly a measure of disorder or is it something else entirely?

  • Context: Undergrad 
  • Thread starter Thread starter LEELA PRATHAP KUMAR
  • Start date Start date
  • Tags Tags
    Disorder Entropy
Click For Summary

Discussion Overview

The discussion revolves around the nature of entropy, specifically whether it should be considered a measure of disorder or if it represents something else. Participants explore various interpretations of entropy, including its definitions in different scientific contexts and its relationship to concepts like information theory.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Exploratory

Main Points Raised

  • Some participants question the validity of stating that entropy is always a measure of disorder, suggesting that the definition of disorder itself is ambiguous.
  • One participant notes that while entropy can be a measure of disorder, there are other interpretations and uses of the term "disorder" that do not align with the physics definition of entropy.
  • A participant expresses confusion over traditional definitions of entropy, particularly in relation to temperature and its integration into thermodynamic equations.
  • Another participant advocates for an information-theoretical perspective on entropy, referencing Shannon's reinterpretation as a measure of missing information and suggesting this view may be more comprehensive.
  • There is mention of recent experimental work related to the "quantum Maxwell demon," which some argue supports the information-theoretical interpretation of entropy.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether entropy should be viewed strictly as a measure of disorder. Multiple competing views are presented, with some advocating for a broader interpretation that includes information theory.

Contextual Notes

Participants highlight the limitations of common language in scientific contexts, noting that terms like "disorder" may have different meanings outside of strict physics definitions. The discussion also reflects varying levels of understanding regarding the mathematical and conceptual foundations of entropy.

LEELA PRATHAP KUMAR
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
 
Science news on Phys.org
LEELA PRATHAP KUMAR said:
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
That topic is often discussed here. I suggest a forum search (always a good idea on basic questions). A good place to start is with the links at the bottom of the page (the forum does a brief search for you, based on your subject line)
 
  • Like
Likes   Reactions: DrClaude
LEELA PRATHAP KUMAR said:
How much correct to say entropy is always measure of disorder
Is disorder always related to entropy
Certainly not "always." How do you define disorder?
 
Most scientific areas use common words and give them a more narrow technical meaning. There are lots of common uses and broader understandings of physics words that may differ with the strict physics definitions: energy, force, momentum, revolution, period, etc. Disorder is one of those words also.

Entropy is one measure of disorder. There are other uses and non-technical understandings of the word disorder which do not exactly correspond with the physics definition of entropy.
 
Well, this you often read, but to be honest I never understood it. Also the tradiational definition of introducing temperature as an integrating factor to make ##\delta Q=T \mathrm{d} S##, which introduces entropy into the game, didn't help me much.

What I find most convincing is to use Shannon's reinterpretation of entropy as a measure for missing information (relative to a prior state defining complete information) for a given probability distribution, leading to the Shannon-Jaynes-von-Neumann entropy in (quantum-)statistical physics. The idea of course goes back to Szilard's famous paper on the Maxwell demon of 1928.

A good introduction to statistical physics (both classical and quantum) using the information-theoretical approach is

A. Katz, Principles of Statistical Mechanics, W. H. Freeman and Company, San Francisco and London, 1967.

I also think that with the recent experimental work on "quantum Maxwell demon" it's almost empirically proven that the information-theoretical interpretation of entropy is the most comprehensive one.
 
  • Like
Likes   Reactions: DrClaude

Similar threads

  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 9 ·
Replies
9
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
9K