Refuting the idea of entropy equalling "disorder"

  • Context: Graduate 
  • Thread starter Thread starter sshai45
  • Start date Start date
  • Tags Tags
    Disorder Entropy Idea
Click For Summary

Discussion Overview

The discussion revolves around the concept of entropy, specifically addressing the common characterization of entropy as "disorder." Participants explore the differences between thermodynamic and informational perspectives on entropy, questioning how these interpretations relate to one another. The discussion includes theoretical considerations and mathematical inquiries regarding entropy changes in specific scenarios.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants assert that entropy is often mischaracterized as "disorder," suggesting that this view is overly simplistic.
  • One participant questions how to mathematically demonstrate that the entropy change from rearranging identical objects is zero under constant conditions, while also pondering the potential for a non-zero but immeasurable entropy change.
  • Another participant introduces the concepts of macrostates and microstates, noting that the macrostate interpretation of entropy does not involve disorder, whereas the microstate interpretation does, as described by the Boltzmann equation.
  • There is a discussion about different definitions of information content, including Shannon entropy and Kolmogorov complexity, and how these relate to the concept of entropy in various contexts.
  • One participant highlights the subjective nature of entropy in statistical mechanics compared to the objective definitions used in chemistry, raising questions about the implications of not knowing certain state variables.
  • Concerns are raised about the applicability of different entropy definitions to exotic systems, suggesting that the choice of macrostate definition is crucial in such cases.

Areas of Agreement / Disagreement

Participants express differing views on the characterization of entropy and its interpretations. There is no consensus on how to reconcile the thermodynamic and informational perspectives, and the discussion remains unresolved regarding the implications of these differing definitions.

Contextual Notes

Participants acknowledge the complexity of defining entropy, particularly in relation to different systems and the information available. The discussion highlights the limitations of definitions based on specific contexts and the potential for ambiguity in measuring entropy.

sshai45
Messages
86
Reaction score
1
Hi.

I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. I see stuff like this:

http://arstechnica.com/civis/viewtopic.php?f=2&t=3122

see the posts of "kmellis" and he simultaneously says that entropy in thermodynamics is NOT informational disorder, while simultaneously advocating an "information theoretic" basis for physics. How the f--- do you do that and then since thermo. entropy != inform. entropy, how do they relate in such a framework or not relate?

I'm curious. How can one prove mathematically that the entropy change in, say, converting an "ordered" stack of identical objects -- a more simplified version of an often given and apparently invalid example -- to a "messy" looking pile, is identically zero if you assume all other variables (temperature, etc.) are mathematically constant, so there is literally nothing but the rearrangement going on? I know, this is highly idealized, but that's the point, to isolate the "disordering" in a common man's sense and show that it has absolutely zero effect on the entropy of the whole system of objects. What I am wondering about is why couldn't there be some immeasurably small but not zero entropy change because after all you are rearranging the matter in the system, just not by a very great degree when you think of things on a "microscopic" scale?
 
Science news on Phys.org
Research macrostates and microstates. The macrostate interpretation of entropy does not include disorder, while the microstate interpretation does via the Boltzmann equation. Both interpretations are equally correct and important, each applying to their respective part of thermodynamics (i.e. macrostate thermodynamics and microstate thermodynamics).
 
sshai45 said:
Hi.

I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. <snip>

Part of the difficulty is that there are multiple ways to define the information content of a system. One way, Shannon entropy, is more suited to communications- how much information is required to digitally communicate (possibly including 'measure') the microstate of a system. In this sense, entropy is related to how random the bit stream is- if you can predict the value of an incoming bit before receiving it, the entropy associated with that bit is zero. Thus, in the Shannon context, 'negentropy' is usually more useful than 'entropy'. Data compression is especially well-suited to this context.

Another way to define information content is how much information is needed to create the state, the Kolmogorov complexity. In this context, one can pose questions about how much information is required to build a factory that makes things (including other factories). Assigning numbers to the Kolmorgorov complexity is not trivial, AFAIK.

There are other measures of information, but I am only familiar with those two. Does this help?
 
I guess it depends on how fundamental you want to go. If you are talking about the entropy of a gas or an engine, then you can use a chemistry definition and it's all very objective and we can all agree on what the entropy is. And this is consistent with the laws of thermodynamics. But then you get into statistical mechanics and macrostates and it all sounds very subjective, since a macrostate seems to depend on how much information you have. (I am grouping the statistical mechanics definition and information theoretical definition together, since they seem to be compatible to me.)

For familiar systems like ideal gases, we can use the convention that a macrostate refers to a set of states with known volume, energy, and particle number. And then entropy is just a state variable, S(V,U,N), and the two notions agree. But if we don't know the volume, does that mean we don't know the entropy? Or does it mean the entropy is greater (because we are referring to a larger set of states with known energy and particle number)?

Practically speaking, we don't measure the energy. We measure the temperature and infer the energy. But temperature is also defined in terms of the entropy.

It seems like the chemistry definition is more useful, but the information theoretical definition is more extensible to more exotic systems which have different state variables than volume, energy, and particle number. For example, in a big bang nucleosynthesis experiment, particle number might not be a useful state variable. When we report a value for entropy for an exotic system, we have to define what we consider a macrostate to be.
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 25 ·
Replies
25
Views
6K
  • · Replies 10 ·
Replies
10
Views
14K
  • · Replies 13 ·
Replies
13
Views
10K
  • · Replies 23 ·
Replies
23
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 14 ·
Replies
14
Views
3K
  • · Replies 70 ·
3
Replies
70
Views
10K