Refuting the idea of entropy equalling "disorder"

AI Thread Summary
Entropy is often mischaracterized as "disorder," but this interpretation does not hold in all contexts, particularly when comparing thermodynamic and information theory perspectives. The discussion highlights the differences between Shannon entropy, which relates to information transmission, and Kolmogorov complexity, which pertains to the information needed to create a state. It emphasizes that while macrostates in thermodynamics can be objectively defined, their interpretation can become subjective in statistical mechanics, depending on the information available. The relationship between thermodynamic and informational entropy is complex, with each framework serving distinct purposes in understanding systems. Ultimately, a clear understanding of entropy requires recognizing its multifaceted definitions and applications across different scientific domains.
sshai45
Messages
86
Reaction score
1
Hi.

I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. I see stuff like this:

http://arstechnica.com/civis/viewtopic.php?f=2&t=3122

see the posts of "kmellis" and he simultaneously says that entropy in thermodynamics is NOT informational disorder, while simultaneously advocating an "information theoretic" basis for physics. How the f--- do you do that and then since thermo. entropy != inform. entropy, how do they relate in such a framework or not relate?

I'm curious. How can one prove mathematically that the entropy change in, say, converting an "ordered" stack of identical objects -- a more simplified version of an often given and apparently invalid example -- to a "messy" looking pile, is identically zero if you assume all other variables (temperature, etc.) are mathematically constant, so there is literally nothing but the rearrangement going on? I know, this is highly idealized, but that's the point, to isolate the "disordering" in a common man's sense and show that it has absolutely zero effect on the entropy of the whole system of objects. What I am wondering about is why couldn't there be some immeasurably small but not zero entropy change because after all you are rearranging the matter in the system, just not by a very great degree when you think of things on a "microscopic" scale?
 
Science news on Phys.org
Research macrostates and microstates. The macrostate interpretation of entropy does not include disorder, while the microstate interpretation does via the Boltzmann equation. Both interpretations are equally correct and important, each applying to their respective part of thermodynamics (i.e. macrostate thermodynamics and microstate thermodynamics).
 
sshai45 said:
Hi.

I have heard this, that entropy is often called "disorder" but _isn't really so_. And I am even more puzzled by the connection and difference between entropy from "information theory" pov and from "thermodynamic" pov. <snip>

Part of the difficulty is that there are multiple ways to define the information content of a system. One way, Shannon entropy, is more suited to communications- how much information is required to digitally communicate (possibly including 'measure') the microstate of a system. In this sense, entropy is related to how random the bit stream is- if you can predict the value of an incoming bit before receiving it, the entropy associated with that bit is zero. Thus, in the Shannon context, 'negentropy' is usually more useful than 'entropy'. Data compression is especially well-suited to this context.

Another way to define information content is how much information is needed to create the state, the Kolmogorov complexity. In this context, one can pose questions about how much information is required to build a factory that makes things (including other factories). Assigning numbers to the Kolmorgorov complexity is not trivial, AFAIK.

There are other measures of information, but I am only familiar with those two. Does this help?
 
I guess it depends on how fundamental you want to go. If you are talking about the entropy of a gas or an engine, then you can use a chemistry definition and it's all very objective and we can all agree on what the entropy is. And this is consistent with the laws of thermodynamics. But then you get into statistical mechanics and macrostates and it all sounds very subjective, since a macrostate seems to depend on how much information you have. (I am grouping the statistical mechanics definition and information theoretical definition together, since they seem to be compatible to me.)

For familiar systems like ideal gases, we can use the convention that a macrostate refers to a set of states with known volume, energy, and particle number. And then entropy is just a state variable, S(V,U,N), and the two notions agree. But if we don't know the volume, does that mean we don't know the entropy? Or does it mean the entropy is greater (because we are referring to a larger set of states with known energy and particle number)?

Practically speaking, we don't measure the energy. We measure the temperature and infer the energy. But temperature is also defined in terms of the entropy.

It seems like the chemistry definition is more useful, but the information theoretical definition is more extensible to more exotic systems which have different state variables than volume, energy, and particle number. For example, in a big bang nucleosynthesis experiment, particle number might not be a useful state variable. When we report a value for entropy for an exotic system, we have to define what we consider a macrostate to be.
 
I need to calculate the amount of water condensed from a DX cooling coil per hour given the size of the expansion coil (the total condensing surface area), the incoming air temperature, the amount of air flow from the fan, the BTU capacity of the compressor and the incoming air humidity. There are lots of condenser calculators around but they all need the air flow and incoming and outgoing humidity and then give a total volume of condensed water but I need more than that. The size of the...
Thread 'Why work is PdV and not (P+dP)dV in an isothermal process?'
Let's say we have a cylinder of volume V1 with a frictionless movable piston and some gas trapped inside with pressure P1 and temperature T1. On top of the piston lay some small pebbles that add weight and essentially create the pressure P1. Also the system is inside a reservoir of water that keeps its temperature constant at T1. The system is in equilibrium at V1, P1, T1. Now let's say i put another very small pebble on top of the piston (0,00001kg) and after some seconds the system...

Similar threads

Replies
25
Views
5K
Replies
23
Views
3K
Replies
1
Views
3K
Replies
1
Views
3K
Replies
1
Views
3K
Replies
6
Views
4K
Back
Top