Gold Member
I'm reading Brian Greene's latest book 'Until the End of Time' (I'll pause here while you finish groaning at yet another layperson reading popularist physics books.) In it, he's describing entropy in a way I've never heard before and it clarifies something that's always stuck on my craw about common descriptions of entropy.

He uses the steam from a hot shower to demonstrate how low entropy is associated with systems that have highly constrained degrees of freedom - such as a small volume of "steam." As the steam expands to fill the bathroom, it's volume goes up, relaxing the constraints on where droplets can be.

This volume increase means it reaches a state where there are many, many more configurations the droplets can be in than if it were a a tiny volume. All these configurations "look a lot alike" - i.e. it's hard to tell one from the other - and this has always been where the description of entropy or disorder has stopped.

Which bugs me - because "all these states look alike" is sloppy and ambiguous. The fact that they seem alike does not mean they are alike.

Look at a deck of cards. One shuffled state looks very much like another shuffled state - to the indiscriminate eye. But that's not true. Every configuration of a deck is specific and distinct from another - even if it isn't obvious.

Likewise, while we can't tell which steam droplets are where merely by looking, it's still true that each configuration is distinct.

But Greene took the bathroom scenario one step further. He implies (as I see it, though he doesn't say this explicitly) that it's not the actual configuration itself that's being measured, it's a general property of interest - a statistical property of the system, rather than an particulate property. In the case of the bathroom, the properties might be temperature (or pressure or volume).

To-wit: there are many many, many configurations of the bathroom system that all result in the same temperature. i.e. as far as temperature of the room goes, all those configurations are indistinguishable from each other. And that's where the crux of disorder lies.

Likewise with a table of pennies. (Lets say there are one hundred pennies - and we only get to spend pennies that land heads up; we cannot spend pennies that land tails up.) There is only one configuration of one hundred pennies where we have a full dollar to spend. The system of pennies is in a configuration that is highly constrained.

But there are many, many, many configurations of pennies where we have 50 cents to spend.
Yes, every penny is unique, and every penny knows its location and its face-up face - so it's not like any given configuration is actually identical to any other. But we are not measuring the state of pennies (particulates) within the system, we are measuring statistics (spending power) of the whole system.

Either I'm way off, or I'm telling you guys things you already know.

TLDR: entropy is a statistical property of a whole system, not an accounting of the order of items within the system. Yes? Specifically, entropy is concerned with the metrics used to measure the system, as if it were a black box.

DrStupid
TLDR: entropy is a statistical property of a whole system, not an accounting of the order of items within the system. Yes?

Yes, if the macro state of the system does not depend on the order of items within.

Mentor
It's on Boltzmann's tombstone:

The entropy of a macrostate (S) is the logarithm of the number of possible microstates (multiplicity, W) that make up the macrostate, multiplied by Boltzmann's constant (k).

Gold Member
Yes, if the macro state of the system does not depend on the order of items within.
Right. Like a deck of cards.