Understanding Entropy: A Fresh Perspective from Brian Greene's Latest Book

In summary: When shuffled, it can have multiple configurations but the overall state (number of cards, suits, etc.) remains the same. In summary, entropy is a statistical property of a whole system that is not dependent on the order of items within it. It measures the number of possible configurations of the system and is related to the concept of disorder. This understanding is demonstrated by Brian Greene's explanation using the example of steam expanding in a bathroom, where the temperature remains the same despite the increase in volume. This idea is also reflected in Boltzmann's equation, where entropy is related to the number of possible microstates in a macrostate.
  • #1
DaveC426913
Gold Member
22,497
6,168
I'm reading Brian Greene's latest book 'Until the End of Time' (I'll pause here while you finish groaning at yet another layperson reading popularist physics books.) In it, he's describing entropy in a way I've never heard before and it clarifies something that's always stuck on my craw about common descriptions of entropy.

He uses the steam from a hot shower to demonstrate how low entropy is associated with systems that have highly constrained degrees of freedom - such as a small volume of "steam." As the steam expands to fill the bathroom, it's volume goes up, relaxing the constraints on where droplets can be.

This volume increase means it reaches a state where there are many, many more configurations the droplets can be in than if it were a a tiny volume. All these configurations "look a lot alike" - i.e. it's hard to tell one from the other - and this has always been where the description of entropy or disorder has stopped.

Which bugs me - because "all these states look alike" is sloppy and ambiguous. The fact that they seem alike does not mean they are alike.

Look at a deck of cards. One shuffled state looks very much like another shuffled state - to the indiscriminate eye. But that's not true. Every configuration of a deck is specific and distinct from another - even if it isn't obvious.

Likewise, while we can't tell which steam droplets are where merely by looking, it's still true that each configuration is distinct.

But Greene took the bathroom scenario one step further. He implies (as I see it, though he doesn't say this explicitly) that it's not the actual configuration itself that's being measured, it's a general property of interest - a statistical property of the system, rather than an particulate property. In the case of the bathroom, the properties might be temperature (or pressure or volume).

To-wit: there are many many, many configurations of the bathroom system that all result in the same temperature. i.e. as far as temperature of the room goes, all those configurations are indistinguishable from each other. And that's where the crux of disorder lies.

Likewise with a table of pennies. (Lets say there are one hundred pennies - and we only get to spend pennies that land heads up; we cannot spend pennies that land tails up.) There is only one configuration of one hundred pennies where we have a full dollar to spend. The system of pennies is in a configuration that is highly constrained.

But there are many, many, many configurations of pennies where we have 50 cents to spend.
Yes, every penny is unique, and every penny knows its location and its face-up face - so it's not like any given configuration is actually identical to any other. But we are not measuring the state of pennies (particulates) within the system, we are measuring statistics (spending power) of the whole system.Either I'm way off, or I'm telling you guys things you already know.TLDR: entropy is a statistical property of a whole system, not an accounting of the order of items within the system. Yes? Specifically, entropy is concerned with the metrics used to measure the system, as if it were a black box.
 
Science news on Phys.org
  • #2
DaveC426913 said:
TLDR: entropy is a statistical property of a whole system, not an accounting of the order of items within the system. Yes?

Yes, if the macro state of the system does not depend on the order of items within.
 
  • #3
It's on Boltzmann's tombstone:
DmWyR9rU0AMNaid.jpg

The entropy of a macrostate (S) is the logarithm of the number of possible microstates (multiplicity, W) that make up the macrostate, multiplied by Boltzmann's constant (k).
 
  • #4
DrStupid said:
Yes, if the macro state of the system does not depend on the order of items within.
Right. Like a deck of cards.
 

1. What is entropy and why is it important?

Entropy is a measure of the disorder or randomness of a system. It is important because it helps us understand how energy flows and how systems change over time.

2. How does Brian Greene's latest book provide a fresh perspective on understanding entropy?

Brian Greene's latest book, "Until the End of Time," explores the concept of entropy in the context of the universe and its ultimate fate. He presents a unique perspective on how entropy plays a role in shaping our world and the universe as a whole.

3. How does entropy relate to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that things tend to become more disordered and chaotic, rather than more organized. Entropy is a direct measurement of this increase in disorder.

4. Can entropy be reversed or decreased?

In isolated systems, entropy cannot be reversed or decreased. However, in open systems, energy can be added or removed to decrease the overall entropy. This is how living organisms are able to maintain order and decrease their own entropy.

5. How does entropy play a role in the concept of time?

Entropy is closely linked to the concept of time. As entropy increases, time moves forward. This is because as systems become more disordered, it becomes more difficult to reverse their state. Therefore, entropy can be seen as a measure of the direction of time.

Similar threads

Replies
2
Views
843
  • Thermodynamics
Replies
2
Views
9K
  • Thermodynamics
Replies
5
Views
2K
Replies
27
Views
3K
Replies
5
Views
2K
Replies
17
Views
2K
  • Thermodynamics
Replies
2
Views
2K
  • Other Physics Topics
Replies
6
Views
1K
Back
Top