Is entropy real or just a construct?

  • Context: Graduate 
  • Thread starter Thread starter ramzerimar
  • Start date Start date
  • Tags Tags
    Entropy Thermodynamics
Click For Summary

Discussion Overview

The discussion centers around the nature of entropy, questioning whether it is a real property of systems or merely a construct arising from our inability to fully describe a system's state. Participants explore the implications of complete descriptions of systems, the relationship between entropy and concepts like temperature, and the role of quantum mechanics in understanding entropy.

Discussion Character

  • Exploratory
  • Debate/contested
  • Conceptual clarification

Main Points Raised

  • Some participants suggest that entropy arises from our inability to completely describe a system, questioning how it can be a property if it is linked to our limitations.
  • Others argue that entropy is a measure of the number of ways a system can be reorganized, likening it to other properties such as length.
  • A participant expresses confusion about the implications of being unable to fully describe a system's state, wondering if this relates to quantum mechanics or other factors.
  • There is a suggestion that the classical definition of entropy as thermal energy unavailable to do work may be more intuitive for some participants.
  • One participant introduces the concept of microstates and macrostates, explaining how entropy can be understood in terms of the number of microstates corresponding to a given macrostate.

Areas of Agreement / Disagreement

Participants do not reach a consensus on whether entropy is a real property or a construct. Multiple competing views remain regarding the nature of entropy and its relationship to complete descriptions of systems.

Contextual Notes

There are unresolved questions about the definitions of entropy, the implications of quantum mechanics, and the relationship between entropy and other thermodynamic concepts like temperature and heat.

ramzerimar
Messages
178
Reaction score
23
When we talk about entropy, we say it comes from our inability to completely describe the state of a system. Also, we say it is a property of the system (like entalphy). That's confusing me a lot. If entropy is a property, how can it come from our inability to describe the system? Or it's just a construct? Something that we use to describe reality?

Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

I just need some clarification about this.
 
Science news on Phys.org
Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?
 
phinds said:
Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?

Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
 
How would you create a complete description of, say, the Earth? Things just get too complicated.
 
ramzerimar said:
Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?

What does "completely describe the state of a system" mean?

Do you think the concept of "temperature" or "heat" is sufficient to "completely describe the state of a system"? After all, you made no complaints about those, and thus, presumably, you are happy with those concepts. Yet, these are the same thermodynamics/statistical concepts on the SAME foundation as entropy.

Zz.
 
ramzerimar said:
Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
My first thought on reading your first post was that you were thinking of the HUP; no, entropy is not due to the limitations of the HUP.

As an engineer, the statistical description doesn't speak to me. The classical definition may work better for you too; it is thermal energy unavailable to do work. A simple manifestation of it is that when the temperature difference between two reservoirs is lowered, your ability to extract is reduced by a larger proportion. The difference is the entropy loss.

To put it another way: you are better off having a little bit of hot water than a lot of lukewarm water. As such, this is also referred to as "quality".
 
ramzerimar said:
Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

The statistical notion of entropy involves two different notions of the state of a system: (1) the microstate, which is the complete description, and (2) the macrostate, which is the "coarse-grained" description. There can be many microstates that correspond to the same macrostate.

For example, suppose you flip 100 coins. The microstate might be the exact record of which of the coins were heads and which were tails. The macrostate might be just the count of the number of heads and tails. Then the entropy for a macrostate with [itex]H[/itex] heads and [itex]T[/itex] tails would be [itex]k log(W(H,T,100)[/itex], where [itex]W(H,T,100)[/itex] is the number of ways to arrange 100 coins so that [itex]H[/itex] of them are heads and [itex]T[/itex] of them are tails. If I remember correctly, the formula for that is: [itex]W(H,T,100) = \frac{100!}{H! T!}[/itex]. The zero-entropy states are [itex]H= 100, T=0[/itex] and [itex]H=0, T=100[/itex]. The highest-entropy state is [itex]H=50, T=50[/itex].

But the statistical notion of entropy is relative to a particular notion of macrostate. In thermodynamics, the usual notion of macrostate of a gas of identical particles is the total energy, the number of particles, the volume.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
8K