Is entropy real or just a construct?

  • #1
178
23
When we talk about entropy, we say it comes from our inability to completely describe the state of a system. Also, we say it is a property of the system (like entalphy). That's confusing me a lot. If entropy is a property, how can it come from our inability to describe the system? Or it's just a construct? Something that we use to describe reality?

Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

I just need some clarification about this.
 

Answers and Replies

  • #2
phinds
Science Advisor
Insights Author
Gold Member
16,894
7,861
Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?
 
  • #3
178
23
Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?

Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
 
  • #4
phinds
Science Advisor
Insights Author
Gold Member
16,894
7,861
How would you create a complete description of, say, the Earth? Things just get too complicated.
 
  • #5
ZapperZ
Staff Emeritus
Science Advisor
Education Advisor
Insights Author
35,847
4,676
Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?

What does "completely describe the state of a system" mean?

Do you think the concept of "temperature" or "heat" is sufficient to "completely describe the state of a system"? After all, you made no complaints about those, and thus, presumably, you are happy with those concepts. Yet, these are the same thermodynamics/statistical concepts on the SAME foundation as entropy.

Zz.
 
  • #6
russ_watters
Mentor
20,560
7,208
Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
My first thought on reading your first post was that you were thinking of the HUP; no, entropy is not due to the limitations of the HUP.

As an engineer, the statistical description doesn't speak to me. The classical definition may work better for you too; it is thermal energy unavailable to do work. A simple manifestation of it is that when the temperature difference between two reservoirs is lowered, your ability to extract is reduced by a larger proportion. The difference is the entropy loss.

To put it another way: you are better off having a little bit of hot water than a lot of lukewarm water. As such, this is also referred to as "quality".
 
  • #7
stevendaryl
Staff Emeritus
Science Advisor
Insights Author
8,776
2,806
Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

The statistical notion of entropy involves two different notions of the state of a system: (1) the microstate, which is the complete description, and (2) the macrostate, which is the "coarse-grained" description. There can be many microstates that correspond to the same macrostate.

For example, suppose you flip 100 coins. The microstate might be the exact record of which of the coins were heads and which were tails. The macrostate might be just the count of the number of heads and tails. Then the entropy for a macrostate with [itex]H[/itex] heads and [itex]T[/itex] tails would be [itex]k log(W(H,T,100)[/itex], where [itex]W(H,T,100)[/itex] is the number of ways to arrange 100 coins so that [itex]H[/itex] of them are heads and [itex]T[/itex] of them are tails. If I remember correctly, the formula for that is: [itex]W(H,T,100) = \frac{100!}{H! T!}[/itex]. The zero-entropy states are [itex]H= 100, T=0[/itex] and [itex]H=0, T=100[/itex]. The highest-entropy state is [itex]H=50, T=50[/itex].

But the statistical notion of entropy is relative to a particular notion of macrostate. In thermodynamics, the usual notion of macrostate of a gas of identical particles is the total energy, the number of particles, the volume.
 

Related Threads on Is entropy real or just a construct?

Replies
7
Views
1K
Replies
1
Views
2K
Replies
1
Views
1K
  • Last Post
Replies
18
Views
15K
  • Last Post
Replies
2
Views
2K
Replies
1
Views
446
  • Last Post
3
Replies
57
Views
5K
Replies
3
Views
10K
Replies
17
Views
3K
Replies
7
Views
2K
Top