Is entropy real or just a construct?

AI Thread Summary
Entropy is a property of systems that quantifies the number of ways a system can be reorganized, reflecting its randomness. The discussion highlights the confusion surrounding entropy as both a measurable property and a construct arising from our inability to fully describe a system's state. It emphasizes that while complete descriptions of systems may be theoretically possible, practical limitations, such as those posed by quantum mechanics, complicate this endeavor. The relationship between microstates and macrostates is crucial, as many microstates can correspond to a single macrostate, influencing the entropy calculation. Ultimately, entropy remains a fundamental concept in thermodynamics, illustrating the limitations of energy extraction based on thermal states.
ramzerimar
Messages
178
Reaction score
23
When we talk about entropy, we say it comes from our inability to completely describe the state of a system. Also, we say it is a property of the system (like entalphy). That's confusing me a lot. If entropy is a property, how can it come from our inability to describe the system? Or it's just a construct? Something that we use to describe reality?

Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

I just need some clarification about this.
 
Science news on Phys.org
Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?
 
phinds said:
Entropy is a measure of the # of ways a system can be reorganized (aka its "randomness"), so it's a property of systems, like length. If for some reason you can't measure the length of something, does that mean it doesn't HAVE a length?

Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
 
How would you create a complete description of, say, the Earth? Things just get too complicated.
 
ramzerimar said:
Ok, understand that. The greater the number of ways the system can be reorganized, the greater the entropy.

So it's impossible to completely describe the state of a system. I'm having some trouble trying to grasp what does it really mean. Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?

What does "completely describe the state of a system" mean?

Do you think the concept of "temperature" or "heat" is sufficient to "completely describe the state of a system"? After all, you made no complaints about those, and thus, presumably, you are happy with those concepts. Yet, these are the same thermodynamics/statistical concepts on the SAME foundation as entropy.

Zz.
 
ramzerimar said:
Does it come from quantum mechanics (Heisenberg Uncertainty) or is it something else?
My first thought on reading your first post was that you were thinking of the HUP; no, entropy is not due to the limitations of the HUP.

As an engineer, the statistical description doesn't speak to me. The classical definition may work better for you too; it is thermal energy unavailable to do work. A simple manifestation of it is that when the temperature difference between two reservoirs is lowered, your ability to extract is reduced by a larger proportion. The difference is the entropy loss.

To put it another way: you are better off having a little bit of hot water than a lot of lukewarm water. As such, this is also referred to as "quality".
 
ramzerimar said:
Let's say we had some entity (like the Laplace Demon) that could completely describe the state of a system (knowing exactly the position and velocity of all the particles within it), so there would be no entropy?

The statistical notion of entropy involves two different notions of the state of a system: (1) the microstate, which is the complete description, and (2) the macrostate, which is the "coarse-grained" description. There can be many microstates that correspond to the same macrostate.

For example, suppose you flip 100 coins. The microstate might be the exact record of which of the coins were heads and which were tails. The macrostate might be just the count of the number of heads and tails. Then the entropy for a macrostate with H heads and T tails would be k log(W(H,T,100), where W(H,T,100) is the number of ways to arrange 100 coins so that H of them are heads and T of them are tails. If I remember correctly, the formula for that is: W(H,T,100) = \frac{100!}{H! T!}. The zero-entropy states are H= 100, T=0 and H=0, T=100. The highest-entropy state is H=50, T=50.

But the statistical notion of entropy is relative to a particular notion of macrostate. In thermodynamics, the usual notion of macrostate of a gas of identical particles is the total energy, the number of particles, the volume.
 
Back
Top