Statistical Entropy: How Can Zero Entropy be in View of the Second Law?

  • Context: Graduate 
  • Thread starter Thread starter touqra
  • Start date Start date
  • Tags Tags
    Entropy Statistical
Click For Summary

Discussion Overview

The discussion revolves around the concept of statistical entropy, particularly in relation to the second law of thermodynamics. Participants explore the implications of knowledge and measurement on entropy, using analogies such as cards and particles in a box to illustrate their points. The conversation touches on definitions of entropy, the role of microstates, and the relationship between knowledge and uncertainty.

Discussion Character

  • Debate/contested
  • Conceptual clarification
  • Technical explanation

Main Points Raised

  • One participant suggests that knowing the state of a card reduces entropy to zero, questioning how this aligns with the second law of thermodynamics.
  • Another participant argues that the entropy depends on the definition and that the probability of the card being one of 26 letters is 1/26, implying a misunderstanding of entropy's context.
  • A different viewpoint states that entropy remains the same regardless of whether the card is face-down or face-up, as there are still 26 possible states in the system.
  • One participant emphasizes that statistical mechanics defines entropy as kln W, where W is the number of microstates, and asserts that measuring a microstate does not change the entropy.
  • Another participant challenges the idea that knowledge of a microstate reduces entropy, stating that the number of microstates available to the system remains unchanged regardless of knowledge.
  • Further, a participant explains that knowing the exact state of a system does not affect its entropy, using the example of gas in a container to illustrate that configurations can remain the same despite knowledge of positions.

Areas of Agreement / Disagreement

Participants express differing views on the relationship between knowledge and entropy, with no consensus reached. Some argue that knowledge reduces entropy, while others maintain that entropy is independent of what is known about the system.

Contextual Notes

Participants reference different interpretations of microstates and the definitions of entropy, indicating potential limitations in their arguments. The discussion highlights the complexity of entropy in statistical mechanics and the nuances in understanding its implications in various contexts.

touqra
Messages
284
Reaction score
0
Supposing I have a face-down card containing one of the 26 english alphabet, but I don't know which one. Hence, the entropy is kln 26.
But, if I were to open the card, now I know exactly what the alphabet is. Hence, the entropy now is zero.
How can this be in view of the second law?
 
Science news on Phys.org
What kind of opening and closing can you see in the components of a system which can bring uncertianity and certianity in the observer?
The entropy you are talking depends on the definition. You cannot talk that way. If you are talking about probability then that is 1/26.
 
Isn't entropy used in context of a system? I would say that the entropy is still the same because there are still 26 states in the system. Turning a card up doesn't do anything.
 
What does having a card face-down on a table have to do with entropy?
 
The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
I still think my question is valid.
Let's not use cards. Use particles instead.
Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
 
Last edited:
Entropy has nothing to do with your knowlegde of the system. W is the number of microstates (under the constraints) available to the system. Your knowlegde doesn't change that number, it's totally unrelated.
 
touqra said:
The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
I still think my question is valid.
You are using the wrong interpretation of a microstate to the card analogy.

Let's not use cards. Use particles instead.
Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
No, it does not. Your knowing which microstate the system is in (which would require your measuring all the positions and conjugate momenta simultaneously) does not reduce the phase space of states the system can sample at the given temperature. And it is this number that's relevant to the entropy.
 
Gokul43201 said:
You are using the wrong interpretation of a microstate to the card analogy.

I used the wrong interpretation of microstates? How?
 
You are asuming that the probability of knowing the state of an object or more properly of a system's properties and probability of their correctness to be entropy. But it isn't. Knowing the states of a system does not deal anywhere with entropy. ntropy of n object to be near to the calculated value is less, then that implies the object has high entropy and vice versa knowin much perfectly the state of a system makes the entropy very less. But even if we know the how is a system existing or its properties, still its entropy doesn't decreases, it remains as it is. The physical meaning of this is that the theory cannot predict perfectly how should that system exist from its properties like T, P, V, etc. But the probability of the ways of its existence doesn't decrease just coz you know it.
Example - If you fill some gas in a container. Further by compressing thevolume of container and increasin pressure, you are decreasing the entropy. There is less volume but the number of molecules has remained the same. So the configuration og their arrangement is less. But in other words, if you know the exact mapping of where and where are the molecules of gas in the container, that does not change the number of configuration in which gas can be filled in the container.
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K