Statistical Entropy: How Can Zero Entropy be in View of the Second Law?

AI Thread Summary
The discussion centers on the concept of entropy and its relationship to knowledge and microstates. It argues that knowing the state of a system, such as a card or particles in a box, does not reduce the entropy of the system, which is defined by the number of possible microstates (W). Participants emphasize that entropy, calculated as kln W, remains constant regardless of an observer's knowledge of the system's state. The conversation highlights a misunderstanding of how knowledge interacts with entropy, asserting that knowing the microstate does not alter the system's inherent entropy. Ultimately, the key point is that entropy is a property of the system itself, independent of the observer's knowledge.
touqra
Messages
284
Reaction score
0
Supposing I have a face-down card containing one of the 26 english alphabet, but I don't know which one. Hence, the entropy is kln 26.
But, if I were to open the card, now I know exactly what the alphabet is. Hence, the entropy now is zero.
How can this be in view of the second law?
 
Science news on Phys.org
What kind of opening and closing can you see in the components of a system which can bring uncertianity and certianity in the observer?
The entropy you are talking depends on the defintion. You cannot talk that way. If you are talking about probability then that is 1/26.
 
Isn't entropy used in context of a system? I would say that the entropy is still the same because there are still 26 states in the system. Turning a card up doesn't do anything.
 
What does having a card face-down on a table have to do with entropy?
 
The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
I still think my question is valid.
Let's not use cards. Use particles instead.
Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
 
Last edited:
Entropy has nothing to do with your knowlegde of the system. W is the number of microstates (under the constraints) available to the system. Your knowlegde doesn't change that number, it's totally unrelated.
 
touqra said:
The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
I still think my question is valid.
You are using the wrong interpretation of a microstate to the card analogy.

Let's not use cards. Use particles instead.
Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
No, it does not. Your knowing which microstate the system is in (which would require your measuring all the positions and conjugate momenta simultaneously) does not reduce the phase space of states the system can sample at the given temperature. And it is this number that's relevant to the entropy.
 
Gokul43201 said:
You are using the wrong interpretation of a microstate to the card analogy.

I used the wrong interpretation of microstates? How?
 
You are asuming that the probability of knowing the state of an object or more properly of a system's properties and probability of their correctness to be entropy. But it isn't. Knowing the states of a system does not deal anywhere with entropy. ntropy of n object to be near to the calculated value is less, then that implies the object has high entropy and vice versa knowin much perfectly the state of a system makes the entropy very less. But even if we know the how is a system existing or its properties, still its entropy doesn't decreases, it remains as it is. The physical meaning of this is that the theory cannot predict perfectly how should that system exist from its properties like T, P, V, etc. But the probability of the ways of its existence doesn't decrease just coz you know it.
Example - If you fill some gas in a container. Further by compressing thevolume of container and increasin pressure, you are decreasing the entropy. There is less volume but the number of molecules has remained the same. So the configuration og their arrangement is less. But in other words, if you know the exact mapping of where and where are the molecules of gas in the container, that does not change the number of configuration in which gas can be filled in the container.
 
Back
Top