Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Statistical entropy

  1. Mar 11, 2006 #1
    Supposing I have a face-down card containing one of the 26 english alphabet, but I don't know which one. Hence, the entropy is kln 26.
    But, if I were to open the card, now I know exactly what the alphabet is. Hence, the entropy now is zero.
    How can this be in view of the second law?
  2. jcsd
  3. Mar 11, 2006 #2
    What kind of opening and closing can you see in the components of a system which can bring uncertianity and certianity in the observer?
    The entropy you are talking depends on the defintion. You cannot talk that way. If you are talking about probability then that is 1/26.
  4. Mar 11, 2006 #3


    User Avatar

    Isn't entropy used in context of a system? I would say that the entropy is still the same because there are still 26 states in the system. Turning a card up doesn't do anything.
  5. Mar 11, 2006 #4


    User Avatar

    Staff: Mentor

    What does having a card face-down on a table have to do with entropy?
  6. Mar 11, 2006 #5
    The entropy defined using statistical mechanics is kln W, where W is the number of possible microstates for a certain energy, E. That's the definition of entropy I am using.
    I still think my question is valid.
    Let's not use cards. Use particles instead.
    Suppose I have some particles in a box with a sum of energy, E. So, initially, the system have an entropy of kln W. Now, if I were to perform an experiment to determine which microstate the system is in, then, since now, I know with certainty which microstate, the entropy now becomes zero.
    Last edited: Mar 11, 2006
  7. Mar 12, 2006 #6


    User Avatar
    Science Advisor
    Homework Helper

    Entropy has nothing to do with your knowlegde of the system. W is the number of microstates (under the constraints) available to the system. Your knowlegde doesn't change that number, it's totally unrelated.
  8. Mar 12, 2006 #7


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    You are using the wrong interpretation of a microstate to the card analogy.

    No, it does not. Your knowing which microstate the system is in (which would require your measuring all the positions and conjugate momenta simultaneously) does not reduce the phase space of states the system can sample at the given temperature. And it is this number that's relevant to the entropy.
  9. Mar 12, 2006 #8
    I used the wrong interpretation of microstates? How?
  10. Mar 13, 2006 #9
    You are asuming that the probability of knowing the state of an object or more properly of a system's properties and probability of their correctness to be entropy. But it isn't. Knowing the states of a system does not deal anywhere with entropy. ntropy of n object to be near to the calculated value is less, then that implies the object has high entropy and vice versa knowin much perfectly the state of a system makes the entropy very less. But even if we know the how is a system existing or its properties, still its entropy doesn't decreases, it remains as it is. The physical meaning of this is that the theory cannot predict perfectly how should that system exist from its properties like T, P, V, etc. But the probability of the ways of its existence doesn't decrease just coz you know it.
    Example - If you fill some gas in a container. Further by compressing thevolume of container and increasin pressure, you are decreasing the entropy. There is less volume but the number of molecules has remained the same. So the configuration og their arrangement is less. But in other words, if you know the exact mapping of where and where are the molecules of gas in the container, that does not change the number of configuration in which gas can be filled in the container.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?