1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

B Clarifying entropy

Tags:
  1. May 23, 2016 #1
    Considering a closed system with an ideal gas(in a low entropy state) inside, then are following statements correct?

    The gas is in a certain state we can assign an entropy value to.

    Let X be the set of all states which are of a lower entropy value compared to the current state.
    Y the set of states of higher entropy value etc.

    We can assign a probability value to each of the states within X and Y, giving us the chance of the gas reaching any of those states from the current state.
    Question: Are there states with exactly zero probability, which cannot be reached at all from the current state(without having to go through other states)? In general, are all states equally probable or do the probabilities to reach a given state depend on the current state of the gas?

    The probability of reaching a state of higher entropy contained in Y are higher than reaching one in X.
    However, it is not impossible to reach a state within X, therefore (rarely) decrease entropy. Hence, it is more likely for entropy to increase, rather than decrease but not impossible for it to decrease.

    At some point, as entropy keeps increasing more often than not, we reach the equilibrium state of the gas.

    Question: Is the equilibrium state of the gas really the maximum entropy state? If yes, then do all other states which are contained in the set of X have exactly zero probability of occurring?

    Or is the equilibrium state a state where the probabilities of reaching one of the states contained within X is equal to reaching one of the states contained in Y?

    Therefore, there are states of higher entropy still, which are not equilibrium states and where reaching one of those states from the equilibrium state is just as likely/unlikely as reaching a state contained within X of lower entropy.

    If this was the case, then even the statement of "entropy is more likely to increase than decrease" would not be true any more once the equilibrium state is reached.
    Question: If above is not the case, then is there a quantum mechanical proof for it?

    Question: Can the gas being at maximum entropy fall back to its initial state of minimum entropy just out of sheer coincidence? Extremely low chance, but possible?
    Before you close my thread, this is what Susskind explains 24:44 within this video



    also in this video, he states that entropy ALMOST always increases but not always at 16:44 into the video
     
    Last edited: May 23, 2016
  2. jcsd
  3. May 23, 2016 #2
    Why was this moved? I think my questioning implies that i am interested in the quantum mechanical point of view on entropy.

    From what i have been reading, an ideal gas inside a closed system, according to classical physics, has infinite states whereas according to QM the states are finite. Unless this is incorrect i believe my thread should go back to QM.
     
  4. May 23, 2016 #3
    I am not really an expert on this, but highest entropy is absolute zero, isn't it? And that doesn't exist, so entropy 'wobbles' around some equilibrium I guess...
     
  5. May 23, 2016 #4

    Strilanc

    User Avatar
    Science Advisor

    Individual states don't have an entropy, probability distributions of states have an entropy. If you came up with an optimal compression scheme for describing a state sampled from the distribution, the entropy of the distribution is the expected length of the description.

    Since individual states don't have an entropy in the sense your question is assuming, I'm not sure how to answer it.
     
  6. May 23, 2016 #5
    There is a misconception in statement he makes, which is to the effect that in infinite time, everything happens. That is simply not so. It comes from not recognizing that infinities are different sizes.

    In infinite time, you can never calculate the entire decimal representation of pi.
    In infinite time, you can never name all the real numbers between 0 and 1.
    In infinite time you can never count all the integers.
    In infinite time, you can never flip a coin an infinite amount of times, and get all heads.
    In infinite time, you can never calculate the final 3 in the decimal representation of one third, 0.33333333333333333333333333333333...

    Take a single gas molecule in a 1 cubic meter box. How many locations are within that box? An infinite amount. That one gas molecule has an infinite number of positions. And in a cubic meter of air, there are 2.5 x 10^25 air molecules.

    People grasp the law of large numbers, and can imagine that in infinite time, flipping a coin in sets of 1 million times, eventually, a million heads will occur. We accept that infinity is larger than that very large improbability. But how likely is it that you can flip a coin an infinite number of times and get all heads? It will take many infinities of infinities to get that result.

    My examples may be somewhat forced, but the idea that there is some probability of the air molecules in a cubic meter of gas collecting at one corner is somewhat incorrect. There is a probability that is infinitely small. And it is not correct to assume that the passage of infinite time is large enough to generate all of the infinite arrangements of the gas, including that infinitely improbable one. Infinity is a bit tricky to deal with, and there are different types of infinity. Infinity^infinity different types.
     
  7. May 23, 2016 #6
    Which is why i posted this in the QM forum. According to QM there are NOT infinite amount of states but a finite amount, which means that given ENOUGH time, NOT infinite time, the gas molecules will found themselves in the corner of the box once again.
    Some moderator decided however that this is the right place to ask that question and moved it.
     
  8. May 23, 2016 #7
    Hmm. News to me. I can't see why a theory would put a finite number on it. Where did you see that?
     
  9. May 23, 2016 #8
    https://en.wikipedia.org/wiki/Gibbs_paradox
     
  10. May 23, 2016 #9
    I'm not sure I accept that. I still need to think on the math in that a bit more ...

    But even if it was the case ... it is still a disproportionate ratio of the states to the time. We cannot build a box that lasts very long. Planets like earth have an atmosphere that changes in composition, changes temperature daily. And the earth is only 4 billion years old. The universe something like 14 billion years old.

    If I go back to my million coin flips of heads. Say I can flip and measure every second. I am looking for 1 out of 2^1,000,000 possible states. 14 billion years is only 4.4x10^17 seconds. 2^1,000,000 dwarfs 10^17. So a relatively simple system ... 1 million coins flipped becomes prohibitive to find that "low entropy" solution in a finite universe of time.

    If I come to accept the finite number of states for a defined box ... I will still be inclined to regard that box as likely to expire before the time necessary to see anything other than the ordinary, which is overwhelmingly probable.
     
  11. May 24, 2016 #10
    The more I think about it, it makes a certain amount of sense that there are a finite number of states. I need to look at more than the Wikipedia reduction, but the gas particles have momentum and location, and the uncertainty of those leads to an indistinguishable state difference unless they move by more than a small location difference.

    Regardless of my current pondering of this Wikipedia reduction ... the final number of states is still so large that it is impractical ... it almost requires the passage of infinite time on a bounded, isolated container. I need a more practical prediction.

    If I take that cubic meter of an ideal gas, with 2.5 x 10^25 molecules. And I distribute them across the momentum-location range they can have, and break that down by the quantum size implied by the Heisenberg uncertainty principle, I have an extremely large number. If I just take the spatial range and divide it into pieces of h/2 in size ... since it is 3 dimensions, I have an h^3 in my denominator.

    Planck's constant, when phrased as momentum x distance is:
    6.626x10^-34 (momentum) x meters

    Let's assume every particle has a different momentum. Then I would back-of-the-envelope estimate the final number of states as 10^(25+34+34+34) = 10^127. If I estimate the time interval between states as the time when any two gas particles have moved a distinguishable distance from each other ... I will use the Planck distance. The fastest gas molecule is moving much lower than 10,000 m/s. The Planck distance is about 10^-34 m. divide that by 10,000 and the states change every 10^-39 seconds.

    That leads me to estimate that the most improbably, single state can be expected within 10^88 seconds. Somewhat larger than the 10^17 age of the universe. I come up with the inability to predict anything with that sort of result. I simply cannot make a bounded isolated cubic meter of ideal gas, with pressure gauges that let me measure the anomalous, and wait that long.

    So the "low chance" seems to be incredibly low. In practical terms, there is no difference between something impossible and something extraordinarily improbable.

    Please double check the sloppy calculations ... I apologize in advance for errors.
     
  12. Jun 17, 2016 #11
    Can a simplistic understanding of entropy also be stated as the natural expenditure of a process falling to a lower energy state?
     
  13. Apr 2, 2017 #12
    It seems to apply.
     
  14. Apr 3, 2017 #13
    I'd have to agree with the mods.

    Maybe it has to do with the phrasing of it? While I realize that a large volume of an ideal gas does have a quantum state, there is a subtle implication I think that it should be solvable (ie. changing the question perhaps to "100-500 molecules of in a gaseous state") in order to have application to QM.

    The way you have phrased it lends its itself more to classical thermodynamics I think than a quantum mechanical interpretation. No offense. :D
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Clarifying entropy
  1. Entropy ? (Replies: 2)

  2. Friction clarified (Replies: 2)

Loading...