Clarifying Entropy: Does it Always Increase?

  • Thread starter Jeronimus
  • Start date
  • Tags
    entropy
In summary: It's a complex question with many variables, but in the simplest case, if you have a gas in a closed system, the more molecules there are, the more probable it is that they will all end up at the same place.
  • #1
Jeronimus
287
9
Considering a closed system with an ideal gas(in a low entropy state) inside, then are following statements correct?

The gas is in a certain state we can assign an entropy value to.

Let X be the set of all states which are of a lower entropy value compared to the current state.
Y the set of states of higher entropy value etc.

We can assign a probability value to each of the states within X and Y, giving us the chance of the gas reaching any of those states from the current state.
Question: Are there states with exactly zero probability, which cannot be reached at all from the current state(without having to go through other states)? In general, are all states equally probable or do the probabilities to reach a given state depend on the current state of the gas?

The probability of reaching a state of higher entropy contained in Y are higher than reaching one in X.
However, it is not impossible to reach a state within X, therefore (rarely) decrease entropy. Hence, it is more likely for entropy to increase, rather than decrease but not impossible for it to decrease.

At some point, as entropy keeps increasing more often than not, we reach the equilibrium state of the gas.

Question: Is the equilibrium state of the gas really the maximum entropy state? If yes, then do all other states which are contained in the set of X have exactly zero probability of occurring?

Or is the equilibrium state a state where the probabilities of reaching one of the states contained within X is equal to reaching one of the states contained in Y?

Therefore, there are states of higher entropy still, which are not equilibrium states and where reaching one of those states from the equilibrium state is just as likely/unlikely as reaching a state contained within X of lower entropy.

If this was the case, then even the statement of "entropy is more likely to increase than decrease" would not be true any more once the equilibrium state is reached.
Question: If above is not the case, then is there a quantum mechanical proof for it?

Question: Can the gas being at maximum entropy fall back to its initial state of minimum entropy just out of sheer coincidence? Extremely low chance, but possible?
Before you close my thread, this is what Susskind explains 24:44 within this video



also in this video, he states that entropy ALMOST always increases but not always at 16:44 into the video
 
Last edited:
  • Like
Likes woody stanford
Science news on Phys.org
  • #2
Why was this moved? I think my questioning implies that i am interested in the quantum mechanical point of view on entropy.

From what i have been reading, an ideal gas inside a closed system, according to classical physics, has infinite states whereas according to QM the states are finite. Unless this is incorrect i believe my thread should go back to QM.
 
  • #3
I am not really an expert on this, but highest entropy is absolute zero, isn't it? And that doesn't exist, so entropy 'wobbles' around some equilibrium I guess...
 
  • #4
Individual states don't have an entropy, probability distributions of states have an entropy. If you came up with an optimal compression scheme for describing a state sampled from the distribution, the entropy of the distribution is the expected length of the description.

Since individual states don't have an entropy in the sense your question is assuming, I'm not sure how to answer it.
 
  • #5
There is a misconception in statement he makes, which is to the effect that in infinite time, everything happens. That is simply not so. It comes from not recognizing that infinities are different sizes.

In infinite time, you can never calculate the entire decimal representation of pi.
In infinite time, you can never name all the real numbers between 0 and 1.
In infinite time you can never count all the integers.
In infinite time, you can never flip a coin an infinite amount of times, and get all heads.
In infinite time, you can never calculate the final 3 in the decimal representation of one third, 0.33333333333333333333333333333333...

Take a single gas molecule in a 1 cubic meter box. How many locations are within that box? An infinite amount. That one gas molecule has an infinite number of positions. And in a cubic meter of air, there are 2.5 x 10^25 air molecules.

People grasp the law of large numbers, and can imagine that in infinite time, flipping a coin in sets of 1 million times, eventually, a million heads will occur. We accept that infinity is larger than that very large improbability. But how likely is it that you can flip a coin an infinite number of times and get all heads? It will take many infinities of infinities to get that result.

My examples may be somewhat forced, but the idea that there is some probability of the air molecules in a cubic meter of gas collecting at one corner is somewhat incorrect. There is a probability that is infinitely small. And it is not correct to assume that the passage of infinite time is large enough to generate all of the infinite arrangements of the gas, including that infinitely improbable one. Infinity is a bit tricky to deal with, and there are different types of infinity. Infinity^infinity different types.
 
  • Like
Likes Buzz Bloom
  • #6
votingmachine said:
And it is not correct to assume that the passage of infinite time is large enough to generate all of the infinite arrangements of the gas, including that infinitely improbable one. Infinity is a bit tricky to deal with, and there are different types of infinity. Infinity^infinity different types.

Which is why i posted this in the QM forum. According to QM there are NOT infinite amount of states but a finite amount, which means that given ENOUGH time, NOT infinite time, the gas molecules will found themselves in the corner of the box once again.
Some moderator decided however that this is the right place to ask that question and moved it.
 
  • #7
Hmm. News to me. I can't see why a theory would put a finite number on it. Where did you see that?
 
  • #8
votingmachine said:
Hmm. News to me. I can't see why a theory would put a finite number on it. Where did you see that?

The entropy is proportional to the logarithm of the number of states that the gas could have while satisfying these constraints. In classical physics, the number of states is infinitely large, but according to quantum mechanics it is finite.

https://en.wikipedia.org/wiki/Gibbs_paradox
 
  • #9
I'm not sure I accept that. I still need to think on the math in that a bit more ...

But even if it was the case ... it is still a disproportionate ratio of the states to the time. We cannot build a box that lasts very long. Planets like Earth have an atmosphere that changes in composition, changes temperature daily. And the Earth is only 4 billion years old. The universe something like 14 billion years old.

If I go back to my million coin flips of heads. Say I can flip and measure every second. I am looking for 1 out of 2^1,000,000 possible states. 14 billion years is only 4.4x10^17 seconds. 2^1,000,000 dwarfs 10^17. So a relatively simple system ... 1 million coins flipped becomes prohibitive to find that "low entropy" solution in a finite universe of time.

If I come to accept the finite number of states for a defined box ... I will still be inclined to regard that box as likely to expire before the time necessary to see anything other than the ordinary, which is overwhelmingly probable.
 
  • #10
The more I think about it, it makes a certain amount of sense that there are a finite number of states. I need to look at more than the Wikipedia reduction, but the gas particles have momentum and location, and the uncertainty of those leads to an indistinguishable state difference unless they move by more than a small location difference.

Question: Can the gas being at maximum entropy fall back to its initial state of minimum entropy just out of sheer coincidence? Extremely low chance, but possible?

Regardless of my current pondering of this Wikipedia reduction ... the final number of states is still so large that it is impractical ... it almost requires the passage of infinite time on a bounded, isolated container. I need a more practical prediction.

If I take that cubic meter of an ideal gas, with 2.5 x 10^25 molecules. And I distribute them across the momentum-location range they can have, and break that down by the quantum size implied by the Heisenberg uncertainty principle, I have an extremely large number. If I just take the spatial range and divide it into pieces of h/2 in size ... since it is 3 dimensions, I have an h^3 in my denominator.

Planck's constant, when phrased as momentum x distance is:
6.626x10^-34 (momentum) x meters

Let's assume every particle has a different momentum. Then I would back-of-the-envelope estimate the final number of states as 10^(25+34+34+34) = 10^127. If I estimate the time interval between states as the time when any two gas particles have moved a distinguishable distance from each other ... I will use the Planck distance. The fastest gas molecule is moving much lower than 10,000 m/s. The Planck distance is about 10^-34 m. divide that by 10,000 and the states change every 10^-39 seconds.

That leads me to estimate that the most improbably, single state can be expected within 10^88 seconds. Somewhat larger than the 10^17 age of the universe. I come up with the inability to predict anything with that sort of result. I simply cannot make a bounded isolated cubic meter of ideal gas, with pressure gauges that let me measure the anomalous, and wait that long.

So the "low chance" seems to be incredibly low. In practical terms, there is no difference between something impossible and something extraordinarily improbable.

Please double check the sloppy calculations ... I apologize in advance for errors.
 
  • #11
Can a simplistic understanding of entropy also be stated as the natural expenditure of a process falling to a lower energy state?
 
  • #12
It seems to apply.
 
  • #13
Jeronimus said:
Why was this moved? I think my questioning implies that i am interested in the quantum mechanical point of view on entropy.

From what i have been reading, an ideal gas inside a closed system, according to classical physics, has infinite states whereas according to QM the states are finite. Unless this is incorrect i believe my thread should go back to QM.

I'd have to agree with the mods.

Maybe it has to do with the phrasing of it? While I realize that a large volume of an ideal gas does have a quantum state, there is a subtle implication I think that it should be solvable (ie. changing the question perhaps to "100-500 molecules of in a gaseous state") in order to have application to QM.

The way you have phrased it lends its itself more to classical thermodynamics I think than a quantum mechanical interpretation. No offense. :D
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. It is a thermodynamic property that is used to describe the amount of energy that is unavailable for work in a system.

2. Why does entropy always increase?

According to the Second Law of Thermodynamics, the total entropy of a closed system will always increase over time. This is because natural processes tend to move towards a state of greater disorder.

3. Does entropy only apply to physical systems?

No, entropy can also be applied to non-physical systems such as information theory or economic systems. Basically, any system that involves a flow of energy and a tendency towards disorder can be described using entropy.

4. Can entropy ever decrease?

In a closed system, entropy will always increase. However, in an open system where energy and matter can enter and leave, the entropy of one part of the system can decrease as long as the overall entropy of the entire system increases.

5. How is entropy related to the concept of time?

Entropy is often associated with time because it is a measure of the direction in which natural processes tend to move. As time passes, entropy increases and systems become more disordered. This is why entropy is sometimes referred to as the "arrow of time".

Similar threads

  • Thermodynamics
Replies
33
Views
2K
Replies
13
Views
1K
Replies
2
Views
846
Replies
17
Views
1K
Replies
22
Views
2K
Replies
1
Views
911
Replies
12
Views
2K
  • Thermodynamics
Replies
1
Views
736
Replies
3
Views
973
Replies
4
Views
1K
Back
Top