Help me understand Entropy & P's recurrence theorem

In summary: Yes, sampling over an adequate period of time would be necessary in order to determine if the system was near or at maximum entropy.
  • #1
BWV
1,455
1,769
reading the very interesting discussion on the long thread on the 2nd law, had a couple of much more basic questions regarding entropy & the Poincaré Recurrence Theorem

A) is the reduction of entropy implied by the theorem a real, unresolved paradox in physics?

B) if you take a frictionless pool table and a system that begins with a break, the recurrence theorem would state that eventually the system will go back to the point where the balls are racked with the cue ball traveling toward them at the original momentum. how would you quantify the reduction in entropy of this state, if any?

Is maximum entropy achieved when all the balls on the table have the same momentum / kinetic energy?

this seems odd since at random there would be elastic collisions where all the momentum of one ball was transferred to another, leaving the first ball at rest. this happening to all but the cue ball where the balls are re-racked seems unlikely though.
 
Science news on Phys.org
  • #2
BWV said:
reading the very interesting discussion on the long thread on the 2nd law, had a couple of much more basic questions regarding entropy & the Poincaré Recurrence Theorem

A) is the reduction of entropy implied by the theorem a real, unresolved paradox in physics?
No. For a system of even a small number of objects, the time required for the system to return to its original state greatly exceeds the lifetime of the universe. We don't have unlimited amounts of time any more than we have unlimited amounts of space or energy etc.

B) if you take a frictionless pool table and a system that begins with a break, the recurrence theorem would state that eventually the system will go back to the point where the balls are racked with the cue ball traveling toward them at the original momentum. how would you quantify the reduction in entropy of this state, if any?
The concept of entropy is associated with the dispersion of energy. Energy naturally disperses. The initial state with all the energy concentrated in the cue ball is a state of low entropy. In the post-break state that energy is shared among all the balls, a state of higher entropy.

Is maximum entropy achieved when all the balls on the table have the same momentum / kinetic energy?
No. The maximum entropy occurs when the distribution of speeds of the balls follows a Maxwell Boltzmann speed distribution. That will occur fairly quickly.

AM
 
  • #3
BWV said:
B) if you take a frictionless pool table and a system that begins with a break, the recurrence theorem would state that eventually the system will go back to the point where the balls are racked with the cue ball traveling toward them at the original momentum. how would you quantify the reduction in entropy of this state, if any? .

The recurrence relation says that the initial state will be revisited. The question is then how long will it take? I don't know if there is satisfactory answer to this. Obviously very long since thermodynamics works. But in a way just saying it'll take a very long time is unsatisfactory - is it possible to find some type of classical system with an initial state where recurrence will be short?
 
  • #4
Actually, Poincare's theorem says that it will return close to the state. The trajectories cannot intersect since you're using deterministic equations, so it never actually goes back to the exact state.
 
  • #5
To your question RedX, it all depends on what you mean by "short" and "near".

The wiki on "recurrence plots" gives a sketch of the quantitative analysis and applications of your question.
 
  • #6
Pythagorean said:
Actually, Poincare's theorem says that it will return close to the state. The trajectories cannot intersect since you're using deterministic equations, so it never actually goes back to the exact state.

Periodic trajectories are not forbidden, even for complex systems.
However, periodic trajectories are not "generic".

Periodic trajectories occur only by chance when the various frequencies in the system are resonant:

f1/f2 = n1/n2 where n1 and n2 are integers​
(add many frequencies here to the discussion)
When n1 and n2 are large numbers, the recurrence time becomes large also.
And when the frequency ratio is irrational, the -exact- recurrence time is infinite,
even if one can come close to recurrence in a finite time.

So, the second principle is about irrational numbers!
 
  • #7
Thanks for the responses

is it fair to say that in regards to Andrew's comment

The maximum entropy occurs when the distribution of speeds of the balls follows a Maxwell Boltzmann speed distribution.

that one would need to sample the system over some adequate period of time to see if it was at or near maximum entropy? With a Boltzmann distribution, one could have arbitrarily small decreases in entropy from time t to t+1 as for a system at equilibrium there would be some fluctuation proportional to the variance of the distribution. In the pool table example, while a return to the original state would be an incredibly unlikely event, one would expect that elastic collisions from time to time would leave one ball at rest - which would, I guess, be a trivial and temporary reduction of the dispersal of energy in the system

So in essence, the second law is a statistical concept, not a deterministic law? Like saying that the law of large numbers cannot be violated? One could only say that an apparent violation reduction of entropy would have to fall outside of some significance level?
 
  • #8
lalbatros said:
Periodic trajectories are not forbidden, even for complex systems.
However, periodic trajectories are not "generic".

Periodic trajectories occur only by chance when the various frequencies in the system are resonant:

f1/f2 = n1/n2 where n1 and n2 are integers​
(add many frequencies here to the discussion)
When n1 and n2 are large numbers, the recurrence time becomes large also.
And when the frequency ratio is irrational, the -exact- recurrence time is infinite,
even if one can come close to recurrence in a finite time.

So, the second principle is about irrational numbers!

Right, I should be more accurate and say that you can always get arbitrarily close to the initial state, rather than you can always get back to the initial state (for example Betrand's theorem says for spherically symmetric potentials, only the harmonic oscillator and gravity have closed orbits).

So for example if you have a box half full of gas, then [tex].5 \pm .000000002 [/tex] is very close to the initial state, and Poincare's relation says it'll happen, eventually.

Yes there's that theorem that any bounded classical system (forgot the exact conditions) is isomorphic to an n-torus, and the coordinates for the torus oscillate at constant frequency, and exact recurrence comes from frequencies being rational multiplies of each other. But that just leaves the question how do you prove that n1 and n2 are always large numbers?

I don't believe ergodicity has ever been proven, so I think this is an open question, but I'll defer to people more knowledgeable than me about ergodicity.
 
  • #9
Thanks for the comment, RedX.

If you could find the name of the theorem or some related stuff, I would be interrested.
What I said is only some recollection from some old popular readings, as well as some old readings about ergodic magnetic field lines in tokamaks.

In tokamaks, the structure of magnetic field lines is much simpler than the general problem of statistical mechanics. The torii are two dimensional. But it is nevertheless interresting at least because of the possiblity of ergodic / chaotic field lines.
 
  • #10
RedX said:
So for example if you have a box half full of gas, then [tex].5 \pm .000000002 [/tex] is very close to the initial state, and Poincare's relation says it'll happen, eventually.
It seems to me that a corollary to Poincare's recurrence theorem is that the same state will never recur. Since according to the PRT a recurrence to within [itex]\Delta p_i[/itex] requires a finite amount of time (albeit a very large amount of time), the limit as [itex]\Delta p_i -> 0 = \infty[/itex].

AM
 
  • #11
You are right Andrew,

But the PRT also means that you can observe an almost arbitrary entropy decrease after a finite time, since you can choose the initial state on the cycle to make the recurrence time shorther.
The smaller the decrease the most likely it is, and inversly.
Fluctuations are transient decreases of entropy.
Thermal noise in resistances is a well know example.
During a thermal voltage glitch, the entropy is increased.
 
  • #12
RedX said:
The recurrence relation says that the initial state will be revisited. [...] is it possible to find some type of classical system with an initial state where recurrence will be short?
Yes. Take a pendulum, or any other integrable, bounded system. Then all orbits are periodic and stable, and the recurrence time is just that of one period.

But the N-particle systems considered in statistical mechanics are already chaotic when N>2. And thermodynamics is usually derived in the thermodynamic limit N-->inf. Then recurrence is impossible and entropy increases very quickly to very close its equilibrium state. For large but finite N, the approximation is very good as long as your time interval isn't astronomically large.

Note that all models are approximate only; thermodynamics is no exception. Thus mathematical theorems that depend on extreme accuracies or extreme times are irrelevant for real physics.
 
  • #13
A. Neumaier said:
Yes. Take a pendulum, or any other integrable, bounded system. Then all orbits are periodic and stable, and the recurrence time is just that of one period.

But the N-particle systems considered in statistical mechanics are already chaotic when N>2. And thermodynamics is usually derived in the thermodynamic limit N-->inf. Then recurrence is impossible and entropy increases very quickly to very close its equilibrium state. For large but finite N, the approximation is very good as long as your time interval isn't astronomically large.

Note that all models are approximate only; thermodynamics is no exception. Thus mathematical theorems that depend on extreme accuracies or extreme times are irrelevant for real physics.

Sounds right. So even when the N particles are very weakly interacting, you get chaos? Because often times you treat N particles as completely independent, with a separable Hamiltonian. Obviously they have to interact somehow, or you won't get any distributions, but I always thought the justification for that is that there is an interaction term in the Hamiltonian, but the coupling is so small that having it vanish won't change the distributions that you get.

Anyways, I found an easy article on ergodicity breaking:

http://www.evolant.org/assets/0/15/19/65/83/563/0FF95DD6-A4B2-4633-8268-36518E62E218.pdf [Broken]
 
Last edited by a moderator:
  • #14
RedX said:
So even when the N particles are very weakly interacting, you get chaos? Because often times you treat N particles as completely independent, with a separable Hamiltonian. Obviously they have to interact somehow,
They act (like in billard) with the boundary of box that keeps the gas together. This is enough to make the system chaotic (like in billard). Without the lbox the atoms would move away in different directions (like in billard), and nothing would be there to consider. The interaction between the patricles themselves can be exactly zero (which gives the ideal gas).
 
  • #15
A. Neumaier said:
They act (like in billard) with the boundary of box that keeps the gas together. This is enough to make the system chaotic (like in billard). Without the lbox the atoms would move away in different directions (like in billard), and nothing would be there to consider. The interaction between the patricles themselves can be exactly zero (which gives the ideal gas).

Most books make the assumption that any collisions with the wall don't change the speed of the particles. So in order to have all the molecules of the gas have the same speed (the equilibrium condition), the gas molecules have to collide with each other so that the super-fast molecules give up some of their speed to the super-slow molecules, perhaps under a 6-12 potential. I've never calculated how often they collide, but one way might be to take the area of the atom which would be the radius squared (assuming square atoms), multiplying by a length to get a volume, and setting this equal to volume per molecule. Once you have this length you can divide by velocity to get the time between collisions, and take the reciprocal to get the frequency of collisions.

You're absolutely right that if the volume went to infinity, then there should be nothing to consider. Which gives a paradox, since in thermodynamics, the volume can go to infinity, but the speed should still be the same for every particle, equal to the square root of the temperature. But if the volume is infinity, then there is no interaction, so I don't understand how the speed can all be equal: those that start with super-high speeds stay that way and those that are super-slow stay that way.
 
  • #16
I am not sure why you think the molecules should all have the same speed. They have a broad range of speeds, in accordance with the Maxwell-Boltzmann speed distribution for that temperature.

AM
 
  • #17
Andrew Mason said:
I am not sure why you think the molecules should all have the same speed. They have a broad range of speeds, in accordance with the Maxwell-Boltzmann speed distribution for that temperature.

AM

Yeah, you're right. It's a bit counter-intuitive, as you'd expect that whenever molecules collide, the slower one gets faster and the faster one gets slower, so eventually everything should be the same speed. But actually that's not necessarily true of collisions.
 
  • #18
RedX said:
Most books make the assumption that any collisions with the wall don't change the speed of the particles. So in order to have all the molecules of the gas have the same speed (the equilibrium condition), the gas molecules have to collide with each other so that the super-fast molecules give up some of their speed to the super-slow molecules, perhaps under a 6-12 potential.
But particles interacting with an nonzero potential are no longer an ideal gas. Thermodynamics is however valid for the latter. And it is because no wall is perfectly reflecting. (It doesn't really matter where entropy is produced - through collisions with the wall or through collisions with each other. in both case, one reaches equilibrium.)
RedX said:
You're absolutely right that if the volume went to infinity, then there should be nothing to consider. Which gives a paradox, since in thermodynamics, the volume can go to infinity,
The thermodynamic limit is taken at the very end, when equilibrium is already attained. In practice, the volume is finite but large enough that the error made by using the much simpler limiting formula is negligible.
 

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In scientific terms, it is a measure of the number of possible microstates that a system can have while maintaining the same macrostate. The higher the entropy, the more disordered the system is.

2. How is entropy related to the second law of thermodynamics?

The second law of thermodynamics states that the total entropy of a closed system will always increase over time. This means that as time goes on, the disorder in a system will naturally increase. Entropy is a way to quantify and measure this increase in disorder.

3. Can you explain P's recurrence theorem?

P's recurrence theorem, also known as the Poincaré recurrence theorem, states that in a closed system with a finite number of states, every possible state will eventually be visited again. This means that even in systems that appear chaotic or random, there is a degree of predictability and order.

4. Why is understanding entropy and P's recurrence theorem important?

Understanding entropy and P's recurrence theorem is important because it helps us better understand and predict the behavior of complex systems. It also has practical applications in various fields such as physics, chemistry, and engineering.

5. How can we calculate entropy?

Entropy can be calculated using the formula S = k ln(W), where S is the entropy, k is the Boltzmann constant, and W is the number of possible microstates. However, in real-world systems, entropy is often estimated rather than precisely calculated due to the complexity of the system.

Similar threads

  • Thermodynamics
Replies
4
Views
277
Replies
1
Views
962
  • Thermodynamics
Replies
7
Views
2K
  • Thermodynamics
Replies
1
Views
697
Replies
3
Views
922
  • Thermodynamics
Replies
7
Views
1K
Replies
3
Views
1K
Replies
12
Views
2K
Back
Top