Help me understand Entropy & P's recurrence theorem

AI Thread Summary
The discussion revolves around the implications of the Poincaré Recurrence Theorem on entropy, questioning whether the theorem presents a paradox in physics. It clarifies that while the theorem suggests a system will eventually return to its initial state, the time required for this to occur far exceeds the universe's lifespan, making such recurrences practically irrelevant. The conversation also highlights that maximum entropy is achieved not when all particles have the same momentum, but when their speeds follow a Maxwell-Boltzmann distribution. Additionally, it emphasizes that fluctuations can lead to temporary decreases in entropy, reinforcing that the second law of thermodynamics is fundamentally statistical rather than deterministic. Overall, the complexities of chaotic systems and their behavior under the recurrence theorem are central to the discussion.
BWV
Messages
1,574
Reaction score
1,928
reading the very interesting discussion on the long thread on the 2nd law, had a couple of much more basic questions regarding entropy & the Poincaré Recurrence Theorem

A) is the reduction of entropy implied by the theorem a real, unresolved paradox in physics?

B) if you take a frictionless pool table and a system that begins with a break, the recurrence theorem would state that eventually the system will go back to the point where the balls are racked with the cue ball traveling toward them at the original momentum. how would you quantify the reduction in entropy of this state, if any?

Is maximum entropy achieved when all the balls on the table have the same momentum / kinetic energy?

this seems odd since at random there would be elastic collisions where all the momentum of one ball was transferred to another, leaving the first ball at rest. this happening to all but the cue ball where the balls are re-racked seems unlikely though.
 
Science news on Phys.org
BWV said:
reading the very interesting discussion on the long thread on the 2nd law, had a couple of much more basic questions regarding entropy & the Poincaré Recurrence Theorem

A) is the reduction of entropy implied by the theorem a real, unresolved paradox in physics?
No. For a system of even a small number of objects, the time required for the system to return to its original state greatly exceeds the lifetime of the universe. We don't have unlimited amounts of time any more than we have unlimited amounts of space or energy etc.

B) if you take a frictionless pool table and a system that begins with a break, the recurrence theorem would state that eventually the system will go back to the point where the balls are racked with the cue ball traveling toward them at the original momentum. how would you quantify the reduction in entropy of this state, if any?
The concept of entropy is associated with the dispersion of energy. Energy naturally disperses. The initial state with all the energy concentrated in the cue ball is a state of low entropy. In the post-break state that energy is shared among all the balls, a state of higher entropy.

Is maximum entropy achieved when all the balls on the table have the same momentum / kinetic energy?
No. The maximum entropy occurs when the distribution of speeds of the balls follows a Maxwell Boltzmann speed distribution. That will occur fairly quickly.

AM
 
BWV said:
B) if you take a frictionless pool table and a system that begins with a break, the recurrence theorem would state that eventually the system will go back to the point where the balls are racked with the cue ball traveling toward them at the original momentum. how would you quantify the reduction in entropy of this state, if any? .

The recurrence relation says that the initial state will be revisited. The question is then how long will it take? I don't know if there is satisfactory answer to this. Obviously very long since thermodynamics works. But in a way just saying it'll take a very long time is unsatisfactory - is it possible to find some type of classical system with an initial state where recurrence will be short?
 
Actually, Poincare's theorem says that it will return close to the state. The trajectories cannot intersect since you're using deterministic equations, so it never actually goes back to the exact state.
 
To your question RedX, it all depends on what you mean by "short" and "near".

The wiki on "recurrence plots" gives a sketch of the quantitative analysis and applications of your question.
 
Pythagorean said:
Actually, Poincare's theorem says that it will return close to the state. The trajectories cannot intersect since you're using deterministic equations, so it never actually goes back to the exact state.

Periodic trajectories are not forbidden, even for complex systems.
However, periodic trajectories are not "generic".

Periodic trajectories occur only by chance when the various frequencies in the system are resonant:

f1/f2 = n1/n2 where n1 and n2 are integers​
(add many frequencies here to the discussion)
When n1 and n2 are large numbers, the recurrence time becomes large also.
And when the frequency ratio is irrational, the -exact- recurrence time is infinite,
even if one can come close to recurrence in a finite time.

So, the second principle is about irrational numbers!
 
Thanks for the responses

is it fair to say that in regards to Andrew's comment

The maximum entropy occurs when the distribution of speeds of the balls follows a Maxwell Boltzmann speed distribution.

that one would need to sample the system over some adequate period of time to see if it was at or near maximum entropy? With a Boltzmann distribution, one could have arbitrarily small decreases in entropy from time t to t+1 as for a system at equilibrium there would be some fluctuation proportional to the variance of the distribution. In the pool table example, while a return to the original state would be an incredibly unlikely event, one would expect that elastic collisions from time to time would leave one ball at rest - which would, I guess, be a trivial and temporary reduction of the dispersal of energy in the system

So in essence, the second law is a statistical concept, not a deterministic law? Like saying that the law of large numbers cannot be violated? One could only say that an apparent violation reduction of entropy would have to fall outside of some significance level?
 
lalbatros said:
Periodic trajectories are not forbidden, even for complex systems.
However, periodic trajectories are not "generic".

Periodic trajectories occur only by chance when the various frequencies in the system are resonant:

f1/f2 = n1/n2 where n1 and n2 are integers​
(add many frequencies here to the discussion)
When n1 and n2 are large numbers, the recurrence time becomes large also.
And when the frequency ratio is irrational, the -exact- recurrence time is infinite,
even if one can come close to recurrence in a finite time.

So, the second principle is about irrational numbers!

Right, I should be more accurate and say that you can always get arbitrarily close to the initial state, rather than you can always get back to the initial state (for example Betrand's theorem says for spherically symmetric potentials, only the harmonic oscillator and gravity have closed orbits).

So for example if you have a box half full of gas, then .5 \pm .000000002 is very close to the initial state, and Poincare's relation says it'll happen, eventually.

Yes there's that theorem that any bounded classical system (forgot the exact conditions) is isomorphic to an n-torus, and the coordinates for the torus oscillate at constant frequency, and exact recurrence comes from frequencies being rational multiplies of each other. But that just leaves the question how do you prove that n1 and n2 are always large numbers?

I don't believe ergodicity has ever been proven, so I think this is an open question, but I'll defer to people more knowledgeable than me about ergodicity.
 
Thanks for the comment, RedX.

If you could find the name of the theorem or some related stuff, I would be interrested.
What I said is only some recollection from some old popular readings, as well as some old readings about ergodic magnetic field lines in tokamaks.

In tokamaks, the structure of magnetic field lines is much simpler than the general problem of statistical mechanics. The torii are two dimensional. But it is nevertheless interresting at least because of the possiblity of ergodic / chaotic field lines.
 
  • #10
RedX said:
So for example if you have a box half full of gas, then .5 \pm .000000002 is very close to the initial state, and Poincare's relation says it'll happen, eventually.
It seems to me that a corollary to Poincare's recurrence theorem is that the same state will never recur. Since according to the PRT a recurrence to within \Delta p_i requires a finite amount of time (albeit a very large amount of time), the limit as \Delta p_i -> 0 = \infty.

AM
 
  • #11
You are right Andrew,

But the PRT also means that you can observe an almost arbitrary entropy decrease after a finite time, since you can choose the initial state on the cycle to make the recurrence time shorther.
The smaller the decrease the most likely it is, and inversly.
Fluctuations are transient decreases of entropy.
Thermal noise in resistances is a well know example.
During a thermal voltage glitch, the entropy is increased.
 
  • #12
RedX said:
The recurrence relation says that the initial state will be revisited. [...] is it possible to find some type of classical system with an initial state where recurrence will be short?
Yes. Take a pendulum, or any other integrable, bounded system. Then all orbits are periodic and stable, and the recurrence time is just that of one period.

But the N-particle systems considered in statistical mechanics are already chaotic when N>2. And thermodynamics is usually derived in the thermodynamic limit N-->inf. Then recurrence is impossible and entropy increases very quickly to very close its equilibrium state. For large but finite N, the approximation is very good as long as your time interval isn't astronomically large.

Note that all models are approximate only; thermodynamics is no exception. Thus mathematical theorems that depend on extreme accuracies or extreme times are irrelevant for real physics.
 
  • #13
A. Neumaier said:
Yes. Take a pendulum, or any other integrable, bounded system. Then all orbits are periodic and stable, and the recurrence time is just that of one period.

But the N-particle systems considered in statistical mechanics are already chaotic when N>2. And thermodynamics is usually derived in the thermodynamic limit N-->inf. Then recurrence is impossible and entropy increases very quickly to very close its equilibrium state. For large but finite N, the approximation is very good as long as your time interval isn't astronomically large.

Note that all models are approximate only; thermodynamics is no exception. Thus mathematical theorems that depend on extreme accuracies or extreme times are irrelevant for real physics.

Sounds right. So even when the N particles are very weakly interacting, you get chaos? Because often times you treat N particles as completely independent, with a separable Hamiltonian. Obviously they have to interact somehow, or you won't get any distributions, but I always thought the justification for that is that there is an interaction term in the Hamiltonian, but the coupling is so small that having it vanish won't change the distributions that you get.

Anyways, I found an easy article on ergodicity breaking:

http://www.evolant.org/assets/0/15/19/65/83/563/0FF95DD6-A4B2-4633-8268-36518E62E218.pdf
 
Last edited by a moderator:
  • #14
RedX said:
So even when the N particles are very weakly interacting, you get chaos? Because often times you treat N particles as completely independent, with a separable Hamiltonian. Obviously they have to interact somehow,
They act (like in billard) with the boundary of box that keeps the gas together. This is enough to make the system chaotic (like in billard). Without the lbox the atoms would move away in different directions (like in billard), and nothing would be there to consider. The interaction between the patricles themselves can be exactly zero (which gives the ideal gas).
 
  • #15
A. Neumaier said:
They act (like in billard) with the boundary of box that keeps the gas together. This is enough to make the system chaotic (like in billard). Without the lbox the atoms would move away in different directions (like in billard), and nothing would be there to consider. The interaction between the patricles themselves can be exactly zero (which gives the ideal gas).

Most books make the assumption that any collisions with the wall don't change the speed of the particles. So in order to have all the molecules of the gas have the same speed (the equilibrium condition), the gas molecules have to collide with each other so that the super-fast molecules give up some of their speed to the super-slow molecules, perhaps under a 6-12 potential. I've never calculated how often they collide, but one way might be to take the area of the atom which would be the radius squared (assuming square atoms), multiplying by a length to get a volume, and setting this equal to volume per molecule. Once you have this length you can divide by velocity to get the time between collisions, and take the reciprocal to get the frequency of collisions.

You're absolutely right that if the volume went to infinity, then there should be nothing to consider. Which gives a paradox, since in thermodynamics, the volume can go to infinity, but the speed should still be the same for every particle, equal to the square root of the temperature. But if the volume is infinity, then there is no interaction, so I don't understand how the speed can all be equal: those that start with super-high speeds stay that way and those that are super-slow stay that way.
 
  • #16
I am not sure why you think the molecules should all have the same speed. They have a broad range of speeds, in accordance with the Maxwell-Boltzmann speed distribution for that temperature.

AM
 
  • #17
Andrew Mason said:
I am not sure why you think the molecules should all have the same speed. They have a broad range of speeds, in accordance with the Maxwell-Boltzmann speed distribution for that temperature.

AM

Yeah, you're right. It's a bit counter-intuitive, as you'd expect that whenever molecules collide, the slower one gets faster and the faster one gets slower, so eventually everything should be the same speed. But actually that's not necessarily true of collisions.
 
  • #18
RedX said:
Most books make the assumption that any collisions with the wall don't change the speed of the particles. So in order to have all the molecules of the gas have the same speed (the equilibrium condition), the gas molecules have to collide with each other so that the super-fast molecules give up some of their speed to the super-slow molecules, perhaps under a 6-12 potential.
But particles interacting with an nonzero potential are no longer an ideal gas. Thermodynamics is however valid for the latter. And it is because no wall is perfectly reflecting. (It doesn't really matter where entropy is produced - through collisions with the wall or through collisions with each other. in both case, one reaches equilibrium.)
RedX said:
You're absolutely right that if the volume went to infinity, then there should be nothing to consider. Which gives a paradox, since in thermodynamics, the volume can go to infinity,
The thermodynamic limit is taken at the very end, when equilibrium is already attained. In practice, the volume is finite but large enough that the error made by using the much simpler limiting formula is negligible.
 
Back
Top