About Entropy and Poincare Recurrence

In summary, the system will always eventually return to a state where all the oscillators are in the same half of the container.
  • #1
hilbert2
Science Advisor
Insights Author
Gold Member
1,598
605
I was reading about entropy, Poincare recurrence theorem and the arrow of time yesterday and I got some ideas/questions I'd like to share here...

Let's think a about a system that is a classical ideal gas made of point particles, confined in a cubic box. Suppose that at time ##t=0## all the particles are in the same half of the box. Now, there will quickly occur a distribution of the gas to fill the whole box uniformly, consistent with the second law of thermodynamics. However, from the Poincare recurrence theorem, we know that after some huge time interval, the system will again temporarily be in approximately the same initial situation where all the gas occupies the same half of the container.

Or let's take an even simpler example, a set of uncoupled harmonic oscillators with one degree of freedom. The time evolution of the oscillators can be described with a set of sine functions of t, like say:

##\sin (t), \sin (\alpha t), \sin (\alpha^2 t), \sin (\alpha^3 t), ... , \sin (\alpha^n t)##

and ##\alpha## is an irrational number larger than ##1## to prevent the phase space trajectory of the system from being periodic. To keep this just as a mathematical toy, I'm not even giving the functions a dimensional amplitude.

At time ##t=0##, all the functions have value zero. We can intuitively call this initial state a "state of low entropy", even though there really isn't such a thing as the entropy of a single microstate. When ##t## increases, the sine functions quickly get all kinds of values between ##-1## and ##1## and you wouldn't immediately recognize any correlation between the values unless you knew the form of the functions above. The system has now gone to a "state of high entropy".

After some time, the length of which depends on n, the number of oscillators, the system will be almost back in the initial state, because of the Poincare recurrence.

Now, I have two questions:

1. How to define an entropy-like variable that would be defined for a single microstate (i.e. the set of values of those sine functions at some ##t##), and that would have intuitively right properties, like having a small value if all the values are equal, and a high value if they appear pseudorandom?

2. Is it possible to assign the oscillators a set of phase velocities, initial phases, and amplitudes, in such a way that there would never occur an accidental state of "low entropy", at any value of t. I.e. the values of the sine functions would be an apparently pseudorandom mix of all values between -1 and 1 at any moment of time. Or is the phase space trajectory of such a system a "space filling curve" that eventually explores all possible states with the same total energy, even low entropy ones?

This is just a thought experiment where I'm trying to construct a system where there never happens anything that would give the arrow of time a preferred direction...
 
Science news on Phys.org
  • #2
hilbert2 said:
1. How to define an entropy-like variable that would be defined for a single microstate (i.e. the set of values of those sine functions at some ##t##), and that would have intuitively right properties, like having a small value if all the values are equal, and a high value if they appear pseudorandom?
For me, this is like asking if you could devise a test to show whether the sequence 2,3,5 was random or not.

hilbert2 said:
2. Is it possible to assign the oscillators a set of phase velocities, initial phases, and amplitudes, in such a way that there would never occur an accidental state of "low entropy", at any value of t. I.e. the values of the sine functions would be an apparently pseudorandom mix of all values between -1 and 1 at any moment of time.
The only way to do that is to have the frequencies of the different oscillators to differ by irrational factors. Otherwise, you are guaranteed that the system will return to the same state after a while.
hilbert2 said:
This is just a thought experiment where I'm trying to construct a system where there never happens anything that would give the arrow of time a preferred direction...
A small-enough isolated system doesn't have an arrow of time. For instance, if you have only two particles, there is no obvious difference between time moving forward or backward.
 
  • Like
Likes hilbert2
  • #3
^ Thanks for the reply. Maybe something that would be heuristically similar to entropy would be the so called "diversity index" of a set of numbers, I don't know if that would require discretizing time and the degrees of freedom to get a finite set of possible microstates. Then I could calculate that index as a function of ##t## and see if it can be constrained to never have a value smaller than some limit, by choosing the phase velocities and initial phases appropriately. The number of oscillators doesn't have to be small, by the way.
 
  • #4
The Poincare cycles take too long time. You can not assume the system to be Hamiltonian so long
 
  • #5
^ My system is not meant to describe anything from the real world. I'm assuming it's an ideal completely isolated system that obeys Newtonian mechanics. I'm not sure what you mean by a system remaining Hamiltonian, does this have something to do with external perturbations?
 
  • #6
I think that for Hamiltonian system must be ##dS=0## something like that
 
  • #7
I made a Mathematica code to play with these oscillator systems... First I define a set of oscillators as in the OP, with ##\alpha = 1 + \frac{e}{1000}## and ##n=10000##:

Code:
fun[t_] = Table[Sin[t*(1 + E/1000)^n], {n, 0, 10000}]

Next I draw a histogram of the oscillator positions at ##t=10##, a time when the system has probably "reached equilibrium":

Code:
Histogram[fun[10], 200]
histogram_osc.jpg


The oscillators are most likely to be near points ##x=-1## and ##x=1##, because their speed is lowest near those extrema. One way to quantify the closeness of a microstate to equilibrium would be to somehow compare the distribution of the oscillator locations to this ideal distribution (this would not take into account the momentum distribution, though).

There is a trivial example where the distribution of the oscillators remains like this at all values of ##t##, it is the situation where all the oscillators have the same angular velocity and their initial phases are evenly distributed on the interval ##[0,2\pi]##:

##\{f(t) = \sin(t+\frac{2n\pi}{1000}) | n=0,1,2,...,1000 \} ##.

Hard to say whether this trivial example is the only situation where there never occur accidental 2nd law violating fluctuations from the equilibrium distribution. I think this problem has something to do with number theory, because the special case of periodic dynamics occurs when the angular velocities are commensurable.
 
  • #8
I've been thinking about proving by mathematical induction that if a set of N harmonic oscillators with amplitude ##1## have angular velocities that are all incommensurable with each other, then the trajectory of the system is a space-filling curve in ##[-1,1]^N## (the cartesian product of ##N## intervals ##[-1,1]##), and the dynamics of the system are almost periodic (periodic to any level of accuracy, if you let the length of a period to be large enough).

It is easy to see that this holds true in the case ##N=1##, because the range of function ##\sin(\alpha_{1} t)## is ##[-1,1]## and it is exactly periodic with period ##\frac{2\pi}{\alpha_{1}}##.

The case N=2 is also easy. We choose some point ##(x_{1} ,x_{2})\in [-1,1]^2## and a real number ##\epsilon > 0## and we want to find a time ##t## when the distance between the points ##(x_{1} ,x_{2})## and ##(\sin(\alpha_{1} t), \sin(\alpha_{2} t))## is smaller than ##\epsilon##. It is easy to find a number t for which ##\sin(\alpha_1 t) = x_{1}##. The equality also holds at times ##t+\Delta t, t+2\Delta t, t + 3\Delta t, ...##, where ##\Delta t = \frac{2\pi}{\alpha_{1}}##. Because ##\Delta t## is incommensurable with the period of the second oscillator, you can find a number ##n## such that ##|\sin(\alpha_2 (t+n\Delta t))-x_2|<\epsilon##. Also, the dynamics of this two-oscillator system are almost periodic, because you can find a number ##n##, such that ##n\Delta t## is arbitrarily close to some integer multiple of ##\frac{2\pi}{\alpha_{2}}##

But now we would have to prove that if the statement holds true for ##N=k##, it also holds for ##N=k+1##. The problem here is that the system of first ##k## oscillators is only almost periodic. We would have to choose a time ##t## and approximate period ##\Delta t## in a way that at sufficiently many times ##t + n\Delta t##, the first ##k## oscillators are close enough to points ##x_1,x_2,...,x_k##, and ##\Delta t## is incommensurable with ##\frac{2\pi}{\alpha_{k+1}}## in such a way that for some ##n##, ##|\sin(\alpha_{k+1}(t+n\Delta t))-x_{k+1}|<\epsilon##. Anyone have ideas on how exactly this should be done?

If this can be proven, it would tell us that any system of harmonic oscillators visits all possible phase space points available for it, even the crazy low entropy configurations.
 
Last edited:

1. What is entropy?

Entropy is a measure of the disorder or randomness in a system. In scientific terms, it is a thermodynamic quantity that describes the amount of energy in a system that is unavailable for doing work.

2. How is entropy related to Poincare Recurrence?

Poincare Recurrence is a theorem in physics that states that in a closed system, any initial state will eventually recur infinitely many times. Entropy is related to this concept because as a closed system evolves, it tends towards a state of maximum entropy, where all energy is evenly distributed and no work can be done.

3. Can entropy be reversed?

In a closed system, entropy cannot be reversed. This is due to the second law of thermodynamics, which states that the total entropy of a closed system will always increase over time. However, in open systems, where energy can be exchanged with the surroundings, local decreases in entropy are possible.

4. How is entropy calculated?

Entropy is calculated using the equation S = klnW, where S is the entropy, k is the Boltzmann constant, and W is the number of possible microstates that a system can have. In simpler terms, entropy is a measure of the number of ways that a system's particles can be arranged or distributed.

5. What are some real-world examples of entropy and Poincare Recurrence?

An example of entropy is ice melting into water. In this process, the ice, which has a lower entropy due to its ordered molecular structure, melts and increases its entropy as the molecules become more randomly arranged in liquid form. An example of Poincare Recurrence is the Earth's orbit around the sun, as it will continue to recur indefinitely unless acted upon by an external force.

Similar threads

Replies
16
Views
849
Replies
2
Views
844
Replies
1
Views
766
Replies
3
Views
1K
Replies
22
Views
2K
  • Thermodynamics
Replies
2
Views
981
Replies
19
Views
1K
Replies
21
Views
4K
  • Thermodynamics
Replies
4
Views
2K
Back
Top