1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A About Entropy and Poincare Recurrence

  1. Jun 8, 2016 #1

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    I was reading about entropy, Poincare recurrence theorem and the arrow of time yesterday and I got some ideas/questions I'd like to share here...

    Let's think a about a system that is a classical ideal gas made of point particles, confined in a cubic box. Suppose that at time ##t=0## all the particles are in the same half of the box. Now, there will quickly occur a distribution of the gas to fill the whole box uniformly, consistent with the second law of thermodynamics. However, from the Poincare recurrence theorem, we know that after some huge time interval, the system will again temporarily be in approximately the same initial situation where all the gas occupies the same half of the container.

    Or let's take an even simpler example, a set of uncoupled harmonic oscillators with one degree of freedom. The time evolution of the oscillators can be described with a set of sine functions of t, like say:

    ##\sin (t), \sin (\alpha t), \sin (\alpha^2 t), \sin (\alpha^3 t), ... , \sin (\alpha^n t)##

    and ##\alpha## is an irrational number larger than ##1## to prevent the phase space trajectory of the system from being periodic. To keep this just as a mathematical toy, I'm not even giving the functions a dimensional amplitude.

    At time ##t=0##, all the functions have value zero. We can intuitively call this initial state a "state of low entropy", even though there really isn't such a thing as the entropy of a single microstate. When ##t## increases, the sine functions quickly get all kinds of values between ##-1## and ##1## and you wouldn't immediately recognize any correlation between the values unless you knew the form of the functions above. The system has now gone to a "state of high entropy".

    After some time, the lenght of which depends on n, the number of oscillators, the system will be almost back in the initial state, because of the Poincare recurrence.

    Now, I have two questions:

    1. How to define an entropy-like variable that would be defined for a single microstate (i.e. the set of values of those sine functions at some ##t##), and that would have intuitively right properties, like having a small value if all the values are equal, and a high value if they appear pseudorandom?

    2. Is it possible to assign the oscillators a set of phase velocities, initial phases, and amplitudes, in such a way that there would never occur an accidental state of "low entropy", at any value of t. I.e. the values of the sine functions would be an apparently pseudorandom mix of all values between -1 and 1 at any moment of time. Or is the phase space trajectory of such a system a "space filling curve" that eventually explores all possible states with the same total energy, even low entropy ones?

    This is just a thought experiment where I'm trying to construct a system where there never happens anything that would give the arrow of time a preferred direction...
     
  2. jcsd
  3. Jun 8, 2016 #2

    DrClaude

    User Avatar

    Staff: Mentor

    For me, this is like asking if you could devise a test to show whether the sequence 2,3,5 was random or not.

    The only way to do that is to have the frequencies of the different oscillators to differ by irrational factors. Otherwise, you are guaranteed that the system will return to the same state after a while.


    A small-enough isolated system doesn't have an arrow of time. For instance, if you have only two particles, there is no obvious difference between time moving forward or backward.
     
  4. Jun 8, 2016 #3

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    ^ Thanks for the reply. Maybe something that would be heuristically similar to entropy would be the so called "diversity index" of a set of numbers, I don't know if that would require discretizing time and the degrees of freedom to get a finite set of possible microstates. Then I could calculate that index as a function of ##t## and see if it can be constrained to never have a value smaller than some limit, by choosing the phase velocities and initial phases appropriately. The number of oscillators doesn't have to be small, by the way.
     
  5. Jun 8, 2016 #4
    The Poincare cycles take too long time. You can not assume the system to be Hamiltonian so long
     
  6. Jun 8, 2016 #5

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    ^ My system is not meant to describe anything from the real world. I'm assuming it's an ideal completely isolated system that obeys Newtonian mechanics. I'm not sure what you mean by a system remaining Hamiltonian, does this have something to do with external perturbations?
     
  7. Jun 8, 2016 #6
    I think that for Hamiltonian system must be ##dS=0## something like that
     
  8. Jun 8, 2016 #7

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    I made a Mathematica code to play with these oscillator systems... First I define a set of oscillators as in the OP, with ##\alpha = 1 + \frac{e}{1000}## and ##n=10000##:

    Code (Text):
    fun[t_] = Table[Sin[t*(1 + E/1000)^n], {n, 0, 10000}]
    Next I draw a histogram of the oscillator positions at ##t=10##, a time when the system has probably "reached equilibrium":

    Code (Text):
    Histogram[fun[10], 200]
    histogram_osc.jpg

    The oscillators are most likely to be near points ##x=-1## and ##x=1##, because their speed is lowest near those extrema. One way to quantify the closeness of a microstate to equilibrium would be to somehow compare the distribution of the oscillator locations to this ideal distribution (this would not take into account the momentum distribution, though).

    There is a trivial example where the distribution of the oscillators remains like this at all values of ##t##, it is the situation where all the oscillators have the same angular velocity and their initial phases are evenly distributed on the interval ##[0,2\pi]##:

    ##\{f(t) = \sin(t+\frac{2n\pi}{1000}) | n=0,1,2,...,1000 \} ##.

    Hard to say whether this trivial example is the only situation where there never occur accidental 2nd law violating fluctuations from the equilibrium distribution. I think this problem has something to do with number theory, because the special case of periodic dynamics occurs when the angular velocities are commensurable.
     
  9. Jun 10, 2016 #8

    hilbert2

    User Avatar
    Science Advisor
    Gold Member

    I've been thinking about proving by mathematical induction that if a set of N harmonic oscillators with amplitude ##1## have angular velocities that are all incommensurable with each other, then the trajectory of the system is a space-filling curve in ##[-1,1]^N## (the cartesian product of ##N## intervals ##[-1,1]##), and the dynamics of the system are almost periodic (periodic to any level of accuracy, if you let the length of a period to be large enough).

    It is easy to see that this holds true in the case ##N=1##, because the range of function ##\sin(\alpha_{1} t)## is ##[-1,1]## and it is exactly periodic with period ##\frac{2\pi}{\alpha_{1}}##.

    The case N=2 is also easy. We choose some point ##(x_{1} ,x_{2})\in [-1,1]^2## and a real number ##\epsilon > 0## and we want to find a time ##t## when the distance between the points ##(x_{1} ,x_{2})## and ##(\sin(\alpha_{1} t), \sin(\alpha_{2} t))## is smaller than ##\epsilon##. It is easy to find a number t for which ##\sin(\alpha_1 t) = x_{1}##. The equality also holds at times ##t+\Delta t, t+2\Delta t, t + 3\Delta t, ...##, where ##\Delta t = \frac{2\pi}{\alpha_{1}}##. Because ##\Delta t## is incommensurable with the period of the second oscillator, you can find a number ##n## such that ##|\sin(\alpha_2 (t+n\Delta t))-x_2|<\epsilon##. Also, the dynamics of this two-oscillator system are almost periodic, because you can find a number ##n##, such that ##n\Delta t## is arbitrarily close to some integer multiple of ##\frac{2\pi}{\alpha_{2}}##

    But now we would have to prove that if the statement holds true for ##N=k##, it also holds for ##N=k+1##. The problem here is that the system of first ##k## oscillators is only almost periodic. We would have to choose a time ##t## and approximate period ##\Delta t## in a way that at sufficiently many times ##t + n\Delta t##, the first ##k## oscillators are close enough to points ##x_1,x_2,...,x_k##, and ##\Delta t## is incommensurable with ##\frac{2\pi}{\alpha_{k+1}}## in such a way that for some ##n##, ##|\sin(\alpha_{k+1}(t+n\Delta t))-x_{k+1}|<\epsilon##. Anyone have ideas on how exactly this should be done?

    If this can be proven, it would tell us that any system of harmonic oscillators visits all possible phase space points available for it, even the crazy low entropy configurations.
     
    Last edited: Jun 10, 2016
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted