Are cycles and entropy compatible?

  • Thread starter Thread starter bill alsept
  • Start date Start date
  • Tags Tags
    Cycles Entropy
Click For Summary
The discussion centers on the compatibility of cycles and entropy within the framework of the second law of thermodynamics. It explores the idea that while entropy tends to increase in isolated systems, there may be scenarios, particularly over long timescales or in small systems, where entropy can both increase and decrease. Participants argue that if the universe operates in cycles, it could theoretically return to a low-entropy state, challenging the notion that entropy must always increase. The conversation also touches on the complexities of defining entropy in the context of gravitational fields and quantum effects during cosmic events like bounces. Ultimately, the dialogue highlights ongoing debates in physics regarding the nature of entropy and its implications for cosmology.
  • #31
Chalnoth said:
How is that a second arrow of time?

I get that definition from wikipedia and it makes sense to me.

http://en.wikipedia.org/wiki/Arrow_of_time

Quote from that page:
"The particle physics (weak) arrow of time
Certain subatomic interactions involving the weak nuclear force violate the conservation of both parity and charge conjugation, but only very rarely. An example is the kaon decay [1]. According to the CPT Theorem, this means they should also be time irreversible, and so establish an arrow of time. Such processes should be responsible for matter creation in the early universe.
This arrow is not linked to any other arrow by any proposed mechanism, and if it would have pointed to the opposite time direction, the only difference would have been that our universe would be made of anti-matter rather than from matter. More accurately, the definitions of matter and anti-matter would just be reversed."
 
Space news on Phys.org
  • #32
Constantin said:
I get that definition from wikipedia and it makes sense to me.

http://en.wikipedia.org/wiki/Arrow_of_time

Quote from that page:
"The particle physics (weak) arrow of time
Certain subatomic interactions involving the weak nuclear force violate the conservation of both parity and charge conjugation, but only very rarely. An example is the kaon decay [1]. According to the CPT Theorem, this means they should also be time irreversible, and so establish an arrow of time. Such processes should be responsible for matter creation in the early universe.
This arrow is not linked to any other arrow by any proposed mechanism, and if it would have pointed to the opposite time direction, the only difference would have been that our universe would be made of anti-matter rather than from matter. More accurately, the definitions of matter and anti-matter would just be reversed."
Yes, CP violation leads to a time asymmetry. However, you still have the CPT symmetry. So it's not that this is a "new" arrow of time, it's just that at a microscopic level, the time symmetry looks a bit different, in that you have to use the whole of the CPT symmetry, which so far as we know is absolute.
 
  • #33
Chalnoth said:
All you need to do is demonstrate that the number of possible configurations within a specific horizon are finite.

I don't see how this helps you. If you have an expanding universe, and you have matter that goes outside of the event horizon then it's gone forever. If you calculate the time for things to recur versus the time it takes for it to wander outside of the observable universe and be lost forever, I think you'll find the first number is much higher.
 
  • #34
twofish-quant said:
I don't see how this helps you. If you have an expanding universe, and you have matter that goes outside of the event horizon then it's gone forever. If you calculate the time for things to recur versus the time it takes for it to wander outside of the observable universe and be lost forever, I think you'll find the first number is much higher.
Doesn't matter. All of the possible degrees of freedom are represented within the single event horizon. Every observer will be within some event horizon. And that observer will see one of the (finite) possible configurations. Worrying about multiple event horizons just means you're counting the same configurations multiple times.
 
  • #35
Chalnoth said:
Yes. The Boltzmann Brain argument demonstrates that in order to explain our own existence, states like our own must occur much more frequently than one would naively expect from the trivial statistical recurrence argument.

I have this suspicion that you can show through Boltzmann's Brain arguments that any comprehensible universe must have some period of inflationary expansion, but I haven't worked out the details.
 
  • #36
twofish-quant said:
I have this suspicion that you can show through Boltzmann's Brain arguments that any comprehensible universe must have some period of inflationary expansion, but I haven't worked out the details.
Well, I don't think that you can prove that. However, inflation has a number of features that make it definitely seem likely to explain the problem.

Edit: Just to clarify, I don't think it's possible to rule out the possibility of somebody else coming up with some other creative solution.
 
  • #37
Chalnoth said:
All of the possible degrees of freedom are represented within the single event horizon. Every observer will be within some event horizon. And that observer will see one of the (finite) possible configurations.

Still don't see how this is going to work. If the universe is expanding then the amount of matter within a given finite event horizon is going to tend to zero as t goes to infinity. If you have a horizon that is moving with the expansion of the universe then you go back to having infinite configurations.

The other thing is that these recurrence arguments assume a closed system. The problem is that the universe itself is gradually cooling to 0K, so I don't see how this assumption is going to work.

Also, if this is just repeating arguments, then feel free to point me to a review paper.
 
  • #38
twofish-quant said:
Still don't see how this is going to work. If the universe is expanding then the amount of matter within a given finite event horizon is going to tend to zero as t goes to infinity. If you have a horizon that is moving with the expansion of the universe then you go back to having infinite configurations.

The other thing is that these recurrence arguments assume a closed system. The problem is that the universe itself is gradually cooling to 0K, so I don't see how this assumption is going to work.

Also, if this is just repeating arguments, then feel free to point me to a review paper.
But as long as the cosmological constant is non-zero, you don't have an event horizon that is expanding with the matter. So as I was saying earlier, you can potentially produce an infinite universe if there is no cosmological constant. But not if there is one.
 
  • #39
Hi TwoFish. You probably will want to google for poincare recurrence time and eternal DeSitter space to get a gander of the literature. Its quite an active field of research and far from settled

The punchline is that classically there must be a recurrence time for any observer in a particular causal diamond, but there is an issue and a controversy surrounding the quantum mechanics of the DeSitter space. Essentially a lot of processes will tend to occur before the recurrence time (like Boltzman brains as well as vacuum decay) so the exact operational meaning of the time is unclear.

Further there are issues on what observables a long lived observer actually can use to discern the physics.
 
  • #40
bill alsept said:
If it truly cycles then its entropy will always come back to where it started...

Bill, I am not certain what you mean by universe "truly cycles".

There are cosmological models (some of them studied quite a lot) where the U contracts and then expands---some do that repeatedly in a regular cycle. There are other models which have some other kinds of regular cyclic behavior.

I thought at first that this is what you were talking about. But in this thread people seem to have gotten into discussing random recurrence. In that case the model has no built-in cycle. I wouldn't call it cyclic. It just comes back to some earlier configuration by accident after a very extremely long time.

Which "truly cycles" were you talking about? Random recurrence or was it one of the actual cyclic universe models?
 
  • #41
twofish-quant said:
Except that you can't have a harmonic oscillator with a period of sqrt(2). The period of a harmonic oscillator has got to be some integer multiple of Planck's constant/2.

Nope. Firstly look at the units. Period = period of time isn't the units of plank's constant.
What is quantized is "action" units so thus for example action=time * energy, given a HO with fixed period the energy will be quantized in units of hbar over period.

Secondly the cyclic assertion made no prescription about the quantization of period. That wasn't used in the argument which is why I brought it up. The argument doesn't prove the assertion.
 
  • #42
From a quantum perspective the entropy of a composite system will be less than the entropy of its components. In particular the more those components are entangled. One way to define entropy in QM is as the degree of entanglement of a system to its environment. One can then assert that the entropy of the universe as a whole is zero in so far as you can define a quantum system called "the universe as a whole".
 
  • #43
jambaugh said:
Take a system consisting of two harmonic oscillators,

Hi Jambaugh. You are referring to a damped system I think. It is precisely there where the assumptions of the recurrence theorem are violated, since you are no longer describing a system which possesses a volume preserving phase space and you can no longer strictly speaking bound recurrent orbits into small epsilon balls (which could be made arbitrarily small).

It is irreversible since states will undergo evolution and get damped and lose their identity permanently. Of course, in the real world the future boundary conditions will restore the reversability (or unitarity) in some way.

In fact, interestingly, classical field theory (including GR) also strictly speaking is an example of a system which violates Liouvilles theorem, since it includes an infinite amount of degrees of freedom. Consequently, the system will equipartition a finite amount of energy into the infinite amount of Fourier modes and you will end up with a recurrence time that tends to infinity.

The reason this is not the case in practise is twofold.

1) There is a finite amount of degrees of freedom in our causal patch (and Hilbert space) as a peculiarity of DeSitter space.
2) Quantum mechanics exists! It essentially acts as a cutoff that regulates the IR physics of the problem, just like it does for the UV catastrophe.
 
  • #44
jambaugh said:
Nope. Firstly look at the units. Period = period of time isn't the units of plank's constant.
What is quantized is "action" units so thus for example action=time * energy, given a HO with fixed period the energy will be quantized in units of hbar over period.

Secondly the cyclic assertion made no prescription about the quantization of period. That wasn't used in the argument which is why I brought it up. The argument doesn't prove the assertion.
Well, if you go to an infinite phase space, the Poincare recurrence theorem states that the system will become arbitrarily close to your starting point in finite time (though typically a very large amount of time).
 
  • #45
When I asked "Are cycles and entropy compatible?" I thought it could be a yes or no answer. I see now that maybe I asked the question wrong. I should have asked "How can the interpritation of the 2nd law of thermo which states that (the entropy of an isolated system always increases or remains constant) be compatible with a system that cycles?" Most of the anologies used in the responces of this thread so far seem to support the conclusion that in a system that cycles entropy stays equal and will decrease just as much as it will increase there for it cycles.
 
  • #46
Haelfix said:
Hi TwoFish. You probably will want to google for poincare recurrence time and eternal DeSitter space to get a gander of the literature. Its quite an active field of research and far from settled

Thanks. Will do.

Also just a note here. It's really, really important when there are "civilians" present to clearly mark what is the settled consensus view, what is speculation, and what is active research. It's also important when there are non-civilians here, because if you give me references to five or six papers that clearly establish that the poincare recurrence theorem has been applied to the big bang, I'm going to react differently than if people are still arguing.

The punchline is that classically there must be a recurrence time for any observer in a particular causal diamond, but there is an issue and a controversy surrounding the quantum mechanics of the DeSitter space. Essentially a lot of processes will tend to occur before the recurrence time (like Boltzman brains as well as vacuum decay) so the exact operational meaning of the time is unclear.

I can see here why the black hole information paradox becomes important. The time it takes for everything to turn into black holes is likely to be a lot less than the recurrence time.

Also, I'm not in a hurry to figure this out. I have about forty years left, and if I die and wake up, my first reaction is likely to be "well, it looks like the second law of thermodynamics doesn't hold" and they I'll find either some big guy with a beard or some person with horns and a pitchfork to explain it to me.
 
  • #47
bill alsept said:
"How can the interpritation of the 2nd law of thermo which states that (the entropy of an isolated system always increases or remains constant) be compatible with a system that cycles?"

That's easy.

It can't. :-)
 
  • #48
Chalnoth said:
Well, if you go to an infinite phase space, the Poincare recurrence theorem states that the system will become arbitrarily close to your starting point in finite time (though typically a very large amount of time).

I don't think that's true (and if it is feel free to point me to a reference).

If you have a ball moving in one direction through infinite space. It's never going to repeat. The proof of the PRT depends critically on phase space being finite. If you have infinite phase space, then it doesn't work.
 
  • #49
Which "truly cycles" were you talking about? Random recurrence or was it one of the actual cyclic universe models?[/QUOTE]

I was talking about "one of the actual ones". I just used the word "truly" instead of "actual"

I just made the statement to clarify between the anologies some people were using. Anologies that described systems in bottles that may cycle and I suppose in some quantum equation an argument is made to support that. What I ment by a true cycle was a system that cycles back to an original state such as a singularity and then inflates to it's maximun and then condences back to the same singularity again. A true cycle would also have a time signature such as in the atomic level. As for the original question about entropy it seems that everyone agrees that what ever kind of cycle it is the entropy does not always gain. Far from it it seems to stay equal after each cycle.
 
  • #50
twofish-quant said:
I don't think that's true (and if it is feel free to point me to a reference).

If you have a ball moving in one direction through infinite space. It's never going to repeat. The proof of the PRT depends critically on phase space being finite. If you have infinite phase space, then it doesn't work.
I think you've misunderstood me. You still need a finite space. But you can have an infinite configuration space (such as is the case if you have finite space but no quantum mechanics), and the Poincare recurrence theorem still applies. Read up on it here:
http://en.wikipedia.org/wiki/Poincaré_recurrence_theorem

And when you take the quantum mechanics into account, the finite horizon of de Sitter space is sufficient to allow recurrence.
 
  • #51
bill alsept said:
I was talking about "one of the actual ones". I just used the word "truly" instead of "actual"

I just made the statement to clarify between the anologies some people were using. Anologies that described systems in bottles that may cycle and I suppose in some quantum equation an argument is made to support that. What I ment by a true cycle was a system that cycles back to an original state such as a singularity and then inflates to it's maximun and then condences back to the same singularity again. A true cycle would also have a time signature such as in the atomic level. As for the original question about entropy it seems that everyone agrees that what ever kind of cycle it is the entropy does not always gain. Far from it it seems to stay equal after each cycle.
I don't see how there's any use in specifying that some kinds of recurrence are "actual" while other kinds are not. This kind of recurrence is highly unlikely, however.
 
  • #52
OMG, I'm reading the papers on de Sitter space and Poincare recurrence. It's weird stuff...

OMG. OMG. OMG.

http://arxiv.org/abs/hep-th/0208013

I see what the issue is. Susskind has proposed a solution to the black hole information paradox in which information never gets destroyed, and it turns out that if information never gets destroyed by tossing it down a black hole event horizon, then it doesn't get destroyed in a deSitter universe when objects move outside the cosmological event horizon. If the amount of information stays the same, then eventually things will repeat.

The alternative is that Hawking is right, and information does get destroyed when you toss it either into a black hole or when it leaves the event horizon. If that happens when things won't repeat. What will happen is that once something goes outside of the cosmological event horizon, it's gone for good. That means that the laws of physics are not unitary.

What I didn't understand was that I was imagining an expanding universe with an event horizon, and then when something goes outside of the event horizon, it's "gone" so over time things will get more and more lonely with no recurrence. Susskind is arguing that this won't happen. The event horizon of the universe is mathematically identical to the event horizon of a black hole so that you will get Hawking radiation from the cosmological horizon just like you will get Hawking radiation from the black hole, and if that Hawking radiation contains any information, then things will reboot.

The paper is called "Disturbing Implications of a Cosmological Constant"

I find the second option, less disturbing, but it's still plenty disturbing.
 
Last edited:
  • #53
bill alsept said:
Which "truly cycles" were you talking about? Random recurrence or was it one of the actual cyclic universe models?

I was talking about "one of the actual ones". I just used the word "truly" instead of "actual"

I just made the statement to clarify between the anologies some people were using. Anologies that described systems in bottles that may cycle and I suppose in some quantum equation an argument is made to support that. What I ment by a true cycle was a system that cycles back to an original state such as a singularity and then inflates to it's maximun and then condences back to the same singularity again. A true cycle would also have a time signature such as in the atomic level. As for the original question about entropy it seems that everyone agrees that what ever kind of cycle it is the entropy does not always gain. Far from it it seems to stay equal after each cycle.

So it seems some of the posts in this thread are not relevant. Many of them are about RANDOM RECURRENCE. Stuff that happens by accident after an indefinite wait of jillion gazillion years.

In cosmology research the thing about contracting, rebounding, and expanding is often called a "bounce".
A lot of papers these days study bounce cosmologies. That's different from random recurrence.

The simplest case of it need not even repeat---might just consist of a single bounce.

That is a good test case to study. One can ask did the U we see result from a bounce whether or not it was one of an infinite series of bounces.

There might be some traces of a bounce in the CMB that we can observe. It makes sense to ask if there was a bounce---are we in the rebound from a collapsing classical U?---without trying to answer the question right away about whether it's an infinite series.

And with any bounce cosmology (cyclic or not) you can ask about entropy. That's what I was trying to get at in my earlier posts #5 and #7 in this thread.
 
Last edited:
  • #54
bill alsept said:
When I asked "Are cycles and entropy compatible?" I thought it could be a yes or no answer. I see now that maybe I asked the question wrong. I should have asked "How can the interpritation of the 2nd law of thermo which states that (the entropy of an isolated system always increases or remains constant) be compatible with a system that cycles?" Most of the anologies used in the responces of this thread so far seem to support the conclusion that in a system that cycles entropy stays equal and will decrease just as much as it will increase there for it cycles.

OK, so let's consider it with respect to a specific cyclic system... say a single simple harmonic oscillator. Can you define an entropy for this system? Answer Yes!

Again this goes back to understanding that entropy is not about disorder vs order but about empirically defined ignorance vs knowledge about system state.

Classically: A SHO's phase space diagram [x,p] will show the orbit of the oscillator state following an ellipse centered about the origin the size of which is defined by the energy. Relative entropy will correspond to logarithms of areas in phase space.
Given you know a range of energies for the SHO and only that then you know the state is a point inside the areas between two ellipses of phase space. This area defines a class of possible systems' in that it defines a range of possible states of a given system. Note that as the system evolves you also know that the state stays within the defined region so over time the entropy is unchanged.

Alternatively if you know the initial conditions up to some error bars x1 < x(0) < x2, p1 < p(0) < p2, you can define its initial state to within a given area (with S = k_{boltz}\ln(A))). By Louiville's theorem you can watch each point in the initial area evolve and its area will not change so neither will the entropy.

One can go further and more general and define a probability distribution over phase space. Louiville's theorem will manifest as a conservation of entropy for the evolution of the distribution over time.
S =- k_{bolt}\int f(x,p) \ln(f(x,p)) dxdp where f is the probability density. Try it with a uniform (constant) density over an area of phase space and see you recover

Now this example isn't very interesting or useful but it shows how entropy is defined based on knowledge about the system state. Now consider many such oscillators and you combine the phase spaces into a single composite space. One then works with "hyper-volumes" instead of areas but it works out the same. Start with an uncertain initial condition and the entropy is defined and Louiville's theorem still applies, the future volume of phase space in which we can know the system resides is of fixed volume (but you'll note it gets stretched out and wrapped around many times. Couple the oscillators to each other in a definite way and still the entropy remains constant.

But if you couple the system to an external source or allow random coupling between oscillators then this randomness adds uncertainty to the future state and the area or distribution spreads out. Entropy increases. No amount of random(=unknown) coupling to the outside world or internally will reduce our ignorance about where the system state will be and thus entropy cannot be decreased this way. That's the 2nd law when one is considering random internal coupling.

One can however couple the system to the outside world in a specific way to reduce the entropy (refrigeration). In particular we can observe the system state...which starts us down the road of QM where we must insist that an observation is a physical interaction even if in the classical case the interaction has infinitesimal effect on the system per se.

The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.
 
Last edited:
  • #55
jambaugh said:
The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.
Except that in the quantum mechanical sense, the entropy of the system is directly related to the number of possible configurations of that system. Being related to the number of possible configurations, the maximum entropy and the maximum recurrence time are closely linked.
 
  • #56
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?
 
  • #57
bill alsept said:
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?
Again, as I've said before, as long as you're dealing with time scales much shorter than the recurrence time, this is a valid statement (though one caveat: it's only valid for closed systems...open systems like the Earth can have their entropy decrease quite easily).
 
  • #58
bill alsept said:
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?

Be careful Bill :biggrin:
It sounds mighty naive to assert (without explanation) that entropy can always be defined.

You need things in order to be able to define the entropy----like microstates and a map of the macrostate regions that corresponds to what someone can measure.

The mathematical resources you require in order for the entropy to be well defined are precisely the resources you lack at the Loop cosmology bounce.
 
Last edited:
  • #59
Yes I forgot there are other ways to define entropy. For this Cosmology thread and the original question I just asumed everyone was talking about a measure of the randomness. However it's defined I still don't understand how entropy can increase any more than it decreases unless something is being added to the system.
 
  • #60
bill alsept said:
Yes I forgot there are other ways to define entropy. For this Cosmology thread and the original question I just asumed everyone was talking about a measure of the randomness. However it's defined I still don't understand how entropy can increase any more than it decreases unless something is being added to the system.

Well you know just saying "measure of randomness" does not say anything. In order to have a definite number you need a mathematical definition. This usually involves a "state space".
A collection of possible states the system can be in.
There is usually an observer in the picture who is able to do certain measurements. and he tends to lump together large collections of detailed "micro" states all of which look the same to him---in terms of what matters to him, like temperature, pressure etc.
Depending on who is defining entropy, there may be probability measures on the states, or on the macro collections of states that are lumped together as equivalent from the observers point of view.

Anyway no matter how you choose to mathematically define entropy, you need some math junk to do it. By itself a word like "randomness" or "disorder" does not mean anything quantitative.

So think about a function of time that is always increasing but fails to be defined at t=0

Like f(t) = -1/x

This is not meant to be the entropy of some system, it is just an example of a function, to illustrate.
The function is always increasing wherever it is defined. And yet its value at positive times t>0 is always less than what its value was at any negative time t<0.

You can construct more realistic looking entropy functions. The point is:
In order to return to an earlier value the entropy function never has to decrease. It can always be increasing, wherever it is defined, and yet it can pass through the same value again and again.

So you CAN imagine entropy decreasing on a regular basis (you were talking "cyclic") but you do not HAVE to imagine it decreasing. There simply need to be moments in time when it is impossible to define. (Or to define correctly, in a consistent unambiguous way.)
 
Last edited:

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
402
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
8K
Replies
17
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
Replies
13
Views
3K