Are cycles and entropy compatible?

In summary, the second law of thermodynamics is an approximation of the true underlying behavior and can only be applied to systems that begin in a low-entropy state. While it states that entropy always increases, this is not necessarily true for very long timescales or small, closed systems. The concept of entropy is also meaningless at the bounce of a universe, and the entropy of the gravitational field and geometry is still being studied. Additionally, in LQG "bounce" cosmology, it is argued that the entropy of the universe may not always increase due to quantum effects causing gravity to repel at high densities.
  • #1
bill alsept
124
0
Entropy is that nature tends from order to disorder in an isolated systems. If a cycle always comes back to where it started wouldn't entropy decrease as well as increase over and over again? With that in mind and the fact that there really are no examples of an isolated system can the 2nd law of thermo hold up?
 
Space news on Phys.org
  • #2
bill alsept said:
Entropy is that nature tends from order to disorder in an isolated systems. If a cycle always comes back to where it started wouldn't entropy decrease as well as increase over and over again? With that in mind and the fact that there really are no examples of an isolated system can the 2nd law of thermo hold up?
Yes. When dealing with very long timescales (as in many, many times the current age of our universe), or with very small, closed systems (as in a handful of atoms), entropy both increases and decreases over time.

The basic way to understand this is that the second law of thermodynamics is an approximation of the true underlying behavior. It can be derived, in fact, from the underlying behavior, by assuming that the underlying particles are randomized, and assuming that the timescales are much shorter than the repetition timescale. If you do that, you get the second law of thermodynamics pretty trivially assuming you start in a low-entropy state. However, you also get that entropy is non-decreasing into the past from the same low-entropy starting state.

So fundamentally what this means is that whenever you have a system that has a second law that holds, that system had to begin with a low-entropy starting state. Precisely how our own universe was set up in this low-entropy starting state is one of the unsolved problems in physics. There are many ideas, but it is not yet clear which (if any) is the correct one.
 
  • #3
Thanks, but I'm not sure if I follow. I am suggesting a system such as our universe does not always increase it's entropy. It may increase and decrease in the short run but stays the same in the long run. If it truly cycles then its entropy will always come back to where it started. If it started at a low state or high state should not matter as far as the 2nd law is concerned which states entropy always increases. It seems to me that many good ideas in the past have been abanduned because people are locked into thinking entropy can only increase.
 
  • #4
Quoted from:
http://en.wikipedia.org/wiki/Entropy_(arrow_of_time )

"An example of apparent irreversibility

Consider the situation in which a large container is filled with two separated liquids, for example a dye on one side and water on the other. With no barrier between the two liquids, the random jostling of their molecules will result in them becoming more mixed as time passes. However, if the dye and water are mixed then one does not expect them to separate out again when left to themselves. A movie of the mixing would seem realistic when played forwards, but unrealistic when played backwards.
If the large container is observed early on in the mixing process, it might be found to be only partially mixed. It would be reasonable to conclude that, without outside intervention, the liquid reached this state because it was more ordered in the past, when there was greater separation, and will be more disordered, or mixed, in the future.
Now imagine that the experiment is repeated, this time with only a few molecules, perhaps ten, in a very small container. One can easily imagine that by watching the random jostling of the molecules it might occur — by chance alone — that the molecules became neatly segregated, with all dye molecules on one side and all water molecules on the other. That this can be expected to occur from time to time can be concluded from the fluctuation theorem; thus it is not impossible for the molecules to segregate themselves. However, for a large numbers of molecules it is so unlikely that one would have to wait, on average, many times longer than the age of the universe for it to occur. Thus a movie that showed a large number of molecules segregating themselves as described above would appear unrealistic and one would be inclined to say that the movie was being played in reverse."

---

I believe that's what Chalnoth meant by "When dealing with very long timescales (as in many, many times the current age of our universe), or with very small, closed systems (as in a handful of atoms), entropy both increases and decreases over time."
 
Last edited by a moderator:
  • #5
bill alsept said:
Thanks, but I'm not sure if I follow. I am suggesting a system such as our universe does not always increase it's entropy. It may increase and decrease in the short run but stays the same in the long run. If it truly cycles then its entropy will always come back to where it started...

I think what you say here is absolutely right. There are several arguments to be made justifying LQG "bounce" cosmology on entropy grounds.

I am not a Loop cosmology expert (by any means!) and don't have time to write a long post right now. I will just try to suggest some of the reasoning, and get back to this later today.
This is a really interesting question!

1. One idea is that entropy of U is not well-defined right at the bounce.

2. The whole 2nd law business is meaningless unless you can define the entropy of the gravitational field because that is a BIG part of the total. The grav field means the geometry of the U. What is the entropy of geometry. People still working on that.

3. LQG quantizes GR (the modern law of gravity) and when it does so it turns out that at very high density quantum effects take over and gravity repels instead of attracting!
This means that uniformly spread out geometry is favored.
But at ordinary density, gravity attracts, and clumpy geometry is favored.
Therefore at the bounce the entropy of U geometry cannot possible be well defined.

4. Right at that moment, density changes abruptly from the usual (attractive gravity) range up to the extreme (Planck scale) density range, and then in a split second changes back down into the usual range again. No consistent definition of the geometrical (i.e. grav. field) entropy is possible.

5. A fundamental requirement of the 2nd Law is an observer, who defines the macrostate regions of the phase space. What defines the observer's map of phase space is what the observer can see and measure. A region of microstates which all look the same to that observer is lumped together into a single macrostate. This necessarily assumes an observer. Coarsegraining requires a point of view.

But at the bounce there is no well-defined observer! There is only the "before" observer who looks forward to the bounce in his future, and the "after" observer for whom the bounce is the big bang beginning of his era, who looks back to it in the past.
At the very moment of the bounce it is not possible to define an observer.

6. I suspect that in the quantum regime right around the bounce, time itself may not be well defined. Normal distinctions between what is matter and what is geometry may be difficult to make. But that is just a matter for speculation at this point. In any case I think it would be a naive/simple-minded point of view to expect the 2nd to apply there in a straightforward manner. One still has not even defined what the terms and quantities mean there.
 
  • #6
bill alsept said:
Thanks, but I'm not sure if I follow. I am suggesting a system such as our universe does not always increase it's entropy. It may increase and decrease in the short run but stays the same in the long run.
The "short run" here is many times the current age of the universe. Entropy will be increasing in our observable universe until all matter is decayed, which will take on the order of [itex]10^{100}[/itex] years. While the recursion time is easily [itex]10^{10^{100}}[/itex] years, if not longer.

bill alsept said:
If it truly cycles then its entropy will always come back to where it started. If it started at a low state or high state should not matter as far as the 2nd law is concerned which states entropy always increases. It seems to me that many good ideas in the past have been abanduned because people are locked into thinking entropy can only increase.
The recursion time has been known about for over a hundred years. But for most purposes, the recursion time is only relevant for very, very small systems (as Constantin notes).
 
  • #7
bill alsept said:
Thanks, but I'm not sure if I follow. I am suggesting a system such as our universe does not always increase it's entropy. It may increase and decrease in the short run but stays the same in the long run. If it truly cycles then its entropy will always come back to where it started...

So yes! The LQG bounce cosmology, as far as we know, is not inconsistent with the 2nd.
What you say is right: that entropy would return to an earlier low value (under classical conditions with matter spread out approx uniformly.)

And there would necessarily be moments when there is no meaningful definition of the entropy. No meaningful map of phase space, no idea of coarse-graining.

You get people who do not realize that the dominant part of entropy in early U is the entropy of geometry---as long as grav. is attractive that means that
low entropy corresponds to uniformly spread out matter.

If you get someone who talks like he thinks (by analogy with gas in a box) that uniformly spread out is a picture of HIGH entropy, that's a sign the person has not thought about it much.

And the relationship is reversed briefly during the bounce when grav. is repellent. Then spread out matter is high entropy. A natural period of faster than exponential inflation (superinflation) occurs, without needing any exotic "inflaton" field. So there's a fascinating bunch of issues involved here!

We are not talking about rare classical "recurrence" events. They are off topic/irrelevant as far as the Loop cosmology bounce goes. The extremely rare classical event where all the gas molecules accidentally happen to be gathered in one corner of the box :biggrin:

I think that is is just a distracting "red herring" or irrelevancy here. The interesting issues have to do with quantum cosmology, not with the 100-year old stuff about molecules in a box. Just my two cents, though. Whatever people want to talk about.
 
Last edited:
  • #8
Not sure if I'm following. It seems to me that if a system can keep cycling back to it's lowest level of entropy then isn't that going against the 2nd law of thermo? How can both be right? I agree that an anology of gas in a box does not work but that's not a system that cycles anyway. Entropy probably does always increase in that type of system but not in a system that cycles.
Just wondering.
Thanks
 
  • #9
bill alsept said:
Not sure if I'm following. It seems to me that if a system can keep cycling back to it's lowest level of entropy then isn't that going against the 2nd law of thermo? How can both be right? I agree that an anology of gas in a box does not work but that's not a system that cycles anyway. Entropy probably does always increase in that type of system but not in a system that cycles.
Just wondering.
Thanks
Strictly speaking, yes. But then ever since statistical mechanics was developed around a hundred and forty years ago or so, it's been known that the second law of thermodynamics was not an absolute law.

At the same time, the exact same argument that says that the second law of thermodynamics is not absolute demonstrates that it is a useful way of thinking about the world, and that this understanding is completely fundamental, for the reason that it's just down to probability: a low-entropy state is a low-probability state, while a high-entropy state is a high-probability state. Systems in low-probability states will, over time, transition to high-probability states.
 
  • #10
So can we conclude that entropy does NOT always increase? Sorry but I'm knew at these forums discussions and I have some ideas I'm trying to work through. If there is an answer to something I like to confirm it before moving to the next question. If there isn't then I can't stop wondering why.
Thanks again for your help
 
  • #11
It is a good question, and this caused a controversy in the early twentieth century. It's known as the Zermelo-Poincare paradox.

But the correct way to think about it (similar to the arguments Chalnoth gave), is to recall the arguments in the Feynman lectures, where you have two liquids mixing. You start out with a very low entropy condition (half red, half blue) and as you wait awhile they will mix. Clearly, one sees the second law of thermodynamics holds as they tend to a higher entropy state.

Now, if you wait an astronomically long time (basically e ^ (S)) just by random chance, you can see (and can prove) that the two liquids will go back to the original configuration. This essentially follows b/c the phase space is a compact manifold, and its pretty intuitive that cycles will eventually return back to certain points multiple times.

Contradiction? Not really! At one point, the two liquids will be in a state that is very close to the maximum entropy possible (basically complete mixture). Just by chance, you can imagine a quick fluctuation back into the more ordered state. There is nothing wrong with that, b/c just as quickly the liquids will mix again and for the immense duration of the history of the system will be tending towards more chaos. The same thing holds in quantum mechanics, where evolution under the Shrodinger equation evolves initial states back into the same state after some ridiculously long time interval.

Further, there is a problem of definition here too. When we talk about the entropy of a liquid, we are tacitly assuming that we *DONT* know everything about the particular configuration, that is, we don't know exactly the positions and velocities of all its individual atoms to perfect accuracy. If we did know, then we couldn't use the word entropy.

However, the recurrence time theorem is the opposite. It says that after a recurrence time, the exact microstates must go back into themselves perfectly, implying exact knowledge of the positions and velocities. So its a little weird to then ask questions about what the entropy of the system is. You can only ask that question at some later time, when you have lost track of the perfect information.
 
Last edited:
  • #12
bill alsept said:
So can we conclude that entropy does NOT always increase?
Yes. The second law of thermodynamics, properly-understood, is a statistical law. So a more detailed understanding is that it says that entropy almost always increases, but not quite always.

But because for decent-sized systems (say, the order of a box you can hold in your hands), the amount of time you have to wait to see any noticeable decrease in entropy is typically longer than the current age of the universe, for most purposes we can take the second law to be absolute.
 
  • #13
The anology of the two liquids mixing is not a system that is intended to cycle and the chances of the two colors dividing themselves again right down the middle are far worse than the chances of them never doing it. On the other hand a system that really cycles will cycle and most likely right on time. But even in your own example you say that if you wait an astronomically long time that the two liquids will go back to their original configuration which I think you said was low entropy. So entropy does not always increase.
 
  • #14
You could even make the argument that entropy always decreases but I would bet that in the long run we find that it ends up equal other that a positive or negetive kick the system may get from outside the system at some phase of it's cycle.
 
  • #15
bill alsept said:
The anology of the two liquids mixing is not a system that is intended to cycle and the chances of the two colors dividing themselves again right down the middle are far worse than the chances of them never doing it.
This is incorrect. If you wait long enough, it is guaranteed to happen. It's actually a proof within physics that such closed systems must recur given enough time. It's just that it takes so long that we don't need to worry about it nearly all the time.
 
  • #16
bill alsept said:
The anology of the two liquids mixing is not a system that is intended to cycle and the chances of the two colors dividing themselves again right down the middle are far worse than the chances of them never doing it. On the other hand a system that really cycles will cycle and most likely right on time. But even in your own example you say that if you wait an astronomically long time that the two liquids will go back to their original configuration which I think you said was low entropy. So entropy does not always increase.

That's the point! All physical systems cycle on various timescales. They have too, basically as a direct consequence of Liouvilles theorem (alternatively Unitarity in quantum mechanics).

As Chalnoth says, the 2nd law is a statement about statistics, whereas you are picturing it like some sort of monotonic function. Don't! It's like throwing a sequence of coins. You can say that the large N expectation value of the system is zero. However there will be moments where you will have some enormous disparity between heads and tails. This will go away when viewed on even longer time frames, but you will have periods where random fluctuations will make the statistics look crazy.

Further, as I tried to mention, there is a subtlety about what you call 'entropy' when you are talking about the recurrence theorem!
 
  • #17
Also these arguments work only for cyclical universes. If the universe is open, then everything will fly away from each other before things can repeat. There's also this big debate as to how black holes fit into all of this, because it seems that black holes destroy information.
 
  • #18
twofish-quant said:
Also these arguments work only for cyclical universes. If the universe is open, then everything will fly away from each other before things can repeat. There's also this big debate as to how black holes fit into all of this, because it seems that black holes destroy information.
Not quite. The recurrence argument holds as long as the laws of physics are unitary*. All currently-known laws of physics are unitary, and it is largely suspected that this is a fundamental property.

*For the uninitiated, the laws of physics are unitary if you can take the exact state of a system and correctly predict its behavior either in the past or the future. In classical mechanics, the exact state would mean the exact position and momentum of every particle in the system. In quantum mechanics, the exact state means the full wavefunction including all particles in the system.
 
  • #19
As time is unidirectional, and it is an integral part of the Universe, the Universe cannot cycle.
Also the hypothesis of a Big Crunch and Big Bang cycles is not supported by evidence (expansion at accelerated rate).

Various examples, like molecules in a box, don't include the time dimension itself.
 
  • #20
Constantin said:
As time is unidirectional, and it is an integral part of the Universe, the Universe cannot cycle.
Also the hypothesis of a Big Crunch and Big Bang cycles is not supported by evidence (expansion at accelerated rate).

Various examples, like molecules in a box, don't include the time dimension itself.
As I've said before, this just isn't true. You don't need to rely upon any specific model of the universe. All you have to know is:
1. The number of possible configurations of the universe is finite (which we know to be true based upon quantum mechanics).
2. The laws of physics are unitary (which all of our current laws are, and which is generally suspected to be true of the most fundamental laws).

Given those two statements, any place in the parameter space which a universe reaches once, it must necessarily reach an infinite number of times. In other words, it has to cycle in some sense.

There's also the point to be made that at the microscopic level, time is not unidirectional. The direction of time only appears at the macroscopic level, as a result of the overall increase in entropy of our universe.
 
  • #21
Chalnoth said:
As I've said before, this just isn't true. You don't need to rely upon any specific model of the universe. All you have to know is:
1. The number of possible configurations of the universe is finite (which we know to be true based upon quantum mechanics).

We don't know this to be true , and in fact that latest data suggests that it is not true. If the universe is infinite then number possible configurations of the universe is itself infinite. Imagine a universe in which all of the galaxies are 1 Mpc from each other. Then 2 Mpc. Then 3 Mpc. Then 4 Mpc. You end up with an infinite number of configurations. If the universe in fact is accelerating, then you will never have a recurrence.

I have this deep feeling that all this has something to do with the accelerating universe, and that someone can come up with an argument that any comprehensible universe has to be accelerating.
 
  • #22
As far as I know, there's also a second arrow of time, the matter/anti-matter imbalance and the processes that create it, which is also unidirectional.

How do you handle this additional arrow of time and still have cycles ?
 
  • #23
Chalnoth said:
*For the uninitiated, the laws of physics are unitary if you can take the exact state of a system and correctly predict its behavior either in the past or the future. In classical mechanics, the exact state would mean the exact position and momentum of every particle in the system. In quantum mechanics, the exact state means the full wavefunction including all particles in the system.

Which is why people are wondering how black holes fit into all of this. Once you have a black hole, then the information in it becomes unrecoverable once you've crushed everything into a singularity.
 
  • #24
Take a system consisting of two harmonic oscillators, one with period 1 and the other with period sqrt(2). Set them in motion and they will never repeat their initial state exactly.

This is immaterial to questions of entropy. Entropy is not a measure of disorder vs order as these terms have no physical meaning (they are subjective). Entropy is a measure of relative uncertainty about the state of a system.

To understand classical entropy it is important to understand that an actual system has no specific entropy. Entropy is not a physical observable. What we do is classify actual systems based on what we know about them in the form of constraints... e.g. the particle in a box, or system of particles with a given fixed total energy.

Entropy is a number we associate with classes of physical systems representing the degree to which this class might be further resolved.

That seems to imply that entropy is not physically meaningful but that too is a mistake. We identify an actual physical system as being in some class by virtue of the physical constraints and measurements we make. To know a system is remaining in a class of systems with a specific entropy requires an ongoing physical constraint. Varying the constraint has physical effects we can then quantify by considering how the entropy for the defined class of systems changes.
 
  • #25
"Take a system consisting of two harmonic oscillators, one with period 1 and the other with period sqrt(2). Set them in motion and they will never repeat their initial state exactly."

In real life you can't do that. Because everything is quantised and you can't have a sqrt(2) period. Both periods will be integer multiples of some quanta, so they can and will repeat the initial state after a while.
 
  • #26
jambaugh said:
Take a system consisting of two harmonic oscillators, one with period 1 and the other with period sqrt(2). Set them in motion and they will never repeat their initial state exactly.

Except that you can't have a harmonic oscillator with a period of sqrt(2). The period of a harmonic oscillator has got to be some integer multiple of Planck's constant/2.
 
  • #27
And then there is Boltzmann's brain. In a closed, finite universe, everything will happen eventually, but some things will happen before others. In particular, it's far more likely that your brain will pop into being in a sea of nothingness than it will pop into a "sensible universe."
 
  • #28
twofish-quant said:
We don't know this to be true , and in fact that latest data suggests that it is not true. If the universe is infinite then number possible configurations of the universe is itself infinite. Imagine a universe in which all of the galaxies are 1 Mpc from each other. Then 2 Mpc. Then 3 Mpc. Then 4 Mpc. You end up with an infinite number of configurations. If the universe in fact is accelerating, then you will never have a recurrence.

I have this deep feeling that all this has something to do with the accelerating universe, and that someone can come up with an argument that any comprehensible universe has to be accelerating.
All you need to do is demonstrate that the number of possible configurations within a specific horizon are finite. You could potentially get out of this by supposing that the cosmological constant is identically zero. But that doesn't appear to be the case.
 
  • #29
Constantin said:
As far as I know, there's also a second arrow of time, the matter/anti-matter imbalance and the processes that create it, which is also unidirectional.

How do you handle this additional arrow of time and still have cycles ?
How is that a second arrow of time?
 
  • #30
twofish-quant said:
And then there is Boltzmann's brain. In a closed, finite universe, everything will happen eventually, but some things will happen before others. In particular, it's far more likely that your brain will pop into being in a sea of nothingness than it will pop into a "sensible universe."
Yes. The Boltzmann Brain argument demonstrates that in order to explain our own existence, states like our own must occur much more frequently than one would naively expect from the trivial statistical recurrence argument.
 
  • #31
Chalnoth said:
How is that a second arrow of time?

I get that definition from wikipedia and it makes sense to me.

http://en.wikipedia.org/wiki/Arrow_of_time

Quote from that page:
"The particle physics (weak) arrow of time
Certain subatomic interactions involving the weak nuclear force violate the conservation of both parity and charge conjugation, but only very rarely. An example is the kaon decay [1]. According to the CPT Theorem, this means they should also be time irreversible, and so establish an arrow of time. Such processes should be responsible for matter creation in the early universe.
This arrow is not linked to any other arrow by any proposed mechanism, and if it would have pointed to the opposite time direction, the only difference would have been that our universe would be made of anti-matter rather than from matter. More accurately, the definitions of matter and anti-matter would just be reversed."
 
  • #32
Constantin said:
I get that definition from wikipedia and it makes sense to me.

http://en.wikipedia.org/wiki/Arrow_of_time

Quote from that page:
"The particle physics (weak) arrow of time
Certain subatomic interactions involving the weak nuclear force violate the conservation of both parity and charge conjugation, but only very rarely. An example is the kaon decay [1]. According to the CPT Theorem, this means they should also be time irreversible, and so establish an arrow of time. Such processes should be responsible for matter creation in the early universe.
This arrow is not linked to any other arrow by any proposed mechanism, and if it would have pointed to the opposite time direction, the only difference would have been that our universe would be made of anti-matter rather than from matter. More accurately, the definitions of matter and anti-matter would just be reversed."
Yes, CP violation leads to a time asymmetry. However, you still have the CPT symmetry. So it's not that this is a "new" arrow of time, it's just that at a microscopic level, the time symmetry looks a bit different, in that you have to use the whole of the CPT symmetry, which so far as we know is absolute.
 
  • #33
Chalnoth said:
All you need to do is demonstrate that the number of possible configurations within a specific horizon are finite.

I don't see how this helps you. If you have an expanding universe, and you have matter that goes outside of the event horizon then it's gone forever. If you calculate the time for things to recur versus the time it takes for it to wander outside of the observable universe and be lost forever, I think you'll find the first number is much higher.
 
  • #34
twofish-quant said:
I don't see how this helps you. If you have an expanding universe, and you have matter that goes outside of the event horizon then it's gone forever. If you calculate the time for things to recur versus the time it takes for it to wander outside of the observable universe and be lost forever, I think you'll find the first number is much higher.
Doesn't matter. All of the possible degrees of freedom are represented within the single event horizon. Every observer will be within some event horizon. And that observer will see one of the (finite) possible configurations. Worrying about multiple event horizons just means you're counting the same configurations multiple times.
 
  • #35
Chalnoth said:
Yes. The Boltzmann Brain argument demonstrates that in order to explain our own existence, states like our own must occur much more frequently than one would naively expect from the trivial statistical recurrence argument.

I have this suspicion that you can show through Boltzmann's Brain arguments that any comprehensible universe must have some period of inflationary expansion, but I haven't worked out the details.
 

Similar threads

  • Cosmology
Replies
10
Views
2K
Replies
5
Views
1K
  • Special and General Relativity
Replies
7
Views
295
Replies
13
Views
1K
  • Atomic and Condensed Matter
Replies
1
Views
1K
Replies
17
Views
1K
  • Thermodynamics
Replies
1
Views
734
  • Thermodynamics
Replies
2
Views
772
  • Thermodynamics
2
Replies
57
Views
6K
Replies
5
Views
1K
Back
Top