Are cycles and entropy compatible?

  • Thread starter Thread starter bill alsept
  • Start date Start date
  • Tags Tags
    Cycles Entropy
Click For Summary
The discussion centers on the compatibility of cycles and entropy within the framework of the second law of thermodynamics. It explores the idea that while entropy tends to increase in isolated systems, there may be scenarios, particularly over long timescales or in small systems, where entropy can both increase and decrease. Participants argue that if the universe operates in cycles, it could theoretically return to a low-entropy state, challenging the notion that entropy must always increase. The conversation also touches on the complexities of defining entropy in the context of gravitational fields and quantum effects during cosmic events like bounces. Ultimately, the dialogue highlights ongoing debates in physics regarding the nature of entropy and its implications for cosmology.
  • #61
Entropy increase is not an absolute, as several people have stated. The existence of cycles in states demonstrates that entropy can increase and decrease.
But entropy is still a useful and important concept on smaller time scales.
 
Last edited:
Space news on Phys.org
  • #62
jambaugh said:
Again this goes back to understanding that entropy is not about disorder vs order but about empirically defined ignorance vs knowledge about system state.

Maybe. That's why the black hole information paradox is interesting. You toss some stuff into a black hole. Within a finite time, classical GR says that it gets crushed to a singularity. Now the question is "is the information still there or not"? Some people (namely Hawking) believe that black holes in fact destroy information so it's not merely a matter of being ignorant of the internal state of the black hole as the black hole has no internal states. Other's disagree (Susskind).

This matters now that it appears we have a positive cosmological constant because the event horizon at the edge of the universe has the same issues as the event horizon at the edge of a black hole.

One reason I find this fascinating is that it turns out that you can figure out a lot about quantum mechanics from classical thermodynamics, and it turns out that QM resolves the "Gibbs paradox." Trying to figure out whether or not the universe can really destroy information or not gives us some clues as to what quantum gravity looks like.

The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.

With springs and pendulum, you can argue this. Now with black holes, what entropy means is not clear. It has to mean something. One thing that just won't work is to have a black hole that is really black. If it was the case that if you through something into a black hole and nothing comes out except gravity, then you can show that this violates thermodynamics.

You can also get anthropic. One thing that you can argue (and I think Max Tegmark argues this) is that in order to have a comprehensible universe, you need an "arrow of time." It could very well be that there are an infinite number of universes in which the laws of physics are such that the second law of themodynamics does not hold, but it's difficult to see how you can have intelligence without an arrow of time.
 
  • #63
bill alsept said:
However it's defined I still don't understand how entropy can increase any more than it decreases unless something is being added to the system.

You can think of entropy as anti-information. I have a 500-MB CD rom with old pictures in it. If I take a hammer to that CD rom, I've destroyed those pictures and I've increased the entropy of the world by 500-MB (and you can measure entropy and to thermodynamics using Megabytes). Now if I leave that CD-rom in a cupboard by itself, what will happen is that it will spontaneously decay, and if I leave it long enough, the pictures on it will decay.

The opposite doesn't happen. If I leave a blank old CD I won't expect my photo album to spontaneously appear. If you turn off your computer, you expect to lose whatever work you had on the computer, however you don't expect that if you start off with a blank computer that you end up with the complete works of Shakespeare.

Also the thermodynamics of information is an very active area of physics research. One thing about computers is that they end up getting hot, and that's annoying when you are trying to run a laptop. It turns out that some of the limits on how cool you can run a laptop result from some fundamental interactions between heat and information. Erasing data increases the entropy in the world which produces heat. So the reason laptops run hot is that it's doing a lot of calculations. Every time something gets erased in the CPU or memory, this increases entropy, and an increase in entropy corresponds to an increase in heat.

Conversely, one tried and true way of erasing information is to burn it.
 
Last edited:
  • #64
marcus said:
You need things in order to be able to define the entropy----like microstates and a map of the macrostate regions that corresponds to what someone can measure.

You can define entropy "bottom up." You can also define entropy "top down". Define temperature as what you measure when you put a thermometer it it. Define entropy as a function the energy you put into a system versus how much the temperature changes.

That's entropy.

Now it's not obvious that this has anything to do with randomness, but the cool thing about physics is how some non-obvious things are actually related.
 
  • #65
bill alsept said:
Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?

I don't think it can.

But I think a lot of this has to do with conflicting definitions of entropy. There's a theoretical statistical mechanical definition and a observational thermodynamic definition, and I think they end up in conflict.
 
  • #66
bill alsept said:
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?

Your reasoning here is the same reasoning that goes into "a system's entropy is always zero since it is always in some single fixed state"... even if we don't know what that state is.

The fact that the system cycles to its initial condition does not imply the return to lower entropy. Consider... if you allow a system to evolve from a low entropy state in the deterministic way required by the assumptions of cyclic behavior, you know the entropy cannot change. The whole cycle is by virtue of being a cycle reversible. You are in particular thinking in terms of zero entropy systems.

Remember also that when you are thinking in terms of such cyclic systems the dynamics itself is an external constraint. All I need to break the cyclic assumption is some non-periodic time variation in the dynamics. How that relates to the entropy then is in the fact that via coupling to the dynamics the system couples to the external world...
for a gas in a box there's the box's walls, for a mass and spring there is the spring's mounting point. For a freely falling vibrating elastic body there's still gravitational coupling to the rest of the universe...

now for the universe as a whole (in so far as such can be defined meaningfully as a physical system) you're perfectly free to say it cycles over some hyper-astronomical period and I'll assert its entropy is zero by invoking QM and sub-additivity of entropy and defining entropy as entanglement with one's environment.
 
  • #67
twofish-quant said:
But I think a lot of this has to do with conflicting definitions of entropy. There's a theoretical statistical mechanical definition and a observational thermodynamic definition, and I think they end up in conflict.

The conflict is only apparent. If you carefully parse the operational meaning of each definition you find compatibility (provided of course you use consistent physical assumptions.)
 
  • #68
The point of my original question was not so much to try and define entropy. There are some who say that the universe cannot cycle because they believe entropy (no matter how you define it) will increase so much that the cycle disintegrates.
 
  • #69
bill alsept said:
The point of my original question was not so much to try and define entropy. There are some who say that the universe cannot cycle because they believe entropy (no matter how you define it) will increase so much that the cycle disintegrates.

I'm curious who those people are since the consensus in this discussion seems to be that this statement is incorrect.
 
  • #70
bill alsept said:
The point of my original question was not so much to try and define entropy. There are some who say that the universe cannot cycle because they believe entropy (no matter how you define it) will increase so much that the cycle disintegrates.

Given the sub-additivity of quantum entropy, this objection needn't be applicable. We may observe parts of the universe increasing in entropy without the entropy of the whole changing... this occurs as the parts entangle over time.

Along those same lines trying to define "the entropy of the universe" by adding the entropies of parts (e.g. by integrating an entropy density over the spatial universe) is not appropriate as it does not take into account spatially separated quantum correlations.

Again you can understand entropy of a system as the amount to which that system is entangled with the rest of the universe... and define the entropy of the whole universe as fixed and equal to zero since there is nothing external to which it is entangled. Now the visible universe, on the other hand... (i.e. the universe outside the interiors of the many BH's floating around).
 
  • #71
jambaugh said:
Given the sub-additivity of quantum entropy, this objection needn't be applicable. We may observe parts of the universe increasing in entropy without the entropy of the whole changing... this occurs as the parts entangle over time.

This doesn't make sense to me. I have a cup of ice. There is an entropy associated with that cup of ice. (41 J/ (mol K)) Now if I watch a cup of ice melt into water and the entropy is now 70 J(mol K), are you trying to tell me that because of some weird quantum entanglement effect the entropy of the universe is constant? That doesn't make sense to me. Now you may be able to define some quantity that does stay constant, but that doesn't seem to have any connection with what an engineer would call entropy.

Along those same lines trying to define "the entropy of the universe" by adding the entropies of parts (e.g. by integrating an entropy density over the spatial universe) is not appropriate as it does not take into account spatially separated quantum correlations.

Doesn't make sense to me. I'm watching ice with an entropy of 41 J/(mol K) turn into water with an entropy of 70 J/(mol K). What's it quantum entangled with. Where is the quantum correlation? I'm watching ice melt. No quantum entanglements, that I can see.

What you seem to be saying is that anytime ice melts, then there is some weird quantum mechanical effect that causes something weird to happen in some other part of universe.

I really don't think this makes sense.

Again you can understand entropy of a system as the amount to which that system is entangled with the rest of the universe...

I'm just watching ice melt. Are you saying that I can't understand ice melting without quantum entanglements?
 
  • #72
twofish-quant said:
This doesn't make sense to me. I have a cup of ice. There is an entropy associated with that cup of ice. (41 J/ (mol K)) Now if I watch a cup of ice melt into water and the entropy is now 70 J(mol K), are you trying to tell me that because of some weird quantum entanglement effect the entropy of the universe is constant?
Yes
What you seem to be saying is that anytime ice melts, then there is some weird quantum mechanical effect that causes something weird to happen in some other part of universe.
Not that something weird happens on the other part of the universe, and in fact the rest of the universe excluding the melting ice also has it's entropy go up. But as the ice melts, it is interacting with everything else and there are correlations between the ice's state and the "rest-of-the-universe"'s state. These correlations when included in the calculation of the entropy of the universe-as-a-whole will reduce the sum of the entropes if you just add the calculated entropy of each part.
I'm just watching ice melt. Are you saying that I can't understand ice melting without quantum entanglements?
Of course not. Likewise we don't need QM to track the Moon's orbit. And as long as one is talking about some piece of the universe (which necessarily excludes the mechanisms used to observe that piece) then you needn't worry about this.

Now you find this idea perplexing and counter-intuitive. Well it is, as is so much in QM. But you can "do the math". Take a system of two particles which are maximally entangled, say two spin-1/2 particles in a sharp spin-0 composite state. Now let one of the particles reside in your "system" and shoot the other one into space, or better yet into a Black Hole.

Since the composite is sharply defined it has zero entropy.
To describe one of the particles alone and you must do a partial trace over the other, you get a maximum entropy density matrix diag(1/2,1/2). Its entropy = k ln(2). (k = Boltzmann's constant.)

Likewise if you were to determine the entropy of the other half of the entangled pair you'd get k ln(2). Add and you get 2kln(2) but that's not the entropy of the composite. It is rather 0. Entropy doesn't "add up" in QM.

Finally let me mention that when you have an entangled pair, it typically undergoes decoherence due to interaction with the environment. But that's just the entanglement "swapping" to other systems. The particle in hand soon becomes entangled with photons (which have interacted with the entangled partner) shooting off into space at speed c, never to be recovered. That's where the irreversibility comes into play.

Similarly if you consider a high entropy system and wish to "refrigerate" it to lower entropy you are basically swapping the entanglements between your lab system and far flung photons to entanglement between your heat sink and far flung photons. Your system's entropy goes down... and remarkably the entropy of all but your system goes down since there are now more correlations within the exosystem.

Ultimately according to this view of entropy, at any given time the entropy of a system is equal to the entropy of all the universe minus that system which is to say both represent the same quantity. It is the amount of entanglement across the boundary between them.

Weird indeed! No?
 
Last edited:
  • #73
I don't really see how this view works. It seems to be relying upon a tautology: when I define the macrostate as the microstate, the entropy is zero. What we do in reality is very different. The macrostate is defined as a set of observables that are due to the collective behavior of a large number of degrees of freedom. And when you have that sort of situation, you very much can talk about overall changes in entropy, whether you're talking about a quantum-mechanical system or not.

A simple example here is that of an evaporating black hole in de Sitter space-time: due to the horizons, we have definite definitions of the entropy of the black hole as it is evaporating and after the evaporation. This is a fully-quantum system, and the total entropy definitively increases. It increases because the macrostates we're considering (the horizon areas) are composed of a tremendous number of quantum degrees of freedom.
 
  • #74
bill alsept said:
Entropy is that nature tends from order to disorder in an isolated systems. If a cycle always comes back to where it started wouldn't entropy decrease as well as increase over and over again? With that in mind and the fact that there really are no examples of an isolated system can the 2nd law of thermo hold up?

Entropy does not say that nature tends from order to disorder.

A non-dissipative cycle is in agreement with the second law, because the second law also consider isoentropic evolutions.

Moreover, the second law is independent of the boundary conditions and also holds for non-isolated systems. For open and close systems the second law predicts a non-negative production of (average) entropy, which is perfectly verified.
 
  • #75
I can agree with average. What goes up most come down.
 
  • #76
bill alsept said:
I can agree with average. What goes up most come down.
There are situations where it's true, situations where it isn't. That particular aphorism most certainly does not apply to everything.
 
  • #77
bill alsept said:
I can agree with average. What goes up most come down.

I mean that, contrary to a common misconception (also showed in this thread), the second law is not a statistical law being violated in fluctuations, because it does predictions about the average entropy <S>, not about the fluctuating entropy.

This is true for any other part of physics, just as Maxwell laws make predictions about the average electric and magnetic fields <E>,<B> not about its fluctuations, just as Einstein G_ab = T_ab is in reality a statement about the averages <G_ab> = <T_ab>.
 
  • #78
juanrga said:
I mean that, contrary to a common misconception (also showed in this thread), the second law is not a statistical law being violated in fluctuations, because it does predictions about the average entropy <S>, not about the fluctuating entropy.

This is true for any other part of physics, just as Maxwell laws make predictions about the average electric and magnetic fields <E>,<B> not about its fluctuations, just as Einstein G_ab = T_ab is in reality a statement about the averages <G_ab> = <T_ab>.
I don't know what you're trying to say here, but the second law of thermodynamics, as we understand it from statistical mechanics, is an approximate law that is accurate except on very small scales or for very long timescales.

That is to say, no matter the size of your system, if you wait for long enough you will see significant deviations from the second law. Similarly, if you are only willing to wait a fixed amount of time, you will see deviations from the second law if you look at very small systems.
 
  • #79
Chalnoth said:
I don't know what you're trying to say here, but the second law of thermodynamics, as we understand it from statistical mechanics, is an approximate law that is accurate except on very small scales or for very long timescales.

That is to say, no matter the size of your system, if you wait for long enough you will see significant deviations from the second law. Similarly, if you are only willing to wait a fixed amount of time, you will see deviations from the second law if you look at very small systems.

I was correcting a misconception that appears too often in misguided literature by non-experts.

Using statistical mechanics we can show that the Laws of Newton, of Maxwell, of kinetics, of hidrodynamics, and the Hilbert-Einstein 'field' equations are only valid in an average sense. Evidently nobody would say that those laws are «only statistical laws violated by fluctuations» because all those laws only refer to the average and, therefore, say nothing about the fluctuations.

Introducing fluctuations we can generalize Newton, Maxwell, kinetics, hidrodynamics, and even GR. For example fluctuating hydrodynamics extends the equations of hydrodynamics incorporating fluctuations in density, pressure, speed...

The same about thermodynamics, the variation of the fluctuating entropy S in an isolated system is given by

\frac{dS}{dt} = \frac{d\langle S\rangle}{dt} + \frac{d\delta S}{dt}

The second law is a statement about {d\langle S\rangle}/{dt} not about the fluctuation. The fluctuation term is studied using the thermodynamic theory of fluctuations which says that {d\delta S}/{dt} can be positive, negative, or zero. Therefore a measurement of a fluctuation does not invalidate thermodynamics.

Misguided literature by non-experts confounds S with \langle S\rangle and makes the incorrect claim that you repeat.

You must also take a look to http://arxiv.org/abs/cond-mat/0207587 and how fluctuations are in perfect agreement with thermodynamic laws.
 
Last edited:
  • #80
Whoever said the contrary?

It was just emphasized that there is nothing inconsistent between the 2nd law and having negative fluctuations in entropy.

I believe there are rigorous theorems about fluctuations as well, to the extent that one can show (for a variety of types of systems) that the positive fluctuations are far more probable than the negative ones (thus proving the 2nd law).
 
  • #81
juanrga said:
Using statistical mechanics we can show that the Laws of Newton, of Maxwell, of kinetics, of hidrodynamics, and the Hilbert-Einstein 'field' equations are only valid in an average sense. Evidently nobody would say that those laws are «only statistical laws violated by fluctuations» because all those laws only refer to the average and, therefore, say nothing about the fluctuations.
Why not? It's true.

And anyway, statistical mechanics is a bit different here, at least on very long timescales, because of Poincare recurrence.

juanrga said:
The same about thermodynamics, the variation of the fluctuating entropy S in an isolated system is given by

\frac{dS}{dt} = \frac{d\langle S\rangle}{dt} + \frac{d\delta S}{dt}

The second law is a statement about {d\langle S\rangle}/{dt} not about the fluctuation. The fluctuation term is studied using the thermodynamic theory of fluctuations which says that {d\delta S}/{dt} can be positive, negative, or zero. Therefore a measurement of a fluctuation does not invalidate thermodynamics.
When d\delta S/dt is of the same order as d\langle S \rangle/dt, which happens on very long timescales, most would consider that a violation of the second law.

I think the thing that you're doing here is redefining the second law of thermodynamics, which was originally devised empirically with no connection whatsoever to statistical mechanics, into a different form that is consistent with statistical mechanics. But this is a new second law, it is not the second law that most everybody writes down.

Of course, expanding the second law of thermodynamics to be consistent with statistical mechanics is a relatively small change. But it is still a change.
 
  • #82
Chalnoth said:
Why not?

Explained in the same quote that you cite.

Chalnoth said:
When d\delta S/dt is of the same order as d\langle S \rangle/dt, which happens on very long timescales, most would consider that a violation of the second law.

I think the thing that you're doing here is redefining the second law of thermodynamics, which was originally devised empirically with no connection whatsoever to statistical mechanics, into a different form that is consistent with statistical mechanics. But this is a new second law, it is not the second law that most everybody writes down.

Of course, expanding the second law of thermodynamics to be consistent with statistical mechanics is a relatively small change. But it is still a change.

When d\delta S/dt is of the same order as d\langle S \rangle/dt, the second law is not violated because its prediction for d\langle S \rangle/dt continues unchanged. The second law says nothing about d\delta S/dt, therefore any measurement of that term cannot violate the law.

Classical thermodynamics and its second law were always about the average quantities, (just as GR, Maxwell, hydrodynamics... were always about the averages as well). The thermodynamic treatment of fluctuations was initiated about 1920.

The introduction of fluctuations is not «expanding the second law of thermodynamics» as you claim. Of course, the second law remains unchanged by the thermodynamic theory of fluctuations.

It is only people who has never studied thermodynamics beyond a basic course (or even less than that) who is seriously confused about thermodynamics and make misguided claims. Read the Arxiv preprint again, specially the part:

It remains to stress that none of formulations of the second law known to us ever claimed that unaveraged entropy production or unaveraged work must be positive; see e.g.4,5,6,7,8,9,10.
 
  • #83
juanrga said:
Explained in the same quote that you cite.



When d\delta S/dt is of the same order as d\langle S \rangle/dt, the second law is not violated because its prediction for d\langle S \rangle/dt continues unchanged. The second law says nothing about d\delta S/dt, therefore any measurement of that term cannot violate the law.
If you take that stance, then nothing can possibly violate the second law.
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 5 ·
Replies
5
Views
401
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 57 ·
2
Replies
57
Views
8K
Replies
17
Views
3K
  • · Replies 7 ·
Replies
7
Views
1K
Replies
13
Views
3K