Are cycles and entropy compatible?

In summary, the second law of thermodynamics is an approximation of the true underlying behavior and can only be applied to systems that begin in a low-entropy state. While it states that entropy always increases, this is not necessarily true for very long timescales or small, closed systems. The concept of entropy is also meaningless at the bounce of a universe, and the entropy of the gravitational field and geometry is still being studied. Additionally, in LQG "bounce" cosmology, it is argued that the entropy of the universe may not always increase due to quantum effects causing gravity to repel at high densities.
  • #71
jambaugh said:
Given the sub-additivity of quantum entropy, this objection needn't be applicable. We may observe parts of the universe increasing in entropy without the entropy of the whole changing... this occurs as the parts entangle over time.

This doesn't make sense to me. I have a cup of ice. There is an entropy associated with that cup of ice. (41 J/ (mol K)) Now if I watch a cup of ice melt into water and the entropy is now 70 J(mol K), are you trying to tell me that because of some weird quantum entanglement effect the entropy of the universe is constant? That doesn't make sense to me. Now you may be able to define some quantity that does stay constant, but that doesn't seem to have any connection with what an engineer would call entropy.

Along those same lines trying to define "the entropy of the universe" by adding the entropies of parts (e.g. by integrating an entropy density over the spatial universe) is not appropriate as it does not take into account spatially separated quantum correlations.

Doesn't make sense to me. I'm watching ice with an entropy of 41 J/(mol K) turn into water with an entropy of 70 J/(mol K). What's it quantum entangled with. Where is the quantum correlation? I'm watching ice melt. No quantum entanglements, that I can see.

What you seem to be saying is that anytime ice melts, then there is some weird quantum mechanical effect that causes something weird to happen in some other part of universe.

I really don't think this makes sense.

Again you can understand entropy of a system as the amount to which that system is entangled with the rest of the universe...

I'm just watching ice melt. Are you saying that I can't understand ice melting without quantum entanglements?
 
Space news on Phys.org
  • #72
twofish-quant said:
This doesn't make sense to me. I have a cup of ice. There is an entropy associated with that cup of ice. (41 J/ (mol K)) Now if I watch a cup of ice melt into water and the entropy is now 70 J(mol K), are you trying to tell me that because of some weird quantum entanglement effect the entropy of the universe is constant?
Yes
What you seem to be saying is that anytime ice melts, then there is some weird quantum mechanical effect that causes something weird to happen in some other part of universe.
Not that something weird happens on the other part of the universe, and in fact the rest of the universe excluding the melting ice also has it's entropy go up. But as the ice melts, it is interacting with everything else and there are correlations between the ice's state and the "rest-of-the-universe"'s state. These correlations when included in the calculation of the entropy of the universe-as-a-whole will reduce the sum of the entropes if you just add the calculated entropy of each part.
I'm just watching ice melt. Are you saying that I can't understand ice melting without quantum entanglements?
Of course not. Likewise we don't need QM to track the Moon's orbit. And as long as one is talking about some piece of the universe (which necessarily excludes the mechanisms used to observe that piece) then you needn't worry about this.

Now you find this idea perplexing and counter-intuitive. Well it is, as is so much in QM. But you can "do the math". Take a system of two particles which are maximally entangled, say two spin-1/2 particles in a sharp spin-0 composite state. Now let one of the particles reside in your "system" and shoot the other one into space, or better yet into a Black Hole.

Since the composite is sharply defined it has zero entropy.
To describe one of the particles alone and you must do a partial trace over the other, you get a maximum entropy density matrix diag(1/2,1/2). Its entropy = k ln(2). (k = Boltzmann's constant.)

Likewise if you were to determine the entropy of the other half of the entangled pair you'd get k ln(2). Add and you get 2kln(2) but that's not the entropy of the composite. It is rather 0. Entropy doesn't "add up" in QM.

Finally let me mention that when you have an entangled pair, it typically undergoes decoherence due to interaction with the environment. But that's just the entanglement "swapping" to other systems. The particle in hand soon becomes entangled with photons (which have interacted with the entangled partner) shooting off into space at speed c, never to be recovered. That's where the irreversibility comes into play.

Similarly if you consider a high entropy system and wish to "refrigerate" it to lower entropy you are basically swapping the entanglements between your lab system and far flung photons to entanglement between your heat sink and far flung photons. Your system's entropy goes down... and remarkably the entropy of all but your system goes down since there are now more correlations within the exosystem.

Ultimately according to this view of entropy, at any given time the entropy of a system is equal to the entropy of all the universe minus that system which is to say both represent the same quantity. It is the amount of entanglement across the boundary between them.

Weird indeed! No?
 
Last edited:
  • #73
I don't really see how this view works. It seems to be relying upon a tautology: when I define the macrostate as the microstate, the entropy is zero. What we do in reality is very different. The macrostate is defined as a set of observables that are due to the collective behavior of a large number of degrees of freedom. And when you have that sort of situation, you very much can talk about overall changes in entropy, whether you're talking about a quantum-mechanical system or not.

A simple example here is that of an evaporating black hole in de Sitter space-time: due to the horizons, we have definite definitions of the entropy of the black hole as it is evaporating and after the evaporation. This is a fully-quantum system, and the total entropy definitively increases. It increases because the macrostates we're considering (the horizon areas) are composed of a tremendous number of quantum degrees of freedom.
 
  • #74
bill alsept said:
Entropy is that nature tends from order to disorder in an isolated systems. If a cycle always comes back to where it started wouldn't entropy decrease as well as increase over and over again? With that in mind and the fact that there really are no examples of an isolated system can the 2nd law of thermo hold up?

Entropy does not say that nature tends from order to disorder.

A non-dissipative cycle is in agreement with the second law, because the second law also consider isoentropic evolutions.

Moreover, the second law is independent of the boundary conditions and also holds for non-isolated systems. For open and close systems the second law predicts a non-negative production of (average) entropy, which is perfectly verified.
 
  • #75
I can agree with average. What goes up most come down.
 
  • #76
bill alsept said:
I can agree with average. What goes up most come down.
There are situations where it's true, situations where it isn't. That particular aphorism most certainly does not apply to everything.
 
  • #77
bill alsept said:
I can agree with average. What goes up most come down.

I mean that, contrary to a common misconception (also showed in this thread), the second law is not a statistical law being violated in fluctuations, because it does predictions about the average entropy <S>, not about the fluctuating entropy.

This is true for any other part of physics, just as Maxwell laws make predictions about the average electric and magnetic fields <E>,<B> not about its fluctuations, just as Einstein G_ab = T_ab is in reality a statement about the averages <G_ab> = <T_ab>.
 
  • #78
juanrga said:
I mean that, contrary to a common misconception (also showed in this thread), the second law is not a statistical law being violated in fluctuations, because it does predictions about the average entropy <S>, not about the fluctuating entropy.

This is true for any other part of physics, just as Maxwell laws make predictions about the average electric and magnetic fields <E>,<B> not about its fluctuations, just as Einstein G_ab = T_ab is in reality a statement about the averages <G_ab> = <T_ab>.
I don't know what you're trying to say here, but the second law of thermodynamics, as we understand it from statistical mechanics, is an approximate law that is accurate except on very small scales or for very long timescales.

That is to say, no matter the size of your system, if you wait for long enough you will see significant deviations from the second law. Similarly, if you are only willing to wait a fixed amount of time, you will see deviations from the second law if you look at very small systems.
 
  • #79
Chalnoth said:
I don't know what you're trying to say here, but the second law of thermodynamics, as we understand it from statistical mechanics, is an approximate law that is accurate except on very small scales or for very long timescales.

That is to say, no matter the size of your system, if you wait for long enough you will see significant deviations from the second law. Similarly, if you are only willing to wait a fixed amount of time, you will see deviations from the second law if you look at very small systems.

I was correcting a misconception that appears too often in misguided literature by non-experts.

Using statistical mechanics we can show that the Laws of Newton, of Maxwell, of kinetics, of hidrodynamics, and the Hilbert-Einstein 'field' equations are only valid in an average sense. Evidently nobody would say that those laws are «only statistical laws violated by fluctuations» because all those laws only refer to the average and, therefore, say nothing about the fluctuations.

Introducing fluctuations we can generalize Newton, Maxwell, kinetics, hidrodynamics, and even GR. For example fluctuating hydrodynamics extends the equations of hydrodynamics incorporating fluctuations in density, pressure, speed...

The same about thermodynamics, the variation of the fluctuating entropy [itex]S[/itex] in an isolated system is given by

[tex]\frac{dS}{dt} = \frac{d\langle S\rangle}{dt} + \frac{d\delta S}{dt}[/tex]

The second law is a statement about [itex]{d\langle S\rangle}/{dt}[/itex] not about the fluctuation. The fluctuation term is studied using the thermodynamic theory of fluctuations which says that [itex]{d\delta S}/{dt}[/itex] can be positive, negative, or zero. Therefore a measurement of a fluctuation does not invalidate thermodynamics.

Misguided literature by non-experts confounds [itex]S[/itex] with [itex]\langle S\rangle[/itex] and makes the incorrect claim that you repeat.

You must also take a look to http://arxiv.org/abs/cond-mat/0207587 and how fluctuations are in perfect agreement with thermodynamic laws.
 
Last edited:
  • #80
Whoever said the contrary?

It was just emphasized that there is nothing inconsistent between the 2nd law and having negative fluctuations in entropy.

I believe there are rigorous theorems about fluctuations as well, to the extent that one can show (for a variety of types of systems) that the positive fluctuations are far more probable than the negative ones (thus proving the 2nd law).
 
  • #81
juanrga said:
Using statistical mechanics we can show that the Laws of Newton, of Maxwell, of kinetics, of hidrodynamics, and the Hilbert-Einstein 'field' equations are only valid in an average sense. Evidently nobody would say that those laws are «only statistical laws violated by fluctuations» because all those laws only refer to the average and, therefore, say nothing about the fluctuations.
Why not? It's true.

And anyway, statistical mechanics is a bit different here, at least on very long timescales, because of Poincare recurrence.

juanrga said:
The same about thermodynamics, the variation of the fluctuating entropy [itex]S[/itex] in an isolated system is given by

[tex]\frac{dS}{dt} = \frac{d\langle S\rangle}{dt} + \frac{d\delta S}{dt}[/tex]

The second law is a statement about [itex]{d\langle S\rangle}/{dt}[/itex] not about the fluctuation. The fluctuation term is studied using the thermodynamic theory of fluctuations which says that [itex]{d\delta S}/{dt}[/itex] can be positive, negative, or zero. Therefore a measurement of a fluctuation does not invalidate thermodynamics.
When [itex]d\delta S/dt[/itex] is of the same order as [itex]d\langle S \rangle/dt[/itex], which happens on very long timescales, most would consider that a violation of the second law.

I think the thing that you're doing here is redefining the second law of thermodynamics, which was originally devised empirically with no connection whatsoever to statistical mechanics, into a different form that is consistent with statistical mechanics. But this is a new second law, it is not the second law that most everybody writes down.

Of course, expanding the second law of thermodynamics to be consistent with statistical mechanics is a relatively small change. But it is still a change.
 
  • #82
Chalnoth said:
Why not?

Explained in the same quote that you cite.

Chalnoth said:
When [itex]d\delta S/dt[/itex] is of the same order as [itex]d\langle S \rangle/dt[/itex], which happens on very long timescales, most would consider that a violation of the second law.

I think the thing that you're doing here is redefining the second law of thermodynamics, which was originally devised empirically with no connection whatsoever to statistical mechanics, into a different form that is consistent with statistical mechanics. But this is a new second law, it is not the second law that most everybody writes down.

Of course, expanding the second law of thermodynamics to be consistent with statistical mechanics is a relatively small change. But it is still a change.

When [itex]d\delta S/dt[/itex] is of the same order as [itex]d\langle S \rangle/dt[/itex], the second law is not violated because its prediction for [itex]d\langle S \rangle/dt[/itex] continues unchanged. The second law says nothing about [itex]d\delta S/dt[/itex], therefore any measurement of that term cannot violate the law.

Classical thermodynamics and its second law were always about the average quantities, (just as GR, Maxwell, hydrodynamics... were always about the averages as well). The thermodynamic treatment of fluctuations was initiated about 1920.

The introduction of fluctuations is not «expanding the second law of thermodynamics» as you claim. Of course, the second law remains unchanged by the thermodynamic theory of fluctuations.

It is only people who has never studied thermodynamics beyond a basic course (or even less than that) who is seriously confused about thermodynamics and make misguided claims. Read the Arxiv preprint again, specially the part:

It remains to stress that none of formulations of the second law known to us ever claimed that unaveraged entropy production or unaveraged work must be positive; see e.g.4,5,6,7,8,9,10.
 
  • #83
juanrga said:
Explained in the same quote that you cite.



When [itex]d\delta S/dt[/itex] is of the same order as [itex]d\langle S \rangle/dt[/itex], the second law is not violated because its prediction for [itex]d\langle S \rangle/dt[/itex] continues unchanged. The second law says nothing about [itex]d\delta S/dt[/itex], therefore any measurement of that term cannot violate the law.
If you take that stance, then nothing can possibly violate the second law.
 

Similar threads

  • Cosmology
Replies
10
Views
2K
Replies
5
Views
1K
  • Special and General Relativity
Replies
7
Views
289
Replies
13
Views
1K
  • Atomic and Condensed Matter
Replies
1
Views
1K
Replies
17
Views
1K
  • Thermodynamics
Replies
1
Views
729
  • Thermodynamics
Replies
2
Views
771
  • Thermodynamics
2
Replies
57
Views
6K
Replies
5
Views
1K
Back
Top