Are cycles and entropy compatible?

  • Thread starter Thread starter bill alsept
  • Start date Start date
  • Tags Tags
    Cycles Entropy
AI Thread Summary
The discussion centers on the compatibility of cycles and entropy within the framework of the second law of thermodynamics. It explores the idea that while entropy tends to increase in isolated systems, there may be scenarios, particularly over long timescales or in small systems, where entropy can both increase and decrease. Participants argue that if the universe operates in cycles, it could theoretically return to a low-entropy state, challenging the notion that entropy must always increase. The conversation also touches on the complexities of defining entropy in the context of gravitational fields and quantum effects during cosmic events like bounces. Ultimately, the dialogue highlights ongoing debates in physics regarding the nature of entropy and its implications for cosmology.
  • #51
bill alsept said:
I was talking about "one of the actual ones". I just used the word "truly" instead of "actual"

I just made the statement to clarify between the anologies some people were using. Anologies that described systems in bottles that may cycle and I suppose in some quantum equation an argument is made to support that. What I ment by a true cycle was a system that cycles back to an original state such as a singularity and then inflates to it's maximun and then condences back to the same singularity again. A true cycle would also have a time signature such as in the atomic level. As for the original question about entropy it seems that everyone agrees that what ever kind of cycle it is the entropy does not always gain. Far from it it seems to stay equal after each cycle.
I don't see how there's any use in specifying that some kinds of recurrence are "actual" while other kinds are not. This kind of recurrence is highly unlikely, however.
 
Space news on Phys.org
  • #52
OMG, I'm reading the papers on de Sitter space and Poincare recurrence. It's weird stuff...

OMG. OMG. OMG.

http://arxiv.org/abs/hep-th/0208013

I see what the issue is. Susskind has proposed a solution to the black hole information paradox in which information never gets destroyed, and it turns out that if information never gets destroyed by tossing it down a black hole event horizon, then it doesn't get destroyed in a deSitter universe when objects move outside the cosmological event horizon. If the amount of information stays the same, then eventually things will repeat.

The alternative is that Hawking is right, and information does get destroyed when you toss it either into a black hole or when it leaves the event horizon. If that happens when things won't repeat. What will happen is that once something goes outside of the cosmological event horizon, it's gone for good. That means that the laws of physics are not unitary.

What I didn't understand was that I was imagining an expanding universe with an event horizon, and then when something goes outside of the event horizon, it's "gone" so over time things will get more and more lonely with no recurrence. Susskind is arguing that this won't happen. The event horizon of the universe is mathematically identical to the event horizon of a black hole so that you will get Hawking radiation from the cosmological horizon just like you will get Hawking radiation from the black hole, and if that Hawking radiation contains any information, then things will reboot.

The paper is called "Disturbing Implications of a Cosmological Constant"

I find the second option, less disturbing, but it's still plenty disturbing.
 
Last edited:
  • #53
bill alsept said:
Which "truly cycles" were you talking about? Random recurrence or was it one of the actual cyclic universe models?

I was talking about "one of the actual ones". I just used the word "truly" instead of "actual"

I just made the statement to clarify between the anologies some people were using. Anologies that described systems in bottles that may cycle and I suppose in some quantum equation an argument is made to support that. What I ment by a true cycle was a system that cycles back to an original state such as a singularity and then inflates to it's maximun and then condences back to the same singularity again. A true cycle would also have a time signature such as in the atomic level. As for the original question about entropy it seems that everyone agrees that what ever kind of cycle it is the entropy does not always gain. Far from it it seems to stay equal after each cycle.

So it seems some of the posts in this thread are not relevant. Many of them are about RANDOM RECURRENCE. Stuff that happens by accident after an indefinite wait of jillion gazillion years.

In cosmology research the thing about contracting, rebounding, and expanding is often called a "bounce".
A lot of papers these days study bounce cosmologies. That's different from random recurrence.

The simplest case of it need not even repeat---might just consist of a single bounce.

That is a good test case to study. One can ask did the U we see result from a bounce whether or not it was one of an infinite series of bounces.

There might be some traces of a bounce in the CMB that we can observe. It makes sense to ask if there was a bounce---are we in the rebound from a collapsing classical U?---without trying to answer the question right away about whether it's an infinite series.

And with any bounce cosmology (cyclic or not) you can ask about entropy. That's what I was trying to get at in my earlier posts #5 and #7 in this thread.
 
Last edited:
  • #54
bill alsept said:
When I asked "Are cycles and entropy compatible?" I thought it could be a yes or no answer. I see now that maybe I asked the question wrong. I should have asked "How can the interpritation of the 2nd law of thermo which states that (the entropy of an isolated system always increases or remains constant) be compatible with a system that cycles?" Most of the anologies used in the responces of this thread so far seem to support the conclusion that in a system that cycles entropy stays equal and will decrease just as much as it will increase there for it cycles.

OK, so let's consider it with respect to a specific cyclic system... say a single simple harmonic oscillator. Can you define an entropy for this system? Answer Yes!

Again this goes back to understanding that entropy is not about disorder vs order but about empirically defined ignorance vs knowledge about system state.

Classically: A SHO's phase space diagram [x,p] will show the orbit of the oscillator state following an ellipse centered about the origin the size of which is defined by the energy. Relative entropy will correspond to logarithms of areas in phase space.
Given you know a range of energies for the SHO and only that then you know the state is a point inside the areas between two ellipses of phase space. This area defines a class of possible systems' in that it defines a range of possible states of a given system. Note that as the system evolves you also know that the state stays within the defined region so over time the entropy is unchanged.

Alternatively if you know the initial conditions up to some error bars x1 < x(0) < x2, p1 < p(0) < p2, you can define its initial state to within a given area (with S = k_{boltz}\ln(A))). By Louiville's theorem you can watch each point in the initial area evolve and its area will not change so neither will the entropy.

One can go further and more general and define a probability distribution over phase space. Louiville's theorem will manifest as a conservation of entropy for the evolution of the distribution over time.
S =- k_{bolt}\int f(x,p) \ln(f(x,p)) dxdp where f is the probability density. Try it with a uniform (constant) density over an area of phase space and see you recover

Now this example isn't very interesting or useful but it shows how entropy is defined based on knowledge about the system state. Now consider many such oscillators and you combine the phase spaces into a single composite space. One then works with "hyper-volumes" instead of areas but it works out the same. Start with an uncertain initial condition and the entropy is defined and Louiville's theorem still applies, the future volume of phase space in which we can know the system resides is of fixed volume (but you'll note it gets stretched out and wrapped around many times. Couple the oscillators to each other in a definite way and still the entropy remains constant.

But if you couple the system to an external source or allow random coupling between oscillators then this randomness adds uncertainty to the future state and the area or distribution spreads out. Entropy increases. No amount of random(=unknown) coupling to the outside world or internally will reduce our ignorance about where the system state will be and thus entropy cannot be decreased this way. That's the 2nd law when one is considering random internal coupling.

One can however couple the system to the outside world in a specific way to reduce the entropy (refrigeration). In particular we can observe the system state...which starts us down the road of QM where we must insist that an observation is a physical interaction even if in the classical case the interaction has infinitesimal effect on the system per se.

The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.
 
Last edited:
  • #55
jambaugh said:
The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.
Except that in the quantum mechanical sense, the entropy of the system is directly related to the number of possible configurations of that system. Being related to the number of possible configurations, the maximum entropy and the maximum recurrence time are closely linked.
 
  • #56
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?
 
  • #57
bill alsept said:
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?
Again, as I've said before, as long as you're dealing with time scales much shorter than the recurrence time, this is a valid statement (though one caveat: it's only valid for closed systems...open systems like the Earth can have their entropy decrease quite easily).
 
  • #58
bill alsept said:
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?

Be careful Bill :biggrin:
It sounds mighty naive to assert (without explanation) that entropy can always be defined.

You need things in order to be able to define the entropy----like microstates and a map of the macrostate regions that corresponds to what someone can measure.

The mathematical resources you require in order for the entropy to be well defined are precisely the resources you lack at the Loop cosmology bounce.
 
Last edited:
  • #59
Yes I forgot there are other ways to define entropy. For this Cosmology thread and the original question I just asumed everyone was talking about a measure of the randomness. However it's defined I still don't understand how entropy can increase any more than it decreases unless something is being added to the system.
 
  • #60
bill alsept said:
Yes I forgot there are other ways to define entropy. For this Cosmology thread and the original question I just asumed everyone was talking about a measure of the randomness. However it's defined I still don't understand how entropy can increase any more than it decreases unless something is being added to the system.

Well you know just saying "measure of randomness" does not say anything. In order to have a definite number you need a mathematical definition. This usually involves a "state space".
A collection of possible states the system can be in.
There is usually an observer in the picture who is able to do certain measurements. and he tends to lump together large collections of detailed "micro" states all of which look the same to him---in terms of what matters to him, like temperature, pressure etc.
Depending on who is defining entropy, there may be probability measures on the states, or on the macro collections of states that are lumped together as equivalent from the observers point of view.

Anyway no matter how you choose to mathematically define entropy, you need some math junk to do it. By itself a word like "randomness" or "disorder" does not mean anything quantitative.

So think about a function of time that is always increasing but fails to be defined at t=0

Like f(t) = -1/x

This is not meant to be the entropy of some system, it is just an example of a function, to illustrate.
The function is always increasing wherever it is defined. And yet its value at positive times t>0 is always less than what its value was at any negative time t<0.

You can construct more realistic looking entropy functions. The point is:
In order to return to an earlier value the entropy function never has to decrease. It can always be increasing, wherever it is defined, and yet it can pass through the same value again and again.

So you CAN imagine entropy decreasing on a regular basis (you were talking "cyclic") but you do not HAVE to imagine it decreasing. There simply need to be moments in time when it is impossible to define. (Or to define correctly, in a consistent unambiguous way.)
 
Last edited:
  • #61
Entropy increase is not an absolute, as several people have stated. The existence of cycles in states demonstrates that entropy can increase and decrease.
But entropy is still a useful and important concept on smaller time scales.
 
Last edited:
  • #62
jambaugh said:
Again this goes back to understanding that entropy is not about disorder vs order but about empirically defined ignorance vs knowledge about system state.

Maybe. That's why the black hole information paradox is interesting. You toss some stuff into a black hole. Within a finite time, classical GR says that it gets crushed to a singularity. Now the question is "is the information still there or not"? Some people (namely Hawking) believe that black holes in fact destroy information so it's not merely a matter of being ignorant of the internal state of the black hole as the black hole has no internal states. Other's disagree (Susskind).

This matters now that it appears we have a positive cosmological constant because the event horizon at the edge of the universe has the same issues as the event horizon at the edge of a black hole.

One reason I find this fascinating is that it turns out that you can figure out a lot about quantum mechanics from classical thermodynamics, and it turns out that QM resolves the "Gibbs paradox." Trying to figure out whether or not the universe can really destroy information or not gives us some clues as to what quantum gravity looks like.

The cyclic nature of the system is immaterial to the entropy because entropy is not about the actual system state but about our knowledge of it.

With springs and pendulum, you can argue this. Now with black holes, what entropy means is not clear. It has to mean something. One thing that just won't work is to have a black hole that is really black. If it was the case that if you through something into a black hole and nothing comes out except gravity, then you can show that this violates thermodynamics.

You can also get anthropic. One thing that you can argue (and I think Max Tegmark argues this) is that in order to have a comprehensible universe, you need an "arrow of time." It could very well be that there are an infinite number of universes in which the laws of physics are such that the second law of themodynamics does not hold, but it's difficult to see how you can have intelligence without an arrow of time.
 
  • #63
bill alsept said:
However it's defined I still don't understand how entropy can increase any more than it decreases unless something is being added to the system.

You can think of entropy as anti-information. I have a 500-MB CD rom with old pictures in it. If I take a hammer to that CD rom, I've destroyed those pictures and I've increased the entropy of the world by 500-MB (and you can measure entropy and to thermodynamics using Megabytes). Now if I leave that CD-rom in a cupboard by itself, what will happen is that it will spontaneously decay, and if I leave it long enough, the pictures on it will decay.

The opposite doesn't happen. If I leave a blank old CD I won't expect my photo album to spontaneously appear. If you turn off your computer, you expect to lose whatever work you had on the computer, however you don't expect that if you start off with a blank computer that you end up with the complete works of Shakespeare.

Also the thermodynamics of information is an very active area of physics research. One thing about computers is that they end up getting hot, and that's annoying when you are trying to run a laptop. It turns out that some of the limits on how cool you can run a laptop result from some fundamental interactions between heat and information. Erasing data increases the entropy in the world which produces heat. So the reason laptops run hot is that it's doing a lot of calculations. Every time something gets erased in the CPU or memory, this increases entropy, and an increase in entropy corresponds to an increase in heat.

Conversely, one tried and true way of erasing information is to burn it.
 
Last edited:
  • #64
marcus said:
You need things in order to be able to define the entropy----like microstates and a map of the macrostate regions that corresponds to what someone can measure.

You can define entropy "bottom up." You can also define entropy "top down". Define temperature as what you measure when you put a thermometer it it. Define entropy as a function the energy you put into a system versus how much the temperature changes.

That's entropy.

Now it's not obvious that this has anything to do with randomness, but the cool thing about physics is how some non-obvious things are actually related.
 
  • #65
bill alsept said:
Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?

I don't think it can.

But I think a lot of this has to do with conflicting definitions of entropy. There's a theoretical statistical mechanical definition and a observational thermodynamic definition, and I think they end up in conflict.
 
  • #66
bill alsept said:
Of course we can always define entropy and sometimes measure it. Any system will have some changing level of entropy. My dispute is with those who believe entropy will only increase. A cycle will return to it's starting point or back to it's lowest level of entropy. So if you believe in cycles how can entropy always increase?

Your reasoning here is the same reasoning that goes into "a system's entropy is always zero since it is always in some single fixed state"... even if we don't know what that state is.

The fact that the system cycles to its initial condition does not imply the return to lower entropy. Consider... if you allow a system to evolve from a low entropy state in the deterministic way required by the assumptions of cyclic behavior, you know the entropy cannot change. The whole cycle is by virtue of being a cycle reversible. You are in particular thinking in terms of zero entropy systems.

Remember also that when you are thinking in terms of such cyclic systems the dynamics itself is an external constraint. All I need to break the cyclic assumption is some non-periodic time variation in the dynamics. How that relates to the entropy then is in the fact that via coupling to the dynamics the system couples to the external world...
for a gas in a box there's the box's walls, for a mass and spring there is the spring's mounting point. For a freely falling vibrating elastic body there's still gravitational coupling to the rest of the universe...

now for the universe as a whole (in so far as such can be defined meaningfully as a physical system) you're perfectly free to say it cycles over some hyper-astronomical period and I'll assert its entropy is zero by invoking QM and sub-additivity of entropy and defining entropy as entanglement with one's environment.
 
  • #67
twofish-quant said:
But I think a lot of this has to do with conflicting definitions of entropy. There's a theoretical statistical mechanical definition and a observational thermodynamic definition, and I think they end up in conflict.

The conflict is only apparent. If you carefully parse the operational meaning of each definition you find compatibility (provided of course you use consistent physical assumptions.)
 
  • #68
The point of my original question was not so much to try and define entropy. There are some who say that the universe cannot cycle because they believe entropy (no matter how you define it) will increase so much that the cycle disintegrates.
 
  • #69
bill alsept said:
The point of my original question was not so much to try and define entropy. There are some who say that the universe cannot cycle because they believe entropy (no matter how you define it) will increase so much that the cycle disintegrates.

I'm curious who those people are since the consensus in this discussion seems to be that this statement is incorrect.
 
  • #70
bill alsept said:
The point of my original question was not so much to try and define entropy. There are some who say that the universe cannot cycle because they believe entropy (no matter how you define it) will increase so much that the cycle disintegrates.

Given the sub-additivity of quantum entropy, this objection needn't be applicable. We may observe parts of the universe increasing in entropy without the entropy of the whole changing... this occurs as the parts entangle over time.

Along those same lines trying to define "the entropy of the universe" by adding the entropies of parts (e.g. by integrating an entropy density over the spatial universe) is not appropriate as it does not take into account spatially separated quantum correlations.

Again you can understand entropy of a system as the amount to which that system is entangled with the rest of the universe... and define the entropy of the whole universe as fixed and equal to zero since there is nothing external to which it is entangled. Now the visible universe, on the other hand... (i.e. the universe outside the interiors of the many BH's floating around).
 
  • #71
jambaugh said:
Given the sub-additivity of quantum entropy, this objection needn't be applicable. We may observe parts of the universe increasing in entropy without the entropy of the whole changing... this occurs as the parts entangle over time.

This doesn't make sense to me. I have a cup of ice. There is an entropy associated with that cup of ice. (41 J/ (mol K)) Now if I watch a cup of ice melt into water and the entropy is now 70 J(mol K), are you trying to tell me that because of some weird quantum entanglement effect the entropy of the universe is constant? That doesn't make sense to me. Now you may be able to define some quantity that does stay constant, but that doesn't seem to have any connection with what an engineer would call entropy.

Along those same lines trying to define "the entropy of the universe" by adding the entropies of parts (e.g. by integrating an entropy density over the spatial universe) is not appropriate as it does not take into account spatially separated quantum correlations.

Doesn't make sense to me. I'm watching ice with an entropy of 41 J/(mol K) turn into water with an entropy of 70 J/(mol K). What's it quantum entangled with. Where is the quantum correlation? I'm watching ice melt. No quantum entanglements, that I can see.

What you seem to be saying is that anytime ice melts, then there is some weird quantum mechanical effect that causes something weird to happen in some other part of universe.

I really don't think this makes sense.

Again you can understand entropy of a system as the amount to which that system is entangled with the rest of the universe...

I'm just watching ice melt. Are you saying that I can't understand ice melting without quantum entanglements?
 
  • #72
twofish-quant said:
This doesn't make sense to me. I have a cup of ice. There is an entropy associated with that cup of ice. (41 J/ (mol K)) Now if I watch a cup of ice melt into water and the entropy is now 70 J(mol K), are you trying to tell me that because of some weird quantum entanglement effect the entropy of the universe is constant?
Yes
What you seem to be saying is that anytime ice melts, then there is some weird quantum mechanical effect that causes something weird to happen in some other part of universe.
Not that something weird happens on the other part of the universe, and in fact the rest of the universe excluding the melting ice also has it's entropy go up. But as the ice melts, it is interacting with everything else and there are correlations between the ice's state and the "rest-of-the-universe"'s state. These correlations when included in the calculation of the entropy of the universe-as-a-whole will reduce the sum of the entropes if you just add the calculated entropy of each part.
I'm just watching ice melt. Are you saying that I can't understand ice melting without quantum entanglements?
Of course not. Likewise we don't need QM to track the Moon's orbit. And as long as one is talking about some piece of the universe (which necessarily excludes the mechanisms used to observe that piece) then you needn't worry about this.

Now you find this idea perplexing and counter-intuitive. Well it is, as is so much in QM. But you can "do the math". Take a system of two particles which are maximally entangled, say two spin-1/2 particles in a sharp spin-0 composite state. Now let one of the particles reside in your "system" and shoot the other one into space, or better yet into a Black Hole.

Since the composite is sharply defined it has zero entropy.
To describe one of the particles alone and you must do a partial trace over the other, you get a maximum entropy density matrix diag(1/2,1/2). Its entropy = k ln(2). (k = Boltzmann's constant.)

Likewise if you were to determine the entropy of the other half of the entangled pair you'd get k ln(2). Add and you get 2kln(2) but that's not the entropy of the composite. It is rather 0. Entropy doesn't "add up" in QM.

Finally let me mention that when you have an entangled pair, it typically undergoes decoherence due to interaction with the environment. But that's just the entanglement "swapping" to other systems. The particle in hand soon becomes entangled with photons (which have interacted with the entangled partner) shooting off into space at speed c, never to be recovered. That's where the irreversibility comes into play.

Similarly if you consider a high entropy system and wish to "refrigerate" it to lower entropy you are basically swapping the entanglements between your lab system and far flung photons to entanglement between your heat sink and far flung photons. Your system's entropy goes down... and remarkably the entropy of all but your system goes down since there are now more correlations within the exosystem.

Ultimately according to this view of entropy, at any given time the entropy of a system is equal to the entropy of all the universe minus that system which is to say both represent the same quantity. It is the amount of entanglement across the boundary between them.

Weird indeed! No?
 
Last edited:
  • #73
I don't really see how this view works. It seems to be relying upon a tautology: when I define the macrostate as the microstate, the entropy is zero. What we do in reality is very different. The macrostate is defined as a set of observables that are due to the collective behavior of a large number of degrees of freedom. And when you have that sort of situation, you very much can talk about overall changes in entropy, whether you're talking about a quantum-mechanical system or not.

A simple example here is that of an evaporating black hole in de Sitter space-time: due to the horizons, we have definite definitions of the entropy of the black hole as it is evaporating and after the evaporation. This is a fully-quantum system, and the total entropy definitively increases. It increases because the macrostates we're considering (the horizon areas) are composed of a tremendous number of quantum degrees of freedom.
 
  • #74
bill alsept said:
Entropy is that nature tends from order to disorder in an isolated systems. If a cycle always comes back to where it started wouldn't entropy decrease as well as increase over and over again? With that in mind and the fact that there really are no examples of an isolated system can the 2nd law of thermo hold up?

Entropy does not say that nature tends from order to disorder.

A non-dissipative cycle is in agreement with the second law, because the second law also consider isoentropic evolutions.

Moreover, the second law is independent of the boundary conditions and also holds for non-isolated systems. For open and close systems the second law predicts a non-negative production of (average) entropy, which is perfectly verified.
 
  • #75
I can agree with average. What goes up most come down.
 
  • #76
bill alsept said:
I can agree with average. What goes up most come down.
There are situations where it's true, situations where it isn't. That particular aphorism most certainly does not apply to everything.
 
  • #77
bill alsept said:
I can agree with average. What goes up most come down.

I mean that, contrary to a common misconception (also showed in this thread), the second law is not a statistical law being violated in fluctuations, because it does predictions about the average entropy <S>, not about the fluctuating entropy.

This is true for any other part of physics, just as Maxwell laws make predictions about the average electric and magnetic fields <E>,<B> not about its fluctuations, just as Einstein G_ab = T_ab is in reality a statement about the averages <G_ab> = <T_ab>.
 
  • #78
juanrga said:
I mean that, contrary to a common misconception (also showed in this thread), the second law is not a statistical law being violated in fluctuations, because it does predictions about the average entropy <S>, not about the fluctuating entropy.

This is true for any other part of physics, just as Maxwell laws make predictions about the average electric and magnetic fields <E>,<B> not about its fluctuations, just as Einstein G_ab = T_ab is in reality a statement about the averages <G_ab> = <T_ab>.
I don't know what you're trying to say here, but the second law of thermodynamics, as we understand it from statistical mechanics, is an approximate law that is accurate except on very small scales or for very long timescales.

That is to say, no matter the size of your system, if you wait for long enough you will see significant deviations from the second law. Similarly, if you are only willing to wait a fixed amount of time, you will see deviations from the second law if you look at very small systems.
 
  • #79
Chalnoth said:
I don't know what you're trying to say here, but the second law of thermodynamics, as we understand it from statistical mechanics, is an approximate law that is accurate except on very small scales or for very long timescales.

That is to say, no matter the size of your system, if you wait for long enough you will see significant deviations from the second law. Similarly, if you are only willing to wait a fixed amount of time, you will see deviations from the second law if you look at very small systems.

I was correcting a misconception that appears too often in misguided literature by non-experts.

Using statistical mechanics we can show that the Laws of Newton, of Maxwell, of kinetics, of hidrodynamics, and the Hilbert-Einstein 'field' equations are only valid in an average sense. Evidently nobody would say that those laws are «only statistical laws violated by fluctuations» because all those laws only refer to the average and, therefore, say nothing about the fluctuations.

Introducing fluctuations we can generalize Newton, Maxwell, kinetics, hidrodynamics, and even GR. For example fluctuating hydrodynamics extends the equations of hydrodynamics incorporating fluctuations in density, pressure, speed...

The same about thermodynamics, the variation of the fluctuating entropy S in an isolated system is given by

\frac{dS}{dt} = \frac{d\langle S\rangle}{dt} + \frac{d\delta S}{dt}

The second law is a statement about {d\langle S\rangle}/{dt} not about the fluctuation. The fluctuation term is studied using the thermodynamic theory of fluctuations which says that {d\delta S}/{dt} can be positive, negative, or zero. Therefore a measurement of a fluctuation does not invalidate thermodynamics.

Misguided literature by non-experts confounds S with \langle S\rangle and makes the incorrect claim that you repeat.

You must also take a look to http://arxiv.org/abs/cond-mat/0207587 and how fluctuations are in perfect agreement with thermodynamic laws.
 
Last edited:
  • #80
Whoever said the contrary?

It was just emphasized that there is nothing inconsistent between the 2nd law and having negative fluctuations in entropy.

I believe there are rigorous theorems about fluctuations as well, to the extent that one can show (for a variety of types of systems) that the positive fluctuations are far more probable than the negative ones (thus proving the 2nd law).
 
  • #81
juanrga said:
Using statistical mechanics we can show that the Laws of Newton, of Maxwell, of kinetics, of hidrodynamics, and the Hilbert-Einstein 'field' equations are only valid in an average sense. Evidently nobody would say that those laws are «only statistical laws violated by fluctuations» because all those laws only refer to the average and, therefore, say nothing about the fluctuations.
Why not? It's true.

And anyway, statistical mechanics is a bit different here, at least on very long timescales, because of Poincare recurrence.

juanrga said:
The same about thermodynamics, the variation of the fluctuating entropy S in an isolated system is given by

\frac{dS}{dt} = \frac{d\langle S\rangle}{dt} + \frac{d\delta S}{dt}

The second law is a statement about {d\langle S\rangle}/{dt} not about the fluctuation. The fluctuation term is studied using the thermodynamic theory of fluctuations which says that {d\delta S}/{dt} can be positive, negative, or zero. Therefore a measurement of a fluctuation does not invalidate thermodynamics.
When d\delta S/dt is of the same order as d\langle S \rangle/dt, which happens on very long timescales, most would consider that a violation of the second law.

I think the thing that you're doing here is redefining the second law of thermodynamics, which was originally devised empirically with no connection whatsoever to statistical mechanics, into a different form that is consistent with statistical mechanics. But this is a new second law, it is not the second law that most everybody writes down.

Of course, expanding the second law of thermodynamics to be consistent with statistical mechanics is a relatively small change. But it is still a change.
 
  • #82
Chalnoth said:
Why not?

Explained in the same quote that you cite.

Chalnoth said:
When d\delta S/dt is of the same order as d\langle S \rangle/dt, which happens on very long timescales, most would consider that a violation of the second law.

I think the thing that you're doing here is redefining the second law of thermodynamics, which was originally devised empirically with no connection whatsoever to statistical mechanics, into a different form that is consistent with statistical mechanics. But this is a new second law, it is not the second law that most everybody writes down.

Of course, expanding the second law of thermodynamics to be consistent with statistical mechanics is a relatively small change. But it is still a change.

When d\delta S/dt is of the same order as d\langle S \rangle/dt, the second law is not violated because its prediction for d\langle S \rangle/dt continues unchanged. The second law says nothing about d\delta S/dt, therefore any measurement of that term cannot violate the law.

Classical thermodynamics and its second law were always about the average quantities, (just as GR, Maxwell, hydrodynamics... were always about the averages as well). The thermodynamic treatment of fluctuations was initiated about 1920.

The introduction of fluctuations is not «expanding the second law of thermodynamics» as you claim. Of course, the second law remains unchanged by the thermodynamic theory of fluctuations.

It is only people who has never studied thermodynamics beyond a basic course (or even less than that) who is seriously confused about thermodynamics and make misguided claims. Read the Arxiv preprint again, specially the part:

It remains to stress that none of formulations of the second law known to us ever claimed that unaveraged entropy production or unaveraged work must be positive; see e.g.4,5,6,7,8,9,10.
 
  • #83
juanrga said:
Explained in the same quote that you cite.



When d\delta S/dt is of the same order as d\langle S \rangle/dt, the second law is not violated because its prediction for d\langle S \rangle/dt continues unchanged. The second law says nothing about d\delta S/dt, therefore any measurement of that term cannot violate the law.
If you take that stance, then nothing can possibly violate the second law.
 
Back
Top