What is the real second law of thermodynamics?

Click For Summary
SUMMARY

The second law of thermodynamics states that the entropy of a closed system will increase over time, as articulated through the Clausius and Kelvin statements. The discussion highlights that while entropy generally increases, the Poincare Recurrence theorem suggests that a closed system can return to a previous state, albeit over an impractically long timescale. The conversation also emphasizes the distinction between classical and statistical thermodynamics, noting that the second law is fundamentally a statistical statement rather than an absolute one. Additionally, the fluctuation-dissipation theorem indicates that entropy can decrease in small systems under specific conditions.

PREREQUISITES
  • Understanding of classical thermodynamics principles
  • Familiarity with statistical mechanics concepts
  • Knowledge of the Poincare Recurrence theorem
  • Basic grasp of quantum mechanics and its implications on thermodynamics
NEXT STEPS
  • Study the implications of the Poincare Recurrence theorem in thermodynamic systems
  • Explore the fluctuation-dissipation theorem and its applications in statistical mechanics
  • Investigate the differences between classical and quantum mechanical interpretations of entropy
  • Learn about the Boltzmann distribution and its significance in thermodynamic equilibrium
USEFUL FOR

Physicists, thermodynamics students, and researchers interested in the foundational principles of entropy and its implications in both classical and quantum systems.

  • #61


lalbatros said:
Is the second principle a physical law at all?
It is a law because it successfully and consistently predicts how things will actually behave. It is never violated. It is not that it is physically impossible for the second law to be violated. It is just that it is statistically impossible for it to be violated.

Even if you are dealing with a relatively small number of molecules, the second law will not be violated. Suppose you had a billionth of a trillionth of a mole (10-21 mole or about 600 molecules) of a gas at temperature T in a container. Since the motions are random, could the gas spontaneously separate out into fast molecules on one half of the container and slow one's on the other for a period long enough to detect it?

The probability of even that occurring is so infinitessimally small that you would have to wait longer than the age of the universe before it would happen anywhere in the universe. So the answer is: "no".

AM
 
Science news on Phys.org
  • #62


A. Neumaier said:
Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known.

That's not true- my parents sure did, and I suspect yours did as well.
 
  • #63


Pythagorean said:
But to the point with thermodynamics in general, I was interested in a physical motivation for something I've only seen motivated from mathematics when I took undergrad thermodynamics (Reif). I was totally turned off by it at the time, and am just now getting back into thermodynamics.

Susskind provided the physical motivation pedagogically, atyy continues to provide research motivation for the question in his links. I'm not insistent on invoking QM, the question still remains without it.
<snip>

The original physical motivation for 'entropy' was simply developing more efficient engines, and the realization that the only determining factor is the temperature difference between the inlet and outlet.

In fact, temperature has been conspicuously missing from this thread.
 
  • #64


Andy Resnick said:
That's not true- my parents sure did, and I suspect yours did as well.
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.
 
  • #65


Andy Resnick said:
In fact, temperature has been conspicuously missing from this thread.
Once you keep temperature constant, entropy becomes essentially irrelevant.
The governing extremal principle is then one of minimizing a free entropy rather than maximizing entropy.
 
  • #66


atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

There's a lot in that document that really bugs me- for example:

"But there is another name for statistical mechanics: Thermodynamics." (p. 11). That is not true- they are quite different. AFAIK, SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems, but many of the concepts from thermostatics (temperature, for example) become more difficult to define.

http://pre.aps.org/abstract/PRE/v49/i2/p1040_1

Also, mechanics uses even-dimensional symplectic spaces, while thermodynamics uses odd-dimensional contact spaces.

"But entropy is simply a fancy word for “disorder”." (p.12). Also not true.

There is also an insistence that there is a mechanical theory of heat, which AFAIK, is not true. Dissipative mechanisms cannot be explained in terms of conservative forces.

"The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity." (p. 17). Not true- there are 'objective' measures of information (Kolmogorov). Also, because there is an absolute scale for temperature there is an absolute scale for entropy. That's besides the fact that entropy *changes* are objective.
 
  • #67


A. Neumaier said:
Once you keep temperature constant, entropy becomes essentially irrelevant.
The governing extremal principle is then one of minimizing a free entropy rather than maximizing entropy.

That's not the point: we can't be so cavalier when discussing nonequilibrium systems. Entropy can change if the temperature is constant: chemical reactions, mixing/diffusion, etc.

I'm not sure what you mean by 'free entropy'- the opposite of free energy? Negentropy?
 
  • #68


A. Neumaier said:
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.

Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
 
  • #69


Andy Resnick said:
Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
Yes, but this doesn't involve a violation of the second law.

There are two processes affecting the entropy distribution: 1. local entropy production, and 2. flow of entropy. The former is nonnegative at all places (this is the second law), whereas the latter redistributes the entropy between places in a conserved fashion.

For example, in fluid flow, local entropy production is associated with dissipative terms in the Navier-Stokes equations. Neglecting the entropy production results in the Euler equations. The latter are conservative but have an entropy current, which - unlike with Navier-Stokes - together with the entropy density satisfies the continuity equation.

The low entropy of biochemical substances (or, for that matter, inanimate crystals) is a result of an entropy current during their formation that dominates the (still positive) entropy production.

All this can be read in books on nonequilibrium thermodynamics. My favorite book is Reichl's Modern Course in Statistical Mechanics.
 
  • #70


Andy Resnick said:
There's a lot in that document that really bugs me- for example:

"But there is another name for statistical mechanics: Thermodynamics." (p. 11). That is not true- they are quite different. AFAIK, SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems, but many of the concepts from thermostatics (temperature, for example) become more difficult to define.

http://pre.aps.org/abstract/PRE/v49/i2/p1040_1

Also, mechanics uses even-dimensional symplectic spaces, while thermodynamics uses odd-dimensional contact spaces.

"But entropy is simply a fancy word for “disorder”." (p.12). Also not true.

There is also an insistence that there is a mechanical theory of heat, which AFAIK, is not true. Dissipative mechanisms cannot be explained in terms of conservative forces.

"The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity." (p. 17). Not true- there are 'objective' measures of information (Kolmogorov). Also, because there is an absolute scale for temperature there is an absolute scale for entropy. That's besides the fact that entropy *changes* are objective.

I'll just discuss your last point, since the previous points seem mainly about terminology.

The last point is in the context of a Hamiltonian system, where Liouville's theorem guarantees that the volume of phase space remains the same. So for a Hamiltonian system, then entropy increase is subjective.

If the system is not Hamiltonian, then there can be objective increases in entropy.
 
  • #71


Andy Resnick said:
SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems
So does statistical mechanics (SM in your post); it is not restricted to computing partition functions and what can be deduced from it.

See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.
 
  • #72


Andrew Mason said:
It is a law because it successfully and consistently predicts how things will actually behave. It is never violated. It is not that it is physically impossible for the second law to be violated. It is just that it is statistically impossible for it to be violated. ...

This doesn't prove it is a physical law.
It just tells us how we play dice.

This is my post nr 1111, for no physical reason.
 
  • #73


A. Neumaier said:
Yes, but this doesn't involve a violation of the second law.

Right- because living systems are *open*.

Clearly, the entropy associated with me is much lower than a puddle containing all of the atoms in me. Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements. From amino acids, we produce functional proteins. From glucose, we produce ATP well in excess of equilibrium concentrations.
 
  • #74


atyy said:
The last point is in the context of a Hamiltonian system, where Liouville's theorem guarantees that the volume of phase space remains the same. So for a Hamiltonian system, then entropy increase is subjective.

If the system is not Hamiltonian, then there can be objective increases in entropy.

Fair enough.
 
  • #75


Andy Resnick said:
Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements. From amino acids, we produce functional proteins. From glucose, we produce ATP well in excess of equilibrium concentrations.
And where is the evidence that this decrease of entropy is not due to entropy currents associated with the processes in the cell? I don't think there is any.

Just saying that the entropy is lower doesn't say anything about whether the second law is violated. The second law also governs the freezing of ice - where lots of entropy is transported to the surrounding but the local entropy production is still positive.
 
  • #76


A. Neumaier said:
See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.

Thanks, I'll check it out.

I have Feynman's "Statistical Mechanics" text and a few others, and they all start off on page 1 with something like "The fundamental principle of statistical mechanics is that if a system in equilibrium can be in 1 of N states |n>, each with energy E_n, the expected value of an observable <A> is 1/Z Sum(<n|A|n>e^(-E_n/kT)"

So right off the bat, the impression is given that only equilibrium systems can be treated, and that time cannot be considered.
 
  • #77


A. Neumaier said:
And where is the evidence that this decrease of entropy is not due to entropy currents associated with the processes in the cell? I don't think there is any.

Just saying that the entropy is lower doesn't say anything about whether the second law is violated. The second law also governs the freezing of ice - where lots of entropy is transported to the surrounding but the local entropy production is still positive.

I'm a little confused- are you saying I claim living systems violate the second law of thermodynamics?
 
  • #78


Andy Resnick said:
That's not the point: we can't be so cavalier when discussing nonequilibrium systems. Entropy can change if the temperature is constant: chemical reactions, mixing/diffusion, etc.

I'm not sure what you mean by 'free entropy'- the opposite of free energy? Negentropy?
Sorry, that was a typo - I meant free energy.

Yes, entropy can change when the temperature is constant - it can both increase and decrease without violating the second law. Whereas the free energy is forced to decrease (or stay constant) by the second law.
 
  • #79


Andy Resnick said:
I'm a little confused- are you saying I claim living systems violate the second law of thermodynamics?

You seemed to claim it; but perhaps there is a misunderstanding about the meaning of the terms.

The second law (in its most general, nonequilibrium form) says that the local mean entropy production (a well-defined term in the entropy balance equation) is everywhere nonnegative.

Your remark
Andy Resnick said:
Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements.
claims the opposite for biological system. The same seems to be the case in
Andy Resnick said:
A. Neumaier said:
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.
Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
On the other hand, your comment
Andy Resnick said:
A. Neumaier said:
Yes, but this doesn't involve a violation of the second law.
Right- because living systems are *open*.
pays lip service to the validity of the second law. So the overall picture is quite confusing.

Indeed, your last-quoted statement is correct. Although the local entropy production is positive (and the second law holds), entropy can decrease in an open system precisely because the system is open - so that entropy can leave the boundaries of the system. But the latter is due to an entropy current, which is not restricted by the second law.
 
  • #80


Andy Resnick said:
I have Feynman's "Statistical Mechanics" text and a few others, and they all start off on page 1 with something like "The fundamental principle of statistical mechanics is that if a system in equilibrium can be in 1 of N states |n>, each with energy E_n, the expected value of an observable <A> is 1/Z Sum(<n|A|n>e^(-E_n/kT)"

So right off the bat, the impression is given that only equilibrium systems can be treated, and that time cannot be considered.

Well, that's the traditional introduction a la Gibbs 1902, who shaped several generations of physicists. Modern books on statistical mechanics often have a nonequilibrium part (especially when this word is in the title). In my 1980 edition of Reichl, the last third (starting in Chapter 13) is about nonequilibrium.
 
  • #81


A. Neumaier said:
Your remark

claims the opposite for biological system. The same seems to be the case in

On the other hand, your comment

pays lip service to the validity of the second law. So the overall picture is quite confusing.

Then let me clarify. Biological systems (or any system) do not violate the second law of thermodynamics. The second law of thermodynamics is perhaps the most fundamental law of nature we have.

Biological systems are *open*. They exchange both matter and energy with their surroundings. Thus, they can *locally* decrease their entropy (for example, by constructing proteins, by synthesizing ATP, by the chemiosmotic mechanism) at the expense of increasing the entropy of their surroundings- waste products.
 
  • #82


Andy Resnick said:
The original physical motivation for 'entropy' was simply developing more efficient engines, and the realization that the only determining factor is the temperature difference between the inlet and outlet.

In fact, temperature has been conspicuously missing from this thread.

I'm not sure if that counts as a physical motivation. It sounds like engineering motivation to me.

It's quite possible that entropy, like conservation of energy has no physical motivation (i.e. we can't explain why it is mechanistically we can only state that it's what we observe.

Whereas many other phenomena can be explained by conservation of energy, itself, for instance; many things must be so because of conservation of energy, but we (I, more accurately) don't know why conservation of energy must be so.

It appears to me, that entropy has the same such "causal dead end" to it, so far.

addendum:

I see you stated in your last post that the second law is perhaps the most fundamental law of nature we have. I think there's a correlation between feelings like this and "dead end causality" (i.e. most physicists feel the same about conservation laws).
 
  • #83


Pythagorean said:
I'm not sure if that counts as a physical motivation. It sounds like engineering motivation to me.

<snip>

I see you stated in your last post that the second law is perhaps the most fundamental law of nature we have. I think there's a correlation between feelings like this and "dead end causality" (i.e. most physicists feel the same about conservation laws).

To the first sentence, I have no idea what you mean- what exactly do you mean by 'physical' motivation, if not physical objects?

To the last, I think that's overstating the case- I mean that the second law has been verified more times, under more varied circumstances, than any other theory and has so far not once ever been observed to fail.
 
  • #84


Andy Resnick said:
To the first sentence, I have no idea what you mean- what exactly do you mean by 'physical' motivation, if not physical objects?

The concept has only been introduced to me by my german physics teachers explicitly (any good lecturer motivates their lectures, but it's generally an implicit process. Graduates from germany tend to explicitly motivate, using the words "motivated" and sometimes specifying: educational, biological, physical, applications (engineering/medical). Then, they like to follow with the "ansatz" (which the us doesn't explicitly point out either), etc, etc, through to the conclusion of their argument.

It's basically a "why do we care"? And your answer was, in short, "for engineering reasons" (speaking of "making" things "more efficient"). It's not bad or wrong to not have a physical motivation.

To the last, I think that's overstating the case- I mean that the second law has been verified more times, under more varied circumstances, than any other theory and has so far not once ever been observed to fail.

I don't see the contradiction with what I said. The same can be said for conservation of energy, yet we have no physical mechanism that explains conservation of energy. I merely pointed out that there's a correlation between these two things (no mechanistic explanation and a tendency for them to be pervasive, and for scientists to call them "fundamental".)
 
  • #85


A. Neumaier said:
See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.

I looked through the table of contents on Amazon; the hydrodynamic coverage appears very similar to Chaikin and Lubensky's "Principles of Condensed Matter".

Out of curiosity, what does Reichl write down as the statistical mechanical result for viscosity?
 
  • #86


Pythagorean said:
The concept has only been introduced to me by my german physics teachers explicitly (any good lecturer motivates their lectures, but it's generally an implicit process. Graduates from germany tend to explicitly motivate, using the words "motivated" and sometimes specifying: educational, biological, physical, applications (engineering/medical). Then, they like to follow with the "ansatz" (which the us doesn't explicitly point out either), etc, etc, through to the conclusion of their argument.

It's basically a "why do we care"? And your answer was, in short, "for engineering reasons" (speaking of "making" things "more efficient"). It's not bad or wrong to not have a physical motivation.

I guess I still don't understand what you mean by 'physical motivation'. If it's not a mathematical/conceptual model or a physical object, what is left?

Pythagorean said:
I don't see the contradiction with what I said. The same can be said for conservation of energy, yet we have no physical mechanism that explains conservation of energy. I merely pointed out that there's a correlation between these two things (no mechanistic explanation and a tendency for them to be pervasive, and for scientists to call them "fundamental".)

No, there's a difference: conservation of energy (or any conservation law)- recall that 'conservation' is a *mathematical* statement- follows from elementary balance laws: the amount of something in a control volume 'now' is equal to the amount previous, in addition to however much was produced within the volume and the net flux entering the volume.
 
  • #87


Pythagorean said:
Whereas many other phenomena can be explained by conservation of energy, itself, for instance; many things must be so because of conservation of energy, but we (I, more accurately) don't know why conservation of energy must be so.
We know exactly WHY energy is conserved. Conservation of energy follows from the definition of force and the equations for electric and gravitational force.

\vec F = \frac{d\vec{p}}{dt}

\vec F = \frac{GmM}{r^2}\hat r

\vec F = \frac{qQ}{4\pi\epsilon_0 r^2}\hat r

If gravity force and the electric force are the only forces that apply, the quantity KE + PE will be conserved. We know why that is, given those forces. (I am not saying that we know why these forces exist).

The problem only arises when we introduce mechanical forces. But that is not because mechanical forces are different. They are entirely the result of gravity and/or the electric force. The problem is the complexity of the interaction. It is not a matter of the interaction of two masses or two charges. Rather it is the interaction of enormous numbers of little masses and charges and it becomes impossible to calculate the effects of all those little interactions. But we know that since all the interactions involve either gravity or electric force that the total KE+PE of all the matter involved will be the same before and after the interaction(s). That is the first law of thermodynamics.

AM
 
  • #88


Andy Resnick said:
I guess I still don't understand what you mean by 'physical motivation'. If it's not a mathematical/conceptual model or a physical object, what is left?

1. observations don't match theory (this is how Planck motivated discretizing the energy in the blackbody integral)
2. finding the interface between two physical theories (i.e. quantum chaos)
3. finding a mechanism for an observation ("why is there night?" and no, not the proverbial "why"; in this case, the answer would be the motion of the planetary bodies)

I was asking if there was a 3. Why is there entropy?

No, there's a difference: conservation of energy (or any conservation law)- recall that 'conservation' is a *mathematical* statement- follows from elementary balance laws: the amount of something in a control volume 'now' is equal to the amount previous, in addition to however much was produced within the volume and the net flux entering the volume.

This isn't where my confusion is, my confusion must be: I'm still under the impression the second law is also a *mathematical* statement (derived from statistics). Is that wrong?

Andrew Mason said:
But we know that since all the interactions involve either gravity or electric force that the total KE+PE of all the matter involved will be the same before and after the interaction(s). That is the first law of thermodynamics.

AM

Oh wow, I can't believe I never realized the 1st law of thermodynamics was equivalent to energy conservation. That's a little embarrassing...
 
  • #89


Pythagorean said:
This isn't where my confusion is, my confusion must be: I'm still under the impression the second law is also a *mathematical* statement (derived from statistics). Is that wrong?

The second law is not derived from classical statistical mechanics. Classical statistical mechanics is a way to predict equilibrium quantities that is unprincipled, but that it reproduces equilibrium thermodynamics from a microscopic viewpoint shows that there is something to it.

The definition of a state variable called entropy, and its non-decrease in an isolated system is derived from the Kelvin statement of the second law of thermodynamics, or equivalently the Clausius statement, which is almost a "plain English" sentence. In contrast to the starting point of classical statistical mechanics, the Kelvin and Clausius statements are supported by lots of data. It is one of the most amazing deductions of physics, certainly on par with Einstein's deduction of special relativity from 2 postulates.

So basically entropy exists because, as Andrew Mason said early in this thread:

Andrew Mason said:
The easiest form of the second law to understand is the Clausius statement of the second law:

"No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature."​

The Kelvin statement can be shown to be equivalent:

"No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work."​
 
Last edited:
  • #90


Ok, this is becoming a lot more physically tangible to me now.

My next step is really to find and look over the Clausius derivation; thank you everyone for your input.
 

Similar threads

  • · Replies 37 ·
2
Replies
37
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
8K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
6K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K