What is the real second law of thermodynamics?

Click For Summary
SUMMARY

The second law of thermodynamics states that the entropy of a closed system will increase over time, as articulated through the Clausius and Kelvin statements. The discussion highlights that while entropy generally increases, the Poincare Recurrence theorem suggests that a closed system can return to a previous state, albeit over an impractically long timescale. The conversation also emphasizes the distinction between classical and statistical thermodynamics, noting that the second law is fundamentally a statistical statement rather than an absolute one. Additionally, the fluctuation-dissipation theorem indicates that entropy can decrease in small systems under specific conditions.

PREREQUISITES
  • Understanding of classical thermodynamics principles
  • Familiarity with statistical mechanics concepts
  • Knowledge of the Poincare Recurrence theorem
  • Basic grasp of quantum mechanics and its implications on thermodynamics
NEXT STEPS
  • Study the implications of the Poincare Recurrence theorem in thermodynamic systems
  • Explore the fluctuation-dissipation theorem and its applications in statistical mechanics
  • Investigate the differences between classical and quantum mechanical interpretations of entropy
  • Learn about the Boltzmann distribution and its significance in thermodynamic equilibrium
USEFUL FOR

Physicists, thermodynamics students, and researchers interested in the foundational principles of entropy and its implications in both classical and quantum systems.

  • #91


An exposition of the route from the Kelvin and Clausius statements to the existence of a state variable called entropy and its non-decrease in an isolated system can be found at http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec2.pdf

Wikipedia has a fascinating history of http://en.wikipedia.org/wiki/History_of_entropy . I would be interested to know what people think of Jayne's view.
 
Last edited by a moderator:
Science news on Phys.org
  • #92


Andy Resnick said:
Then let me clarify. Biological systems (or any system) do not violate the second law of thermodynamics. The second law of thermodynamics is perhaps the most fundamental law of nature we have.

Biological systems are *open*. They exchange both matter and energy with their surroundings. Thus, they can *locally* decrease their entropy (for example, by constructing proteins, by synthesizing ATP, by the chemiosmotic mechanism) at the expense of increasing the entropy of their surroundings- waste products.

OK. I agree. The point is that open systems not only exchange matter and energy but also entropy, and this explains why they can locally decrease their entropy (or energy or matter density).

Our misunderstanding was about the meaning of the term ''local entropy production''. This is a technical term and describes only that part of the local entropy change that cannot be explained by entropy flow. It is this term that the second law requires to be nonnegative (in a classical setting interpretable as the amount of information irreversibly lost to the neglected microscopic degrees of freedom), no matter how much entropy flows in or out the system.
 
  • #93


Andy Resnick said:
I looked through the table of contents on Amazon; the hydrodynamic coverage appears very similar to Chaikin and Lubensky's "Principles of Condensed Matter".

Out of curiosity, what does Reichl write down as the statistical mechanical result for viscosity?
It is expressed at the end of Section 13F (in my edition) as a mommentum integral involving the inverse of the Lorentz-Boltzmann collision operator - too complex to write down.
 
  • #95


A. Neumaier said:
It is expressed at the end of Section 13F (in my edition) as a mommentum integral involving the inverse of the Lorentz-Boltzmann collision operator - too complex to write down.

That's what I expected- one of my criticisms of statistical mechanics (as in all mechanical theories) is that dissipative processes cannot currently be described in any useful way.
 
  • #96


Andy Resnick said:
That's what I expected- one of my criticisms of statistical mechanics (as in all mechanical theories) is that dissipative processes cannot currently be described in any useful way.
No, you misunderstood: they can be used for quantitiative predictions, and Reichl gives explicit formulas (though not down to the point where the viscosity coefficient was numerically evaluated for a particular gas). With ''too complex to be written down'' I only meant that I didn't want to transcribe them into latex for the post.
 
  • #97


A. Neumaier said:
No, you misunderstood: they can be used for quantitiative predictions, and Reichl gives explicit formulas (though not down to the point where the viscosity coefficient was numerically evaluated for a particular gas). With ''too complex to be written down'' I only meant that I didn't want to transcribe them into latex for the post.

Yeah, but that's just for linear viscosity-Chaikin and Lubensky provide calculations against experiment for liquid Argon. And that's perhaps the most simple case. I'd like to see how statistical mechanics handles creep. Or fracture. Or water.

The reality is, in order to generate useful physical models, multiple conceptual approaches must be used- elements of both thermodynamics and statistical mechanics are often used to understand protein folding, for example.
 
Last edited:
  • #98


lalbatros said:
This doesn't prove it is a physical law.
It just tells us how we play dice.

This is my post nr 1111, for no physical reason.
One does not prove physical laws. Mathematics is about proof. Science is about disproof.

In order to debate the issue, one has to start with a definition of a physical law.

I would suggest the following definition: a physical law is:

a) a statement about some physical property or physical phenomenon that purports to be either universally true or true over a certain range of conditions,

b) that leads to predictions of physical behaviour that can be observed, so it would be falsifiable by the existence of contrary physical evidence and

c) for which no physical evidence has yet been found that falsifies it.​

I suggest that the second law of thermodynamics fits this definition.

AM
 
Last edited:
  • #99


I agree with you, Andrew.

However, precisely, I have often asked myself how the second law could be disproved.
That's probably why it has been given different formulations, apparently equivalent.

When we observe entropy-decreasing fluctuations, we are supposed to "say" that the second law is irrelevant on the microscopic level.
Of course, if we wait longer -say 1 billion years- we increase the chance to observe entropy-decreasing fluctuations on scales that we -today- would call "macroscopic".
Looks like, then, that the second law is about "small-scale" and "short-times" phenomena.
Is then "small" and "short" to be understood on a human-scale?
Does that fit in the definition of a physical law?
Probably yes according to your definition list.
But that doesn't wipe out my problem.
It just means that I have to re-phrase my question.

Consider now the http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Clausius_statement" of the second principle:

No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature​
Should I conclude, based on your definition of a physical law, that any fluctuation disproves the second principle? It would then be a physcial law, but a wrong one!

Further, can't also we take the point of view that the Clausius statement simply defines what "temperature" means? Even empiral temperatures would fit the Clausius definition of temperature according the the Clausius statement of the second principle.
Defining that heat (energy) goes from hot to cold, is that a law of physics?
Can a definition become a law?

However, I would be a very ungrateful engineer if I would not recognize the value of the second principle, at least for engineering. But I can harldy consider it as a law of physics. I would rather consider it as a pre-law of physics: a law that still needs to be discovered.
 
Last edited by a moderator:
  • #100


Andy Resnick said:
Yeah, but that's just for linear viscosity-Chaikin and Lubensky provide calculations against experiment for liquid Argon. And that's perhaps the most simple case. I'd like to see how statistical mechanics handles creep. Or fracture. Or water.
Of course, as the models get more complex and the questions asked more detailed, calculations from first principles become too unwieldy to be actually performed. But they suffice to give a proof of concept. But this also holds for equilibrium thermodynamics.

Engineers will always find it simpler to measure the thermodynamic properties of water/steam rather than to derive it from a partition function. But if one wants to understand water in extreme situations where you can't measure it directly, the ab initio methods are useful.

The same holds for nonequilibrium statistical mechanics. It is far easier to measure the viscosity of water than to calculate it from first principles. But, for example, people use the theory to understand inflation after the big bang, where it is impossible to make experiments.
Andy Resnick said:
The reality is, in order to generate useful physical models, multiple conceptual approaches must be used- elements of both thermodynamics and statistical mechanics are often used to understand protein folding, for example.
You might be interested in my survey paper
A. Neumaier,
Molecular modeling of proteins and mathematical prediction of protein structure,
SIAM Rev. 39 (1997), 407-460.
http://arnold-neumaier.at/papers/physpapers.html#protein
 
  • #101


lalbatros said:
When we observe entropy-decreasing fluctuations, we are supposed to "say" that the second law is irrelevant on the microscopic level.
Entropy requires a temperature. Temperature requires a sufficiently large number of molecules to have a Maxwell-Boltzmann distribution of energies. So it is not that entropy is irrelevant at the small scale. It is really that it is not defined at the small scale.
Of course, if we wait longer -say 1 billion years- we increase the chance to observe entropy-decreasing fluctuations on scales that we -today- would call "macroscopic".
No. If you made an experiment for which one only had to wait a billion years, there is a good chance we could observe it happening somewhere in the universe all the time. The second law is not being violated anywhere in the universe.
Looks like, then, that the second law is about "small-scale" and "short-times" phenomena.
Is then "small" and "short" to be understood on a human-scale?
The second law will not be violated on any time scale with respect any collection of molecules for which a temperature can defined.

Should I conclude, based on your definition of a physical law, that any fluctuation disproves the second principle? It would then be a physcial law, but a wrong one!
Do you mean if you witnessed an event that has a chance of one in 10^billion universe lifetimes of occurring that you would disprove the second law?

Suppose you thought you witnessed such an event, that heat flowed spontaneously from cold to hot all by itself for a brief instant. Since you could not repeat the result and no one else can repeat it because it never occurs again, what have you proven? You could never really determine whether your result was a mistake or a real observation. The chances are much better that it was a mistake than a real event.

Further, can't also we take the point of view that the Clausius statement simply defines what "temperature" means? Even empiral temperatures would fit the Clausius definition of temperature according the the Clausius statement of the second principle.
Defining that heat (energy) goes from hot to cold, is that a law of physics?
Can a definition become a law?
No. The temperature of matter - a collection of molecules - is defined by the Maxwell-Boltzmann distribution. One cannot just make up a definition of temperature. It is a real law based on science not an arbitrary definition.

AM
 
  • #102


Andrew Mason said:
Entropy requires a temperature. Temperature requires a sufficiently large number of molecules
The Boltzmann equation shows that entropy does not require a temperature.

The deterministic version of thermodynamics is valid only for systems in local equilibrium, which requires a macroscopic system with enough molecules to reliably average over them.

Fluctuating thermodynamics (the version defined by statistical mechanics, which is valid for any size) is very little predictive for small systems, since for these it predicts fluctuations of a size that may erase any informative content in the mean values.
 
  • #103


Dear All,

I feel much more confortable to consider that the second law is an engineering heuristics.
And I even don't feel totally confident with this view!
Sorry for that!

To go back again to the Clausius formulation, I believe the full construct of thermodynamics can be derived from this statement. It can lead to the existence of an entropy state function and a thermodynamic temperature scale. But this is all a direct consequence from defining the notion of "hot" and "cold", a consequence of defining hot and cold as the direction for heat. Heat was already known before the second principle.
Where is there real physics in the Clausius statement?
Where is there real information on our physical world?
This Clausius statement is more like instructions for engineers to contruct thermodynamic tables from experiment in an organized way, and first of all it instructed engineers to built a consistent thermometer!
What could you do with the Clausius principle if you had no thermodynamic tables?
You reap the result of the Clausius statement only when a huge amount of experiments have been tabulated (recorded) in a rational way.
That's a huge achievement, but I don't really see any physical law in the Clausius statement.

For me the real physics comes with the statistical thermodynamics, with Boltzmann and Gibbs.
Thermodynamic tables can be calculated ab-initio thanks to their work.
And their work acknowledges clearly the existence of fluctuations.

Michel
 
Last edited:
  • #104


A. Neumaier said:
Of course, as the models get more complex and the questions asked more detailed, calculations from first principles become too unwieldy to be actually performed. But they suffice to give a proof of concept.

Spoken like a true mathematician! :)
 
  • #105


A. Neumaier said:
The Boltzmann equation shows that entropy does not require a temperature.
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM
 
  • #106


Andrew Mason said:
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM

I think he's referring to the case of ground state degeneracy.
 
  • #107


Andrew Mason said:
Perhaps you could explain why, in the Boltzmann equation for entropy, the Boltzmann constant has units of Joules/Kelvins. Or why not just rewrite second law of thermodynamics without reference to temperature?

AM

The Boltzmann constant doen't appear in the H-function or in the H-theorem, and it even doesn't appear in the Boltzmann equation.

The Boltzmann equation can, within its range of validity, predict the evolution of any distibution. The distribution doesn't need to be a Maxwellian and therefore, the temperature simply doesn't play any role in the Boltzmann equation, the H-function and the H-theorem. The same applies for other or all master equations in statistical mechanics, except if they are specialized to near-equilibrium solutions. The Maxwellian distribution comes into play as a special stationary solution, and for this special solution the temperature can be taken into consideration. The Boltzmann constant is introduced in the Boltzmann S-function for mere convenience. By doing so the results for (near) equilibrium distributions can be compared with thermodynamics.

This illustrates my point that the second law is more like an "engineering heuristics".
In addition, it also shows that the "http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Description"" are only a very special case of a much broader "second law". The H-theorem applies to the thermalisation of particle distributions, which is not in the scope of the "second law" of thermodynamics as formulated by Clausius or Kelvin or Carathéodory.

Concerning the second law (of thermodynamics), it is the basis to construct the thermodynamic scale of temperature.
Having defined this scale, the recipe to built entropy tables is the famous law dS=dQ/T, where the temperature factor testifies of an assumption of equilibrium. Note that two of the http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Description" make no explicit reference to temperature.

In summary, there are many overlapping definitions of entropy!
 
Last edited by a moderator:
  • #108


lalbatros said:
The Boltzmann constant doen't appear in the H-function or in the H-theorem, and it even doesn't appear in the Boltzmann equation.

I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.
 
  • #109


Andy Resnick said:
I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.

H is the entropy in units where the Boltzmann constant is set to 1. That it has a different value than 1 is due to historical accidents in the definition of the temperature scale, similar to accidents that make the speed of light or the Planck constant not exactly 1.
 
  • #110


lalbatros said:
It just tells us how we play dice.

Life is a game, what else do you ask for besides a rational strategy? ;)

You can see a THEORY as a descriptive model, or, as an INTERACTION TOOL in an inference perspective.

A descriptive model is falsified or corroborated. Corroborated theories lives on. Falsified theories drop dead, leaving no clue as to how to improve.

An interaction tool for inference is different, it either adapts and learns, or it doesn't. Here the falsification corresponds to "failure to learn". The hosting inference system will be outcompeted by more clever competitor. This may be one reason to suspect that the laws of physics, we actually find in nature, does have inferencial status.

It's much more than playing dice.

/Fredrik
 
  • #111


To think that we can DESCRIBE the future, is IMHO a very irrational illusion.

All we have, are expectations of the future, based on the present (including present records of the past), and based upon this we have to throw our dice. There is no other way.

In this respect, the second law is one of the few "laws" that are cast in a proper inferencial form as is.

As anyone seriously suggest you say; understand Newtons law of gravity, but do not understand the second law? If one of them is mysterious I can't see how it's the second law.

/Fredrik
 
  • #112


Fra said:
To think that we can DESCRIBE the future, is IMHO a very irrational illusion.
Both future and past are illusions - real is only the present (if anything at all).

Nevertheless, it is often easier to predict the future than the past. The past of a stirred fluid coming slowly to rest is far more unpredictable than the final rest state.
 
  • #113


Andy Resnick said:
I'm not sure that's a fair criticism- the quantity 'H' (H = Sum(p log p) + const.) isn't the entropy either, S = kH.

I simply don't see what could be obtained from a dimensional argument.
The thermodynamic dimension of entropy is purely conventional.

The factor is there as a connection between a measure of disorder and a measure of energy.
Nevertheless, disorder can be defined without any relation to energy.
The historical path to entropy doesn't imply that entropy requires any concept of thermodynamics.

The widespread use of entropy today has clearly shown that it is not a thermodynamic concept. We know also that entropy finds a wide range of application in thermodynamics.
It should be no surprise that the use of entropy in thermodynamics requires a conversion factor. This factor converts a measure of disorder to the width of a Maxwellian distribution.
 
  • #115


A. Neumaier said:
H is the entropy in units where the Boltzmann constant is set to 1. That it has a different value than 1 is due to historical accidents in the definition of the temperature scale, similar to accidents that make the speed of light or the Planck constant not exactly 1.

Not exactly- you may set the numerical value of k to 1, but there are still units. Temperature can be made equivalent to energy. One is not primary over the other, just as length and time are equivalent.

Even so, that only holds for equilibrium: thermostatics or thermokinetics. It does not hold in the fully nonequilibrium case. Jou, Casas-Vazquez, and Lebon's "Externded Irreversible Thermodynamics" and Truesdell's "Rational Thermodynamics" both have godd discussions about this.
 
  • #117


Andy Resnick said:
That is true- but then what is the temperature of a sandpile?

http://rmp.aps.org/abstract/RMP/v64/i1/p321_1
http://pra.aps.org/abstract/PRA/v42/i4/p2467_1
http://pra.aps.org/abstract/PRA/v38/i1/p364_1

it's not a simple question to answer.

I have at least two reasons to enjoy this type of sandpile physics:

  1. I work for the cement industry, and there are a many granular materials there:
    limestone, chalk (made of Coccoliths!), sand, clay, slag, fly ashes, ground coal, rubber chips, plastic pellets, ...
  2. I enjoyed reading the book by Pierre-Gilles de Gennes on https://www.amazon.com/dp/0387986561/?tag=pfamazon01-20
It is true that the (excited) avalanche phenomena near the critical repose angle has an analogy with a barrier crossing phenomena that can be associated to an equivalent temperature (fig 85.c in https://www.amazon.com/dp/0387986561/?tag=pfamazon01-20). I guess this temperature might indeed represent the probability distribution of the grains energy acquired by the external excitation.

Obviously, this is again an example taken from (statistical) mechanics.
Therefore, the entropy that one might consider here is again related to the distribution of energy.
And therefore this one more energy-related entropy.

If we consider that any information, in the end, needs a physical substrate to be stored, then effectively the whole world is mechanical and , in the end, any entropy could be related to an energy distribution.
As long as there are no degenerate states, of course ...
So the question about entropy and energy could be translated in:

How much information is stored in degenerate states compared to how much is stored on energy levels? (in our universe)

My guess goes for no degeneracy.
Meaning that history of physics was right on the point since Boltzmann: it would make sense to give energy dimensions to entropy!
 
Last edited by a moderator:
  • #118


lalbatros said:
I have at least two reasons to enjoy this type of sandpile physics:

Glad to hear it- I find it interesting as well. I was involved with a few soft-matter groups when I worked at NASA.


lalbatros said:
Obviously, this is again an example taken from (statistical) mechanics.
Therefore, the entropy that one might consider here is again related to the distribution of energy.
And therefore this one more energy-related entropy.

I think you still misunderstand me- what you say above is of course correct, but I think you missed my main point, which is that temperature and entropy should not be simply equated with energy. Entropy is energy/degree, and so there is an essential role for temperature in the entropy.

How about this example- laser light. Even though laser light has an exceedingly well-defined energy, it has *no* temperature:

http://arxiv.org/abs/cond-mat/0209043

They specifically address the difficulty in assigning a temperature and an entropy to an out-of-equilibrium system:

"Out of equilibrium, the entropy S lacks a clear functional dependence on the total energy
E, and the definition of T becomes ambiguous."

Again, laser light is a highly coherent state, is resistant to thermalization in spite of interacting with the environment, has a well defined energy and momentum, and yet has no clear entropy or temperature.
 
  • #119


Andy Resnick said:
They specifically address the difficulty in assigning a temperature and an entropy to an out-of-equilibrium system:

"Out of equilibrium, the entropy S lacks a clear functional dependence on the total energy
E, and the definition of T becomes ambiguous."

I think there should be no problem to define the entropy, even though the temperature might be totally undefined.

It is clear that entropy is not a function of energy in general.
Just consider the supperposition of two bell-shape distribution.
What is the "temperature" of this distribution?
Even when the two distributions are Maxwellians, you would still be forced to describe the global distribution by three numbers: two temperatures and the % of each distribution in the total.
This is a very common situation.
Very often there are several populations that do not thermalize even when reaching a steady state (open system).
For example the electron and ion temperatures are generally very different in a tokamak.
Even different ion species might have different distributions in a tokamak, specially heavy ions with respect to light ions.
There might even be two populations of electrons, not to mention even runaway electrons in extreme situations.
In quite clear that in all these non equilibrium situations, the entropy is perfectly defined as well as the energy, but the entropy is not a function of energy anymore. Therefore, temperature cannot be defined.

I will read the paper later.
However, the introduction suggests that temperature could be sometimes defined in non-equilibrium situations.
I agree with that with the temporary naive idea that this will be the case when at least approximately S=S(E) .
One can easily built articial examples.
For example, on could constrain a distribution to be Lorentzian instead of Maxwellian, or any suitable one-parameter distribution. Within this constraint S would be a function of E via the one parameter defining this distribution. Temperature should be defined in this situation.
I am curious to see a more physical example in the paper.
I am also curious to think about which "thermodynamic relations" would still hold and which should be removed, if any.

Thanks for the reference,

Michel
 
  • #120


Andy Resnick said:
Not exactly- you may set the numerical value of k to 1, but there are still units. Temperature can be made equivalent to energy. One is not primary over the other, just as length and time are equivalent.
The units are arbitrary, since the Kelvin is an independent unit defined only by an experimental procedure. If you set k=1, temperature will have the units of inverse energy, and entropy is unitless.
Andy Resnick said:
Even so, that only holds for equilibrium: thermostatics or thermokinetics. It does not hold in the fully nonequilibrium case. Jou, Casas-Vazquez, and Lebon's "Externded Irreversible Thermodynamics" and Truesdell's "Rational Thermodynamics" both have good discussions about this.
The level of nonequilibrium thermodynamics is characterized by local equilibrium (in position space). On this level, dynamics is governed by hydrodynamics, variants of the Navier-Stokes equations. Here temperature exists, being defined as the equilibiruim temperature of an (in the limit infinitesimal) cell. Or, formally, as the inverse of the quantity conjugate to energy.

A more detailed level is the kinetic level, characterized by microlocal equilibrium (in phase space). On this level, dynamics is governed by kinetic equations, variants of the Boltzmann equation. Entropy still exists, being defined even on the more detailed quantum level. Temperature does not exist on this level, but appears as an implicit parameter field in the hydrodynamic limit: The kinetic dynamics is approximated in a local equilibrium setting, by assuming that the local momentum distribution is Maxwellian. The temperature is a parameter in the Gaussian local momentum distribution, and makes no sense outside the Gaussian approximation.
 

Similar threads

  • · Replies 37 ·
2
Replies
37
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
8K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
6K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K