What is the real second law of thermodynamics?

Click For Summary
SUMMARY

The second law of thermodynamics states that the entropy of a closed system will increase over time, as articulated through the Clausius and Kelvin statements. The discussion highlights that while entropy generally increases, the Poincare Recurrence theorem suggests that a closed system can return to a previous state, albeit over an impractically long timescale. The conversation also emphasizes the distinction between classical and statistical thermodynamics, noting that the second law is fundamentally a statistical statement rather than an absolute one. Additionally, the fluctuation-dissipation theorem indicates that entropy can decrease in small systems under specific conditions.

PREREQUISITES
  • Understanding of classical thermodynamics principles
  • Familiarity with statistical mechanics concepts
  • Knowledge of the Poincare Recurrence theorem
  • Basic grasp of quantum mechanics and its implications on thermodynamics
NEXT STEPS
  • Study the implications of the Poincare Recurrence theorem in thermodynamic systems
  • Explore the fluctuation-dissipation theorem and its applications in statistical mechanics
  • Investigate the differences between classical and quantum mechanical interpretations of entropy
  • Learn about the Boltzmann distribution and its significance in thermodynamic equilibrium
USEFUL FOR

Physicists, thermodynamics students, and researchers interested in the foundational principles of entropy and its implications in both classical and quantum systems.

  • #31


I guess this is the problem with watching lectures and not doing the homework. Thanks to physicsforums for discussion.
 
Science news on Phys.org
  • #32


Pythagorean said:
So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?
.

Believe who you want- science cares not a whit about credentials.
 
  • #33


Pythagorean said:
One of my physics professor conveyed a definition of thermodynamics to me that has stuck. I believe it was originally presented by Susskind (who has lectures available online).

The second law is only really justifiable theoretically with quantum mechanics (the classical mechanics description is limited by the Planck constant and is more of an experimental fact).

But let's start with the classical view, using phase space (a plot of the position vs. the momentum of a particle or set of particles). You pick a point and that represents a particle and you trace it through phase space. Since they're deterministic equations in the classical view, you can trace them back to their origin with no problem, even chaotic systems.

Now, if we consider quantum mechanics, we suddenly have an issue when we trace the particle back to it's origin on the phase plot. Namely, that it could have come from any arbitrary point within a circle the size of Planck's constant (which is an area on the phaseplot).

That is, due to indistinguishability and Heisenberg uncertainty, we have an inherent loss of information in the universe about the state of the particles whose motion (characterized by position and momentum) is directly related to energy and this loss of information is entropy.

(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

(ii) Does classical statistical mechanics of identical particles require QM for its justification -yes. Because classically, identical particles have distinct trajectories, and so cannot be really identical.

Your professor was probably referring to the second idea, not the first.
 
Last edited:
  • #34


Pythagorean said:
Which is why I liked Susskind's treatment, but now I'm bein told Susskind was wrong by AM (or at least my interpretation of it).
When did I say or even suggest that Susskind was wrong? I simply suggested that as a pedagogical matter the concept of entropy would be easier to understand if he started at the beginning rather than at the end of the history of that concept. His lecture, after all, is supposed to be an introduction of the second law of thermodynamics.

I don't pretend to understand what your interpretation of entropy is or what your interpretation of Susskind is.

So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?
.
Science is not about "belief". It is about understanding so as to be able to describe and predict the behaviour of the physical world. So you should use the resources that best help you acquire that understanding. I would recommend Feynman's lectures on Physics Vol 1, Ch. 39-46.

AM
 
  • #35


A subdivision of theoretical physics is in order.

One sort involves objectification.

The other is subjectification. Thermodynamics falls under this category, and is about what information can be obtained by incomplete knowledge. It is concerned with the reduction of ignorance upon acquisition of information, its converse and evolution with time.
 
Last edited:
  • #36


Phrak,

I almost agree with you, except for what I don't understand.

However, how can we conceive that "knowledge" would be part of physics?
The world evolves according to the laws of physics, without taking our knowledge of it into account in any way.
The sun heats up the earth, should we know it or not, should there be humans on Earth or not.

Well, right ... except that the world doesn't even know physics!

Michel
 
  • #37


Andy Resnick said:
Believe who you want- science cares not a whit about credentials.

Andrew Mason said:
Science is not about "belief".

I'm kind of offended that you guys replied with this banal pedantry. Especially when I was asking for rationalization (see Redx's ant atyy's replies). Do I have to lecture you about the word belief and how it doesn't imply that science is a religion when scientists talk about beliefs? I don't think so.

atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

(ii) Does classical statistical mechanics of identical particles require QM for its justification -yes. Because classically, identical particles have distinct trajectories, and so cannot be really identical.

Your professor was probably referring to the second idea, not the first.

The discussion with my professor was about the cause of thermodynamics (a physical description of why it must be so). The conflict you're talking about was my own misunderstanding of a lecture (I rewatched it. It's right where a student asks a question that I can't hear and I took Susskind's answer in the context of the lecture). This conflict was resolved by both your and Redx's replies.

I'm still curious if there is such a mechanistic description in classical thermodynamics? Would the statistical definition really satisfy that?
 
  • #38


moonman239 said:
Title says it all.

The most general second law says that the local entropy production at every point in space is nonnegative, and vanishes only in equilibrium.

The usual formulations of the second law are special cases of this.
 
  • #39


Pythagorean said:
I'm kind of offended that you guys replied with this banal pedantry. Especially when I was asking for rationalization (see Redx's ant atyy's replies). Do I have to lecture you about the word belief and how it doesn't imply that science is a religion when scientists talk about beliefs? I don't think so.



The discussion with my professor was about the cause of thermodynamics (a physical description of why it must be so). The conflict you're talking about was my own misunderstanding of a lecture (I rewatched it. It's right where a student asks a question that I can't hear and I took Susskind's answer in the context of the lecture). This conflict was resolved by both your and Redx's replies.

I'm still curious if there is such a mechanistic description in classical thermodynamics? Would the statistical definition really satisfy that?

I apologize- I was having a bad day yesterday and tossed off a snippy comment. I know you make an honest effort to understand things.

But getting back to the point, the concept of 'entropy' admits many interpretations (thermodynamic, statistical, information, etc), and while these interpretations are (as they must be) equivalent, one interpretation may be more 'useful' to describe a situation than another.

Claims that QM (or SM) is *required* to *understand* or*explain* thermodynamics are not founded on good science. To ask why physical reality *must* be the way it is leads to the anthromorphic principle, which I am personally uncomfortable with.
 
  • #40


Pythagorean said:
I'm still curious if there is such a mechanistic description in classical thermodynamics? Would the statistical definition really satisfy that?

The second law follows from classical statistical mechanics in the same way as from quantum statistical mechanics. The treatment by Gibbs 1902 was classical but survived the quantum revolution without any qualitative changes, and without quantitative changes bigger than O(hbar).

What does not follow from classical statistical mechanics is that mixing identical substances does not increase the entropy. But this follows by extending the state space to one with a variable number of particles (for the grand canonical ensemble) and weighting the Liouville measure for the N-particle space by 1/N!, corresponding to Boltzmann counting. The factor can be interpreted as accounting for indistinguishability of the particles. Nothing quantum is needed for that.

However, quantum field theory explains indistinguishability in a very natural way.
 
  • #41


just an fyi: I learned thermodynamics from Reif and we started with the binomial distribution. Most of that class was mathematics with absolutely zero physical motivation (only statistical motivation). I didn't care about or absorb any of it, I just pushed through for good marks.

So I was quite refreshed by Susskind's treatment.

AM, I will check out Feynman's lecture, I hope it won't be like Reif's treatment.

Andy Resnick said:
I apologize- I was having a bad day yesterday and tossed off a snippy comment. I know you make an honest effort to understand things.

No problem, I'm glad to see you return to the discussion.

But getting back to the point, the concept of 'entropy' admits many interpretations (thermodynamic, statistical, information, etc), and while these interpretations are (as they must be) equivalent, one interpretation may be more 'useful' to describe a situation than another.

Claims that QM (or SM) is *required* to *understand* or*explain* thermodynamics are not founded on good science. To ask why physical reality *must* be the way it is leads to the anthromorphic principle, which I am personally uncomfortable with.

I was trying to be careful to make the distinction between anthropomorphism and physical mechanisms. Returning to the spring discussion, a block of concrete can have a spring constant, but it's nothing you'd realize unless you understood compressibility and solid state physics. The micro explanation is more general and covers more cases of "springiness" than the macro treatment of hooke's law. But it also explains why we have springiness in all macro materials.

As another example, conservation of charge says KCL must be so.

I was under the impression that HUP said entropy must be so (though I realize now that that's unfounded).

Neumaier said:
The second law follows from classical statistical mechanics in the same way as from quantum statistical mechanics. The treatment by Gibbs 1902 was classical but survived the quantum revolution without any qualitative changes, and without quantitative changes bigger than O(hbar).

What does not follow from classical statistical mechanics is that mixing identical substances does not increase the entropy. But this follows by extending the state space to one with a variable number of particles (for the grand canonical ensemble) and weighting the Liouville measure for the N-particle space by 1/N!, corresponding to Boltzmann counting. The factor can be interpreted as accounting for indistinguishability of the particles. Nothing quantum is needed for that.

However, quantum field theory explains indistinguishability in a very natural way.

Thank you for your reply. I'm beginning to recognize the importance of learning the history of science along with the science.
 
  • #42


I guess strictly speaking, neither classical statistical mechanics nor its quantum counterpart explain the second law. An explanation should start from the exact equations of motion. I don't know what his reference is, but this guy says

"Reconciling the reversibility of laws of physics governing the microscopic domain with the observed irreversibility of macroscopic phenomena is a fundamental problem. Of course, not all microscopic laws of physics are reversible: weak nuclear interactions violate time reversal symmetry, and the collapse of the quantum wave-function in the act of observation is irreversible. The former interactions in fact do not play any significant role in everyday observations that lead to the second law. The irreversible collapse of the wave-function may itself be an artifact of treating macroscopic observers and microscopic observables distinctly. There are proponents of the view that the reversibility of the currently accepted microscopic equations of motion (classical or quantum) is indicative of their inadequacy. However, the advent of powerful computers has made it possible to simulate the evolution of collections of large numbers of particles, governed by classical, reversible equations of motion. Although simulations are currently limited to relatively small numbers of particles (10^6), they do exhibit the irreversible macroscopic behaviors similar to those observed in nature (typically involving 10^23 particles). For example, particles initially occupying one half of a box proceed to irreversibly,and uniformly,occupy the whole box. (This has nothing to do with limitations of computational accuracy; the same macroscopic irreversibility is observed in exactly reversible integer based simulations, such as with cellular automata.) Thus the origin of the observed irreversibilities should be sought in the classical evolution of large collections of particles." http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec9.pdf
 
Last edited by a moderator:
  • #44


Pythagorean said:
Returning to the spring discussion, a block of concrete can have a spring constant, but it's nothing you'd realize unless you understood compressibility and solid state physics. The micro explanation is more general and covers more cases of "springiness" than the macro treatment of hooke's law. But it also explains why we have springiness in all macro materials.

That implies we can't understand a phenomenon until there exists a detailed microscopic model from which the phenomenon emerges.

I disagree with that, because when I say I understand something, it means I have a predictive model and the predictions match with experiment. The model can use any set of constructs I choose- so long as it makes testable predictions that match with reality.

I understand Hooke's law in terms of how far something stretches under an applied force- that's plenty sufficient for most applications. Certainly, it's sufficient for Physics I. Alternatively, I understand Hooke's law as a linear 1-D stress-strain relationship. I understand strain because I can measure the amount of deformation in an object with a ruler. I understand stress because Newton's law F = dp/dt and Cauchy's law \nabla\bullet T = Dp/Dt let's me understand stress in the context of force (or energy). Now, I can see Hooke's law as a special case of a more general phenomenon (nonlinear 3-D stress-strain relationships). Notice, I didn't need to know about atoms (or any microscopic details) in order to understand Hooke's law.

To be fair, I can't predict the specific constitutive relationship for a given object with this model; but since constitutive relationships can be measured in the lab, I think that's a weak criticism.

If I wanted to 'play dumb', I could continue to ask "but how do I know what *that* means?" over and over again until I run out of explanations- some people may choose to interpret that as the limit of my knowledge. I, OTOH, interpret that as denying the possibility of knowledge.

Again, requiring that entropy be explained in terms of a microscopic theory in order to understand entropy is overstating the case. Sure, a microscopic theory is useful- but since such a theory does not exist, we should not then say we know nothing about entropy.

Is 'KCL' Kirchoff's circuit law? Yes, Kirchoff's law I_in = I_out follows from conservation of charge. But what is charge? (see? we can keep playing that game- do you really have to understand QED in order to master Physics I?)
 
  • #45


I certainly don't disagree with you, for all practical purposes; but it's allowed, in scientific discussion to look at the frontier. I don't model neurons as quantum particles... I mean, seriously! I don't even have to model each potassium ion. I model them as an isopotential electrochemical circuit (more or less) like all the other ion models do. But it doesn't mean I'm going to act like I have a bucket over my head and only see circuits when I talk about cell behavior in open discussion.

You can't deny that understanding the micro-details of a spring has helped us to understand the macro-view if it. That it goes the other way is already a given (which is why more people are laymen than scientists - macro is closer to every day experience, laymen explanations are even closer to every day experience, so it's natural that we're more comfortable with macro explanations... maybe... once you start talking about cosmology, perhaps it's easier to intuit as a micro-member of the universe.)

But to the point with thermodynamics in general, I was interested in a physical motivation for something I've only seen motivated from mathematics when I took undergrad thermodynamics (Reif). I was totally turned off by it at the time, and am just now getting back into thermodynamics.

Susskind provided the physical motivation pedagogically, atyy continues to provide research motivation for the question in his links. I'm not insistent on invoking QM, the question still remains without it.

addendum:

I didn't actually mean "micro was better than macro", I just meant that you don't have to box yourself into one or the other in a discussion setting.
 
Last edited:
  • #46


Pythagorean said:
I'm kind of offended that you guys replied with this banal pedantry. Especially when I was asking for rationalization (see Redx's ant atyy's replies). Do I have to lecture you about the word belief and how it doesn't imply that science is a religion when scientists talk about beliefs? I don't think so.
Sorry. I certainly didn't mean to offend, but I can see how you might take it that way. I didn't mean to be banal or pedantic. Sometimes we need to rein in our rhetoric. Again sorry for that.

AM
 
  • #47


Andrew Mason said:
Sorry. I certainly didn't mean to offend, but I can see how you might take it that way. I didn't mean to be banal or pedantic. Sometimes we need to rein in our rhetoric. Again sorry for that.

AM

In hindsight, my post could have been less provoking. I was seriously not intending to compare credentials, I was just frustrated with my intuition conflict.
 
  • #48


atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

I only had time to read pages 15-17.

I vaguely recall objects such as Peano curve that can fill the entire plane. So is it possible for phase-space tendrils to be so close together that every area of hbar contains at least one phase space point in it? I think that's probably what Susskind is thinking (although I admit I didn't watch Susskind's lectures (lack of time)), using quantum mechanics to explain how an apparent increase can be taken as a real increase, since there is no such thing as filling half a pixel of hbar - the pixel is either filled or unfilled. But certainly how far apart the phase-space tendrils become is independent of quantum mechanics, and hence whether the system has hbar resolution is not to be decided by quantum mechanics but the equations of classical mechanics. In that sense entropy is independent of quantum mechanics.

Still, how do you normalize phase space without hbar?
 
  • #49


RedX said:
Still, how do you normalize phase space without hbar?

The usual trick is to say, eg. in the microcanonical ensemble, instead of just considering all states with exactly energy E (surface in phase space), we consider all states with energy E plus some slack (volume in phase space). Then you can use the volume as the normalization factor. One has to stick to canonical variables, since it's only under those that phase space volume is invariant.

I'm reading from http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec12.pdf , and also the ideal gas example in http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec13.pdf
 
Last edited by a moderator:
  • #50


RedX said:
I think that's probably what Susskind is thinking (although I admit I didn't watch Susskind's lectures (lack of time)), using quantum mechanics to explain how an apparent increase can be taken as a real increase, since there is no such thing as filling half a pixel of hbar -

for context:

Susskind was reluctant to talk about QM; the students kept pushing him with questions until he admitted that there was an h-bar limit, but I couldn't hear the questions, so the students themselves could have already invoked QM and that's what he conceded to.

My professor's words were that the classical description of entropy was hand-waivy (physically, because it's described statistically) and that you really needed QM (he didn't rationalize this). My natural assumption was that with QM, statistics IS physical motivation because QM is fundamentally statistical.

But I'm still kind of curious why we would have to grain the volume in the first place? Why should there be a constant associated with the phase volume? Shouldn't we expect continuity in classical treatment?
 
  • #51


thank you atyy, that answers my question too
 
  • #52


atyy said:
The usual trick is to say, eg. in the microcanonical ensemble, instead of just considering all states with exactly energy E (surface in phase space), we consider all states with energy E plus some slack (volume in phase space). Then you can use the volume as the normalization factor.

That makes sense. Can't believe I never thought of that before.

Susskind was reluctant to talk about QM; the students kept pushing him with questions until he admitted that there was an h-bar limit, but I couldn't hear the questions, so the students themselves could have already invoked QM and that's what he conceded to.

Susskind is brilliant. But you can tell that he gets frustrated with the students and will sometimes just nod so that he can move on. I think his ideas and overall picture are what's best about those lectures, rather than getting every word or detail right. Not saying that's what happened (since I didn't watch it), but in general I trust books more than lectures for fine detail because of the possibility of real-time mistakes.

The textbook "Feynman Lectures on Physics" are even more brilliant, but since it's a book and not a lecture, the details are right too. I think it's a brilliant piece of work how he derives conservation of gravitational potential energy, principle of virtual work, and mechanical advantage from the observation that there is no perpetual machine, and then the cherry on top is the 345 triangle with a chain of beads surrounding it.
 
  • #53


I started to read "Surely you must be joking..." when I was first studying physics and I was totally turned off by his arrogant tone and haven't picked up any Feynman since. I guess it wouldn't hurt to go back now and read something that has some actual content instead.
 
  • #54


Pythagorean said:
But I'm still kind of curious why we would have to grain the volume in the first place? Why should there be a constant associated with the phase volume? Shouldn't we expect continuity in classical treatment?
In the grand canonical ensemble, one must weight the contributions of the N-particle integrals appropriately by factors whose unit is action^{-N}, in order to get something dimensionless.
This naturally introduces the Planck constant even in the classical case.
 
  • #55


There are numerous numerical experiments that prove the increasing entropy in classical dynamical systems.
This alone proves that QM is not needed to justify the second law.

Nevertheless, the Locksmith paradox remains in these experiments.
Reversing all the velocities midway on the path to equilibrium leads to a temporary decrease of entropy followed by a new increasing entropy history.

The Locksmith paradox just remains exactly as sharp in Quantum Mechanics as long as no measurement is performed on the system.
Therefore, QM mechanics could be investigated as a basis for the second principle, only from the side of the famous measurement process.

Yet, the reverse point of view, that the QM measurement irreversibility is a consequence of the second principle is far more sensible ... since the second principle can live without QM. In addition, this hypothesis would relieve Quantum Mechanics from a useless postulate and make it a physical consequence of a complex evolution.

In this last point of view, one crucial question remains then.
What is the meaning of the wave-function?
Is the wave function a consequence of the second law?

I can be arrogant sometimes, even when parroting the giants.
 
Last edited:
  • #56


Historically, Quantum Mechanics emerged from the need to reconciliate statistical thermodynamics with electrodynamics and observations.
Therefore it makes sense to argue that QM is absolutely needed for the consistency of:

1) either statistical thermodynamics
2) or electrodynamics
3) or both

We know that QM is certainly necessary for the consistency of electrodynamics.
I don't know any reason why QM is needed in statistical thermodynamics, and in particular to justify the second principle.
The normalisation of the phase space is not more than a convenience.
 
  • #57


lalbatros said:
Nevertheless, the Locksmith paradox remains in these experiments. Reversing all the velocities midway on the path to equilibrium leads to a temporary decrease of entropy followed by a new increasing entropy history.
For a macroscopic body, is impossible to reverse all these velocities. Thus the paradox has no experimental relevance.
 
  • #58


A. Neumaier said:
For a macroscopic body, is impossible to reverse all these velocities. Thus the paradox has no experimental relevance.

I agree that preparing a entropy-decreasing system is difficult, technically.
Does that mean that the second principle should be rephrased as:

"Preparing an entropy-decreasing system is too difficult to be part of physics?"

In addition, microfluctuations by themselves are deviations from the second principle.
Would that imply that there is a no man's land in physics, somewhere between microscopic and macroscopic, where there is no law and no understanding?

Furthermore, I am not sure at all that entropy-decreasing experiments have not already been performed, not only on computer simulations, but also in the lab. And even if it is not the case, why could you absolutely exclude such experiments?

Where do you put the boundary between microscopic and macroscopic?
And is there a new physics that pops up just when crossing this boundary?

Is the second principle a physical law at all?
 
  • #59


lalbatros said:
"Preparing an entropy-decreasing system is too difficult to be part of physics?"
The entropy can decrease easily in open nonequilibrium systems. The precise formulation of the second law is that the local mean entropy production is always nonnegative.

Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known.
lalbatros said:
In addition, microfluctuations by themselves are deviations from the second principle.Would that imply that there is a no man's land in physics, somewhere between microscopic and macroscopic, where there is no law and no understanding?
No. it just means that the uncertainty in checking things gets bigger and bigger as your systems get smaller and smaller. So the second law means less and less.
lalbatros said:
Where do you put the boundary between microscopic and macroscopic?
It is like the split between observer and observed, one can put it wherever one likes, without changing the physics.
 
  • #60


A. Neumaier said:
The entropy can decrease easily in open nonequilibrium systems.

My teaser is that, based on the fundamental laws of physics, entropy can decrease even in a closed system.
That's indeed the Locksmith paradox.
Is the second law not inconsistent with the other laws of physics.

I like to tell that in another way.
The second law is a theory about what we are used to observe.
The fundamental laws are theories about everything that could be observed.
(theories, not absolute truths of course)

The interresting point is: how can these be reconciled.
This is of course an old question that received already many answers.
What has been done since Boltzmann and Poincaré?

This thread already reffered to many ideas.
One of them I do not believe is a role of QM.
I believe just the opposite: the measurement postulate is a consequence of the second principle.
 

Similar threads

  • · Replies 37 ·
2
Replies
37
Views
5K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 100 ·
4
Replies
100
Views
8K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 46 ·
2
Replies
46
Views
6K
  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K