What is the real second law of thermodynamics?

AI Thread Summary
The second law of thermodynamics states that the entropy of a closed system tends to increase over time, reflecting a tendency toward disorder. While classical interpretations suggest that entropy cannot decrease, discussions highlight the Poincaré Recurrence theorem, which implies that systems may return to a previous state given sufficient time, albeit over impractically long durations. The conversation also emphasizes the statistical nature of entropy, where fluctuations can lead to temporary decreases in entropy, though these events are exceedingly rare. Additionally, the role of quantum mechanics introduces complexities in understanding entropy, as it involves inherent uncertainties that contribute to information loss. Overall, the second law serves as a fundamental guideline for physical processes, with entropy being a key measure of system behavior.
  • #51


thank you atyy, that answers my question too
 
Science news on Phys.org
  • #52


atyy said:
The usual trick is to say, eg. in the microcanonical ensemble, instead of just considering all states with exactly energy E (surface in phase space), we consider all states with energy E plus some slack (volume in phase space). Then you can use the volume as the normalization factor.

That makes sense. Can't believe I never thought of that before.

Susskind was reluctant to talk about QM; the students kept pushing him with questions until he admitted that there was an h-bar limit, but I couldn't hear the questions, so the students themselves could have already invoked QM and that's what he conceded to.

Susskind is brilliant. But you can tell that he gets frustrated with the students and will sometimes just nod so that he can move on. I think his ideas and overall picture are what's best about those lectures, rather than getting every word or detail right. Not saying that's what happened (since I didn't watch it), but in general I trust books more than lectures for fine detail because of the possibility of real-time mistakes.

The textbook "Feynman Lectures on Physics" are even more brilliant, but since it's a book and not a lecture, the details are right too. I think it's a brilliant piece of work how he derives conservation of gravitational potential energy, principle of virtual work, and mechanical advantage from the observation that there is no perpetual machine, and then the cherry on top is the 345 triangle with a chain of beads surrounding it.
 
  • #53


I started to read "Surely you must be joking..." when I was first studying physics and I was totally turned off by his arrogant tone and haven't picked up any Feynman since. I guess it wouldn't hurt to go back now and read something that has some actual content instead.
 
  • #54


Pythagorean said:
But I'm still kind of curious why we would have to grain the volume in the first place? Why should there be a constant associated with the phase volume? Shouldn't we expect continuity in classical treatment?
In the grand canonical ensemble, one must weight the contributions of the N-particle integrals appropriately by factors whose unit is action^{-N}, in order to get something dimensionless.
This naturally introduces the Planck constant even in the classical case.
 
  • #55


There are numerous numerical experiments that prove the increasing entropy in classical dynamical systems.
This alone proves that QM is not needed to justify the second law.

Nevertheless, the Locksmith paradox remains in these experiments.
Reversing all the velocities midway on the path to equilibrium leads to a temporary decrease of entropy followed by a new increasing entropy history.

The Locksmith paradox just remains exactly as sharp in Quantum Mechanics as long as no measurement is performed on the system.
Therefore, QM mechanics could be investigated as a basis for the second principle, only from the side of the famous measurement process.

Yet, the reverse point of view, that the QM measurement irreversibility is a consequence of the second principle is far more sensible ... since the second principle can live without QM. In addition, this hypothesis would relieve Quantum Mechanics from a useless postulate and make it a physical consequence of a complex evolution.

In this last point of view, one crucial question remains then.
What is the meaning of the wave-function?
Is the wave function a consequence of the second law?

I can be arrogant sometimes, even when parroting the giants.
 
Last edited:
  • #56


Historically, Quantum Mechanics emerged from the need to reconciliate statistical thermodynamics with electrodynamics and observations.
Therefore it makes sense to argue that QM is absolutely needed for the consistency of:

1) either statistical thermodynamics
2) or electrodynamics
3) or both

We know that QM is certainly necessary for the consistency of electrodynamics.
I don't know any reason why QM is needed in statistical thermodynamics, and in particular to justify the second principle.
The normalisation of the phase space is not more than a convenience.
 
  • #57


lalbatros said:
Nevertheless, the Locksmith paradox remains in these experiments. Reversing all the velocities midway on the path to equilibrium leads to a temporary decrease of entropy followed by a new increasing entropy history.
For a macroscopic body, is impossible to reverse all these velocities. Thus the paradox has no experimental relevance.
 
  • #58


A. Neumaier said:
For a macroscopic body, is impossible to reverse all these velocities. Thus the paradox has no experimental relevance.

I agree that preparing a entropy-decreasing system is difficult, technically.
Does that mean that the second principle should be rephrased as:

"Preparing an entropy-decreasing system is too difficult to be part of physics?"

In addition, microfluctuations by themselves are deviations from the second principle.
Would that imply that there is a no man's land in physics, somewhere between microscopic and macroscopic, where there is no law and no understanding?

Furthermore, I am not sure at all that entropy-decreasing experiments have not already been performed, not only on computer simulations, but also in the lab. And even if it is not the case, why could you absolutely exclude such experiments?

Where do you put the boundary between microscopic and macroscopic?
And is there a new physics that pops up just when crossing this boundary?

Is the second principle a physical law at all?
 
  • #59


lalbatros said:
"Preparing an entropy-decreasing system is too difficult to be part of physics?"
The entropy can decrease easily in open nonequilibrium systems. The precise formulation of the second law is that the local mean entropy production is always nonnegative.

Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known.
lalbatros said:
In addition, microfluctuations by themselves are deviations from the second principle.Would that imply that there is a no man's land in physics, somewhere between microscopic and macroscopic, where there is no law and no understanding?
No. it just means that the uncertainty in checking things gets bigger and bigger as your systems get smaller and smaller. So the second law means less and less.
lalbatros said:
Where do you put the boundary between microscopic and macroscopic?
It is like the split between observer and observed, one can put it wherever one likes, without changing the physics.
 
  • #60


A. Neumaier said:
The entropy can decrease easily in open nonequilibrium systems.

My teaser is that, based on the fundamental laws of physics, entropy can decrease even in a closed system.
That's indeed the Locksmith paradox.
Is the second law not inconsistent with the other laws of physics.

I like to tell that in another way.
The second law is a theory about what we are used to observe.
The fundamental laws are theories about everything that could be observed.
(theories, not absolute truths of course)

The interresting point is: how can these be reconciled.
This is of course an old question that received already many answers.
What has been done since Boltzmann and Poincaré?

This thread already reffered to many ideas.
One of them I do not believe is a role of QM.
I believe just the opposite: the measurement postulate is a consequence of the second principle.
 
  • #61


lalbatros said:
Is the second principle a physical law at all?
It is a law because it successfully and consistently predicts how things will actually behave. It is never violated. It is not that it is physically impossible for the second law to be violated. It is just that it is statistically impossible for it to be violated.

Even if you are dealing with a relatively small number of molecules, the second law will not be violated. Suppose you had a billionth of a trillionth of a mole (10-21 mole or about 600 molecules) of a gas at temperature T in a container. Since the motions are random, could the gas spontaneously separate out into fast molecules on one half of the container and slow one's on the other for a period long enough to detect it?

The probability of even that occurring is so infinitessimally small that you would have to wait longer than the age of the universe before it would happen anywhere in the universe. So the answer is: "no".

AM
 
  • #62


A. Neumaier said:
Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known.

That's not true- my parents sure did, and I suspect yours did as well.
 
  • #63


Pythagorean said:
But to the point with thermodynamics in general, I was interested in a physical motivation for something I've only seen motivated from mathematics when I took undergrad thermodynamics (Reif). I was totally turned off by it at the time, and am just now getting back into thermodynamics.

Susskind provided the physical motivation pedagogically, atyy continues to provide research motivation for the question in his links. I'm not insistent on invoking QM, the question still remains without it.
<snip>

The original physical motivation for 'entropy' was simply developing more efficient engines, and the realization that the only determining factor is the temperature difference between the inlet and outlet.

In fact, temperature has been conspicuously missing from this thread.
 
  • #64


Andy Resnick said:
That's not true- my parents sure did, and I suspect yours did as well.
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.
 
  • #65


Andy Resnick said:
In fact, temperature has been conspicuously missing from this thread.
Once you keep temperature constant, entropy becomes essentially irrelevant.
The governing extremal principle is then one of minimizing a free entropy rather than maximizing entropy.
 
  • #66


atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

There's a lot in that document that really bugs me- for example:

"But there is another name for statistical mechanics: Thermodynamics." (p. 11). That is not true- they are quite different. AFAIK, SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems, but many of the concepts from thermostatics (temperature, for example) become more difficult to define.

http://pre.aps.org/abstract/PRE/v49/i2/p1040_1

Also, mechanics uses even-dimensional symplectic spaces, while thermodynamics uses odd-dimensional contact spaces.

"But entropy is simply a fancy word for “disorder”." (p.12). Also not true.

There is also an insistence that there is a mechanical theory of heat, which AFAIK, is not true. Dissipative mechanisms cannot be explained in terms of conservative forces.

"The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity." (p. 17). Not true- there are 'objective' measures of information (Kolmogorov). Also, because there is an absolute scale for temperature there is an absolute scale for entropy. That's besides the fact that entropy *changes* are objective.
 
  • #67


A. Neumaier said:
Once you keep temperature constant, entropy becomes essentially irrelevant.
The governing extremal principle is then one of minimizing a free entropy rather than maximizing entropy.

That's not the point: we can't be so cavalier when discussing nonequilibrium systems. Entropy can change if the temperature is constant: chemical reactions, mixing/diffusion, etc.

I'm not sure what you mean by 'free entropy'- the opposite of free energy? Negentropy?
 
  • #68


A. Neumaier said:
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.

Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
 
  • #69


Andy Resnick said:
Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
Yes, but this doesn't involve a violation of the second law.

There are two processes affecting the entropy distribution: 1. local entropy production, and 2. flow of entropy. The former is nonnegative at all places (this is the second law), whereas the latter redistributes the entropy between places in a conserved fashion.

For example, in fluid flow, local entropy production is associated with dissipative terms in the Navier-Stokes equations. Neglecting the entropy production results in the Euler equations. The latter are conservative but have an entropy current, which - unlike with Navier-Stokes - together with the entropy density satisfies the continuity equation.

The low entropy of biochemical substances (or, for that matter, inanimate crystals) is a result of an entropy current during their formation that dominates the (still positive) entropy production.

All this can be read in books on nonequilibrium thermodynamics. My favorite book is Reichl's Modern Course in Statistical Mechanics.
 
  • #70


Andy Resnick said:
There's a lot in that document that really bugs me- for example:

"But there is another name for statistical mechanics: Thermodynamics." (p. 11). That is not true- they are quite different. AFAIK, SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems, but many of the concepts from thermostatics (temperature, for example) become more difficult to define.

http://pre.aps.org/abstract/PRE/v49/i2/p1040_1

Also, mechanics uses even-dimensional symplectic spaces, while thermodynamics uses odd-dimensional contact spaces.

"But entropy is simply a fancy word for “disorder”." (p.12). Also not true.

There is also an insistence that there is a mechanical theory of heat, which AFAIK, is not true. Dissipative mechanisms cannot be explained in terms of conservative forces.

"The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity." (p. 17). Not true- there are 'objective' measures of information (Kolmogorov). Also, because there is an absolute scale for temperature there is an absolute scale for entropy. That's besides the fact that entropy *changes* are objective.

I'll just discuss your last point, since the previous points seem mainly about terminology.

The last point is in the context of a Hamiltonian system, where Liouville's theorem guarantees that the volume of phase space remains the same. So for a Hamiltonian system, then entropy increase is subjective.

If the system is not Hamiltonian, then there can be objective increases in entropy.
 
  • #71


Andy Resnick said:
SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems
So does statistical mechanics (SM in your post); it is not restricted to computing partition functions and what can be deduced from it.

See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.
 
  • #72


Andrew Mason said:
It is a law because it successfully and consistently predicts how things will actually behave. It is never violated. It is not that it is physically impossible for the second law to be violated. It is just that it is statistically impossible for it to be violated. ...

This doesn't prove it is a physical law.
It just tells us how we play dice.

This is my post nr 1111, for no physical reason.
 
  • #73


A. Neumaier said:
Yes, but this doesn't involve a violation of the second law.

Right- because living systems are *open*.

Clearly, the entropy associated with me is much lower than a puddle containing all of the atoms in me. Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements. From amino acids, we produce functional proteins. From glucose, we produce ATP well in excess of equilibrium concentrations.
 
  • #74


atyy said:
The last point is in the context of a Hamiltonian system, where Liouville's theorem guarantees that the volume of phase space remains the same. So for a Hamiltonian system, then entropy increase is subjective.

If the system is not Hamiltonian, then there can be objective increases in entropy.

Fair enough.
 
  • #75


Andy Resnick said:
Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements. From amino acids, we produce functional proteins. From glucose, we produce ATP well in excess of equilibrium concentrations.
And where is the evidence that this decrease of entropy is not due to entropy currents associated with the processes in the cell? I don't think there is any.

Just saying that the entropy is lower doesn't say anything about whether the second law is violated. The second law also governs the freezing of ice - where lots of entropy is transported to the surrounding but the local entropy production is still positive.
 
  • #76


A. Neumaier said:
See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.

Thanks, I'll check it out.

I have Feynman's "Statistical Mechanics" text and a few others, and they all start off on page 1 with something like "The fundamental principle of statistical mechanics is that if a system in equilibrium can be in 1 of N states |n>, each with energy E_n, the expected value of an observable <A> is 1/Z Sum(<n|A|n>e^(-E_n/kT)"

So right off the bat, the impression is given that only equilibrium systems can be treated, and that time cannot be considered.
 
  • #77


A. Neumaier said:
And where is the evidence that this decrease of entropy is not due to entropy currents associated with the processes in the cell? I don't think there is any.

Just saying that the entropy is lower doesn't say anything about whether the second law is violated. The second law also governs the freezing of ice - where lots of entropy is transported to the surrounding but the local entropy production is still positive.

I'm a little confused- are you saying I claim living systems violate the second law of thermodynamics?
 
  • #78


Andy Resnick said:
That's not the point: we can't be so cavalier when discussing nonequilibrium systems. Entropy can change if the temperature is constant: chemical reactions, mixing/diffusion, etc.

I'm not sure what you mean by 'free entropy'- the opposite of free energy? Negentropy?
Sorry, that was a typo - I meant free energy.

Yes, entropy can change when the temperature is constant - it can both increase and decrease without violating the second law. Whereas the free energy is forced to decrease (or stay constant) by the second law.
 
  • #79


Andy Resnick said:
I'm a little confused- are you saying I claim living systems violate the second law of thermodynamics?

You seemed to claim it; but perhaps there is a misunderstanding about the meaning of the terms.

The second law (in its most general, nonequilibrium form) says that the local mean entropy production (a well-defined term in the entropy balance equation) is everywhere nonnegative.

Your remark
Andy Resnick said:
Your comment "Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known" is not true in the context of biological systems- biological systems do indeed, locally, decrease entropy by constructing things from simpler elements.
claims the opposite for biological system. The same seems to be the case in
Andy Resnick said:
A. Neumaier said:
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.
Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
On the other hand, your comment
Andy Resnick said:
A. Neumaier said:
Yes, but this doesn't involve a violation of the second law.
Right- because living systems are *open*.
pays lip service to the validity of the second law. So the overall picture is quite confusing.

Indeed, your last-quoted statement is correct. Although the local entropy production is positive (and the second law holds), entropy can decrease in an open system precisely because the system is open - so that entropy can leave the boundaries of the system. But the latter is due to an entropy current, which is not restricted by the second law.
 
  • #80


Andy Resnick said:
I have Feynman's "Statistical Mechanics" text and a few others, and they all start off on page 1 with something like "The fundamental principle of statistical mechanics is that if a system in equilibrium can be in 1 of N states |n>, each with energy E_n, the expected value of an observable <A> is 1/Z Sum(<n|A|n>e^(-E_n/kT)"

So right off the bat, the impression is given that only equilibrium systems can be treated, and that time cannot be considered.

Well, that's the traditional introduction a la Gibbs 1902, who shaped several generations of physicists. Modern books on statistical mechanics often have a nonequilibrium part (especially when this word is in the title). In my 1980 edition of Reichl, the last third (starting in Chapter 13) is about nonequilibrium.
 
  • #81


A. Neumaier said:
Your remark

claims the opposite for biological system. The same seems to be the case in

On the other hand, your comment

pays lip service to the validity of the second law. So the overall picture is quite confusing.

Then let me clarify. Biological systems (or any system) do not violate the second law of thermodynamics. The second law of thermodynamics is perhaps the most fundamental law of nature we have.

Biological systems are *open*. They exchange both matter and energy with their surroundings. Thus, they can *locally* decrease their entropy (for example, by constructing proteins, by synthesizing ATP, by the chemiosmotic mechanism) at the expense of increasing the entropy of their surroundings- waste products.
 
  • #82


Andy Resnick said:
The original physical motivation for 'entropy' was simply developing more efficient engines, and the realization that the only determining factor is the temperature difference between the inlet and outlet.

In fact, temperature has been conspicuously missing from this thread.

I'm not sure if that counts as a physical motivation. It sounds like engineering motivation to me.

It's quite possible that entropy, like conservation of energy has no physical motivation (i.e. we can't explain why it is mechanistically we can only state that it's what we observe.

Whereas many other phenomena can be explained by conservation of energy, itself, for instance; many things must be so because of conservation of energy, but we (I, more accurately) don't know why conservation of energy must be so.

It appears to me, that entropy has the same such "causal dead end" to it, so far.

addendum:

I see you stated in your last post that the second law is perhaps the most fundamental law of nature we have. I think there's a correlation between feelings like this and "dead end causality" (i.e. most physicists feel the same about conservation laws).
 
  • #83


Pythagorean said:
I'm not sure if that counts as a physical motivation. It sounds like engineering motivation to me.

<snip>

I see you stated in your last post that the second law is perhaps the most fundamental law of nature we have. I think there's a correlation between feelings like this and "dead end causality" (i.e. most physicists feel the same about conservation laws).

To the first sentence, I have no idea what you mean- what exactly do you mean by 'physical' motivation, if not physical objects?

To the last, I think that's overstating the case- I mean that the second law has been verified more times, under more varied circumstances, than any other theory and has so far not once ever been observed to fail.
 
  • #84


Andy Resnick said:
To the first sentence, I have no idea what you mean- what exactly do you mean by 'physical' motivation, if not physical objects?

The concept has only been introduced to me by my german physics teachers explicitly (any good lecturer motivates their lectures, but it's generally an implicit process. Graduates from germany tend to explicitly motivate, using the words "motivated" and sometimes specifying: educational, biological, physical, applications (engineering/medical). Then, they like to follow with the "ansatz" (which the us doesn't explicitly point out either), etc, etc, through to the conclusion of their argument.

It's basically a "why do we care"? And your answer was, in short, "for engineering reasons" (speaking of "making" things "more efficient"). It's not bad or wrong to not have a physical motivation.

To the last, I think that's overstating the case- I mean that the second law has been verified more times, under more varied circumstances, than any other theory and has so far not once ever been observed to fail.

I don't see the contradiction with what I said. The same can be said for conservation of energy, yet we have no physical mechanism that explains conservation of energy. I merely pointed out that there's a correlation between these two things (no mechanistic explanation and a tendency for them to be pervasive, and for scientists to call them "fundamental".)
 
  • #85


A. Neumaier said:
See the later chapters of Reichl's Modern Course in Statistical Mechanics, where, for example, the Boltzmann equation (and from that, the Navier-Stokes equations) are derived from statistical mechanics. At least the Navier-Stokes equation counts as nonequilibrium thermodynamics, as it relates thermodynamic quantities varying in space and time.

I looked through the table of contents on Amazon; the hydrodynamic coverage appears very similar to Chaikin and Lubensky's "Principles of Condensed Matter".

Out of curiosity, what does Reichl write down as the statistical mechanical result for viscosity?
 
  • #86


Pythagorean said:
The concept has only been introduced to me by my german physics teachers explicitly (any good lecturer motivates their lectures, but it's generally an implicit process. Graduates from germany tend to explicitly motivate, using the words "motivated" and sometimes specifying: educational, biological, physical, applications (engineering/medical). Then, they like to follow with the "ansatz" (which the us doesn't explicitly point out either), etc, etc, through to the conclusion of their argument.

It's basically a "why do we care"? And your answer was, in short, "for engineering reasons" (speaking of "making" things "more efficient"). It's not bad or wrong to not have a physical motivation.

I guess I still don't understand what you mean by 'physical motivation'. If it's not a mathematical/conceptual model or a physical object, what is left?

Pythagorean said:
I don't see the contradiction with what I said. The same can be said for conservation of energy, yet we have no physical mechanism that explains conservation of energy. I merely pointed out that there's a correlation between these two things (no mechanistic explanation and a tendency for them to be pervasive, and for scientists to call them "fundamental".)

No, there's a difference: conservation of energy (or any conservation law)- recall that 'conservation' is a *mathematical* statement- follows from elementary balance laws: the amount of something in a control volume 'now' is equal to the amount previous, in addition to however much was produced within the volume and the net flux entering the volume.
 
  • #87


Pythagorean said:
Whereas many other phenomena can be explained by conservation of energy, itself, for instance; many things must be so because of conservation of energy, but we (I, more accurately) don't know why conservation of energy must be so.
We know exactly WHY energy is conserved. Conservation of energy follows from the definition of force and the equations for electric and gravitational force.

\vec F = \frac{d\vec{p}}{dt}

\vec F = \frac{GmM}{r^2}\hat r

\vec F = \frac{qQ}{4\pi\epsilon_0 r^2}\hat r

If gravity force and the electric force are the only forces that apply, the quantity KE + PE will be conserved. We know why that is, given those forces. (I am not saying that we know why these forces exist).

The problem only arises when we introduce mechanical forces. But that is not because mechanical forces are different. They are entirely the result of gravity and/or the electric force. The problem is the complexity of the interaction. It is not a matter of the interaction of two masses or two charges. Rather it is the interaction of enormous numbers of little masses and charges and it becomes impossible to calculate the effects of all those little interactions. But we know that since all the interactions involve either gravity or electric force that the total KE+PE of all the matter involved will be the same before and after the interaction(s). That is the first law of thermodynamics.

AM
 
  • #88


Andy Resnick said:
I guess I still don't understand what you mean by 'physical motivation'. If it's not a mathematical/conceptual model or a physical object, what is left?

1. observations don't match theory (this is how Planck motivated discretizing the energy in the blackbody integral)
2. finding the interface between two physical theories (i.e. quantum chaos)
3. finding a mechanism for an observation ("why is there night?" and no, not the proverbial "why"; in this case, the answer would be the motion of the planetary bodies)

I was asking if there was a 3. Why is there entropy?

No, there's a difference: conservation of energy (or any conservation law)- recall that 'conservation' is a *mathematical* statement- follows from elementary balance laws: the amount of something in a control volume 'now' is equal to the amount previous, in addition to however much was produced within the volume and the net flux entering the volume.

This isn't where my confusion is, my confusion must be: I'm still under the impression the second law is also a *mathematical* statement (derived from statistics). Is that wrong?

Andrew Mason said:
But we know that since all the interactions involve either gravity or electric force that the total KE+PE of all the matter involved will be the same before and after the interaction(s). That is the first law of thermodynamics.

AM

Oh wow, I can't believe I never realized the 1st law of thermodynamics was equivalent to energy conservation. That's a little embarrassing...
 
  • #89


Pythagorean said:
This isn't where my confusion is, my confusion must be: I'm still under the impression the second law is also a *mathematical* statement (derived from statistics). Is that wrong?

The second law is not derived from classical statistical mechanics. Classical statistical mechanics is a way to predict equilibrium quantities that is unprincipled, but that it reproduces equilibrium thermodynamics from a microscopic viewpoint shows that there is something to it.

The definition of a state variable called entropy, and its non-decrease in an isolated system is derived from the Kelvin statement of the second law of thermodynamics, or equivalently the Clausius statement, which is almost a "plain English" sentence. In contrast to the starting point of classical statistical mechanics, the Kelvin and Clausius statements are supported by lots of data. It is one of the most amazing deductions of physics, certainly on par with Einstein's deduction of special relativity from 2 postulates.

So basically entropy exists because, as Andrew Mason said early in this thread:

Andrew Mason said:
The easiest form of the second law to understand is the Clausius statement of the second law:

"No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature."​

The Kelvin statement can be shown to be equivalent:

"No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work."​
 
Last edited:
  • #90


Ok, this is becoming a lot more physically tangible to me now.

My next step is really to find and look over the Clausius derivation; thank you everyone for your input.
 
  • #91


An exposition of the route from the Kelvin and Clausius statements to the existence of a state variable called entropy and its non-decrease in an isolated system can be found at http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec2.pdf

Wikipedia has a fascinating history of http://en.wikipedia.org/wiki/History_of_entropy . I would be interested to know what people think of Jayne's view.
 
Last edited by a moderator:
  • #92


Andy Resnick said:
Then let me clarify. Biological systems (or any system) do not violate the second law of thermodynamics. The second law of thermodynamics is perhaps the most fundamental law of nature we have.

Biological systems are *open*. They exchange both matter and energy with their surroundings. Thus, they can *locally* decrease their entropy (for example, by constructing proteins, by synthesizing ATP, by the chemiosmotic mechanism) at the expense of increasing the entropy of their surroundings- waste products.

OK. I agree. The point is that open systems not only exchange matter and energy but also entropy, and this explains why they can locally decrease their entropy (or energy or matter density).

Our misunderstanding was about the meaning of the term ''local entropy production''. This is a technical term and describes only that part of the local entropy change that cannot be explained by entropy flow. It is this term that the second law requires to be nonnegative (in a classical setting interpretable as the amount of information irreversibly lost to the neglected microscopic degrees of freedom), no matter how much entropy flows in or out the system.
 
  • #93


Andy Resnick said:
I looked through the table of contents on Amazon; the hydrodynamic coverage appears very similar to Chaikin and Lubensky's "Principles of Condensed Matter".

Out of curiosity, what does Reichl write down as the statistical mechanical result for viscosity?
It is expressed at the end of Section 13F (in my edition) as a mommentum integral involving the inverse of the Lorentz-Boltzmann collision operator - too complex to write down.
 
  • #95


A. Neumaier said:
It is expressed at the end of Section 13F (in my edition) as a mommentum integral involving the inverse of the Lorentz-Boltzmann collision operator - too complex to write down.

That's what I expected- one of my criticisms of statistical mechanics (as in all mechanical theories) is that dissipative processes cannot currently be described in any useful way.
 
  • #96


Andy Resnick said:
That's what I expected- one of my criticisms of statistical mechanics (as in all mechanical theories) is that dissipative processes cannot currently be described in any useful way.
No, you misunderstood: they can be used for quantitiative predictions, and Reichl gives explicit formulas (though not down to the point where the viscosity coefficient was numerically evaluated for a particular gas). With ''too complex to be written down'' I only meant that I didn't want to transcribe them into latex for the post.
 
  • #97


A. Neumaier said:
No, you misunderstood: they can be used for quantitiative predictions, and Reichl gives explicit formulas (though not down to the point where the viscosity coefficient was numerically evaluated for a particular gas). With ''too complex to be written down'' I only meant that I didn't want to transcribe them into latex for the post.

Yeah, but that's just for linear viscosity-Chaikin and Lubensky provide calculations against experiment for liquid Argon. And that's perhaps the most simple case. I'd like to see how statistical mechanics handles creep. Or fracture. Or water.

The reality is, in order to generate useful physical models, multiple conceptual approaches must be used- elements of both thermodynamics and statistical mechanics are often used to understand protein folding, for example.
 
Last edited:
  • #98


lalbatros said:
This doesn't prove it is a physical law.
It just tells us how we play dice.

This is my post nr 1111, for no physical reason.
One does not prove physical laws. Mathematics is about proof. Science is about disproof.

In order to debate the issue, one has to start with a definition of a physical law.

I would suggest the following definition: a physical law is:

a) a statement about some physical property or physical phenomenon that purports to be either universally true or true over a certain range of conditions,

b) that leads to predictions of physical behaviour that can be observed, so it would be falsifiable by the existence of contrary physical evidence and

c) for which no physical evidence has yet been found that falsifies it.​

I suggest that the second law of thermodynamics fits this definition.

AM
 
Last edited:
  • #99


I agree with you, Andrew.

However, precisely, I have often asked myself how the second law could be disproved.
That's probably why it has been given different formulations, apparently equivalent.

When we observe entropy-decreasing fluctuations, we are supposed to "say" that the second law is irrelevant on the microscopic level.
Of course, if we wait longer -say 1 billion years- we increase the chance to observe entropy-decreasing fluctuations on scales that we -today- would call "macroscopic".
Looks like, then, that the second law is about "small-scale" and "short-times" phenomena.
Is then "small" and "short" to be understood on a human-scale?
Does that fit in the definition of a physical law?
Probably yes according to your definition list.
But that doesn't wipe out my problem.
It just means that I have to re-phrase my question.

Consider now the http://en.wikipedia.org/wiki/Second_law_of_thermodynamics#Clausius_statement" of the second principle:

No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature​
Should I conclude, based on your definition of a physical law, that any fluctuation disproves the second principle? It would then be a physcial law, but a wrong one!

Further, can't also we take the point of view that the Clausius statement simply defines what "temperature" means? Even empiral temperatures would fit the Clausius definition of temperature according the the Clausius statement of the second principle.
Defining that heat (energy) goes from hot to cold, is that a law of physics?
Can a definition become a law?

However, I would be a very ungrateful engineer if I would not recognize the value of the second principle, at least for engineering. But I can harldy consider it as a law of physics. I would rather consider it as a pre-law of physics: a law that still needs to be discovered.
 
Last edited by a moderator:
  • #100


Andy Resnick said:
Yeah, but that's just for linear viscosity-Chaikin and Lubensky provide calculations against experiment for liquid Argon. And that's perhaps the most simple case. I'd like to see how statistical mechanics handles creep. Or fracture. Or water.
Of course, as the models get more complex and the questions asked more detailed, calculations from first principles become too unwieldy to be actually performed. But they suffice to give a proof of concept. But this also holds for equilibrium thermodynamics.

Engineers will always find it simpler to measure the thermodynamic properties of water/steam rather than to derive it from a partition function. But if one wants to understand water in extreme situations where you can't measure it directly, the ab initio methods are useful.

The same holds for nonequilibrium statistical mechanics. It is far easier to measure the viscosity of water than to calculate it from first principles. But, for example, people use the theory to understand inflation after the big bang, where it is impossible to make experiments.
Andy Resnick said:
The reality is, in order to generate useful physical models, multiple conceptual approaches must be used- elements of both thermodynamics and statistical mechanics are often used to understand protein folding, for example.
You might be interested in my survey paper
A. Neumaier,
Molecular modeling of proteins and mathematical prediction of protein structure,
SIAM Rev. 39 (1997), 407-460.
http://arnold-neumaier.at/papers/physpapers.html#protein
 
Back
Top