What is the real second law of thermodynamics?

In summary: assuming you're asking about the entropy of a system in equilibrium, then it is absolutely true that the entropy will increase over time.
  • #36


Phrak,

I almost agree with you, except for what I don't understand.

However, how can we conceive that "knowledge" would be part of physics?
The world evolves according to the laws of physics, without taking our knowledge of it into account in any way.
The sun heats up the earth, should we know it or not, should there be humans on Earth or not.

Well, right ... except that the world doesn't even know physics!

Michel
 
Science news on Phys.org
  • #37


Andy Resnick said:
Believe who you want- science cares not a whit about credentials.

Andrew Mason said:
Science is not about "belief".

I'm kind of offended that you guys replied with this banal pedantry. Especially when I was asking for rationalization (see Redx's ant atyy's replies). Do I have to lecture you about the word belief and how it doesn't imply that science is a religion when scientists talk about beliefs? I don't think so.

atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

(ii) Does classical statistical mechanics of identical particles require QM for its justification -yes. Because classically, identical particles have distinct trajectories, and so cannot be really identical.

Your professor was probably referring to the second idea, not the first.

The discussion with my professor was about the cause of thermodynamics (a physical description of why it must be so). The conflict you're talking about was my own misunderstanding of a lecture (I rewatched it. It's right where a student asks a question that I can't hear and I took Susskind's answer in the context of the lecture). This conflict was resolved by both your and Redx's replies.

I'm still curious if there is such a mechanistic description in classical thermodynamics? Would the statistical definition really satisfy that?
 
  • #38


moonman239 said:
Title says it all.

The most general second law says that the local entropy production at every point in space is nonnegative, and vanishes only in equilibrium.

The usual formulations of the second law are special cases of this.
 
  • #39


Pythagorean said:
I'm kind of offended that you guys replied with this banal pedantry. Especially when I was asking for rationalization (see Redx's ant atyy's replies). Do I have to lecture you about the word belief and how it doesn't imply that science is a religion when scientists talk about beliefs? I don't think so.



The discussion with my professor was about the cause of thermodynamics (a physical description of why it must be so). The conflict you're talking about was my own misunderstanding of a lecture (I rewatched it. It's right where a student asks a question that I can't hear and I took Susskind's answer in the context of the lecture). This conflict was resolved by both your and Redx's replies.

I'm still curious if there is such a mechanistic description in classical thermodynamics? Would the statistical definition really satisfy that?

I apologize- I was having a bad day yesterday and tossed off a snippy comment. I know you make an honest effort to understand things.

But getting back to the point, the concept of 'entropy' admits many interpretations (thermodynamic, statistical, information, etc), and while these interpretations are (as they must be) equivalent, one interpretation may be more 'useful' to describe a situation than another.

Claims that QM (or SM) is *required* to *understand* or*explain* thermodynamics are not founded on good science. To ask why physical reality *must* be the way it is leads to the anthromorphic principle, which I am personally uncomfortable with.
 
  • #40


Pythagorean said:
I'm still curious if there is such a mechanistic description in classical thermodynamics? Would the statistical definition really satisfy that?

The second law follows from classical statistical mechanics in the same way as from quantum statistical mechanics. The treatment by Gibbs 1902 was classical but survived the quantum revolution without any qualitative changes, and without quantitative changes bigger than O(hbar).

What does not follow from classical statistical mechanics is that mixing identical substances does not increase the entropy. But this follows by extending the state space to one with a variable number of particles (for the grand canonical ensemble) and weighting the Liouville measure for the N-particle space by 1/N!, corresponding to Boltzmann counting. The factor can be interpreted as accounting for indistinguishability of the particles. Nothing quantum is needed for that.

However, quantum field theory explains indistinguishability in a very natural way.
 
  • #41


just an fyi: I learned thermodynamics from Reif and we started with the binomial distribution. Most of that class was mathematics with absolutely zero physical motivation (only statistical motivation). I didn't care about or absorb any of it, I just pushed through for good marks.

So I was quite refreshed by Susskind's treatment.

AM, I will check out Feynman's lecture, I hope it won't be like Reif's treatment.

Andy Resnick said:
I apologize- I was having a bad day yesterday and tossed off a snippy comment. I know you make an honest effort to understand things.

No problem, I'm glad to see you return to the discussion.

But getting back to the point, the concept of 'entropy' admits many interpretations (thermodynamic, statistical, information, etc), and while these interpretations are (as they must be) equivalent, one interpretation may be more 'useful' to describe a situation than another.

Claims that QM (or SM) is *required* to *understand* or*explain* thermodynamics are not founded on good science. To ask why physical reality *must* be the way it is leads to the anthromorphic principle, which I am personally uncomfortable with.

I was trying to be careful to make the distinction between anthropomorphism and physical mechanisms. Returning to the spring discussion, a block of concrete can have a spring constant, but it's nothing you'd realize unless you understood compressibility and solid state physics. The micro explanation is more general and covers more cases of "springiness" than the macro treatment of hooke's law. But it also explains why we have springiness in all macro materials.

As another example, conservation of charge says KCL must be so.

I was under the impression that HUP said entropy must be so (though I realize now that that's unfounded).

Neumaier said:
The second law follows from classical statistical mechanics in the same way as from quantum statistical mechanics. The treatment by Gibbs 1902 was classical but survived the quantum revolution without any qualitative changes, and without quantitative changes bigger than O(hbar).

What does not follow from classical statistical mechanics is that mixing identical substances does not increase the entropy. But this follows by extending the state space to one with a variable number of particles (for the grand canonical ensemble) and weighting the Liouville measure for the N-particle space by 1/N!, corresponding to Boltzmann counting. The factor can be interpreted as accounting for indistinguishability of the particles. Nothing quantum is needed for that.

However, quantum field theory explains indistinguishability in a very natural way.

Thank you for your reply. I'm beginning to recognize the importance of learning the history of science along with the science.
 
  • #42


I guess strictly speaking, neither classical statistical mechanics nor its quantum counterpart explain the second law. An explanation should start from the exact equations of motion. I don't know what his reference is, but this guy says

"Reconciling the reversibility of laws of physics governing the microscopic domain with the observed irreversibility of macroscopic phenomena is a fundamental problem. Of course, not all microscopic laws of physics are reversible: weak nuclear interactions violate time reversal symmetry, and the collapse of the quantum wave-function in the act of observation is irreversible. The former interactions in fact do not play any significant role in everyday observations that lead to the second law. The irreversible collapse of the wave-function may itself be an artifact of treating macroscopic observers and microscopic observables distinctly. There are proponents of the view that the reversibility of the currently accepted microscopic equations of motion (classical or quantum) is indicative of their inadequacy. However, the advent of powerful computers has made it possible to simulate the evolution of collections of large numbers of particles, governed by classical, reversible equations of motion. Although simulations are currently limited to relatively small numbers of particles (10^6), they do exhibit the irreversible macroscopic behaviors similar to those observed in nature (typically involving 10^23 particles). For example, particles initially occupying one half of a box proceed to irreversibly,and uniformly,occupy the whole box. (This has nothing to do with limitations of computational accuracy; the same macroscopic irreversibility is observed in exactly reversible integer based simulations, such as with cellular automata.) Thus the origin of the observed irreversibilities should be sought in the classical evolution of large collections of particles." http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec9.pdf
 
Last edited by a moderator:
  • #44


Pythagorean said:
Returning to the spring discussion, a block of concrete can have a spring constant, but it's nothing you'd realize unless you understood compressibility and solid state physics. The micro explanation is more general and covers more cases of "springiness" than the macro treatment of hooke's law. But it also explains why we have springiness in all macro materials.

That implies we can't understand a phenomenon until there exists a detailed microscopic model from which the phenomenon emerges.

I disagree with that, because when I say I understand something, it means I have a predictive model and the predictions match with experiment. The model can use any set of constructs I choose- so long as it makes testable predictions that match with reality.

I understand Hooke's law in terms of how far something stretches under an applied force- that's plenty sufficient for most applications. Certainly, it's sufficient for Physics I. Alternatively, I understand Hooke's law as a linear 1-D stress-strain relationship. I understand strain because I can measure the amount of deformation in an object with a ruler. I understand stress because Newton's law F = dp/dt and Cauchy's law [itex]\nabla\bullet T = Dp/Dt[/itex] let's me understand stress in the context of force (or energy). Now, I can see Hooke's law as a special case of a more general phenomenon (nonlinear 3-D stress-strain relationships). Notice, I didn't need to know about atoms (or any microscopic details) in order to understand Hooke's law.

To be fair, I can't predict the specific constitutive relationship for a given object with this model; but since constitutive relationships can be measured in the lab, I think that's a weak criticism.

If I wanted to 'play dumb', I could continue to ask "but how do I know what *that* means?" over and over again until I run out of explanations- some people may choose to interpret that as the limit of my knowledge. I, OTOH, interpret that as denying the possibility of knowledge.

Again, requiring that entropy be explained in terms of a microscopic theory in order to understand entropy is overstating the case. Sure, a microscopic theory is useful- but since such a theory does not exist, we should not then say we know nothing about entropy.

Is 'KCL' Kirchoff's circuit law? Yes, Kirchoff's law I_in = I_out follows from conservation of charge. But what is charge? (see? we can keep playing that game- do you really have to understand QED in order to master Physics I?)
 
  • #45


I certainly don't disagree with you, for all practical purposes; but it's allowed, in scientific discussion to look at the frontier. I don't model neurons as quantum particles... I mean, seriously! I don't even have to model each potassium ion. I model them as an isopotential electrochemical circuit (more or less) like all the other ion models do. But it doesn't mean I'm going to act like I have a bucket over my head and only see circuits when I talk about cell behavior in open discussion.

You can't deny that understanding the micro-details of a spring has helped us to understand the macro-view if it. That it goes the other way is already a given (which is why more people are laymen than scientists - macro is closer to every day experience, laymen explanations are even closer to every day experience, so it's natural that we're more comfortable with macro explanations... maybe... once you start talking about cosmology, perhaps it's easier to intuit as a micro-member of the universe.)

But to the point with thermodynamics in general, I was interested in a physical motivation for something I've only seen motivated from mathematics when I took undergrad thermodynamics (Reif). I was totally turned off by it at the time, and am just now getting back into thermodynamics.

Susskind provided the physical motivation pedagogically, atyy continues to provide research motivation for the question in his links. I'm not insistent on invoking QM, the question still remains without it.

addendum:

I didn't actually mean "micro was better than macro", I just meant that you don't have to box yourself into one or the other in a discussion setting.
 
Last edited:
  • #46


Pythagorean said:
I'm kind of offended that you guys replied with this banal pedantry. Especially when I was asking for rationalization (see Redx's ant atyy's replies). Do I have to lecture you about the word belief and how it doesn't imply that science is a religion when scientists talk about beliefs? I don't think so.
Sorry. I certainly didn't mean to offend, but I can see how you might take it that way. I didn't mean to be banal or pedantic. Sometimes we need to rein in our rhetoric. Again sorry for that.

AM
 
  • #47


Andrew Mason said:
Sorry. I certainly didn't mean to offend, but I can see how you might take it that way. I didn't mean to be banal or pedantic. Sometimes we need to rein in our rhetoric. Again sorry for that.

AM

In hindsight, my post could have been less provoking. I was seriously not intending to compare credentials, I was just frustrated with my intuition conflict.
 
  • #48


atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

I only had time to read pages 15-17.

I vaguely recall objects such as Peano curve that can fill the entire plane. So is it possible for phase-space tendrils to be so close together that every area of hbar contains at least one phase space point in it? I think that's probably what Susskind is thinking (although I admit I didn't watch Susskind's lectures (lack of time)), using quantum mechanics to explain how an apparent increase can be taken as a real increase, since there is no such thing as filling half a pixel of hbar - the pixel is either filled or unfilled. But certainly how far apart the phase-space tendrils become is independent of quantum mechanics, and hence whether the system has hbar resolution is not to be decided by quantum mechanics but the equations of classical mechanics. In that sense entropy is independent of quantum mechanics.

Still, how do you normalize phase space without hbar?
 
  • #49


RedX said:
Still, how do you normalize phase space without hbar?

The usual trick is to say, eg. in the microcanonical ensemble, instead of just considering all states with exactly energy E (surface in phase space), we consider all states with energy E plus some slack (volume in phase space). Then you can use the volume as the normalization factor. One has to stick to canonical variables, since it's only under those that phase space volume is invariant.

I'm reading from http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec12.pdf , and also the ideal gas example in http://ocw.mit.edu/courses/physics/8-333-statistical-mechanics-i-statistical-mechanics-of-particles-fall-2007/lecture-notes/lec13.pdf
 
Last edited by a moderator:
  • #50


RedX said:
I think that's probably what Susskind is thinking (although I admit I didn't watch Susskind's lectures (lack of time)), using quantum mechanics to explain how an apparent increase can be taken as a real increase, since there is no such thing as filling half a pixel of hbar -

for context:

Susskind was reluctant to talk about QM; the students kept pushing him with questions until he admitted that there was an h-bar limit, but I couldn't hear the questions, so the students themselves could have already invoked QM and that's what he conceded to.

My professor's words were that the classical description of entropy was hand-waivy (physically, because it's described statistically) and that you really needed QM (he didn't rationalize this). My natural assumption was that with QM, statistics IS physical motivation because QM is fundamentally statistical.

But I'm still kind of curious why we would have to grain the volume in the first place? Why should there be a constant associated with the phase volume? Shouldn't we expect continuity in classical treatment?
 
  • #51


thank you atyy, that answers my question too
 
  • #52


atyy said:
The usual trick is to say, eg. in the microcanonical ensemble, instead of just considering all states with exactly energy E (surface in phase space), we consider all states with energy E plus some slack (volume in phase space). Then you can use the volume as the normalization factor.

That makes sense. Can't believe I never thought of that before.

Susskind was reluctant to talk about QM; the students kept pushing him with questions until he admitted that there was an h-bar limit, but I couldn't hear the questions, so the students themselves could have already invoked QM and that's what he conceded to.

Susskind is brilliant. But you can tell that he gets frustrated with the students and will sometimes just nod so that he can move on. I think his ideas and overall picture are what's best about those lectures, rather than getting every word or detail right. Not saying that's what happened (since I didn't watch it), but in general I trust books more than lectures for fine detail because of the possibility of real-time mistakes.

The textbook "Feynman Lectures on Physics" are even more brilliant, but since it's a book and not a lecture, the details are right too. I think it's a brilliant piece of work how he derives conservation of gravitational potential energy, principle of virtual work, and mechanical advantage from the observation that there is no perpetual machine, and then the cherry on top is the 345 triangle with a chain of beads surrounding it.
 
  • #53


I started to read "Surely you must be joking..." when I was first studying physics and I was totally turned off by his arrogant tone and haven't picked up any Feynman since. I guess it wouldn't hurt to go back now and read something that has some actual content instead.
 
  • #54


Pythagorean said:
But I'm still kind of curious why we would have to grain the volume in the first place? Why should there be a constant associated with the phase volume? Shouldn't we expect continuity in classical treatment?
In the grand canonical ensemble, one must weight the contributions of the N-particle integrals appropriately by factors whose unit is action^{-N}, in order to get something dimensionless.
This naturally introduces the Planck constant even in the classical case.
 
  • #55


There are numerous numerical experiments that prove the increasing entropy in classical dynamical systems.
This alone proves that QM is not needed to justify the second law.

Nevertheless, the Locksmith paradox remains in these experiments.
Reversing all the velocities midway on the path to equilibrium leads to a temporary decrease of entropy followed by a new increasing entropy history.

The Locksmith paradox just remains exactly as sharp in Quantum Mechanics as long as no measurement is performed on the system.
Therefore, QM mechanics could be investigated as a basis for the second principle, only from the side of the famous measurement process.

Yet, the reverse point of view, that the QM measurement irreversibility is a consequence of the second principle is far more sensible ... since the second principle can live without QM. In addition, this hypothesis would relieve Quantum Mechanics from a useless postulate and make it a physical consequence of a complex evolution.

In this last point of view, one crucial question remains then.
What is the meaning of the wave-function?
Is the wave function a consequence of the second law?

I can be arrogant sometimes, even when parroting the giants.
 
Last edited:
  • #56


Historically, Quantum Mechanics emerged from the need to reconciliate statistical thermodynamics with electrodynamics and observations.
Therefore it makes sense to argue that QM is absolutely needed for the consistency of:

1) either statistical thermodynamics
2) or electrodynamics
3) or both

We know that QM is certainly necessary for the consistency of electrodynamics.
I don't know any reason why QM is needed in statistical thermodynamics, and in particular to justify the second principle.
The normalisation of the phase space is not more than a convenience.
 
  • #57


lalbatros said:
Nevertheless, the Locksmith paradox remains in these experiments. Reversing all the velocities midway on the path to equilibrium leads to a temporary decrease of entropy followed by a new increasing entropy history.
For a macroscopic body, is impossible to reverse all these velocities. Thus the paradox has no experimental relevance.
 
  • #58


A. Neumaier said:
For a macroscopic body, is impossible to reverse all these velocities. Thus the paradox has no experimental relevance.

I agree that preparing a entropy-decreasing system is difficult, technically.
Does that mean that the second principle should be rephrased as:

"Preparing an entropy-decreasing system is too difficult to be part of physics?"

In addition, microfluctuations by themselves are deviations from the second principle.
Would that imply that there is a no man's land in physics, somewhere between microscopic and macroscopic, where there is no law and no understanding?

Furthermore, I am not sure at all that entropy-decreasing experiments have not already been performed, not only on computer simulations, but also in the lab. And even if it is not the case, why could you absolutely exclude such experiments?

Where do you put the boundary between microscopic and macroscopic?
And is there a new physics that pops up just when crossing this boundary?

Is the second principle a physical law at all?
 
  • #59


lalbatros said:
"Preparing an entropy-decreasing system is too difficult to be part of physics?"
The entropy can decrease easily in open nonequilibrium systems. The precise formulation of the second law is that the local mean entropy production is always nonnegative.

Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known.
lalbatros said:
In addition, microfluctuations by themselves are deviations from the second principle.Would that imply that there is a no man's land in physics, somewhere between microscopic and macroscopic, where there is no law and no understanding?
No. it just means that the uncertainty in checking things gets bigger and bigger as your systems get smaller and smaller. So the second law means less and less.
lalbatros said:
Where do you put the boundary between microscopic and macroscopic?
It is like the split between observer and observed, one can put it wherever one likes, without changing the physics.
 
  • #60


A. Neumaier said:
The entropy can decrease easily in open nonequilibrium systems.

My teaser is that, based on the fundamental laws of physics, entropy can decrease even in a closed system.
That's indeed the Locksmith paradox.
Is the second law not inconsistent with the other laws of physics.

I like to tell that in another way.
The second law is a theory about what we are used to observe.
The fundamental laws are theories about everything that could be observed.
(theories, not absolute truths of course)

The interresting point is: how can these be reconciled.
This is of course an old question that received already many answers.
What has been done since Boltzmann and Poincaré?

This thread already reffered to many ideas.
One of them I do not believe is a role of QM.
I believe just the opposite: the measurement postulate is a consequence of the second principle.
 
  • #61


lalbatros said:
Is the second principle a physical law at all?
It is a law because it successfully and consistently predicts how things will actually behave. It is never violated. It is not that it is physically impossible for the second law to be violated. It is just that it is statistically impossible for it to be violated.

Even if you are dealing with a relatively small number of molecules, the second law will not be violated. Suppose you had a billionth of a trillionth of a mole (10-21 mole or about 600 molecules) of a gas at temperature T in a container. Since the motions are random, could the gas spontaneously separate out into fast molecules on one half of the container and slow one's on the other for a period long enough to detect it?

The probability of even that occurring is so infinitessimally small that you would have to wait longer than the age of the universe before it would happen anywhere in the universe. So the answer is: "no".

AM
 
  • #62


A. Neumaier said:
Nobody has the slightest idea about how to construct something with negative local mean entropy production, consistent with what is known.

That's not true- my parents sure did, and I suspect yours did as well.
 
  • #63


Pythagorean said:
But to the point with thermodynamics in general, I was interested in a physical motivation for something I've only seen motivated from mathematics when I took undergrad thermodynamics (Reif). I was totally turned off by it at the time, and am just now getting back into thermodynamics.

Susskind provided the physical motivation pedagogically, atyy continues to provide research motivation for the question in his links. I'm not insistent on invoking QM, the question still remains without it.
<snip>

The original physical motivation for 'entropy' was simply developing more efficient engines, and the realization that the only determining factor is the temperature difference between the inlet and outlet.

In fact, temperature has been conspicuously missing from this thread.
 
  • #64


Andy Resnick said:
That's not true- my parents sure did, and I suspect yours did as well.
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.
 
  • #65


Andy Resnick said:
In fact, temperature has been conspicuously missing from this thread.
Once you keep temperature constant, entropy becomes essentially irrelevant.
The governing extremal principle is then one of minimizing a free entropy rather than maximizing entropy.
 
  • #66


atyy said:
(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

There's a lot in that document that really bugs me- for example:

"But there is another name for statistical mechanics: Thermodynamics." (p. 11). That is not true- they are quite different. AFAIK, SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems, but many of the concepts from thermostatics (temperature, for example) become more difficult to define.

http://pre.aps.org/abstract/PRE/v49/i2/p1040_1

Also, mechanics uses even-dimensional symplectic spaces, while thermodynamics uses odd-dimensional contact spaces.

"But entropy is simply a fancy word for “disorder”." (p.12). Also not true.

There is also an insistence that there is a mechanical theory of heat, which AFAIK, is not true. Dissipative mechanisms cannot be explained in terms of conservative forces.

"The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity." (p. 17). Not true- there are 'objective' measures of information (Kolmogorov). Also, because there is an absolute scale for temperature there is an absolute scale for entropy. That's besides the fact that entropy *changes* are objective.
 
  • #67


A. Neumaier said:
Once you keep temperature constant, entropy becomes essentially irrelevant.
The governing extremal principle is then one of minimizing a free entropy rather than maximizing entropy.

That's not the point: we can't be so cavalier when discussing nonequilibrium systems. Entropy can change if the temperature is constant: chemical reactions, mixing/diffusion, etc.

I'm not sure what you mean by 'free entropy'- the opposite of free energy? Negentropy?
 
  • #68


A. Neumaier said:
Biological processes are not exempt from the second law in the form of nonnegative local mean entropy production.

Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
 
  • #69


Andy Resnick said:
Eh? That's the whole point of living systems- I keep my entropy low at the expense of raising the entropy somewhere else.
Yes, but this doesn't involve a violation of the second law.

There are two processes affecting the entropy distribution: 1. local entropy production, and 2. flow of entropy. The former is nonnegative at all places (this is the second law), whereas the latter redistributes the entropy between places in a conserved fashion.

For example, in fluid flow, local entropy production is associated with dissipative terms in the Navier-Stokes equations. Neglecting the entropy production results in the Euler equations. The latter are conservative but have an entropy current, which - unlike with Navier-Stokes - together with the entropy density satisfies the continuity equation.

The low entropy of biochemical substances (or, for that matter, inanimate crystals) is a result of an entropy current during their formation that dominates the (still positive) entropy production.

All this can be read in books on nonequilibrium thermodynamics. My favorite book is Reichl's Modern Course in Statistical Mechanics.
 
  • #70


Andy Resnick said:
There's a lot in that document that really bugs me- for example:

"But there is another name for statistical mechanics: Thermodynamics." (p. 11). That is not true- they are quite different. AFAIK, SM is currently able to only generate results for time-independent (equilibrium) systems. The partition function, in particular, does not depend on time. Thermo*dynamics* is not thermo*statics*. Thermodynamics does allow for time-dependent non-equilibrium systems, but many of the concepts from thermostatics (temperature, for example) become more difficult to define.

http://pre.aps.org/abstract/PRE/v49/i2/p1040_1

Also, mechanics uses even-dimensional symplectic spaces, while thermodynamics uses odd-dimensional contact spaces.

"But entropy is simply a fancy word for “disorder”." (p.12). Also not true.

There is also an insistence that there is a mechanical theory of heat, which AFAIK, is not true. Dissipative mechanisms cannot be explained in terms of conservative forces.

"The conclusion is that our dimensionless entropy, which measures our lack of knowledge, is a purely subjective quantity." (p. 17). Not true- there are 'objective' measures of information (Kolmogorov). Also, because there is an absolute scale for temperature there is an absolute scale for entropy. That's besides the fact that entropy *changes* are objective.

I'll just discuss your last point, since the previous points seem mainly about terminology.

The last point is in the context of a Hamiltonian system, where Liouville's theorem guarantees that the volume of phase space remains the same. So for a Hamiltonian system, then entropy increase is subjective.

If the system is not Hamiltonian, then there can be objective increases in entropy.
 

Similar threads

  • Thermodynamics
Replies
19
Views
217
Replies
4
Views
951
Replies
12
Views
1K
  • Thermodynamics
2
Replies
46
Views
2K
Replies
2
Views
910
Replies
100
Views
6K
Replies
6
Views
1K
Replies
9
Views
815
  • Thermodynamics
Replies
2
Views
688
Replies
13
Views
2K
Back
Top