What is the real second law of thermodynamics?

In summary: assuming you're asking about the entropy of a system in equilibrium, then it is absolutely true that the entropy will increase over time.
  • #1
moonman239
282
0
What is the "real" second law of thermodynamics?

Title says it all.
 
Science news on Phys.org
  • #2


The second law of thermodynamics says that the entropy of a closed system will maximize its entropy.
 
  • #3


Multiplicity increases, or equivalently, information about a system decreases. As much as it can...
 
  • #4


The easiest form of the second law to understand is the Clausius statement of the second law:

"No process is possible whose sole result is the transfer of heat from a body of lower temperature to a body of higher temperature."​

The Kelvin statement can be shown to be equivalent:

"No process is possible in which the sole result is the absorption of heat from a reservoir and its complete conversion into work."​

AM
 
  • #5


I'd have to go with entropy in a closed system increases over time.

However, just to give you some food for thought, is it possible for the entropy of a closed system to decrease?

Say you have an insulated box of gas, but the gas is only occupying half of the box. If you wait awhile the gas will eventually occupy all of the box, and the entropy will have increased.

But maybe if we wait awhile longer, the gas will return to just occupying half the box?

Keep in mind that all the molecules of the gas obey Newton's laws, and in Newton's laws there is no uncertainty: in principle if we have a strong enough computer, we can calculate where all the particles will be in the box exactly.

In fact, there is even a theorem in classical mechanics, called the Poincare Recurrence relation, that states that for anything that obeys Newton's laws, if you wait long enough, your system will return arbitrarily close to the initial state. This is a 100% certainty.

So there is a 100% certainty that the gas will be in only half the box again at some really long time.

For more information take a look at Wikipedia:

http://en.wikipedia.org/wiki/Poincare_recurrence#Recurrence_theorem_and_entropy
 
  • #6


RedX said:
However, just to give you some food for thought, is it possible for the entropy of a closed system to decrease?
...
So there is a 100% certainty that the gas will be in only half the box again at some really long time.
This is why I personally think the second law is best phrased not as a forcing law, but as an expectation and rational constraint on decisions.

Given no further information besides the entropy and the macroscopic constraints, the entropy of a system is by construction more likely to increase than to decrease during any change.

This certainly does not mean that it WILL or MUST increase. Strictly speaking I'd say that form of the second law is not quite correct. That fact that in the FAPP sense, it is true is just in the sense that there is a saying that "events with sufficiently low probability, simply never happens".

So the second law is one of the rare few things in physics which is clear. The only not to trivial things about probabilit theory and entropic reasoning, is the role of the CHOICE of the microstructure (and thus prior). Because it's no denial if you look close enough that the measure of disorder (entropy) is actually relative in the sense that it depends on an ergodic hypothesis. And genereally, at first analysis level, there is no prevention from different observers making DIFFERENT ergodic hypothesis, yielding them inequivalence entropic expectations. This latter thing is the only thing that is not so trivial in there. Essential this means that each observe has their own distored direction suggested by second law.

But in CLASSICAL nonrelativistiv physics, this latter issue is not there, and entropy is objective. But both in relativity, and in observer dependent theories, these things is far more intricate. In QM one can define different entropy mesures, like the von neumann entropy. But this is in fact NOT the same measure, though both are called entropy. Here are again lots more food for thought too.

/Fredrik
 
  • #7


RedX said:
But maybe if we wait awhile longer, the gas will return to just occupying half the box?

Yes, where "awhile" means much much [...] much much longer than the age of the universe. As I recall, estimating this time is an exercise in the book that I used the last time I taught thermodynamics.

This reflects the difference between classical thermodynamics and statistical thermodynamics, in which the Second Law is a statistical statement, not an absolute one.
 
  • #8


All this is a little over my head. I always found it amazing that if you set up a system of differential equations together with boundary conditions, that if you let time go to infinity, you'll get a Boltzmann distribution as your asymptotic behavior, no matter what your differential equations are. But I've also seen the proof of the Poincare Relation before, so I'm a little confused since the Poincare Relation seems to disprove that asymptotic solutions of a system of differential equations even exists.

Now of course a Boltzmann distribution is the same as equal distribution for states with the same energy (i.e. canonical and microcanonical are equivalent). This is the result that's derived in every class, that the fluctuations in energy of the Boltzmann distribution goes as the specific heat which is negligible.

So are there asymptotic solutions to (phase-volume preserving) differential equations with boundary conditions, and is the asymptotic solution that all states with the same energy are equally probable?

As for adding relativity to thermodynamics, this is really confusing to me. You don't expect the partition function to be relativistically invariant. However, the path integral is relativistically invariant, and the partition function is merely the path integral (using imaginary time), except you sum over all possible states for the endpoints. If the partition function is Lorentz invariant, then so should be the entropy, since temperature is a scalar (partition function invariant -> helmholtz energy invariant -> entropy invariant).

As always I suspect that its because the boundary conditions change when converting the partition function to the path integral, and this is the cause of the breaking of Lorentz invariance. But I thought Lorentz invariance can be restored by introducing as a 4-vector the velocity of the heat bath.

Anyways, in actual calculation of entropy, you do sum over all microstates, even microstates of the box that are only half-occupied. It changes the final result by so little it can be ignored.
 
  • #9


Going along what Andrew Mason said, the second law is a fundamental statement about what kinds of physical processes can occur in nature. It does this by defining the entropy of a process. Most other statements regarding the second law involve different ways of defining the entropy.
 
  • #11


Andy Resnick said:
According to the fluctuation-dissipation theorem, it regularly does:

http://prl.aps.org/abstract/PRL/v89/i5/e050601

If you took Avogadro's number([tex]N_{A}[/tex]) of fair coins and tossed them all at once, the overwhelming probability is that the results would be proportionately very close to 0.5 heads (the point of maximal entropy). However, there exists a finite probability that they all could be heads, specifically [tex](1/2)^{N_{A}}[/tex]. With much smaller numbers (or scales), the probability of larger deviations from the expected values (maximal entropy) increases.
 
Last edited:
  • #12


SW VandeCarr said:
If you took Avogadro's number([tex]N_{A}[/tex]) of fair coins and tossed them all at once, the overwhelming probability is that the results would be proportionately very close to 0.5 heads (the point of maximal entropy). However, there exists a finite probability that they all could be heads, specifically [tex](1/2)^{N_{A}}[/tex]. With much smaller numbers (or scales), the probability of larger deviations from the expected values (maximal entropy) increases.
You can say that there is a finite probability. But the chances are still infinitessimally small that it has ever occurred anywhere in the history of the universe or that it ever will. That is about the probability in quantum mechanics that an apple will not obey Newton's law of gravity. So we call it a law.

AM
 
Last edited:
  • #13


One of my physics professor conveyed a definition of thermodynamics to me that has stuck. I believe it was originally presented by Susskind (who has lectures available online).

The second law is only really justifiable theoretically with quantum mechanics (the classical mechanics description is limited by the Planck constant and is more of an experimental fact).

But let's start with the classical view, using phase space (a plot of the position vs. the momentum of a particle or set of particles). You pick a point and that represents a particle and you trace it through phase space. Since they're deterministic equations in the classical view, you can trace them back to their origin with no problem, even chaotic systems.

Now, if we consider quantum mechanics, we suddenly have an issue when we trace the particle back to it's origin on the phase plot. Namely, that it could have come from any arbitrary point within a circle the size of Planck's constant (which is an area on the phaseplot).

That is, due to indistinguishability and Heisenburg uncertainty, we have an inherent loss of information in the universe about the state of the particles whose motion (characterized by position and momentum) is directly related to energy and this loss of information is entropy.
 
  • Like
Likes 1 person
  • #14


Pythagorean said:
One of my physics professor conveyed a definition of thermodynamics to me that has stuck. I believe it was originally presented by Susskind (who has lectures available online).

The second law is only really justifiable theoretically with quantum mechanics (the classical mechanics description is limited by the Planck constant and is more of an experimental fact).

But let's start with the classical view, using phase space (a plot of the position vs. the momentum of a particle or set of particles). You pick a point and that represents a particle and you trace it through phase space. Since they're deterministic equations in the classical view, you can trace them back to their origin with no problem, even chaotic systems.

Now, if we consider quantum mechanics, we suddenly have an issue when we trace the particle back to it's origin on the phase plot. Namely, that it could have come from any arbitrary point within a circle the size of Planck's constant (which is an area on the phaseplot).

That is, due to indistinguishability and Heisenburg uncertainty, we have an inherent loss of information in the universe about the state of the particles whose motion (characterized by position and momentum) is directly related to energy and this loss of information is entropy.
I don't think that entropy is really that esoteric. It does not depend at all on quantum effects. When I break 15 balls on a pool table (one that has no friction, has perfectly elastic collisions between balls and cushions and no pockets), the energy of the cue ball will disperse into the other 15 and no matter how long I wait, the motion of the cue ball prior to impact will never be recovered.

Energy tends to disperse from more concentrated forms to less concentrated forms. That is the principle behind the second law of thermodynamics.

AM
 
  • #15


SW VandeCarr said:
If you took Avogadro's number([tex]N_{A}[/tex]) of fair coins and tossed them all at once, the overwhelming probability is that the results would be proportionately very close to 0.5 heads (the point of maximal entropy). However, there exists a finite probability that they all could be heads, specifically [tex](1/2)^{N_{A}}[/tex]. With much smaller numbers (or scales), the probability of larger deviations from the expected values (maximal entropy) increases.

I'm not sure how that pertains to the reference: a *small* system, with *short* time scales.
 
  • #16


Andrew Mason said:
I don't think that entropy is really that esoteric. It does not depend at all on quantum effects. When I break 15 balls on a pool table (one that has no friction, has perfectly elastic collisions between balls and cushions and no pockets), the energy of the cue ball will disperse into the other 15 and no matter how long I wait, the motion of the cue ball prior to impact will never be recovered.

Energy tends to disperse from more concentrated forms to less concentrated forms. That is the principle behind the second law of thermodynamics.

AM

The point, I think, is how you qualified "tendency". Classically, thermodynamics must be defined this way (statistically) because the equations are deterministic. We observe entropy classically, but there's no way to predict it theoreically without QM.

Susskind gives the classical description in his entropy lecture (available online) in which he uses chaos (fractalization of phase-space) to recover this inconsistancy between centropy and determinism by partitioning the initial distribution. But this isn't valid if your partitions have an area less than Plancks constant.

it doesn't appear so esoteric to me, but I admittedly don't know the quantum formalism of entropy and have taken my interpretation of my professors words on good faith. It makes sense to me qualitatively, that HUP would contribute to entropy.

addendum:

Perhaps you've heard the basis of the arguments before and already rejected them. I'm not sure, I know there's been several discussion motivated from information theory before here on physicsforums:

http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html

The entropy S is a state function of a thermodynamic system, but it can't be directly measured like pressure and temperature. There is no entropy-meter; entropy must be inferred by varying the state of a system near equilibrium and observing how other thermodynamic variables (pressure, temperature, etc.) respond. This is one reason why the statistical mechanics interpretation of entropy is so important:

"[The] ability to make macroscopic predictions based on microscopic properties is the main asset of statistical mechanics over thermodynamics. Both theories are governed by the second law of thermodynamics through the medium of entropy. However, entropy in thermodynamics can only be known empirically, whereas in statistical mechanics, it is a function of the distribution of the system on its microstates." (from statistical mechanics)

It might seem like this statistical interpretation of matter can cause matter to be "influenced" by our knowledge, or lack of knowledge, of its microstates. What does information or knowledge about microstates have to do with how a steam engine works! But this train of thought is a result of a misperception of microscopic states in nature. Which microstate a particle system is in is irreducibly (inherently) uncertain, in same sense that the position and momentum of individual particles are uncertain (Heisenberg's uncertainty principle). All we know about a steam engine is the possible microstates for any given macrostate, and we know nothing of the particular microstate.

The fact that entropy almost always increases or stays the same (the second law of thermodynamics) is a statistical statement about the uncertainty of a particle system's microstate.
 
Last edited:
  • #17


Pythagorean said:
The point, I think, is how you qualified "tendency". Classically, thermodynamics must be defined this way (statistically) because the equations are deterministic. We observe entropy classically, but there's no way to predict it theoreically without QM.
I am not sure what you mean by predicting entropy. We simply predict that the entropy of a closed system will always increase. A closed system tends to equilibrium. That is what is observed. That is the second law.

Susskind gives the classical description in his entropy lecture (available online) in which he uses chaos (fractalization of phase-space) to recover this inconsistancy between centropy and determinism by partitioning the initial distribution. But this isn't valid if your partitions have an area less than Plancks constant.

it doesn't appear so esoteric to me, but I admittedly don't know the quantum formalism of entropy and have taken my interpretation of my professors words on good faith. It makes sense to me qualitatively, that HUP would contribute to entropy.
I think it would be better if Susskind started at the beginning of the history of the concept of entropy. Instead he starts at the end.

I don't see an inconsistency between determinism and entropy at all. Entropy is a macroscopic concept that has a statistical explanation. The cue ball does not regain all its lost energy due to some quantum concept. It does not regain it because there are an infinite number of states that the balls can have and only one will result in the balls regaining their initial state.

At the quantum level, there is a different kind of statistics operating. Entropy at that level is conceptually different and perhaps should not be called entropy.

AM
 
  • #18


Pythagorean said:
http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html

The entropy S is a state function of a thermodynamic system, but it can't be directly measured like pressure and temperature. There is no entropy-meter; entropy must be inferred by varying the state of a system near equilibrium and observing how other thermodynamic variables (pressure, temperature, etc.) respond

I'm not sure I agree with this- calorimeters directly measure changes in the enthalpy and/or the Gibbs free energy, these are related to changes in the entropy: ΔG = ΔH – TΔS.

http://www.microcal.com/technology/dsc.asp
 
Last edited by a moderator:
  • #19


Andrew Mason said:
I am not sure what you mean by predicting entropy. We simply predict that the entropy of a closed system will always increase. A closed system tends to equilibrium. That is what is observed. That is the second law.

What I mean, and now it's a question set, since I'm not sure anymore is:

can you predict the second law without the observation (from other "first principles") in classical mechanics? What about QM?

But now that you've said they're completely different things, then the idea that QM explains the mechanisms behind classical entropy is faulty? What's the classical mechanism, then? I've only seen it defined by observation (what systems tend to do)?

I think it would be better if Susskind started at the beginning of the history of the concept of entropy. Instead he starts at the end.

How do you mean? I'd be interested in a different pedagogical approach. I wasn't interested in thermodynamics when I went through my undergrad, i just did what I had to for good marks, and now I'm going back and watching lectures online with a renewed interest.

I don't see an inconsistency between determinism and entropy at all.

Perhaps it was purely pedagogical and there was is only an intuition block between determinism and entropy (did you watch the lecture, by chance?). He discusses it from a phase space diagram (and it's for systems in which the energy is conserved, so the area of phase space stays constant, but still spreads out, becoming more porous)

Newtonian determinism (I thought) says the process is reversible, so there's nothing that disallows all the particles from condensing back to a volume in phase-space and if you wait long enough, they just might.

But if you fractilize the phas-space as fine as you can, you actually measure a volume difference after the system has reached equilibrium (a volume difference on the phase map, meaning an energy gain/loss, which violates conservation). But you can fix this problem by, as I said, partitioning the initial volume (when it's in a low entropy state) so that none of the fractalized branches have volumes smaller than the Planck area (phasemap area).

Was that more coherent? If there's anything incorrect about what I've said here (in the last paragraph, the previous was more the interpretation) please let me know.

Andy Resnick said:
I'm not sure I agree with this- calorimeters directly measure changes in the enthalpy and/or the Gibbs free energy, these are related to changes in the entropy: ΔG = ΔH – TΔS.

Well, I'm not sure either anymore. I hear different things about entropy from every professional I ask.

But there's a problem with the website's language that you quoted. It says "inferred". Could it be argued that your counterpoint is an "inference"? Possibly, but then so can the whole empirical method, so it's kind of difficult to understand what is meant without a rigorous definition of inference.

addendum

It's occurred to me that I might be harboring silent assumptions from the field of quantum chaos, too, which studies how chaos can arise from quantum systems to emerge in classical systems. I guess I've had a feeling that HUP is significant when looking at perturbations in a chaotic system, but please kill this belief now if it's faulty.
 
Last edited:
  • #20


Pythagorean said:
But there's a problem with the website's language that you quoted. It says "inferred". Could it be argued that your counterpoint is an "inference"? Possibly, but then so can the whole empirical method, so it's kind of difficult to understand what is meant without a rigorous definition of inference.

I'm not trying to be dense, but I didn't see the word 'inferred' on the page- can you be a little more specific?

In any case, do you perhaps mean something analogous to "measuring" a spring constant by hanging weights off a spring and measuring the change in length is, in fact, not directly measuring the spring constant?
 
  • #21


Andy Resnick said:
I'm not trying to be dense, but I didn't see the word 'inferred' on the page- can you be a little more specific?

In any case, do you perhaps mean something analogous to "measuring" a spring constant by hanging weights off a spring and measuring the change in length is, in fact, not directly measuring the spring constant?

It was in the quote you provided that you disagreed with.

As to your example, I think so, kind of, but it doesn't get to the heart of it. My argument would be:

The spring constant doesn't tell us anything physically meaningful (that we don't already know via dimensional analysis) about what it describes. If we really want to know what causes a spring constant to be what value it is, we want to look closer at it (to the micro level) so that we can explain the spring constant causally.

If you just want to explain hooke's law, you can hand wave what the spring constant means with dimensional analysis, but it's not rigorous if somebody started a thread: "what's the
'real' meaning of the spring constant?" I'd be annoyed if someone just showed me the dimensional analysis. and said.. look! it shows you how badly it wants (force) to go back to equilibrium (distance).

That's great as a tool for understanding a bigger system that consists of a spring (like even a set of masses coupled through springs) but not all springs really work with a constant; we can construct all kinds of springs with time and space dependencies based on the more general model provided by solid state physics to get a better description of how springs really work and then we realize that all macroscopic objects in the world actually have a "springy" quality.
 
  • #22


Pythagorean said:
It was in the quote you provided that you disagreed with.

Now I'm confused- that quote was originally from *your* post:

Pythagorean said:
The point, I think, is how you qualified "tendency". Classically, thermodynamics must be defined this way (statistically) because the equations are deterministic. We observe entropy classically, but there's no way to predict it theoreically without QM.

<snip>
addendum:

Perhaps you've heard the basis of the arguments before and already rejected them. I'm not sure, I know there's been several discussion motivated from information theory before here on physicsforums:

http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html

My comment was simply that a calorimeter can measure the entropy. Kind of. Maybe.

Pythagorean said:
As to your example, I think so, kind of, but it doesn't get to the heart of it. My argument would be:

The spring constant doesn't tell us anything physically meaningful (that we don't already know via dimensional analysis) about what it describes. If we really want to know what causes a spring constant to be what value it is, we want to look closer at it (to the micro level) so that we can explain the spring constant causally.

If you just want to explain hooke's law, you can hand wave what the spring constant means with dimensional analysis, but it's not rigorous if somebody started a thread: "what's the
'real' meaning of the spring constant?" I'd be annoyed if someone just showed me the dimensional analysis. and said.. look! it shows you how badly it wants (force) to go back to equilibrium (distance).

That's great as a tool for understanding a bigger system that consists of a spring (like even a set of masses coupled through springs) but not all springs really work with a constant; we can construct all kinds of springs with time and space dependencies based on the more general model provided by solid state physics to get a better description of how springs really work and then we realize that all macroscopic objects in the world actually have a "springy" quality.

I'm not sure I follow you.
 
  • #23


Why did you think I was saying it was from your post? I was pointing out the loose language of the author of the site...

If ou don't understand the beef of my last post then perhaps your interests in entropy are different than mine. I want to know the mechanism of entropy, that is a "real" definition to me.

Which is why I liked Susskind's treatment, but now I'm bein told Susskind was wrong by AM (or at least my interpretation of it).

So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?

So I'd at least like to here a rebuttle to Susskinds reasoning before I invest any more motor skills.
 
  • #24


Phase space volume is always conserved, even if the Hamiltonian is dependent on time. This is a property of Hamilton's equations and is related to the fact that phase space is odd dimensional (equal # of position variables and momentum variables, plus one time variable for 2n+1). It's a bit esoteric but basically in an odd-dimensional space there is always a special direction in which a two-form such as d(pdq-Hdt) vanishes, and it so happens that the direction in which that particular 2-form vanishes corresponds to the trajectories that are solutions of Hamilton's equations. So if you take tube whose sides are along these special directions in phase space and whose ends are in different t=constant hyperplanes, then by Stokes' theorem:

[tex]\int pdq=\int PdQ [/tex]
[tex]\int \int dp \wedge dq= \int \int dP \wedge dQ [/tex]

so that the phase volume is the same at two different times.

Anyways that's a gorgeous proof that I learned from Arnold's Classical Mechanics book, but there are proofs that don't invoke differential geometry.

So, phase volume is always conserved, even if the Hamiltonian depends on time. But as you mention things can get fractalized, where you get thin fingers of phase volume.

I believe you are right and this fractalization can be taken as a measure of entropy. Of course phase volume is preserved, but you basically get fingers that spread out so much that it looks like phase volume increased to fill the entire box.

Also note that the classical formula for the partition function:

[tex]\mathcal Z=\int \int dp dq e^{-H/kT}[/tex]

is not dimensionless. To make it dimensionless you have to divide by hbar.

This is called graining, and corresponds to Heisenberg's uncertainty principle that you can only know dpdq to hbar, so dividing phase volume by hbar counts the states according to quantum mechanics.

Of course such a constant to the partition function doesn't affect the thermodynamics.

So it could be the case that entropy is related to quantum mechanics, but one thing for sure is that even if you take the limit as Planck's constant goes to zero (the classical limit), then the thermodynamics of:

[tex]\mathcal Z=\frac{1}{\hbar} \int \int dp dq e^{-H/kT}[/tex]

is not affected since the hbar contributes nothing to the thermodynamics. So in that sense entropy doesn't depend on quantum mechanics.

addendum:

maybe introducing the partition function complicates everything.

Instead, just count the number of microstates:

[tex]\Omega=\int \int dp dq [/tex]

where the integral say is taken over the volume of the box for the integral over q, and over a hypersphere p^2/(2m) = E for the integral over p.

This integral does not have proper dimesions, so you still have to divide by hbar, so that the number of microstates is:

[tex]\Omega=\frac{1}{\hbar} \int \int dp dq [/tex]

So hbar is needed to dimensionalize the number of microstates, and has the interpretation that dpdq can only be known to hbar, so you are counting the states when you take the phase space volume and divide by hbar.

But then of course the hbar doesn't affect the thermodynamics, as it only adds a constant to the entropy.
 
Last edited:
  • #25


RedX, thank you for your post!
 
  • #26


The formula for entropy sum(-plnp) is not coordinate invariant if the variables are continuous, since the p becomes a probability density with units, and only pdx is unitless. So one could argue that things are really discrete and we always do the sum, and perhaps this involves quantum mechanics. There is no problem if classical reality is fundamentally discrete. An alternative to invoking discreteness is to restrict ourselves to canonical changes of variables, and say things always happen only in phase space of Hamiltonian systems.

I think the place where eveyrone agrees quantum stuff helps is to justify the Gibbs correction for identical particles, since one really doesn't have distinguishable trajectories.
 
  • #27


Redx, et al:

I still have a question though, you said:

if you take the limit as Planck's constant goes to zero (the classical limit), then the thermodynamics of:

(equation)

is not affected since the hbar contributes nothing to the thermodynamics.

But from the Susskind lecture, I thought that this violated energy conservation (i.e. your before and after snapshots don't' have the same volume because you've fractilized the "after" snapshot to infinitesimal thin lengths, so now the volume is less).
 
  • #28


Pythagorean said:
But from the Susskind lecture, I thought that this violated energy conservation (i.e. your before and after snapshots don't' have the same volume because you've fractilized the "after" snapshot to infinitesimal thin lengths, so now the volume is less).

The volume of phase space for a Hamiltonian system is constant, no matter how thinly drawn out it becomes. The apparent increase in volume (entropy) is due to your inability to measure with infinite precision (nothing to do with quantum mechanics, just lousy instruments), and so although the system's true volume has not increased, it appears to you that entropy has increased.
 
  • #29


Pythagorean said:
Redx, et al:

I still have a question though, you said:

But from the Susskind lecture, I thought that this violated energy conservation (i.e. your before and after snapshots don't' have the same volume because you've fractilized the "after" snapshot to infinitesimal thin lengths, so now the volume is less).

In classical physics, phase space volume is preserved.

It is true that the lengths of the fingers will be thin. But the fingers get longer. So the volume is preserved. This is what classical physics tells us at least. Now if you want to invoke quantum, then as you say, this might not be right and the volumes won't be equal.

My guess is that Susskind is saying what you're saying, that although the volumes are the same before and after, the fingers are so spread out and since you can only know phase space volumes to hbar, for all practical purposes the phase space volume has increased and filled the entire box. Quite simply your resolution is not capable of saying that there are holes or spaces between your fingers. According to classical mechanics, the volumes are the same before and after. But if you invoke quantum then because of the hbar limitation, you can't observe the empty space in between fingers.
 
  • #30


Ok, thanks for the clarification.
 
  • #31


I guess this is the problem with watching lectures and not doing the homework. Thanks to physicsforums for discussion.
 
  • #32


Pythagorean said:
So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?
.

Believe who you want- science cares not a whit about credentials.
 
  • #33


Pythagorean said:
One of my physics professor conveyed a definition of thermodynamics to me that has stuck. I believe it was originally presented by Susskind (who has lectures available online).

The second law is only really justifiable theoretically with quantum mechanics (the classical mechanics description is limited by the Planck constant and is more of an experimental fact).

But let's start with the classical view, using phase space (a plot of the position vs. the momentum of a particle or set of particles). You pick a point and that represents a particle and you trace it through phase space. Since they're deterministic equations in the classical view, you can trace them back to their origin with no problem, even chaotic systems.

Now, if we consider quantum mechanics, we suddenly have an issue when we trace the particle back to it's origin on the phase plot. Namely, that it could have come from any arbitrary point within a circle the size of Planck's constant (which is an area on the phaseplot).

That is, due to indistinguishability and Heisenburg uncertainty, we have an inherent loss of information in the universe about the state of the particles whose motion (characterized by position and momentum) is directly related to energy and this loss of information is entropy.

(i) Does classical statistical mechanics require QM for its justification - no. We can stick to canonical variables in phase space. The conservation of phase space volume is not at odds with the apparent increase in volume that we call an increase in entropy - see eg. http://www.necsi.edu/projects/baranger/cce.pdf

(ii) Does classical statistical mechanics of identical particles require QM for its justification -yes. Because classically, identical particles have distinct trajectories, and so cannot be really identical.

Your professor was probably referring to the second idea, not the first.
 
Last edited:
  • #34


Pythagorean said:
Which is why I liked Susskind's treatment, but now I'm bein told Susskind was wrong by AM (or at least my interpretation of it).
When did I say or even suggest that Susskind was wrong? I simply suggested that as a pedagogical matter the concept of entropy would be easier to understand if he started at the beginning rather than at the end of the history of that concept. His lecture, after all, is supposed to be an introduction of the second law of thermodynamics.

I don't pretend to understand what your interpretation of entropy is or what your interpretation of Susskind is.

So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?
.
Science is not about "belief". It is about understanding so as to be able to describe and predict the behaviour of the physical world. So you should use the resources that best help you acquire that understanding. I would recommend Feynman's lectures on Physics Vol 1, Ch. 39-46.

AM
 
  • #35


A subdivision of theoretical physics is in order.

One sort involves objectification.

The other is subjectification. Thermodynamics falls under this category, and is about what information can be obtained by incomplete knowledge. It is concerned with the reduction of ignorance upon acquisition of information, its converse and evolution with time.
 
Last edited:

Similar threads

  • Thermodynamics
Replies
19
Views
228
Replies
4
Views
954
Replies
12
Views
1K
  • Thermodynamics
2
Replies
46
Views
2K
Replies
2
Views
918
Replies
100
Views
6K
Replies
6
Views
1K
Replies
9
Views
823
  • Thermodynamics
Replies
2
Views
689
Replies
13
Views
2K
Back
Top