- #1
moonman239
- 282
- 0
What is the "real" second law of thermodynamics?
Title says it all.
Title says it all.
This is why I personally think the second law is best phrased not as a forcing law, but as an expectation and rational constraint on decisions.RedX said:However, just to give you some food for thought, is it possible for the entropy of a closed system to decrease?
...
So there is a 100% certainty that the gas will be in only half the box again at some really long time.
RedX said:But maybe if we wait awhile longer, the gas will return to just occupying half the box?
RedX said:However, just to give you some food for thought, is it possible for the entropy of a closed system to decrease?
Andy Resnick said:According to the fluctuation-dissipation theorem, it regularly does:
http://prl.aps.org/abstract/PRL/v89/i5/e050601
You can say that there is a finite probability. But the chances are still infinitessimally small that it has ever occurred anywhere in the history of the universe or that it ever will. That is about the probability in quantum mechanics that an apple will not obey Newton's law of gravity. So we call it a law.SW VandeCarr said:If you took Avogadro's number([tex]N_{A}[/tex]) of fair coins and tossed them all at once, the overwhelming probability is that the results would be proportionately very close to 0.5 heads (the point of maximal entropy). However, there exists a finite probability that they all could be heads, specifically [tex](1/2)^{N_{A}}[/tex]. With much smaller numbers (or scales), the probability of larger deviations from the expected values (maximal entropy) increases.
I don't think that entropy is really that esoteric. It does not depend at all on quantum effects. When I break 15 balls on a pool table (one that has no friction, has perfectly elastic collisions between balls and cushions and no pockets), the energy of the cue ball will disperse into the other 15 and no matter how long I wait, the motion of the cue ball prior to impact will never be recovered.Pythagorean said:One of my physics professor conveyed a definition of thermodynamics to me that has stuck. I believe it was originally presented by Susskind (who has lectures available online).
The second law is only really justifiable theoretically with quantum mechanics (the classical mechanics description is limited by the Planck constant and is more of an experimental fact).
But let's start with the classical view, using phase space (a plot of the position vs. the momentum of a particle or set of particles). You pick a point and that represents a particle and you trace it through phase space. Since they're deterministic equations in the classical view, you can trace them back to their origin with no problem, even chaotic systems.
Now, if we consider quantum mechanics, we suddenly have an issue when we trace the particle back to it's origin on the phase plot. Namely, that it could have come from any arbitrary point within a circle the size of Planck's constant (which is an area on the phaseplot).
That is, due to indistinguishability and Heisenburg uncertainty, we have an inherent loss of information in the universe about the state of the particles whose motion (characterized by position and momentum) is directly related to energy and this loss of information is entropy.
SW VandeCarr said:If you took Avogadro's number([tex]N_{A}[/tex]) of fair coins and tossed them all at once, the overwhelming probability is that the results would be proportionately very close to 0.5 heads (the point of maximal entropy). However, there exists a finite probability that they all could be heads, specifically [tex](1/2)^{N_{A}}[/tex]. With much smaller numbers (or scales), the probability of larger deviations from the expected values (maximal entropy) increases.
Andrew Mason said:I don't think that entropy is really that esoteric. It does not depend at all on quantum effects. When I break 15 balls on a pool table (one that has no friction, has perfectly elastic collisions between balls and cushions and no pockets), the energy of the cue ball will disperse into the other 15 and no matter how long I wait, the motion of the cue ball prior to impact will never be recovered.
Energy tends to disperse from more concentrated forms to less concentrated forms. That is the principle behind the second law of thermodynamics.
AM
The entropy S is a state function of a thermodynamic system, but it can't be directly measured like pressure and temperature. There is no entropy-meter; entropy must be inferred by varying the state of a system near equilibrium and observing how other thermodynamic variables (pressure, temperature, etc.) respond. This is one reason why the statistical mechanics interpretation of entropy is so important:
"[The] ability to make macroscopic predictions based on microscopic properties is the main asset of statistical mechanics over thermodynamics. Both theories are governed by the second law of thermodynamics through the medium of entropy. However, entropy in thermodynamics can only be known empirically, whereas in statistical mechanics, it is a function of the distribution of the system on its microstates." (from statistical mechanics)
It might seem like this statistical interpretation of matter can cause matter to be "influenced" by our knowledge, or lack of knowledge, of its microstates. What does information or knowledge about microstates have to do with how a steam engine works! But this train of thought is a result of a misperception of microscopic states in nature. Which microstate a particle system is in is irreducibly (inherently) uncertain, in same sense that the position and momentum of individual particles are uncertain (Heisenberg's uncertainty principle). All we know about a steam engine is the possible microstates for any given macrostate, and we know nothing of the particular microstate.
The fact that entropy almost always increases or stays the same (the second law of thermodynamics) is a statistical statement about the uncertainty of a particle system's microstate.
I am not sure what you mean by predicting entropy. We simply predict that the entropy of a closed system will always increase. A closed system tends to equilibrium. That is what is observed. That is the second law.Pythagorean said:The point, I think, is how you qualified "tendency". Classically, thermodynamics must be defined this way (statistically) because the equations are deterministic. We observe entropy classically, but there's no way to predict it theoreically without QM.
I think it would be better if Susskind started at the beginning of the history of the concept of entropy. Instead he starts at the end.Susskind gives the classical description in his entropy lecture (available online) in which he uses chaos (fractalization of phase-space) to recover this inconsistancy between centropy and determinism by partitioning the initial distribution. But this isn't valid if your partitions have an area less than Plancks constant.
it doesn't appear so esoteric to me, but I admittedly don't know the quantum formalism of entropy and have taken my interpretation of my professors words on good faith. It makes sense to me qualitatively, that HUP would contribute to entropy.
Pythagorean said:http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html
The entropy S is a state function of a thermodynamic system, but it can't be directly measured like pressure and temperature. There is no entropy-meter; entropy must be inferred by varying the state of a system near equilibrium and observing how other thermodynamic variables (pressure, temperature, etc.) respond
Andrew Mason said:I am not sure what you mean by predicting entropy. We simply predict that the entropy of a closed system will always increase. A closed system tends to equilibrium. That is what is observed. That is the second law.
I think it would be better if Susskind started at the beginning of the history of the concept of entropy. Instead he starts at the end.
I don't see an inconsistency between determinism and entropy at all.
Andy Resnick said:I'm not sure I agree with this- calorimeters directly measure changes in the enthalpy and/or the Gibbs free energy, these are related to changes in the entropy: ΔG = ΔH – TΔS.
Pythagorean said:But there's a problem with the website's language that you quoted. It says "inferred". Could it be argued that your counterpoint is an "inference"? Possibly, but then so can the whole empirical method, so it's kind of difficult to understand what is meant without a rigorous definition of inference.
Andy Resnick said:I'm not trying to be dense, but I didn't see the word 'inferred' on the page- can you be a little more specific?
In any case, do you perhaps mean something analogous to "measuring" a spring constant by hanging weights off a spring and measuring the change in length is, in fact, not directly measuring the spring constant?
Pythagorean said:It was in the quote you provided that you disagreed with.
Pythagorean said:The point, I think, is how you qualified "tendency". Classically, thermodynamics must be defined this way (statistically) because the equations are deterministic. We observe entropy classically, but there's no way to predict it theoreically without QM.
<snip>
addendum:
Perhaps you've heard the basis of the arguments before and already rejected them. I'm not sure, I know there's been several discussion motivated from information theory before here on physicsforums:
http://lcni.uoregon.edu/~mark/Stat_mech/thermodynamic_entropy_and_information.html
Pythagorean said:As to your example, I think so, kind of, but it doesn't get to the heart of it. My argument would be:
The spring constant doesn't tell us anything physically meaningful (that we don't already know via dimensional analysis) about what it describes. If we really want to know what causes a spring constant to be what value it is, we want to look closer at it (to the micro level) so that we can explain the spring constant causally.
If you just want to explain hooke's law, you can hand wave what the spring constant means with dimensional analysis, but it's not rigorous if somebody started a thread: "what's the
'real' meaning of the spring constant?" I'd be annoyed if someone just showed me the dimensional analysis. and said.. look! it shows you how badly it wants (force) to go back to equilibrium (distance).
That's great as a tool for understanding a bigger system that consists of a spring (like even a set of masses coupled through springs) but not all springs really work with a constant; we can construct all kinds of springs with time and space dependencies based on the more general model provided by solid state physics to get a better description of how springs really work and then we realize that all macroscopic objects in the world actually have a "springy" quality.
if you take the limit as Planck's constant goes to zero (the classical limit), then the thermodynamics of:
(equation)
is not affected since the hbar contributes nothing to the thermodynamics.
Pythagorean said:But from the Susskind lecture, I thought that this violated energy conservation (i.e. your before and after snapshots don't' have the same volume because you've fractilized the "after" snapshot to infinitesimal thin lengths, so now the volume is less).
Pythagorean said:Redx, et al:
I still have a question though, you said:
But from the Susskind lecture, I thought that this violated energy conservation (i.e. your before and after snapshots don't' have the same volume because you've fractilized the "after" snapshot to infinitesimal thin lengths, so now the volume is less).
Pythagorean said:So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?
.
Pythagorean said:One of my physics professor conveyed a definition of thermodynamics to me that has stuck. I believe it was originally presented by Susskind (who has lectures available online).
The second law is only really justifiable theoretically with quantum mechanics (the classical mechanics description is limited by the Planck constant and is more of an experimental fact).
But let's start with the classical view, using phase space (a plot of the position vs. the momentum of a particle or set of particles). You pick a point and that represents a particle and you trace it through phase space. Since they're deterministic equations in the classical view, you can trace them back to their origin with no problem, even chaotic systems.
Now, if we consider quantum mechanics, we suddenly have an issue when we trace the particle back to it's origin on the phase plot. Namely, that it could have come from any arbitrary point within a circle the size of Planck's constant (which is an area on the phaseplot).
That is, due to indistinguishability and Heisenburg uncertainty, we have an inherent loss of information in the universe about the state of the particles whose motion (characterized by position and momentum) is directly related to energy and this loss of information is entropy.
When did I say or even suggest that Susskind was wrong? I simply suggested that as a pedagogical matter the concept of entropy would be easier to understand if he started at the beginning rather than at the end of the history of that concept. His lecture, after all, is supposed to be an introduction of the second law of thermodynamics.Pythagorean said:Which is why I liked Susskind's treatment, but now I'm bein told Susskind was wrong by AM (or at least my interpretation of it).
Science is not about "belief". It is about understanding so as to be able to describe and predict the behaviour of the physical world. So you should use the resources that best help you acquire that understanding. I would recommend Feynman's lectures on Physics Vol 1, Ch. 39-46.So who do I beleive? Somebody on physics forums or a Stanford professor of quantum statistics?
.