Is there an equation to prove the Zeroth law of thermodynamics?

1. Feb 29, 2012

Ralphonsicus

Is there a formula/equation that proves the Zeroth law, that can be confirmed by experiments?

And another quick question. If I bunch together three temperatures (all water, of the same volume). Let's say 10 degrees Celsius, 5 degrees Celsius and 15 degrees Celsius. Is there an way of working what the outcome temperature of the mix will be?

Thanks.

2. Feb 29, 2012

anigeo

zeroth law is a practical observation that is fundamental to many process.many phenomenon can ber proved by the pre- assumption of zeroth law.just as all the gravitational phenomena can be explained if we assume gravitational force is directly proportional to the masses of the bodies and inversely to the distance between them, so is the zeroth law.
for the 2nd que. i would suggest tou to first apply calorimetry for two vols. of liquid and then apply calorimetry again with the resultant and the third volume of water.

3. Feb 29, 2012

Staff: Mentor

Let Q1, Q2 and Q3 be the heat gained (positive) or lost (negative) by each volume of water. Assume no heat is gained or lost due to outside sources (i.e. the system is insulated from the outside). Then

Q1 + Q2 + Q3 = 0

where

Q1 = m1c1ΔT1

and similarly for Q2 and Q3. The m's are the masses of each volume of water (the same for each, in your problem), the c's are the specific heats (again the same, 1 cal/(g-°C) for each), and the ΔT's are the changes in temperature: ΔT1 = T1,final - T1,initial, etc. You know the initial temperatures. The final temperature is the same for each, and unknown. Set up the equation and solve for Tfinal.

4. Feb 29, 2012

lugita15

I've wondered about the proof of the zeroth law as well. Here's a question I asked a couple years ago and never got an answer to:

5. Mar 1, 2012

JDStupi

I mean, operationally it is a pretty sound observation/principle of thermal equilibrium. If you want an equation you can use the general definition of temperature, namely that T=($\partial$S/$\partial$U)-1. From this it is fairly easy to show.

You can say that n systems are in thermal equilibrium if, and only if ($\partial$S/$\partial$U1=$\partial$S/$\partial$U2=$\partial$S/$\partial$U3=. . . = $\partial$S/$\partial$Un)

( I just realized I forgot the subs under the entropies of the systems. I don't feel like going back, but just know that I mean the partials of the systems 1,2,3,..n with respect to their own internal energy)

From this it is easy to see that if a system A is in thermal equilibirum with a system B and a system B is in thermal equilibrium with a system C then ($\partial$SA/$\partial$UA=$\partial$SB/$\partial$UB) and ($\partial$SB/$\partial$UB=$\partial$SC/$\partial$UC).

Then by using the transitivity of identity we can see that system A and C are in thermal equilibrium

*  I don't know that this constitutes a complete proof as I am sure there are some technical subtleties of which I am unaware. However, this basically originates by considering the operational definition of temperature and the empirical observation of the phenomena of equilibirum; that point at which no further macrosopic changes are observed. The equilibirum concept itself is not entirely physical, that is to say it is related to our purposes, for a system defined in thermal equilibirum may be so for some period of time, yet the equilibirum concept/approximation may fail to hold after a longer amount of time. However, using the statistical definition of temperature stated before and doing some considerations about equilibirum you can see that the necessary conditions for equilibirum are the equilization of the rates of change of the entropies with respect to the internal energies of systems, which is to say the equivalence of T (1/T, but if the quantities are equal then there inverses are as well). This defines equilibirum conditions and agrees perfectly with our intuitive characterization of equilibirum. Generalizing to n systems and using transitivity yields the "proof"

Last edited: Mar 1, 2012
6. Mar 1, 2012

Ken G

Yes I agree, the zeroth law is easier if one takes T as its fundamental meaning as you say. Perhaps Feynman is referring to other more practical uses of the meaning of T, like average energy per particle or some such thing (though there is no obvious connection there, it has to be derived in every case).

7. Mar 1, 2012

lugita15

Yes, what Feynman is doing is assuming the definition of thermal equilibrium as the state two gases get into if they stay in thermal contact long enough. So under that definition, the zeroth law of thermodynamics becomes a nontrivial statement in statistical mechanics that has to be proved, and then you use this result to prove that thermal equilibrium corresponds to sameness of temperature, i.e. average kinetic energy of molecules. But Feynman omits the proof of the zeroth law because it's too advanced. Does anyone know this proof or where I can find it?

8. Mar 1, 2012

JDStupi

lugita, be wary of characterizing temperature as "the average kinetic energy of molecules". I know you said "for two ideal gases" and this is decent for two ideal gases and for many physical systems, but not all. You may be aware of this already and I may just be telling you what you already know, but I figured I would say it just in case other people did not know.Temperature is more accurately described as the tendancy of an object to spontaneously exchange energy with its surroundings. For example, the ideal two-state paramagnet in an external magnetic field (or many systems with a limitied energy) have some un-intuitive temperatures, namely that high internal energies and low internal energies correspond to the same macroscopic state/temperature and this is because of the entropic defintition of temperature

9. Mar 1, 2012

Rap

The zeroth law is an expression of experience, it cannot be "proved".

The zeroth law is called the zeroth law because it does not introduce any of the concepts of the first and second. The laws are an axiomatic system. The zeroth law takes as axiomatic the concept of equilibrium. (Wait a long time, things stop changing, thats equilibrium). It is not stated in terms of temperature, it lays the foundation for the definition of temperature which does not occur until the second law. (The first law, properly expressed, will therefore make no mention of temperature either.)

The zeroth law allows you to divide the set of all thermodynamic systems into a bunch of "disjoint subsets". Every system is a member of only one subset. Every system that is a member of a given subset is in equilibrium with every other member of that subset, and out of equilibrium with every system that is not. Now you can assign a unique identifier with each subset and say each system in the subset possesses that identifier. Then two systems are in equilibrium if their identifiers are the same, out of equilibrium if they are not.

And that's about it. It says nothing about the nature of the identifiers, they do not have to have any structure. Ultimately, using the first and second laws, the identifiers that are chosen are the temperatures, but all of the characteristics that we associate with temperature (other than its sameness for systems in equilibrium) are developed later. Is system A hotter that system B? The zeroth law has nothing to say about that. In other words, there is no order relationship on the identifiers supplied by the zeroth law. No concept of continuity, either, no concept of "closeness", no metric. All of that follows from developments of the first and second laws.

You cannot define temperature by using the zeroth, first and second laws, and then use that definition to prove the zeroth. You cannot "prove" the zeroth law using any concepts developed by the first and second, and that includes temperature and entropy.

Ok, there are arguments that say that this heirarchy of axioms is not the best, that you don't really need the zeroth law, its just a convenience, which will fall out of the first and second laws, when they are very carefully expressed. Maybe so, I don't know the details at this point.

Last edited: Mar 1, 2012
10. Mar 1, 2012

Ken G

That's a good point, if we create the usual structure, we regard entropy as a more advanced concept than anything that appears in the zeroth law. The zeroth law only refers to heat exhange, and asserts that the absence of spontaneous heat exchange is a transitive relationship. By mentioning temperature, Feynman got us thinking in terms of the second law, but if one sticks to the structure you have in mind, we should note as you say that the concept of temperature is made possible by the zeroth law. So we should be asking, what makes the concept of temperature possible?

Personally, I think all of thermodynamics starts with entropy, so the zeroth law should be about entropy, so should be something like, "what happens spontaneously is what increases entropy," which we normally think of as the second law. The next crucial concept is heat, so the first law should be something like the transitive relationship between systems that won't spontaneously exchange heat, which is what we now think of as the zeroth law. This law would stem from the fact that if sending heat from system 1 to 3 will increase entropy, then sending heat from 1 to 2 and from 2 to 3 should also increase entropy, so should not be disallowed even if system 3 is only connected to system 1 via system 2 (which is pretty much the proof that JDStupi referred to using T, but T is not needed for that proof). Then we need to bring in the other types of energy, in the second law (which was the first). We never need the previous second law, because it stems from the zeroth law, and our definition of T need not be viewed as a law, just a handy way to define the key parameter that describes these equilibrium classes. So I am a proponent of inserting the second law ahead of the zeroth law.

11. Mar 1, 2012

Rap

I like to keep a clear distinction between classical thermodynamics and statistical mechanics. Classical thermodynamics is just an expression of relationships between various measured thermodynamic parameters, with no reference to an atomic theory of matter or anything, no attempt, beyond the laws, to "explain" anything. Then I view statistical mechanics as an explanation of classical thermodynamics (and more). I view the laws of thermodynamics as at least an attempt to build an axiomatic theory of classical thermodynamics and if that is the case, then entropy is the last thing to be defined, but its true, the concept of entropy lies at the core of the statistical mechanical understanding, and everything then flows from that. If you to begin by assuming the atomic theory of matter, and all the non-statistical physics that it entails, then the axiomatic development would probably be more or less the reverse of the classical thermodynamics development.

12. Mar 1, 2012

Ken G

Right, but the role of entropy can still be quite fundamental, as fundamental as the concept of "heat." Note also that entropy was used, in effect (via the Boltzmann factor), to calculate things like the low-frequency behavior of thermal radiation fields, on general grounds that did not reference any atoms present in the "blackbody." Indeed, given quantum mechanics, we see that classical physics is not actually sufficient to understand how atoms behave anyway. So there is a kind of "middle ground" between old-time thermodynamics and modern atomic theory, which appears when you recognize the importance of the concept of entropy, even before you understand the nature of the systems that exhibit entropy (do we understand that now?).

If you start with "let there be entropy", then tack on "let T be how much heat you need to add per entropy increase", you can talk about the T of reservoirs without caring what is in the reservoir, and you are off to the races. There is no direct need for any statistical mechanics in the meaning of entropy-- statistical mechanics is basically "how to count microstates", but entropy is just a state variable (different meaning of "state"-- thermodynamic state, not microstate) relating to the "size of the target", regardless of how it is determined that this is what the size of the target is. You only need statistical mechanics to calculate entropy, but you need classical mechanics to calculate energy, and we wouldn't say the first law of thermodynamics requires classical mechanics. It's always nice to connect thermodynamics to microphysics, but it's generally not required to do so, if you want to think of thermodynamics as its own axiomatic structure.

Last edited: Mar 1, 2012
13. Mar 1, 2012

Rap

You don't need statmech to calculate entropy, only to explain it. Entropy is measureable within the confines of classical thermo. Other than that, I agree.

I think one of the problems with entropy is that, unlike pressure, temperature, volume, etc., its not something that you can immediately sense and so its harder to intuitively understand it.

14. Mar 1, 2012

Ken G

Your words "calculate" and "measure" are having a little fight in that sentence, are they not?
Yes, it is a concept moreso than a physical attribute-- which is also why it is so closely connected to the choices we make as physicists, and about what know and what we want to know and so forth, than is typical with other physical variables. But this may be a feature, not a bug, when we recognize that modern physics is getting closer and closer to a place where the physicist can no longer be considered separately from the physics!

15. Mar 1, 2012

Rap

That's why this reminds me of our Schroedinger's cat discussion. What's going on inside that demon/catbox?

16. Mar 1, 2012

Ken G

I don't know, but I think we'll end up realizing there is a mirror in there!

17. Mar 2, 2012

lugita15

Does anyone know the answer to my Feynman question?

18. Mar 2, 2012

Rap

He is talking about the zeroth law, and more. The zeroth law says basically that if two systems are in thermal equilibrium with a third system, then they are in thermal equilibrium with each other. No mention of temperature. If you add that the third system is a calibrated thermometer, then that's what Feynman is saying. So you are looking for the "proof" of the zeroth law (using statistical mechanics), and the statistical mechanics explanation of the meaning of "temperature" and "thermometer".

I think probably "Fundamentals of Statistical and Thermal Physics" by Reif and/or "Statistical Physics" by Landau and Lifgarbagez will have what you need.

Last edited: Mar 2, 2012
19. Mar 3, 2012

lugita15

Let me quote Feynman at greater length, because I think there's a chance he may not be talking about the zeroth law.

"Incidentally, when we say that the mean kinetic energy of the particle is 3/2kT, we claim to have derived this from Newton's laws.... and it is most interesting that we can apparently get so much from so little ... How do we get so much out? The answer is that we have been perpetually making a certain important assumption, which is that if a given system is in thermal equilibrium at some temperature, it will also be in thermal equilibrium with anything else at the same temperature. For instance, if we wanted to see how a particle would move if it was really colliding with water, we could imagine that there was a gas present, composed of another kind of particle, little fine pellets that (we suppose) do not interact with water, but only hit the particle with "hard" collisons. Suppose that the particle has a prong sticking out of it; all our pellets have to do is hit the prong. We know all about this imaginary gas of pellets at temperature T - it is an ideal gas. Water is complicated, but an ideal gas is simple. Now, our particle has to be in equilibrium with the gas of pellets. Therefore, the mean motion of the particle must be what we get for gaseous collisions, because if it were not moving at the right speed relative to the water but, say, was moving faster, that would mean that the pellets would pick up energy from it and get hotter than the water. But we had started them at the same temperature, and we assume that if a thing is once in equilibrium it stays in equilibrium - parts of it do not get hotter and other parts colder, spontaneously. This proposition is true and can be proved from the laws of mechanics, but the proof is very complicated and can be established only by using advanced mechanics. It is much easier to prove in quantum mechanics than it is in classical mechanics. It was first proved by Boltzmann, but now we simply take it to be true, and then we can argue that our particle has to have 3/2kT of energy if it is hit with artificial pellets, so it also must have 3/2kT when it is being hit with water at the same temperature and we take away the pellets; so it is 3/2kT. It is a strange line of argument, but perfectly valid." (italics in original)

So I see a few possibilities. The title of the section is "the equipartition of energy", so it could refer to the equipartition theorem, which was independently proven by Maxwell and Boltzmann. Or as I've been supposing it could be the zeroth law, because "it will also be in thermal equilibrium with anything else" kind of sounds like the language you'd use to talk about the fact that thermal equilibrium is an equivalence relation. But then it could be Boltzmann's H-theorem, which was the derivation of the second law of thermodynamics from Newton's laws; the reason the last one is a possibility is because "if a thing is once in equilibrium it stays in equilibrium - parts of it do not get hotter and other parts colder, spontaneously" sounds like some formulations of the second law.

So which is it? If possible, I want to try and look at Boltzmann's original and see what matches Feynman's description.

20. Mar 5, 2012

Rap

Ok, that's the second law, entropy increases, and now I see why he said it was easier in QM than in classical. The entropy is proportional to the logarithm of the number of microstates that could give you the macrostate. In QM, there are discrete energy levels for a finite volume, so the number of microstates are countable, that makes it easy. In classical mechanics, the number of microstates are infinite, so the entropy is infinite, and that causes problems.

The idea is simple - if you count up the number of microstates for a system in an equilibrium macrostate, (uniform temperature) its hugely bigger than the number of microstates for a non-equilibrium macrostate (e.g. hotter in one place than another). Then you have to show or assume each microstate is equally probable. Then you have to show that the system is continually changing microstates. Then you can say that a system in a non-equilibrium macrostate will almost certainly wander into an equilibrium macrostate and stay there, or, in other words, the entropy will increase to its equilibrium value and almost never change after that.