Conflicting definitions of temperature?

  • #31
@Andy:
That's an interesting proposal. At first glance I'm not exactly sure what you are trying to imply, but I try to read up your link carefully and Joule-Kelvin and think about it. I'll answer to that soon :smile:
twofish-quant said:
The statmech book says that chemical potentials are a direct consequence of the Boltzmann distribution if you apply it correctly.
They aren't. In any case, it doesn't matter.
Now that is very non-scientific, to say something isn't true when you apparently don't know the theory. Fermi and Bose distribution for the particle energies with it's chemical potential is a direct consequence of applying the pure Boltzmann distribution to the total energy of the system. Read up the derivation.

twofish-quant said:
I stick the thermometer into something, I read out the number. That's temperature. If I can come up with some nice theory about how that thermometer behaves that's nice.
[...]
Do experiments. Come up with a theory. See if the theory matches observations. Reject theory if it doesn't. This is the problem with coming up with definitions of things that aren't based on observation is that you are lost if your theory is wrong.
[...]
I bake a pizza, I set the oven to 350 F. No assumptions or even knowledge of statistical mechanics.
The point is that you are lucky that there are many scientists around that whipsered you some laws about entropy and heat flow and all other results from statistical mechanics. You yourself can derive these results only more or less indirectly from a few experiments. But for every new bit of theoretical claim (e.g. about magnetisation or other quantities) you will have to do a lot of experiments again to check, whereas for scientists with more in-depth knowledge it will be a 5min university exercise in statmech to check that claim.
And the whole point of the discussion her that the temperature can be successfully generalized to all sorts of system like a deck or cards as mentioned earlier. Your definition of thermometer apply only to a very restricted range and with you cannot prove laws of thermodynamics (you will find many exceptions to the usual laws and only more details knowledge can identify the cause of non-applicability). Science has moved on within the last 100 years, and that's why we look at the more powerful ideas about temperature.
According to your statements you might also say "I take a wheel and it rolls. I don't need to know anything about velocity or angular velocity". You see my point?
 
Last edited:
Science news on Phys.org
  • #32
Gerenuk said:
I was asking for a temperature that includes statmech and ideal gases and also extends to as much as possible. So basically the most general temperature possible to define - i.e. the best one can do.

The most "general" definition I can think of is that temperature is a parameter the describes the width of a distribution that can somehow be related to the energy of the system.


I was also asking which other parameters exists that claim to be called temperature. Also I asked for criteria for a parameter to be called temperature. So basically I was asking why conflicting definitions have the right to be called temperature. They might be... but what physical equation or concept is the reason?

I don't think equations have anything to do with it. E.g. people who work with single electronics often measure just about everything in Kelvin: bath temperature, gap energies, photon energies etc. This is mostly for historical reason (the reason being that the first devices that were made were all superconducting meaning most parameters could be related to Tc); nine times out of ten you might as well use for example eV. When I analyze data I usually measure temperature in GHz.


Try to forget all equipment you have and all the tables you are given! Imagine you are the first scientist to construct a thermometer. How would you gauge it?


It depends on the system. The nuclear orientation thermometer I mentioned above is a primary thermometer (you don't need to calibrate it, you can get T by measuring the anisotropy of the gamma radiation) for phonons but doesn't tell you much about the electronic temperature. If it was an electronic device I would probably use noise thermometry and measure e.g. the switching current from a Josephson junction. For my dilution fridge I can get a fair idea of the temperature by measuring pressure; this is the most "conventional" measurement.
In order to get the temperature of a mode of a mechanical resonator one can simply measure the Q value of the resonance.


I assume make use of common statmech (i.e. Boltzmann distribution and temperature as I stated) and hope that your physical model of the microscopic process you used for the statmech equation is correct (just as people do for gases). Only this way you can relate a macroscopic quantity to something which is called temperature and derives from the Boltzmann distribution.

Not at all. That procedure won't work very well for a single vibrational mode. Moreover, whereas switching current measurements are "statistical" each event is unique and corresponds to the escape of phase from the whole junction so it is not really "microscopic" in the usual sense. The escape rate only depends on the amount of "noise" the junction sees and this will only be the same as the phonon temperature if the e-p scattering times are short (and at low temperature the escape is due to tunneling, meaning the escape temperature becomes independent of bath temperature).
But that's a completely different source of troubles. Of course if the thermometer doesn't interact the right way with the electronic system, then these two energy systems won't equilibrate and will have different temperatures.
But again, you need a different kind of thermometer depending on which system you want to interact with. There are lots of useful thermometers that do not interact well with phonons or gases.

All these methods rely on assumptions about the statmech equations of the thermometer. I mean again at some point someone must have gauged the macroscopic parameter to something that obeys the laws for temperature.

This was true maybe a hundred years ago. But the fact that we no have a "general" definition of temperature is now a well known issue (although it is rarely a real problem, it is usually clear from the context what it meant by T).
I even remember learning about this during my first course in low temperature physics as an undergraduate. I also remember being quite surprised about this, like you I had assumed that temperature was something well defined.
 
  • #33
I tried to work through the Carnot definition of absolute temperature, but the common sources of information I found are very sloppy and full of hidden assumption that I find hard to unclutter.

Is there a more mathematically strict prove of that approach to temperature? I probably have to read that before I can say much more.

Every sentence in physics only halfs the amount of missing information, so that - similar to popular physics literature - one never arrives at 100% understanding. Whereas an equation unambigiously states what's going on.

Andy Resnick said:
The Joule-Thompson experiment had nothing to do with 'kinetic energies' or even 'ideal gases'. In fact, I claim that those results are currently outside any physical model you choose- that is, they cannot be predicted or constructed based on *any* current theory or model system, because the data represents a postulate upon which the theory is based (like c_0 in special relativity)
How did this experiment define temperature?
Couldn't it be explained by the statmech definition together with the Van-der-Waals equation for the gas?!

Andy Resnick said:
I'm not sure what this link is implying. From what I understood: First it mentions the air-thermometer of temperature which is based on the assumption V=Tf(p). Second it calls for a temperature which for all reversible engines should obey W/Q_\text{in}=f(T_2-T_1). Right?

Andy Resnick said:
So, if we (analytically) analyze the Joule-Thompson experiment using statistical mechanics, we are tacitly building in the assumption that experimentally, there were equilibrium conditions (or that we can somehow apply the concept of equilibrium to those experimental conditions). If you think that is incorrect, please say so.
I agree. We assume that equilibration is a much faster processes than all others, so that effectively the system is at equilibirum w.r.t. to spontaneous energy transfer.

Andy Resnick said:
Question: How can an "absolute" temperature scale be constructed from measurements of specific heat?
[...]
So their data was, on one hand, a simple measurement of material properties: C_p(T) for water, for example. On the other, it was a material-independent measurement of the energy content of a degree interval at various temperatures, thus defining a degree in terms of a change in energy.
So what would be the defining equation for temperature exactly?
A have a system which can be brought in contact with water. Also I can modify that system by adding or subtracting a controlled and measureable amount of heat. What now?
What exactly am I supposed to do to attribute a temperature?

I just read some more. So am I right, that "all I need to do" is to find a reversible heat engine and the ratio of heat in and heat out will give the the ratio of temperatures?
 
Last edited:
  • #34
Gerenuk said:
Now that is very non-scientific, to say something isn't true when you apparently don't know the theory.

Science is grounded in observation and not theory.

Fermi and Bose distribution for the particle energies with it's chemical potential is a direct consequence of applying the pure Boltzmann distribution to the total energy of the system. Read up the derivation.

Can you point me to a reference for this. I don't think this is the situation. I don't think that you can derive femi and bose distributions without using the grand canonical ensemble at which point you aren't in Boltzmann distribution world.

And the whole point of the discussion her that the temperature can be successfully generalized to all sorts of system like a deck or cards as mentioned earlier.

Yes you can generalize the concept of temperature and entropy and energy. But we were talking about the definition of temperature. Definition of temperature is what a thermometer measures. If we have a theory of temperature that can be generalized then GREAT! But it's not the definition.

Your definition of thermometer apply only to a very restricted range and with you cannot prove laws of thermodynamics (you will find many exceptions to the usual laws and only more details knowledge can identify the cause of non-applicability).

You can't prove the laws of themodynamics at all. They are empirical statements about the universe. You can develop theories that are consistent with the observed laws, but that's not proof.

Science has moved on within the last 100 years, and that's why we look at the more powerful ideas about temperature.

Science is ultimately based on observation. We have a lot more sophisticated tools to explain observations, but you still have to do experiments.

According to your statements you might also say "I take a wheel and it rolls. I don't need to know anything about velocity or angular velocity". You see my point?

I'm saying the opposite. Just because I have this theory about velocity or angular velocity doesn't mean that it means anything unless I see how the wheel rolls.
 
  • #35
twofish-quant said:
Science is grounded in observation and not theory.
I believe you wholly think like an engineer and you have not tried thinking about my statements as a physicst. That's good your using existing stuff, but not good for an in-depth understanding to finally discover new connections.
I make a few comments to your last post and will read you answer, but this is not getting anywhere, so let's leave it like that.

twofish-quant said:
Can you point me to a reference for this. I don't think this is the situation. I don't think that you can derive femi and bose distributions without using the grand canonical ensemble at which point you aren't in Boltzmann distribution world.
How can you think that if you have never tried or seen the prove of impossibility?
I was actually referring to the exponential law in general that includes the grand canonical ensemble. Sorry if that is not correct.
In any case one can still prove the chemical potential from the Boltzmann distribution. It was in some lecture notes by J.J. Binney where he used contour integration and an approximation for large numbers. If I find these again and you really need them I can post it.

twofish-quant said:
Yes you can generalize the concept of temperature and entropy and energy. But we were talking about the definition of temperature. Definition of temperature is what a thermometer measures. If we have a theory of temperature that can be generalized then GREAT! But it's not the definition.
That might not be your definition or that of an engineers. But I bet physicist who deal with more general systems than normal gases, have the advanced definition that includes all of yours.

twofish-quant said:
You can't prove the laws of themodynamics at all. They are empirical statements about the universe.
From statisical mechanics and probability theory you can prove the law of increase of entropy. Entropy is the logarithm of the number of microstates. The number of microstates highly peaks for a certain configuration just as the multinomial distribution does.
All these high-level laws (like ideal gas law and so on) have derivations from microscopic principles. Everything should be proved from microscopic laws to be a "nice" theory. Non-nice theories will find their counterexample to the rule sooner or later.

twofish-quant said:
Just because I have this theory about velocity or angular velocity doesn't mean that it means anything unless I see how the wheel rolls.
In your view you'd build a wheel and define that it rolls. You would say "Why do I need an extended concept of rolling like using angular velocity? My car does work already."
The answer to that is, that someone might have more sophisticated uses for a wheel. Just as some people has more sophisticated uses for temperature.
 
  • #36
Gerenuk said:
I believe you wholly think like an engineer and you have not tried thinking about my statements as a physicst. That's good your using existing stuff, but not good for an in-depth understanding to finally discover new connections.

I have a Ph.D. is in theoretical astrophysics, and I did my dissertation on radiation hydrodynamics, and my bachelors was in physics at MIT. I don't like bringing up credentials, but you are the one that brought it up.

Having been one, yes, I do know how physicists think.

That might not be your definition or that of an engineers. But I bet physicist who deal with more general systems than normal gases, have the advanced definition that includes all of yours.

No. The definition of temperature is what a thermometer measures. Now we can create other thermometers. We can create models for how temperatures work, but those aren't *definitions*. The definition of temperature is what a thermometer measures. If you want to know what a thermometer is, I can hand one to you.

From statisical mechanics and probability theory you can prove the law of increase of entropy.

No you can't. You can model entropy. You can show that given some assumptions and definitions that entropy increases, but that's not proof.

All these high-level laws (like ideal gas law and so on) have derivations from microscopic principles.

Which you then compare to observation, and if they don't match, you toss the theory and start over again.
 
  • #37
Gerenuk said:
Every sentence in physics only halfs the amount of missing information, so that - similar to popular physics literature - one never arrives at 100% understanding. Whereas an equation unambigiously states what's going on.

This isn't true. In physics you want to avoid equations when possible. You can't, but you should try.

How did this experiment define temperature?

Here is a thermometer. Here is what it measures.
 
  • #38
twofish-quant said:
I have a Ph.D. is in theoretical astrophysics, and I did my dissertation on radiation hydrodynamics, and my bachelors was in physics at MIT. I don't like bringing up credentials, but you are the one that brought it up.
Instead of mentioning credentials, why don't you prove it by showing some in-depth knowledge? I believe you that you are hard-working. But have you ever sat down and thought if there are more connections in - say statmech - than you have been told at university? Everyone should start organising their knowledge, beause many things they teach you are actually special cases.

For the entropy derivation I don't have a good link. But that's basically just the statement that the multinomial distribution has a very sharp maximum. Have you tried to think about that?
I recall seeing this idea illustrated in roughly in
http://www.princeton.edu/WebMedia/lectures/
"Fashion, Faith and Fantasy in the New Physics of the Universe, Lecture 2: FAITH"
(or maybe one of the other 2 parts)
So look out for these type of explanation if you some day want to see where increase of entropy derives from.

For the derivation of chemical potential from Boltzmann see
http://www-thphys.physics.ox.ac.uk/user/JamesBinney/statphys2.pdf
(I hope it's the right notes. My download is stuck so I cannot check if that's the file I meant at the moment)

So why do you say thing aren't there or cannot be done just because you don't know them?
 
  • #39
twofish-quant said:
Every sentence in physics only halfs the amount of missing information, so that - similar to popular physics literature - one never arrives at 100% understanding. Whereas an equation unambigiously states what's going on.
This isn't true. In physics you want to avoid equations when possible. You can't, but you should try.
I'm not surprised to read that. Blunty speaking I believe exactly this type of physicist are only quoting and repeating what they have learned and they will never make a new discovery themselves.
But it possible that I'm wrong on that.
Who knows...

As an example I once saw experimentalists reading lots of books about superconductivity (with little equations). It was good enough for bragging small talk, but when they talked to theorists, the theorists noticed they were just talking b#!$ß!t
Had these people only once tried to understand the simple BCS wave function, they wouldn't make these mistakes. I strongly suggest popular physics reading is interesting but harmful to understanding.
For example an electron isn't parttime wave and parttime particle. I guess you know very well, that it is a consistent mathematical structure.
 
  • #40
Gerenuk said:
Instead of mentioning credentials, why don't you prove it by showing some in-depth knowledge?

You're the one that told me that I wasn't thinking like a physicist.

For the entropy derivation I don't have a good link. But that's basically just the statement that the multinomial distribution has a very sharp maximum. Have you tried to think about that?

Which proves nothing about the laws of thermodynamics. The laws of thermodynamics are observations. You can come up with a model that is consistent with observations, but that doesn't *prove* anything.

So why do you say thing aren't there or cannot be done just because you don't know them?

Because I'm wrong about things. Thanks for the derviation.
 
  • #41
I looked at the Carnot temperature definition in detail. Here is what I conclude:

With the assumptions that
  1. all systems can be ordered according to some real parameter called temperature and it is impossible for energy to spontaneously (i.e. without external work) flow from a lower temperature to a high temperature
  2. there exists reversible heat engines that move heat by consuming or producing work
  3. the heat engine operate cyclically, i.e. there internal energy is restored after one cycle
one can show that
  1. up to a factor one can determine the temperatures T for all system configurations that can be reached with reversible engine changes or isothermal processes; however any function like T^*=\ln T is just as good and here has no absolute 0
  2. for reversible changes the change in entropy is given by \mathrm{d}S=\mathrm{d}Q/T
  3. total entropy is constant for reversible changes and changes with no heat transfer
  4. provided work can be transformed into heat and engines that deal with negative temperatures exist, one can show that this would contradict assumption (1)

Negative temperature would be ok, if one skips assumption (1) for them.
 
Last edited:
  • #42
Negative temperatures are something that you can fit into a themodynamic framework if you think of them as things hotter than infinity rather than colder than absolute zero. One thing that works when dealing with temperatures from a theory viewpoint is to work with (beta) i.e. 1/T rather than T. At that point beta for absolute zero becomes infinite. Infinite temperature becomes beta=zero, and then negative temperatures become hotter than infinity.

Once you have that then you can create a negative temperature system by having a system with two energy states in which you pump the object so that items in the higher energy state are more populated than in the lower energy state.
 
  • #43
Gerenuk said:
IAs an example I once saw experimentalists reading lots of books about superconductivity (with little equations). It was good enough for bragging small talk, but when they talked to theorists, the theorists noticed they were just talking b#!$ß!t Had these people only once tried to understand the simple BCS wave function, they wouldn't make these mistakes.

What usually happens at this point is that the experimentalists go back and say "hell yes, we understand the BCS wave function, it's just that your theory doesn't match our data." Also it's possible for a theory to be mathematical elegant and almost totally useless. General relativity for example. It's a beautiful, elegant theory, except that in all but the most simple situations (FRW metric), it's largely useless when you actually try to match it to experimental data, so the first thing that you have to do is to calculate approximations so that you can actually get out numbers that you can test with experiment.

The reason you want to avoid math and use very simple arguments if you can is that these tend to be more robust and give you more insight as to how a physical system behaves.
 
  • #44
Andy Resnick said:
This is a false argument, because it requires an individual to "interpret" a derived result "in a proper way". Proper according to whom? Nature cares not a whit how you interpret your model, or what model you invent.

Of course not, but humans have developed a framework for understanding nature, and whether or not the results of our model are physically meaningful (to us) may indeed depend on how we interpret what the model is telling us. In certain contexts a negative temperature is not a meaningful thing - this is not such a context. What negative temperature tells us is that if I put a system with a negative absolute temperature in thermal contact with a system with positive absolute temperature heat will flow from the negative temperature system to the positive temperature system. This is a physical prediction. You may still be resisting based on whether or not I can produce for you a system with a negative absolute temperature. A page on [URL='https://www.physicsforums.com/insights/author/john-baez/']John Baez's website[/url] explains:

"Can this system ever by realized in the real world, or is it just a fantastic invention of sinister theoretical condensed matter physicists? Atoms always have other degrees of freedom in addition to spin, usually making the total energy of the system unbounded upward due to the translational degrees of freedom that the atom has. Thus, only certain degrees of freedom of a particle can have negative temperature. It makes sense to define the "spin-temperature" of a collection of atoms, so long as one condition is met: the coupling between the atomic spins and the other degrees of freedom is sufficiently weak, and the coupling between atomic spins sufficiently strong, that the timescale for energy to flow from the spins into other degrees of freedom is very large compared to the timescale for thermalization of the spins among themselves. Then it makes sense to talk about the temperature of the spins separately from the temperature of the atoms as a whole. This condition can easily be met for the case of nuclear spins in a strong external magnetic field.

Nuclear and electron spin systems can be promoted to negative temperatures by suitable radio frequency techniques. Various experiments in the calorimetry of negative temperatures, as well as applications of negative temperature systems as RF amplifiers, etc., can be found in the articles listed below, and the references therein."

I'll save you the effort of digging through the references to find such an actual system:

N.F. Ramsey, "Thermodynamics and statistical mechanics at negative absolute temperature," Phys. Rev. 103, 20 (1956) Describes the theory being spins systems in which negative temperatures may be realized.

A. S. Oja and O. V. Lounasmaa, "Nuclear magnetic ordering in simple metals at positive and negative nanokelvin temperatures", Rev. Mod. Phys. 69, 1 - 136 (1997) A lengthy review article which discusses negative temperatures in it, including experimental realizations

E. M. Purcell and R. V. Pound, "A Nuclear Spin System at Negative Temperature", Phys. Rev. 81, 279 - 280 (1951) The first, according to the above review article, realization of negative temperature in a nuclear spin system.

Any further arguing that negative temperature isn't a physical thing should now be a matter of semantics.

No! It's of direct relevance, if your model implies 'temperature' applies only to a system *in equilibrium*. Which it does using statistical mechanics. Our living bodies are not in "energetic equilibrium" (or any other kind of equilibrium) with the environment.

Really? So as I sit here in a room that is kept at room temperature (with small fluctuations about this temperature), you suggest that the flux of energy out of my body is not, on average, equal to the flux of energy coming into my body? So if I sit here long enough, either all of the energy in my body will radiate away, or I will absorb energy until I combust? I doubt that's the case. That's seems to me like a kind of energetic equilibrium, or at least a steady state if we want to split semantic hairs, and as I mentioned the concept of temperature has been extended to steady states using the Onsager reciprocal relations. So, in this way, I can define a body temperature, and I never needed to know directly which processes in by body were not in some sort of equilibrium with some other part of my body. These processes certainly lead to the energy being radiated from my body, but they do so in such a way that the energy radiated is balanced by the energy absorbed.
That's fine, but again, I am interested in systems that are far from equilibrium- living systems. Thus, I require (or I need to develop) a model that works better than the models I currently have. Frankly, I'm not smart enough to develop such a model. So I am reduced to hoping someone else publishes an idea that I can glom on to.

I whole-heartedly agree that systems out of equilibrium are fascinating and ought to be understood (and I would like to try and understand them!); my argument is merely that the body is in a steady state (for sufficiently long periods of time) that a sensible temperature can be defined for it.
 
  • #45
Mute said:
<snip>Really? So as I sit here in a room that is kept at room temperature (with small fluctuations about this temperature), you suggest that the flux of energy out of my body is not, on average, equal to the flux of energy coming into my body? So if I sit here long enough, either all of the energy in my body will radiate away, or I will absorb energy until I combust? I doubt that's the case.
On the contrary, it *is* exactly the case. Sit in a dark room long enough and you will die- or have you forgotten the function of breathing, eating and drinking? I wouldn't say the energy radiated away or you combusted, you simply starved (or asphyxiated). Or are dietary calories not considered energy? Is the use of oxygen outside of physics?
Mute said:
<snip>

I whole-heartedly agree that systems out of equilibrium are fascinating and ought to be understood (and I would like to try and understand them!); my argument is merely that the body is in a steady state (for sufficiently long periods of time) that a sensible temperature can be defined for it.

You may think the problem is solved; I disagree.
 
Last edited:
  • #46
Mute said:
<snip>You may still be resisting based on whether or not I can produce for you a system with a negative absolute temperature. A page on [URL='https://www.physicsforums.com/insights/author/john-baez/']John Baez's website[/url] explains:

<snip>

That not what I'm contesting. You apparently didn't notice the little disclaimer at the top of the page:

"Under certain conditions, a closed system can be described by a negative temperature."

Note those three words: 'a closed system'. A closed system admits equilibrium and hence negative temperatures are logical predictions of statistical mechanics.

However, all that line does is re-formulate my objection "negative temperatures are unphysical" to equivalently stating "closed systems are unphysical". You may choose to respond that a closed system is a good approximation of a real, physical, system (or even that a real, physical system can be prepared arbitrarily close to a closed system), but this does not invalidate the essence of my objection.
 
  • #47
twofish-quant said:
Negative temperatures are something that you can fit into a themodynamic framework if you think of them as things hotter than infinity rather than colder than absolute zero. One thing that works when dealing with temperatures from a theory viewpoint is to work with (beta) i.e. 1/T rather than T. At that point beta for absolute zero becomes infinite. Infinite temperature becomes beta=zero, and then negative temperatures become hotter than infinity.

Once you have that then you can create a negative temperature system by having a system with two energy states in which you pump the object so that items in the higher energy state are more populated than in the lower energy state.

That's not really true, either- can you heat something up from (positive) infinity to negative temperatures without passing through zero? If so, by what path along the real number line do you propose this? Never mind about how to heat something up from 300K to infinity...
 
  • #48
twofish-quant said:
What usually happens at this point is that the experimentalists go back and say "hell yes, we understand the BCS wave function, it's just that your theory doesn't match our data."

The problem is rather they hear one word and over simplify that to their own common knowledge instead of getting the abstract idea. You tell them there is an energy "gap" and they will draw you an actual gap into the Fermi surface diagram on paper.
You ask them what d-wave symmetry is and they will draw you a cloverleaf and that drawing is about all the can say about it and yet they believe they know it all.
To illustrate general relativity, a popular physics article might draw a plane and ball rolling on top. Then it might say a dent on the plane would be the analogy to a planet in space attracting. This analogy is actually even wrong and misses the concept of "straight lines" in curved space.
I even fell victim to one of these stupid analogies. In school they draw EM waves equal to waves on a rope. It took me a while before I noticed that the EM waves do not shake in space (i.e. have directions but no displacement in space).
Unfortunately physics isn't as simple as a non-physicists toy picture of the world. So one better starts understanding the equations.
 
  • #49
Gerenuk said:
I tried to work through the Carnot definition of absolute temperature, but the common sources of information I found are very sloppy and full of hidden assumption that I find hard to unclutter.

<snip>

It's good to perform some scholarship every once in a while- you learn all kinds of new information, once you clear away the layers of *other people's* interpretations. And yes, it is often difficult. Carnot had impeccible physical intuition, but his mathematical presentation was... sloppy, to be nice.

Gerenuk said:
I'm not sure what this link is implying. From what I understood: First it mentions the air-thermometer of temperature which is based on the assumption V=Tf(p). Second it calls for a temperature which for all reversible engines should obey W/Q_\text{in}=f(T_2-T_1). Right?

That article is the one that Kelvin wrote to define an absolute temperature scale; if we are to discuss temperature, it seems logical to at least be familiar with how it came to be defined.

Recall, Carnot's function \mu depends only on temperature. Kelvin thus first defines an absolute temperature t = \int \mu (\theta) d\theta, which is now independent of the scale of \mu. It should be noted that t can indeed vary from (-\infty, +\infty) and has an arbitrary zero.

However... this contradicts what is also known, namely that the latent heat at constant volume is not zero, and that the signs of the latent heat (at constant volume) and the variation of pressure with temperature are either of the same sign or both vanish. These are important considerations becasue both make up (via a ratio) the definition of \mu. So we define a *new* absolute temperature T = exp(t/g), where g is a constant. T, as opposed to t, ranges from (0, \infty ) and has an 'arbitrary value' of 1.

'g', you may guess, is somehow related to the Boltzmann factor 'k'.

So, to recap, based on measurements of \mu, we constructed a universal function T that is material-independent and scale-independent. The construction of the function T does not depend on any notion of irreversibility, ideal gas, statmech distribution, or constitutive relation - although the value of 'g' chosen *does*.
 
Last edited:
  • #50
Andy Resnick said:
On the contrary, it *is* exactly the case. Sit in a dark room long enough and you will die- or have you forgotten the function of breathing, eating and drinking? I wouldn't say the energy radiated away or you combusted, you simply starved (or asphyxiated). Or are dietary calories not considered energy? Is the use of oxygen outside of physics?

The idea that I have apparently failed to convey to you is that there are periods of time over which the net amount of energy radiated from a body is equal to the net amount of heat absorbed, and during such a steady state I may define a temperature as per the Onsager reciprocal relations. Do you still disagree?


You may think the problem is solved; I disagree.

Which problem? The problem of developing a non-equilibrium statistical mechanics is certainly nowhere near solved, nor did I ever claim it was. I said the problem of defining "body temperature" was solved, as it is (effectively) a steady state problem and may be described by the Onsager reciprocal relations.

That not what I'm contesting. You apparently didn't notice the little disclaimer at the top of the page:

"Under certain conditions, a closed system can be described by a negative temperature."

Note those three words: 'a closed system'. A closed system admits equilibrium and hence negative temperatures are logical predictions of statistical mechanics.

However, all that line does is re-formulate my objection "negative temperatures are unphysical" to equivalently stating "closed systems are unphysical". You may choose to respond that a closed system is a good approximation of a real, physical, system (or even that a real, physical system can be prepared arbitrarily close to a closed system), but this does not invalidate the essence of my objection.

Did you look up the references I gave you? The ones in which nuclear spins systems were experimentally demonstrated to behave as having a negative temperature? How is that "unphysical"? A statistical mechanical model of the system predicts the spin degrees of freedom to have a negative temperature. I put this spin system in contact with a spin system with positive temperature and the heat flows from the negative temperature system to the positive temperature system. This procedure has been done experimentally in nuclear systems where the spin degrees of freedom do not interact with the non-spin degrees of freedom over relevant time scales, and so are effectively a separate subsystem from the rest of the degrees of freedom and may have negative temperatures. So, what is your objection to this information? Is it that this is simply "an approximation" or something else?

You can argue all you like that your objections are not invalidated because the real world, strictly speaking, does not conform perfectly to the models, but your objections are then, practically speaking, pointless. Strictly speaking, there is no such thing as a phase transition in the real world, but that does not stop water from turning into ice or steam. All of our theoretical understanding of nature is through idealized models intended to capture the essence of the phenomena we are studying. If you object to the model on the basis of it never fully describing a real system, then what exactly is your criterion for a prediction of the model to be "physical"?
 
  • #51
Andy Resnick said:
Kelvin thus first defines an absolute temperature t = \int \mu (\theta) d\theta, which is now independent of the scale of \mu. It should be noted that t can indeed vary from (-\infty, +\infty) and has an arbitrary zero.
I didn't see it from the articles. Thanks for telling. Let's see...

What is \theta? How exactly do you measure to find \mu??
The expression \frac{V}{C_P}(T\alpha-1) contains temperature related properties, so they cannot be used before you know what temperature ist?!
Also you cannot know in advance that a process is isenthalpic?! For the Carnot definition one takes all reversible processes to be the isentropic, but here there is no way to pick only the right processes?

Also I don't see an argument why this definition should be material independent.
 
  • #52
Mute said:
The idea that I have apparently failed to convey to you is that there are periods of time over which the net amount of energy radiated from a body is equal to the net amount of heat absorbed, and during such a steady state I may define a temperature as per the Onsager reciprocal relations. Do you still disagree?

Of course not- the Onsager relations, as is all of statistical mechanics, is valid for some phenomena. My point, which you are either ignoring or I am not being sufficiently clear in explaining- is that the statistical model of phenomena is *incomplete*. Using a statistical model of temperature (or any other physical quantity) is also *incomplete*. Furthermore, rather than simply giving up trying to develop a more general model that includes statistical mechanics as a limiting case, we should be trying to develop a more general ('better') model.


Mute said:
Which problem? The problem of developing a non-equilibrium statistical mechanics is certainly nowhere near solved, nor did I ever claim it was. I said the problem of defining "body temperature" was solved, as it is (effectively) a steady state problem and may be described by the Onsager reciprocal relations.

Again, how can you consider it solved when you are applying a result to conditions outside the region of validity? Onsager's relations are linear phenomenological relations. And please don't confuse my highlighting the assumptions inherent in statistical mechanics with questioning the validity of those assumptions.
 
  • #53
Gerenuk said:
I didn't see it from the articles. Thanks for telling. Let's see...

What is \theta? How exactly do you measure to find \mu??
The expression \frac{V}{C_P}(T\alpha-1) contains temperature related properties, so they cannot be used before you know what temperature ist?!
Also you cannot know in advance that a process is isenthalpic?! For the Carnot definition one takes all reversible processes to be the isentropic, but here there is no way to pick only the right processes?

Also I don't see an argument why this definition should be material independent.

This thread has been good- it's forced me to really dig down into some concepts I consider fundamental.

Ok- first, all those different symbols signifying 'temperature', which is fitting given the thread title. \theta is generally used to refer to the 'ideal gas temperature', and is measured by an ideal-gas themometer. 't' (or \tau when we need to use 't' for time) is Kelvin's first aboslute temeprate, and 'T' is Kelvin's second absolute temperature.

Next: the function \mu. Honestly, I do not have a clear understanding of what that is- the best derivation I have is from Truesdell's "The Tragicomical History of Thermodynamics 1822-1854". It's derived based on the heat generated though a (Carnot) cycle, and is thus material independent. The Carnot-Clapeyron theorem shows that

\mu \Lambda_{V} = \frac{\partial p}{\partial \theta}, where \Lambda_{V} is the latent heat at a specific volume- from this, one can generate the expression you presented. However, the original form of \mu is important because it is entirely *experimental*- measuring it allows a check on any theory regarding specific heats of real materials, the first law of thermodynamics, etc. etc.

In that context, there was quite a bit of experimental work by Clausius, Joule, Rankine, and Thompson to measure \mu for air, steam, etc. There's a lot of experimental results to sift through, Experimental measurements can be made isoenthalpic by simple insulation. One common criticism is that the measurements require an equation of state. However, by defining the various absolute temperatures the way they are, using different equations of state simply changes the relationship of \theta and T.
 
  • #54
Andy Resnick said:
Ok- first, all those different symbols signifying 'temperature', which is fitting given the thread title. \theta is generally used to refer to the 'ideal gas temperature', and is measured by an ideal-gas themometer.
That's not a good starting point, as it makes assumptions that you are able to find ideal gases. Also then the temperature cannot be applied to a deck of cards.

Andy Resnick said:
't' (or \tau when we need to use 't' for time) is Kelvin's first aboslute temeprate, and 'T' is Kelvin's second absolute temperature.
I have no idea what Kelvin did, but I played around with solely \mathrm{d}E=T\mathrm{d}S-p\mathrm{d}V and finally got the interesting equation
<br /> T(V,S)=T(V_0,S)\exp\left(-\int_{V_0,\text{const }S}^V \left(\frac{\partial p}{\partial E}\right)_V\mathrm{d}V\right)<br />
(and a similar for pressure) which can be used to determine the temperature for reversible processes.

Andy Resnick said:
\mu \Lambda_{V} = \frac{\partial p}{\partial \theta}, where \Lambda_{V} is the latent heat at a specific volume
I'm not sure how to derive that \Lambda_V from general assumptions.

Andy Resnick said:
One common criticism is that the measurements require an equation of state. However, by defining the various absolute temperatures the way they are, using different equations of state simply changes the relationship of \theta and T.
Oh, I see. With an equation of state I'd understand better and that would also be a big objection by me. I try to play around with equations a bit more to see if different equations of state just map the temperature to another function.
 
  • #55
Gerenuk said:
That's not a good starting point, as it makes assumptions that you are able to find ideal gases. Also then the temperature cannot be applied to a deck of cards.

Exactly! That said, the practical reality is that an air thermometer acts very much like an ideal gas thermometer for some range in temperature- recall the original goal of thermometry was to establish a method by which different thermometers could be compared. Not different air thermometers, but (say) an air thermometer and a mercury thermometer.



Gerenuk said:
I'm not sure how to derive that \Lambda_V from general assumptions.

The 'fundamental' starting point of thermodynamics is that an object that absorbs a certain amount of heat Q experiences both a change in temperature and volume:

dQ/dt = \Lambda_{V} (V, \theta ) dV/dt + C_{V}(V, \theta ) d\theta /dt.

Where C_V is the specific heat at constant volume. Note that the latent heat, as written here, is not derived from assumptions regarding an equation of state. It's simply based on the observation that expanding or compressing a material involves the flow of heat, even at constant temperature. Note also that the above equation is a *dynamic* equation, in that time is explicit; much of 'thermodynamics' is really 'thermostatics' becasue the timedependence is supressed.
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 25 ·
Replies
25
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
6
Views
2K
  • · Replies 152 ·
6
Replies
152
Views
11K
  • · Replies 30 ·
2
Replies
30
Views
1K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
20
Views
6K
Replies
39
Views
6K