Conflicting definitions of temperature?

AI Thread Summary
Temperature is often misunderstood as a direct measure of energy density, but it is more accurately defined in relation to the average kinetic energy of particles in a system. At absolute zero, while classical motion ceases, quantum mechanics reveals that zero-point motion persists, indicating that systems can still possess potential energy. The definition of temperature varies significantly across different contexts, especially at low temperatures, where traditional thermodynamic definitions may not apply. There is no universally accepted definition of temperature that encompasses all physical systems, leading to ambiguities in its application. Ultimately, the concept of temperature requires careful consideration of the specific conditions and systems being analyzed.
nomadreid
Gold Member
Messages
1,748
Reaction score
243
I thought that temperature is a measure of energy density, which means that at the vacuum energy has a minuscule temperature above absolute zero. However, I read at http://www.Newton.dep.anl.gov/Newton/askasci/1993/physics/PHY59.HTM that "At absolute zero, all motion does not cease,..." which would seem to contradict the idea of absolute zero as a state of zero energy density which is attainable with a probability approaching zero. So, is the definition of "temperature proportional to energy density" flawed?
 
Science news on Phys.org
Yes, temperature is related to the average kinetic energy of the system. Any system, classically, that is at absolute zero will have no kinetic energy, and thus no movement. However, it can still have a potential energy. Heck, even the vacuum field at 0 K has a very dense energy density that is divergent with frequency.

I'm not sure about quantum mechanics though. I would still feel that at 0 K there is no movement. However, even in vacuum there are still quantum field fluctuations. For example, charged particles can couple with the field fluctuations and this has real effects in quantum electrodynamics. But I am not sure if we can say that this will cause true movement of a system brought to 0 K. That would seem to require energy being taken out of the vacuum field to do work which as far as I know is not known to be possible, at least as a constant dynamic. Casimir force for example can draw objects closer but it will eventually hit a static point. Well, somebody with a far greater understanding of statistical physics could correct me here.
 
nomadreid said:
I thought that temperature is a measure of energy density, which means that at the vacuum energy has a minuscule temperature above absolute zero. However, I read at http://www.Newton.dep.anl.gov/Newton/askasci/1993/physics/PHY59.HTM that "At absolute zero, all motion does not cease,..." which would seem to contradict the idea of absolute zero as a state of zero energy density which is attainable with a probability approaching zero. So, is the definition of "temperature proportional to energy density" flawed?

Temperature is not a measure of the total energy density, temperature in thermodynamics is analogous to 'mass' in mechanics. Just as we say "mass is the amount of material", we can say "temperature is how hot an object is". Trying to say much more than that generally leads to either highly restrictive uses of the quantity (such as a mechanical basis for temperature), or curious nonphysical temperatures (such as occurs in two-state systems during population inversion).

In order to define temeprature sensibly, one needs a more general definition than is supplied by ideal-gas definitions (e.g, the temperature is a measure of how fast the atoms are moving). Defining the temperature of a body, for example, requires the body be in equilibrium.
 
First of all, Yes, there is still zero-point motion at 0K. This is because in QM the temperature is usually "defined" as a parameter of a bath of harmonic oscillators; and even when you set "T" in these equations to zero things move.
There is a section on this is Gardiner's book on open quantum systems (I don't remember the title).

And as already been stated: There is no good all-encompassing definition of "Temperature". The concept is used in many situations where the "classical" meaning of the world does not apply.
At very low temperatures the word is VERY ambiguous and you basically have to look at the exact circumstances of a given experiment to understand what the "T" in the equations actually refers to.
 
Temperature is in general not defined by the average energy. If you look at the basis of statistical mechanics, then you find that temperature is defined so that the Boltzmann distribution gives you the correct mean energy.
<br /> E=\frac{\sum_c E_ce^{-E_c/k_BT}}{\sum_c e^{-E_c/k_BT}}<br />
where E is the energy you measure in the system and the sum is over all possible configurations c.

Only for the special case where g(E)\propto E^a the temperature is incidently proportional to the energy.

Andy Resnick said:
we can say "temperature is how hot an object is".
Hmm, that's a kinda content-less statement :wink:
But there is a general definition for general systems. I can't remember how it goes. Do you know?

Andy Resnick said:
or curious nonphysical temperatures (such as occurs in two-state systems during population inversion).
That would be the case for my definition of temperature, but I don't see a problem with that. It's only non-physical if you believes temperature should be the average energy and therefore positive :rolleyes: That's why that definition is general.
Actually for system not in equilibirium there wouldn't be a temperature defined. But surely someone has generalized my definition of temperature so that it encompasses all distribution and converges the the normal definition for the Boltzmann distribution.
 
Gerenuk said:
<snip>

Hmm, that's a kinda content-less statement :wink:

I'm not sure why you say that. You didn't seem to mind "mass is a measure of how much stuff there is". In both cases, a physical property is defined in terms of a mathematical objects: a scalar quantity that also allows for ordering (T1 >T2, for example). The statement also allows for numerous other quantitative treatments: changes in temperature, for example. It really is the most fundamental statement possible.

Gerenuk said:
That would be the case for my definition of temperature, but I don't see a problem with that. It's only non-physical if you believes temperature should be the average energy and therefore positive :rolleyes: That's why that definition is general.
Actually for system not in equilibirium there wouldn't be a temperature defined. But surely someone has generalized my definition of temperature so that it encompasses all distribution and converges the the normal definition for the Boltzmann distribution.

There's nothing inherently wrong with non-physical mathematical solutions; I am simply saying that the *physical* basis of physics must be primary to the *mathematical language* of physics. Otherwise, based on what criteria do we exclude solutions as non-physical?

To reiterate, AFAIK, there is no rational generalized defintion of temperature that holds for all physical systems. One may define "effective" temperatures, but these are not things we can measure with a thermometer.

Here's a practical example: for all practical purposes, we exist at constant temperature and pressure. Yet we exist in a state far from equilibrium- equilibrium for us means we are dead and decomposed. How can this be reconciled, other than simply stating "well, I can take my temperature with a thermometer so the temperature exists."? That's not a rational definition of temperature.

The same problem exists for simpler systems- sandpiles, a hard-sphere gas of bowling balls in zero-g conditions, etc. etc. Assigning a single, unique temperature to a hard-sphere gas is usually done in terms of the volume fractions (in order to correlate to phase transitions), but that does not correlate with the temperature of the (for example) bowling balls.
 
Andy Resnick said:
I'm not sure why you say that. You didn't seem to mind "mass is a measure of how much stuff there is".
Oh, that last statement is also useless in a way. If I imagine I want to measure temperature, what would I do assuming all I know is to measure "hotness"? It's just a shift in definition - just as useful as saying "because god wanted it so".

I heard of definitions similar to "let's define two reference system with determined temperature", but I cannot recall exactly how they work.

Andy Resnick said:
There's nothing inherently wrong with non-physical mathematical solutions; I am simply saying that the *physical* basis of physics must be primary to the *mathematical language* of physics.
I'm not sure what you mean. Can you please explain what you mean by "non-physical"? But you may not refer to violated laws that only follow for the special case when temperature is proportional to energy. In that case the preconditions are of course not satisfied.

Andy Resnick said:
To reiterate, AFAIK, there is no rational generalized defintion of temperature that holds for all physical systems. One may define "effective" temperatures, but these are not things we can measure with a thermometer.
I only know the temperature defined by basic statistical mechanics. To me it seems, the only reason why it's not applicable to all systems is, because someone invented a contradictory parameter and also called it "temperature".

Andy Resnick said:
Here's a practical example: for all practical purposes, we exist at constant temperature and pressure. Yet we exist in a state far from equilibrium- equilibrium for us means we are dead and decomposed.
Actually that's a very good point. One can think how much "huge number statistics" one needs and how homogeneous a medium has to be to define temperature. I think for this one can go back to the derivation of entropy and examine what happens for a small system.

Andy Resnick said:
How can this be reconciled, other than simply stating "well, I can take my temperature with a thermometer so the temperature exists."?
One just says as an approximation the body is in a constant temperature state. There are deviation in details. They are small for thermodynamical purposes, but essential for us.

Andy Resnick said:
The same problem exists for simpler systems- sandpiles, a hard-sphere gas of bowling balls in zero-g conditions, etc. etc. Assigning a single, unique temperature to a hard-sphere gas is usually done in terms of the volume fractions (in order to correlate to phase transitions), but that does not correlate with the temperature of the (for example) bowling balls.
I heard of these examples, but I don't know the details. Why do they call it temperature in the first place? What are the conditions to justify calling a parameter temperature?
 
Last edited:
There's a lot here... I'll do my best:

Gerenuk said:
Oh, that last statement is also useless in a way. If I imagine I want to measure temperature, what would I do assuming all I know is to measure "hotness"? It's just a shift in definition - just as useful as saying "because god wanted it so".

No, it's not the same thing as saying 'god says so'. I'm talking about the foundations of physical theory- in order to have a theory, one must first formally define objects and concepts. 'hotness', like 'quantity', is a primitive concept- and 'mass' has no meaning without 'quantity'. It may seem trivial and silly, but saying 'there are measurable properties of things that are positive real numbers' is an important concept. Because positive real numbers are not physical objects, and there is no reason to assume that positive real numbers correlate with anything real.

Gerenuk said:
I heard of definitions similar to "let's define two reference system with determined temperature", but I cannot recall exactly how they work.

You may be referring the 'the zeroth law', which is a way of defining temperature. The zeroth law is a definition of equilibrium, that's all.

Gerenuk said:
I'm not sure what you mean. Can you please explain what you mean by "non-physical"? But you may not refer to violated laws that only follow for the special case when temperature is proportional to energy. In that case the preconditions are of course not satisfied.

Ok- why do you agree that negative temperatures are non-physical? We can define negative energies, why not temperatures?

Gerenuk said:
I only know the temperature defined by basic statistical mechanics. To me it seems, the only reason why it's not applicable to all systems is, because someone invented a contradictory parameter and also called it "temperature".

Statistical mechanics is not the foundation of all of physics.

Gerenuk said:
Actually that's a very good point. One can think how much "huge number statistics" one needs and how homogeneous a medium has to be to define temperature. I think for this one can go back to the derivation of entropy and examine what happens for a small system.

I think you missed my point. I am not in any way close to equilibrium, and neither are you. Yet we can both use a thermometer to measure our temperature. If temperature can only be defined for a body in equilibrium, how is it that we have a temperature of 98.6 F?.

Gerenuk said:
One just says as an approximation the body is in a constant temperature state. There are deviation in details. They are small for thermodynamical purposes, but essential for us.

Our deviation from equilibrium is not small! We are, by one measure (the concentration of ATP relative to ADP), orders of magnitude away from equilibrium.

Gerenuk said:
I heard of these examples, but I don't know the details. Why do they call it temperature in the first place? What are the conditions to justify calling a parameter temperature?

Now *that's* a good question! I don't have an answer, other than if one writes dE/dS (or something like that), you get a parameter that acts like T.
 
What are the conditions to justify calling a parameter temperature?

There are not generally accepted criteria. I actually know some people who work in temperature metrology and not even they know. They basically stay avay from situations where there is any ambiguity. Which, btw, is why the latest international temperature scale(ITS-90) is only defined down to 650 mK. There have been attempts to extend it to lower temperatures but they haven't been successfull (you can buy sensors for lower temperatures that can be traced to NIST, but that is not an "offical" calibration).
 
  • #10
Andy Resnick said:
There's a lot here... I'll do my best:
... in order to have a theory, one must first formally define objects and concepts. 'hotness', like 'quantity'
OK, so you have a block of wood. How would you measure temperature then? Don't forget you are not given a magic device that measures "hotness". Saying temperature is hotness is just a shift of definition. What is hotness then?

Andy Resnick said:
You may be referring the 'the zeroth law', which is a way of defining temperature. The zeroth law is a definition of equilibrium, that's all.
No, there are guys around who define temperature with reference systems and for very general systems. A system that could be anything like a chess board with a cup of water on it. But I don't remember their very mathematical precise way. It probably has to do with ergodic theory or so.

Andy Resnick said:
Ok- why do you agree that negative temperatures are non-physical? We can define negative energies, why not temperatures?
I didn't write I don't agree with negative temperatures. I was asking you why you say that a temperature definition (probably mine) can be non-physical. Negative temperatures are in fact physical.

Andy Resnick said:
I think you missed my point. I am not in any way close to equilibrium, and neither are you. Yet we can both use a thermometer to measure our temperature. If temperature can only be defined for a body in equilibrium, how is it that we have a temperature of 98.6 F?.
Stricly speaking we are not in equilibrium. But the material in the thermometer is and so it can show you a temperature value. As the thermometer only interacts with the kinetic motion of our molecules, which themselves are roughly in equilibirium, it is in equilibrium with the motion of our molecules only.

Andy Resnick said:
Our deviation from equilibrium is not small! We are, by one measure (the concentration of ATP relative to ADP), orders of magnitude away from equilibrium.
For this you have a different temperature from mine. As an approximate and to define the only reasonable temperature I use the kinetic motion of molecules only. Everything else doesn't permit the definition of temperature anyway.

Andy Resnick said:
Now *that's* a good question! I don't have an answer, other than if one writes dE/dS (or something like that), you get a parameter that acts like T.
I thought about that, but the problem is that entropy S is even less defined than temperature.
 
  • #11
Gerenuk said:
OK, so you have a block of wood. How would you measure temperature then? Don't forget you are not given a magic device that measures "hotness". Saying temperature is hotness is just a shift of definition. What is hotness then?

Hopefully, you are starting to see that temperature cannot be measured unless you have a thermometer- which is defined as a device to measure some *physical property* of the system, and that satisfies certain properties, analogous to having a ruler or a clock- being able to compare different measurements,for example. Enunciating those properties is 'thermometry', and the foundations of thermometry are not compeletely understood as of now.


Gerenuk said:
No, there are guys around who define temperature with reference systems and for very general systems. A system that could be anything like a chess board with a cup of water on it. But I don't remember their very mathematical precise way. It probably has to do with ergodic theory or so.

You keep saying this, but have not supplied a reference (I would like to read the article). In any case, ergodic theory does not cover glassy states, so I don't see how it can be a truly general definition.


Gerenuk said:
I didn't write I don't agree with negative temperatures. I was asking you why you say that a temperature definition (probably mine) can be non-physical. Negative temperatures are in fact physical.

Really? Kelvin would disagree with you.

Gerenuk said:
Stricly speaking we are not in equilibrium. But the material in the thermometer is and so it can show you a temperature value. As the thermometer only interacts with the kinetic motion of our molecules, which themselves are roughly in equilibirium, it is in equilibrium with the motion of our molecules only.


For this you have a different temperature from mine. As an approximate and to define the only reasonable temperature I use the kinetic motion of molecules only. Everything else doesn't permit the definition of temperature anyway.


I thought about that, but the problem is that entropy S is even less defined than temperature.

Again, defining the temperature in terms of 'average kinetic energy' is too restrictive. It may be useful for elementary considerations, but it cannot constitute a foundation of thermometry.
 
  • #12
nomadreid said:
So, is the definition of "temperature proportional to energy density" flawed?

It's not flawed. It's just classical. It provides a good estimate for large systems. But it breaks down when you get to a certain extremely cold temperature due to the uncertainty principle.

Temperature is a measure of kinetic energy, and energy is the "dual" quantity of time. If you freeze a particle to 0K and measure the energy of a system exactly, then by the uncertainty principle, you have no idea when that measurement was valid.
 
  • #13
Gerenuk said:
the problem is that entropy S is even less defined than temperature.

In what way is entropy not well defined in statistical mechanics?

\Omega = number of microstates of a system which comprise a given macrostate (specified by total energy U, volume V, number of molecules N, for e.g. a gas).

Entropy S = k \log \Omega (the famous equation which is engraved on Boltzmann's gravestone).

Then define temperature via

\frac{1}{T} = {\left( \frac {\partial S}{\partial U} \right)}_{V,N}
 
  • #14
The problem is that it is very hard to see how one would use these equations when dealing with e.g. the cooling of a single mode of a resonator.
 
  • #15
Right- another good example is the electromagnetic field. It's possible to define a temperature for a single configuration of the field- black body radiation. Any deviation from that, such as passing the light through a filter that removes only a narrow region of frequencies, results in non-thermal light that cannot be assigned a temperature.
 
  • #16
Andy Resnick said:
Hopefully, you are starting to see that temperature cannot be measured unless you have a thermometer
Hmm, that's again a shift in definition only and not getting to the point. You still haven't specified how you want to measure temperature. Instead you rely on other people providing you a device called thermometer.
To illustrate what I mean by giving a specific system for measurement here is how I would measure temperature defined by my statmech equation above:
I observe the system and measure its total energy at different times. From this energy data I plot a histogram of the energy distribution and fit it to an exponential law. The exponent of the exponential law gives me the temperature. If it doesn't fit an exponential law, then the system is not in equilibirium and doesn't have a temperature.
This method is general enough to include the statmech and thermodynamics concept of temperature.

Andy Resnick said:
You keep saying this, but have not supplied a reference (I would like to read the article). In any case, ergodic theory does not cover glassy states, so I don't see how it can be a truly general definition.
I keep saying that I do not recall how they did it. It was a lecture where I quickly noticed that it was to mathematical and abstract for me. But some part of the talk was based on
http://arxiv.org/abs/math-ph/0003028
That's sort of their method and temperature was defined similarly. Maybe one can find a paper search for these guys.

Andy Resnick said:
I was asking you why you say that a temperature definition (probably mine) can be non-physical. Negative temperatures are in fact physical.
Really? Kelvin would disagree with you.
It doesn't matter if Kelvin disagrees. He probably used temperature for the thermodynamics of ideal gases only.
But anyway, please finally post you own opinion. Do you find negative temperatures unphysical and if so then why?

Andy Resnick said:
Again, defining the temperature in terms of 'average kinetic energy' is too restrictive.
That's not what I wrote. My general definition is the statmech one. For the special case of the interaction between an ideal gas and and human body, the kinetic of the molecules plays a role only.

jtbell said:
In what way is entropy not well defined in statistical mechanics?
I wasn't clear enough. I meant, if you want to measure the temperature of a piece of wood, referring to theoretical equations about entropy makes the task only harder. Or just really try to imagine which step by step instructions you would try to follow to measure temperature. What would it be? What is a microstate? How do you count them?
But remember that some equation you might know only apply to an ideal gas and not to a block of wood.
 
  • #17
Temperature is what a thermometer measures. Trying to define temperature is like trying to define length. For that matter how do you define "cup" or "spinach." Ultimately you end up pointing to things that are cups, things that aren't cups, and as long as we agree on what is a cup then we are good.

Also trying to fit things to a Boltzmann distribution won't work. For most systems you end up with quantum interchange effects and chemical potentials. Also systems that are not in equilibrium have well defined temperatures. Also if you define temperature in terms of energy distributions, then you have the not insignificant problem of trying to define "energy".

The other thing is that suppose I give you a system that doesn't follow Boltzmann's equations, but gives you a well defined temperature when I stick a thermometer in it. Then I just toss Boltzmann's equations because they are wrong.
 
  • #18
Gerenuk said:
<snip>
But anyway, please finally post you own opinion. <snip>

This is a science discussion; my opinion is irrelevant.

Thanks for the reference.
 
  • #19
twofish-quant said:
Temperature is what a thermometer measures.
OK, so what is a thermometer then? You have to start going down to lower concepts like measuring energy or time at some point.

I reiterate my question: Which (hypothetical) procedure would you perform to measure temperature? You are sitting in a lab, but no-one has left a thermometer, so you have to build one. A simple gas thermometer won't be general enough though - at least my Boltzmann procedure can deal with more general cases.

twofish-quant said:
Trying to define temperature is like trying to define length.
Length is defined by the speed of light and a certain duration of a physical process. These are preconstructed by nature. Thermometers do not come from nature.
In fact eventually all explanations and devices probably should end up using the observables length and time only.

twofish-quant said:
Also trying to fit things to a Boltzmann distribution won't work. For most systems you end up with quantum interchange effects and chemical potentials.
The statmech book says that chemical potentials are a direct consequence of the Boltzmann distribution if you apply it correctly.

twofish-quant said:
Also systems that are not in equilibrium have well defined temperatures.
I have not studied that topic yet. I suppose there exists temperature that coincides with my definition for equilibrium cases, but generalize for non-equilibirium also. Hope I learn that at some point.

twofish-quant said:
The other thing is that suppose I give you a system that doesn't follow Boltzmann's equations, but gives you a well defined temperature when I stick a thermometer in it. Then I just toss Boltzmann's equations because they are wrong.
They are not wrong. Your thermometer follows the Boltzmann equations and keep in mind that the temperature reading refers to the temperature of your thermometer no not directly the body you are probing!
Due to interactions with a non-equilibrium system the thermometer will acquire a certain equilibrium state for itself. In fact knowing the physical laws and applying the Boltzmann equation to the liquid in the thermometer will predict you the correct temperature.

OK, so what would you do? You did indeed made fair points where the Boltzmann definition might fail, but what is a better suggestion? You have to suggest something better that explains at least as much as the "Boltzmann temperature".

The "Boltzmann temperature" explains all of thermodynamics and all of undergrad statmech.
 
  • #20
I reiterate my question: Which (hypothetical) procedure would you perform to measure temperature? You are sitting in a lab, but no-one has left a thermometer, so you have to build one. A simple gas thermometer won't be general enough though - at least my Boltzmann procedure can deal with more general cases.

But again, there IS no "general" definition of temperature.
Most of the fixed points on the international temperature scale (ITS-90) are based on triple points, although the lowest points use the melting curve of He-3. Hence, these points are "classical".

However, no one -including the people who manage the ITS (I know some of them) claim that this this is more than a practical scale. The reason why it hasn't been extended to lower temperatures is because the concept of temperature is so ill defined.

I use a nuclear orientation(NO) thermometer in my lab to measure temperatures between 15mK and 200 mK. Some of the equipment(including the Co-60 source) I am using is actually "leftovers" from a project that aimed to extend the ITS to lower temperatures using NO thermometers. However, they never succeeded; mainly because the temperature that is measured by NO (essentially the phonon temperature of the Co) is not necessarily the temperature relevant in experiments (usually the electronic temperature, which can be hundreds of mK higher if the e-p scattering times are long or the system is noisy).
Hence, extending the ITS using this method wouldn't actually be of much use.

The fact that there are several relevant "temperatures" when working below 1K is something a lot of people do not appreciate, it is definitely something I've had to point out many times when writing referee reports. It is also a very common error in published papers.
 
Last edited:
  • #21
f95toli said:
But again, there IS no "general" definition of temperature.
I was asking for a temperature that includes statmech and ideal gases and also extends to as much as possible. So basically the most general temperature possible to define - i.e. the best one can do.
I was also asking which other parameters exists that claim to be called temperature. Also I asked for criteria for a parameter to be called temperature. So basically I was asking why conflicting definitions have the right to be called temperature. They might be... but what physical equation or concept is the reason?
Of course I could call the height of water in a glass temperature. It even would have some of its properties. But why would I do that?

f95toli said:
Most of the fixed points on the international temperature scale (ITS-90) are based on triple points, although the lowest points use the melting curve of He-3. [...]
The reason why it hasn't been extended to lower temperatures is because the concept of temperature is so ill defined.
Try to forget all equipment you have and all the tables you are given! Imagine you are the first scientist to construct a thermometer. How would you gauge it?
I assume make use of common statmech (i.e. Boltzmann distribution and temperature as I stated) and hope that your physical model of the microscopic process you used for the statmech equation is correct (just as people do for gases). Only this way you can relate a macroscopic quantity to something which is called temperature and derives from the Boltzmann distribution.

f95toli said:
is not necessarily the temperature relevant in experiments (usually the electronic temperature, which can be hundreds of mK higher if the e-p scattering times are long or the system is noisy).
But that's a completely different source of troubles. Of course if the thermometer doesn't interact the right way with the electronic system, then these two energy systems won't equilibrate and will have different temperatures.

f95toli said:
The fact that there are several relevant "temperatures" when working below 1K is something a lot of people do not appreciate, it is definitely something I've had to point out many times when writing referee reports.
All these methods rely on assumptions about the statmech equations of the thermometer. I mean again at some point someone must have gauged the macroscopic parameter to something that obeys the laws for temperature.
These can only be guessed within an approximation so essentially all these temperature might be wrong. Luckily the a low density gas behaves quite like an ideal gas for which it is easy to measure the temperature.

I think temperature is a cardinal scale, so you cannot arbitrarily scale/transform it, without disturbing some equations?
 
  • #22
Gerenuk said:
I was also asking which other parameters exists that claim to be called temperature. Also I asked for criteria for a parameter to be called temperature. So basically I was asking why conflicting definitions have the right to be called temperature. They might be... but what physical equation or concept is the reason?

Temperature is abstractly defined as the property of two connected systems that is constant when there is no net energy flow between them. From the statistical mechanical view point if I have two systems connected by prevented from exchanging energy and I then release the constraint that they can't exchange energy, then the two systems will eventually evolve to configurations where the net energy in both of them doesn't change in time, which defines our "equilibrium". The two systems need not have the same energy, but it turns out that

\left(\frac{\partial S_1}{\partial E_1}\right) = \left(\frac{\partial S_2}{\partial E_2}\right)

Whatever this is, it's equal between the two systems. We choose to call this thing 1/T, where we call T "Temperature":

\frac{1}{T} = \left(\frac{\partial S}{\partial E}\right)

So, in any system where there is a property that we can measure that somehow behaves like energy (it doesn't actually have to be an energy - it just has to behave somehow like one mathematically), the net flow of which between two systems doesn't change in time defines an equilibrium condition analogous to thermal equilibrium in ideal gases, eg. We may then define a temperature-like variable \theta by

\frac{1}{\theta} = \left(\frac{\partial S}{\partial \mathcal E}\right),
where \mathcal E is the energy-like variable. S is the entropy, of course, defined by

S = k\ln \Omega.
Here, k is NOT necessarily Boltzmann's constant. In thermodynamics the Boltzmann's constant value isn't really fundamental - it just fixes the units we measure temperature or energy in. Aside from this constant, the number of microstates of the system is, in principle, a well defined number that doesn't really care about whether or not the system we're studying is a physical thermodynamic system. For example, I can calculate the entropy of a deck of cards: the number of possible sequences of a standard deck of cards is \Omega = 52!, so S = \log_2 (52!), where I chose k such that entropy is measured in bits. It's not a thermodynamic entropy, it's an "information" entropy. One could argue that statistical mechanics is simply information theory applied to systems which exchange energy, particles, etc.

So, this is in general our theoretical definition of temperature. In systems where we can define a temperature-like variable we expect it to play a role analogous to temperature in a physical system. So, while \theta might not be a physical temperature, it might still be the variable we tune to induce a "phase transition" in our system of interest.
I was asking for a temperature that includes statmech and ideal gases and also extends to as much as possible. So basically the most general temperature possible to define - i.e. the best one can do.

Rather than find the definition of temperature that covers the most cases, it is probably best to pick several definitions which overlap in certain regimes. The statistical mechanical definition is very nice, but it is perhaps not always practical. So, for practical purposes what we can do is find a system in which we can define some other definition of temperature, "A", that coincides with our stat-mech definition in some regime, and if we then want to measure temperature in a regime where our stat-mech definition is cumbersome but our other definition still works, we can specify yet another definition of temperature, "B", that coincides with our definition A in this region where our stat mech definition is impractical.For example, in the classical limit, the stat mech definition coincides with the definition of temperature defined by relating "thermal energy" to the average kinetic energy of the ideal gas. If we then consider a block of wood, for which we might never hope to calculate the temperature from the stat mech defintion, we can measure it with our "average kinetic energy definition". We might then be able to find another quantity which matches the "average kinetic energy definition" in a regime where "average kinetic energy definition" is no longer equal to the stat mech definition, and take that as another definition of temperature, which might still work in a regime where the "average kinetic energy definition" fails. (This is conceptual like analytic continuation of complex functions, I suppose).
I think temperature is a cardinal scale, so you cannot arbitrarily scale/transform it, without disturbing some equations?

You can certainly scale it. All that really amounts to is changing the units I chose to measure temperature in. I can set k_B = 1 if I so desire; all that results in is me measuring temperature in joules instead of Kelvin. There's no fundamental difference.

Andy Resnick said:
Really? Kelvin would disagree with you.

Negative temperature is certainly a physical thing, if you interpret it properly (and, depending on how you are defining "temperature" in that statement!). Via wikipedia, "a system with a truly negative temperature is not colder than absolute zero; in fact, temperatures colder than absolute zero are impossible. Rather, a system with a negative temperature is hotter than any system with a positive temperature (in the sense that if a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system)."

Negative temperatures as defined in the "average kinetic energy" sense are impossible. As you yourself say, it is too restrictive. From the stat mech definition, however, negative temperatures are allowed, as you seem to be aware. So, it's certainly a physical thing from the stat mech viewpoint. (the wikipedia article: http://en.wikipedia.org/wiki/Negative_temperature)

I think you missed my point. I am not in any way close to equilibrium, and neither are you. Yet we can both use a thermometer to measure our temperature. If temperature can only be defined for a body in equilibrium, how is it that we have a temperature of 98.6 F?.

Not in any way close to equilibrium with respect to what? Equilibrium is not a property of a system on its own, it's a property of a system with respect to another system. In the case relevant to defining a body temperature, that system is the environment, and we are most certainly in an energetic equilibrium with that: the amount of energy we are radiating away must be equal to the amount of energy we're absorbing from the environment - that is, the net flux of energy between our bodies and the environment is constant. (Of course, there are times when we are not in such an equilibrium, but there are periods of time where things are in a steady state and this applies). You might argue that temperature is still ill-defined because energy isn't constant, but we're in a steady state, and there is a generalization to steady state processes. In this case, "temperature" is defined by

\frac{1}{T} = \left(\frac{\partial s}{\partial u}\right),
where u is the energy density and s is the entropy density. See the wikipedia article on the Onsager reciprocal relations for more info: http://en.wikipedia.org/wiki/Onsager_reciprocal_relations
 
Last edited:
  • #23
Mute said:
<snip>

Negative temperature is certainly a physical thing, if you interpret it properly (and, depending on how you are defining "temperature" in that statement!).
<snip>
From the stat mech definition, however, negative temperatures are allowed, as you seem to be aware. So, it's certainly a physical thing from the stat mech viewpoint.
<snip>
I would claim you are confusing the object being studied with the model. Statistical mechanics is a model of some physical phenomena, it is not the phenomena itself.

Lots of equations used in physics have solutions that do not correspond to physical reality- we take the principal square root of the kinetic energy when calculating the velocity, because negative speeds are nonphysical.

Mute said:
Not in any way close to equilibrium with respect to what? <snip>
With respect to the chemical reactions that serve to keep us alive rather than dead. The ratio [ATP][Pi]/[ADP] is 10^8 higher than equilibrium conditions (ATP = adenosine triphosphate, ADP = adenosine diphosphate, Pi = phosphate), and this excess free energy is how we derive useful work from hydrolysis of ATP.

Mute said:
\frac{1}{T} = \left(\frac{\partial s}{\partial u}\right),
where u is the energy density and s is the entropy density. See the wikipedia article on the Onsager reciprocal relations for more info: http://en.wikipedia.org/wiki/Onsager_reciprocal_relations

I used to think the Onsager relations were interesting. I read a nice rebuttal by Truesdell, and now I see they are a simple linearization of irreversible thermodynamics, and so do not apply to systems of interest (to me). 10^8 is much larger than 1.
 
  • #24
Andy Resnick said:
I would claim you are confusing the object being studied with the model. Statistical mechanics is a model of some physical phenomena, it is not the phenomena itself.

The point of the models is to be able to make physical predictions (or at least gain a qualitative understanding of the phenomenon). A negative temperature is a physical prediction if you interpret it properly. Whether or not any systems with negative temperatures are known is another matter - for one thing it relates back to the discussion on how to measure temperature: we certainly couldn't measure a negative absolute temperature with an ideal gas thermometer, but that's a problem with our thermometer, not necessarily our notion of temperature. A negative temperature has a well defined interpretation in the model; solutions in problems that give negative velocities typically do not.

With respect to the chemical reactions that serve to keep us alive rather than dead. The ratio [ATP][Pi]/[ADP] is 10^8 higher than equilibrium conditions (ATP = adenosine triphosphate, ADP = adenosine diphosphate, Pi = phosphate), and this excess free energy is how we derive useful work from hydrolysis of ATP.

Okay, but that's irrelevant to defining a body temperature. We define the body temperature with respect to an energetic equilibrium between our bodies (as a whole system) and the environment, and such an equilibrium exists. We don't need to worry about the ratio [ATP][Pi]/[ADP] in defining such a temperature.

I used to think the Onsager relations were interesting. I read a nice rebuttal by Truesdell, and now I see they are a simple linearization of irreversible thermodynamics, and so do not apply to systems of interest (to me). 10^8 is much larger than 1.

Sure, they apply to steady state systems where a notion of local equilibrium is definable. As far as defining a body temperature goes, that should be sufficient. They certainly are not sufficient for systems far from equilibrium.
 
  • #25
Mute said:
Temperature is abstractly defined as the property of two connected systems that is constant when there is no net energy flow between them.
So my thought was: Would the transformed quantity T^*=T^2 also be a valid temperature or are there more constraints for something to be called temperature?

Mute said:
\frac{1}{T} = \left(\frac{\partial S}{\partial E}\right)
What bothered me long time ago, when I didn't know about statmech: If you are only able to measure macroscopic observables, then how can you ever deduce entropy without knowing about temperature first? That also means you cannot use this definition to define temperature?

Mute said:
Rather than find the definition of temperature that covers the most cases, it is probably best to pick several definitions which overlap in certain regimes. The statistical mechanical definition is very nice, but it is perhaps not always practical.
[...]
If we then consider a block of wood, for which we might never hope to calculate the temperature from the stat mech defintion, we can measure it with our "average kinetic energy definition".
[...]
in a regime where "average kinetic energy definition" is no longer equal to the stat mech definition
I believe if you are fond enough to assume approximations for the system, then the statmech temperature immediately includes all "thermal average energy" definitions. The thermal average definition follows from g(E)\propto E^\alpha which you probably could approximate to the relevant energy system in a block of wood.

So basically statmech temperature will never contradict the thermal dynamical average one. And if you want to use the latter, just claim that the correct precondition for statmech is fullfilled.
 
  • #26
Mute said:
The point of the models is to be able to make physical predictions (or at least gain a qualitative understanding of the phenomenon). A negative temperature is a physical prediction if you interpret it properly. Whether or not any systems with negative temperatures are known is another matter - for one thing it relates back to the discussion on how to measure temperature: we certainly couldn't measure a negative absolute temperature with an ideal gas thermometer, but that's a problem with our thermometer, not necessarily our notion of temperature. A negative temperature has a well defined interpretation in the model; solutions in problems that give negative velocities typically do not.

This is a false argument, because it requires an individual to "interpret" a derived result "in a proper way". Proper according to whom? Nature cares not a whit how you interpret your model, or what model you invent. I recommend reading about what absolute temperature is, and how an absolute temperature scale came to be: the Joule-Thompson experiment. 0 K is not an arbitrary thing any more than c_0 is. Or would you claim measurements of v > c_0 are physical results?

Mute said:
Okay, but that's irrelevant to defining a body temperature. We define the body temperature with respect to an energetic equilibrium between our bodies (as a whole system) and the environment, and such an equilibrium exists. We don't need to worry about the ratio [ATP][Pi]/[ADP] in defining such a temperature.

No! It's of direct relevance, if your model implies 'temperature' applies only to a system *in equilibrium*. Which it does using statistical mechanics. Our living bodies are not in "energetic equilibrium" (or any other kind of equilibrium) with the environment. I am simply pointing out an obvious paradox that (as of now) does not have a resolution. Far from simply throwing up our hands and saying "well, that's the best we can do", or saying "well, our model requires us to ignore this or that" we should be trying to resolve the paradox by developing a better model.

Mute said:
Sure, they apply to steady state systems where a notion of local equilibrium is definable. As far as defining a body temperature goes, that should be sufficient. They certainly are not sufficient for systems far from equilibrium.

That's fine, but again, I am interested in systems that are far from equilibrium- living systems. Thus, I require (or I need to develop) a model that works better than the models I currently have. Frankly, I'm not smart enough to develop such a model. So I am reduced to hoping someone else publishes an idea that I can glom on to.
 
  • #27
Andy Resnick said:
I recommend reading about what absolute temperature is, and how an absolute temperature scale came to be: the Joule-Thompson experiment. 0 K is not an arbitrary thing any more than c_0 is.
All you are writing assumes that temperature is defined a the average kinetic energy of ideal gas molecules. Taking temperature as the statmech definition allows for a more general application with negativ temperature for example.

Andy Resnick said:
That's fine, but again, I am interested in systems that are far from equilibrium- living systems. Thus, I require (or I need to develop) a model that works better than the models I currently have.
I wouldn't say "better". I'd say "completely different". Because statmech (at least equilibrium statmech as I know) is a very trivial statement that given there are outcomes A or B or B we should sensibly assume the outcome will be B. So statmech doesn't say anything about the system. It only identifies the most likely outcome assuming the system is a completely random mess.
 
  • #28
Gerenuk said:
All you are writing assumes that temperature is defined a the average kinetic energy of ideal gas molecules. Taking temperature as the statmech definition allows for a more general application with negativ temperature for example.


I wouldn't say "better". I'd say "completely different". Because statmech (at least equilibrium statmech as I know) is a very trivial statement that given there are outcomes A or B or B we should sensibly assume the outcome will be B. So statmech doesn't say anything about the system. It only identifies the most likely outcome assuming the system is a completely random mess.

I don't have time to write out a complete response, but here's a start:

The Joule-Thompson experiment had nothing to do with 'kinetic energies' or even 'ideal gases'. In fact, I claim that those results are currently outside any physical model you choose- that is, they cannot be predicted or constructed based on *any* current theory or model system, because the data represents a postulate upon which the theory is based (like c_0 in special relativity)

My evidence: First, the Joule-Thompson experiment measured the specific heat at constant pressure for various gases.

http://zapatopi.net/kelvin/papers/on_an_absolute_thermometric_scale.html

That is, what was measured (with a thermometer and manometer) was \frac{\partial T}{\partial P}\right)_{h} (the Joule-Thompson coefficient). I'm not an expert in statmech, but my understanding is that the partition function can only be evaluated (and from that, the free energies and temperature and from those, the specific heat) under equilibrium conditions. In fact, any statmech result only strictly applies to equilibrium: we may invoke 'local equilibrium', or transform a steady-state problem into an equlibrium problem (Onsager relations), but statmech can only be analytically written in terms of equilibrium conditions. So, if we (analytically) analyze the Joule-Thompson experiment using statistical mechanics, we are tacitly building in the assumption that experimentally, there were equilibrium conditions (or that we can somehow apply the concept of equilibrium to those experimental conditions). If you think that is incorrect, please say so.

Now for this business about 'what is a thermometer': everyone knows that any physical apparatus has limited measurement capability- one cannot use a single ruler to measure both the diameter of an atom and the distance to the moon, for example. So it is with thermometers: one cannot use a single thermometer to measure both the temperature of liquid helium and the temperature of the sun. Kelvin's (by that time he was Lord Kelvin) goal was to establish a *universal scale* by which different thermometers could be compared, the same way we can compare different rulers (wavelength of light and a meter stick, for example) and thus establish the field of 'thermometry'.

Question: How can an "absolute" temperature scale be constructed from measurements of specific heat?

One way of looking at the Joule-Thompson experiment is that they were measuring the "energy content" of a single degree change in a material at various temperatures. What temperatures could they use, since they are trying to calibrate the thermometers? Simple- the ice point, the boiling point, and the triple points of water. Those are experimentally accessible, well-defined thermodynamic states (note: the statmech picture of these states is decidedly *not* simple since there is a phase change).

So their data was, on one hand, a simple measurement of material properties: C_p(T) for water, for example. On the other, it was a material-independent measurement of the energy content of a degree interval at various temperatures, thus defining a degree in terms of a change in energy.

Proof of that last statement will have to wait, as I'm out of time for now. Are you with me so far? I'm willing to continue if you are interested...
 
Last edited:
  • #29
Gerenuk said:
OK, so what is a thermometer then?

It's this device that I hold with a bit of mercury inside a glass vial.

You have to start going down to lower concepts like measuring energy or time at some point.

No I don't. I stick this device in an oven and read out the number. That's temperature.

The statmech book says that chemical potentials are a direct consequence of the Boltzmann distribution if you apply it correctly.

They aren't. In any case, it doesn't matter. I stick the thermometer into something, I read out the number. That's temperature. If I can come up with some nice theory about how that thermometer behaves that's nice.

Your thermometer follows the Boltzmann equations and keep in mind that the temperature reading refers to the temperature of your thermometer no not directly the body you are probing!

I don't know beforehand that the thermometer does follow the Boltzmann equations, and I can think of lots of situations where the number that I read off the thermometer gives me wildly non Boltzmann distributions. Fermion gases.

OK, so what would you do?
Do experiments. Come up with a theory. See if the theory matches observations. Reject theory if it doesn't. This is the problem with coming up with definitions of things that aren't based on observation is that you are lost if your theory is wrong.

You have to suggest something better that explains at least as much as the "Boltzmann temperature".

I'm suggesting that you build this device and read off numbers. The fact that I may have no idea what causes those numbers to behaving in the way that they do is sort of irrelevant. Theory comes later.

The "Boltzmann temperature" explains all of thermodynamics and all of undergrad statmech.

The problem is that it *doesn't*. If you measure energy distributions of boson or fermion gases, you will get nothing near Boltzmann distributions. But that doesn't matter. You can stick an instrument into a fermion gas and get numbers out.
 
  • #30
Gerenuk said:
Try to forget all equipment you have and all the tables you are given! Imagine you are the first scientist to construct a thermometer. How would you gauge it?
I assume make use of common statmech (i.e. Boltzmann distribution and temperature as I stated) and hope that your physical model of the microscopic process you used for the statmech equation is correct (just as people do for gases). Only this way you can relate a macroscopic quantity to something which is called temperature and derives from the Boltzmann distribution.

Except that if you are the first scientist to construct a thermometer, you'd have probably have no way of knowing anything at all about microscopic processes. All you know is that when you stick this device in this cup of water, you get this result. From your observations, you have up evidence that atoms exist. People in 1500 BC, knew what temperature was. No clue about Boltzman relations or even atoms.

I think you are putting the theoretical cart before the observational horse.

All these methods rely on assumptions about the statmech equations of the thermometer.

No they don't. I know lots of people that know how to measure temperature and have no clue about statistical mechanics. I bake a pizza, I set the oven to 350 F. No assumptions or even knowledge of statistical mechanics.

I mean again at some point someone must have gauged the macroscopic parameter to something that obeys the laws for temperature.

If you insist. I put the thermometer in in melting ice water and mark off a "O" and then put it in boiling water, and mark off a "100". Happy?
 
  • #31
@Andy:
That's an interesting proposal. At first glance I'm not exactly sure what you are trying to imply, but I try to read up your link carefully and Joule-Kelvin and think about it. I'll answer to that soon :smile:
twofish-quant said:
The statmech book says that chemical potentials are a direct consequence of the Boltzmann distribution if you apply it correctly.
They aren't. In any case, it doesn't matter.
Now that is very non-scientific, to say something isn't true when you apparently don't know the theory. Fermi and Bose distribution for the particle energies with it's chemical potential is a direct consequence of applying the pure Boltzmann distribution to the total energy of the system. Read up the derivation.

twofish-quant said:
I stick the thermometer into something, I read out the number. That's temperature. If I can come up with some nice theory about how that thermometer behaves that's nice.
[...]
Do experiments. Come up with a theory. See if the theory matches observations. Reject theory if it doesn't. This is the problem with coming up with definitions of things that aren't based on observation is that you are lost if your theory is wrong.
[...]
I bake a pizza, I set the oven to 350 F. No assumptions or even knowledge of statistical mechanics.
The point is that you are lucky that there are many scientists around that whipsered you some laws about entropy and heat flow and all other results from statistical mechanics. You yourself can derive these results only more or less indirectly from a few experiments. But for every new bit of theoretical claim (e.g. about magnetisation or other quantities) you will have to do a lot of experiments again to check, whereas for scientists with more in-depth knowledge it will be a 5min university exercise in statmech to check that claim.
And the whole point of the discussion her that the temperature can be successfully generalized to all sorts of system like a deck or cards as mentioned earlier. Your definition of thermometer apply only to a very restricted range and with you cannot prove laws of thermodynamics (you will find many exceptions to the usual laws and only more details knowledge can identify the cause of non-applicability). Science has moved on within the last 100 years, and that's why we look at the more powerful ideas about temperature.
According to your statements you might also say "I take a wheel and it rolls. I don't need to know anything about velocity or angular velocity". You see my point?
 
Last edited:
  • #32
Gerenuk said:
I was asking for a temperature that includes statmech and ideal gases and also extends to as much as possible. So basically the most general temperature possible to define - i.e. the best one can do.

The most "general" definition I can think of is that temperature is a parameter the describes the width of a distribution that can somehow be related to the energy of the system.


I was also asking which other parameters exists that claim to be called temperature. Also I asked for criteria for a parameter to be called temperature. So basically I was asking why conflicting definitions have the right to be called temperature. They might be... but what physical equation or concept is the reason?

I don't think equations have anything to do with it. E.g. people who work with single electronics often measure just about everything in Kelvin: bath temperature, gap energies, photon energies etc. This is mostly for historical reason (the reason being that the first devices that were made were all superconducting meaning most parameters could be related to Tc); nine times out of ten you might as well use for example eV. When I analyze data I usually measure temperature in GHz.


Try to forget all equipment you have and all the tables you are given! Imagine you are the first scientist to construct a thermometer. How would you gauge it?


It depends on the system. The nuclear orientation thermometer I mentioned above is a primary thermometer (you don't need to calibrate it, you can get T by measuring the anisotropy of the gamma radiation) for phonons but doesn't tell you much about the electronic temperature. If it was an electronic device I would probably use noise thermometry and measure e.g. the switching current from a Josephson junction. For my dilution fridge I can get a fair idea of the temperature by measuring pressure; this is the most "conventional" measurement.
In order to get the temperature of a mode of a mechanical resonator one can simply measure the Q value of the resonance.


I assume make use of common statmech (i.e. Boltzmann distribution and temperature as I stated) and hope that your physical model of the microscopic process you used for the statmech equation is correct (just as people do for gases). Only this way you can relate a macroscopic quantity to something which is called temperature and derives from the Boltzmann distribution.

Not at all. That procedure won't work very well for a single vibrational mode. Moreover, whereas switching current measurements are "statistical" each event is unique and corresponds to the escape of phase from the whole junction so it is not really "microscopic" in the usual sense. The escape rate only depends on the amount of "noise" the junction sees and this will only be the same as the phonon temperature if the e-p scattering times are short (and at low temperature the escape is due to tunneling, meaning the escape temperature becomes independent of bath temperature).
But that's a completely different source of troubles. Of course if the thermometer doesn't interact the right way with the electronic system, then these two energy systems won't equilibrate and will have different temperatures.
But again, you need a different kind of thermometer depending on which system you want to interact with. There are lots of useful thermometers that do not interact well with phonons or gases.

All these methods rely on assumptions about the statmech equations of the thermometer. I mean again at some point someone must have gauged the macroscopic parameter to something that obeys the laws for temperature.

This was true maybe a hundred years ago. But the fact that we no have a "general" definition of temperature is now a well known issue (although it is rarely a real problem, it is usually clear from the context what it meant by T).
I even remember learning about this during my first course in low temperature physics as an undergraduate. I also remember being quite surprised about this, like you I had assumed that temperature was something well defined.
 
  • #33
I tried to work through the Carnot definition of absolute temperature, but the common sources of information I found are very sloppy and full of hidden assumption that I find hard to unclutter.

Is there a more mathematically strict prove of that approach to temperature? I probably have to read that before I can say much more.

Every sentence in physics only halfs the amount of missing information, so that - similar to popular physics literature - one never arrives at 100% understanding. Whereas an equation unambigiously states what's going on.

Andy Resnick said:
The Joule-Thompson experiment had nothing to do with 'kinetic energies' or even 'ideal gases'. In fact, I claim that those results are currently outside any physical model you choose- that is, they cannot be predicted or constructed based on *any* current theory or model system, because the data represents a postulate upon which the theory is based (like c_0 in special relativity)
How did this experiment define temperature?
Couldn't it be explained by the statmech definition together with the Van-der-Waals equation for the gas?!

Andy Resnick said:
I'm not sure what this link is implying. From what I understood: First it mentions the air-thermometer of temperature which is based on the assumption V=Tf(p). Second it calls for a temperature which for all reversible engines should obey W/Q_\text{in}=f(T_2-T_1). Right?

Andy Resnick said:
So, if we (analytically) analyze the Joule-Thompson experiment using statistical mechanics, we are tacitly building in the assumption that experimentally, there were equilibrium conditions (or that we can somehow apply the concept of equilibrium to those experimental conditions). If you think that is incorrect, please say so.
I agree. We assume that equilibration is a much faster processes than all others, so that effectively the system is at equilibirum w.r.t. to spontaneous energy transfer.

Andy Resnick said:
Question: How can an "absolute" temperature scale be constructed from measurements of specific heat?
[...]
So their data was, on one hand, a simple measurement of material properties: C_p(T) for water, for example. On the other, it was a material-independent measurement of the energy content of a degree interval at various temperatures, thus defining a degree in terms of a change in energy.
So what would be the defining equation for temperature exactly?
A have a system which can be brought in contact with water. Also I can modify that system by adding or subtracting a controlled and measureable amount of heat. What now?
What exactly am I supposed to do to attribute a temperature?

I just read some more. So am I right, that "all I need to do" is to find a reversible heat engine and the ratio of heat in and heat out will give the the ratio of temperatures?
 
Last edited:
  • #34
Gerenuk said:
Now that is very non-scientific, to say something isn't true when you apparently don't know the theory.

Science is grounded in observation and not theory.

Fermi and Bose distribution for the particle energies with it's chemical potential is a direct consequence of applying the pure Boltzmann distribution to the total energy of the system. Read up the derivation.

Can you point me to a reference for this. I don't think this is the situation. I don't think that you can derive femi and bose distributions without using the grand canonical ensemble at which point you aren't in Boltzman distribution world.

And the whole point of the discussion her that the temperature can be successfully generalized to all sorts of system like a deck or cards as mentioned earlier.

Yes you can generalize the concept of temperature and entropy and energy. But we were talking about the definition of temperature. Definition of temperature is what a thermometer measures. If we have a theory of temperature that can be generalized then GREAT! But it's not the definition.

Your definition of thermometer apply only to a very restricted range and with you cannot prove laws of thermodynamics (you will find many exceptions to the usual laws and only more details knowledge can identify the cause of non-applicability).

You can't prove the laws of themodynamics at all. They are empirical statements about the universe. You can develop theories that are consistent with the observed laws, but that's not proof.

Science has moved on within the last 100 years, and that's why we look at the more powerful ideas about temperature.

Science is ultimately based on observation. We have a lot more sophisticated tools to explain observations, but you still have to do experiments.

According to your statements you might also say "I take a wheel and it rolls. I don't need to know anything about velocity or angular velocity". You see my point?

I'm saying the opposite. Just because I have this theory about velocity or angular velocity doesn't mean that it means anything unless I see how the wheel rolls.
 
  • #35
twofish-quant said:
Science is grounded in observation and not theory.
I believe you wholly think like an engineer and you have not tried thinking about my statements as a physicst. That's good your using existing stuff, but not good for an in-depth understanding to finally discover new connections.
I make a few comments to your last post and will read you answer, but this is not getting anywhere, so let's leave it like that.

twofish-quant said:
Can you point me to a reference for this. I don't think this is the situation. I don't think that you can derive femi and bose distributions without using the grand canonical ensemble at which point you aren't in Boltzman distribution world.
How can you think that if you have never tried or seen the prove of impossibility?
I was actually referring to the exponential law in general that includes the grand canonical ensemble. Sorry if that is not correct.
In any case one can still prove the chemical potential from the Boltzmann distribution. It was in some lecture notes by J.J. Binney where he used contour integration and an approximation for large numbers. If I find these again and you really need them I can post it.

twofish-quant said:
Yes you can generalize the concept of temperature and entropy and energy. But we were talking about the definition of temperature. Definition of temperature is what a thermometer measures. If we have a theory of temperature that can be generalized then GREAT! But it's not the definition.
That might not be your definition or that of an engineers. But I bet physicist who deal with more general systems than normal gases, have the advanced definition that includes all of yours.

twofish-quant said:
You can't prove the laws of themodynamics at all. They are empirical statements about the universe.
From statisical mechanics and probability theory you can prove the law of increase of entropy. Entropy is the logarithm of the number of microstates. The number of microstates highly peaks for a certain configuration just as the multinomial distribution does.
All these high-level laws (like ideal gas law and so on) have derivations from microscopic principles. Everything should be proved from microscopic laws to be a "nice" theory. Non-nice theories will find their counterexample to the rule sooner or later.

twofish-quant said:
Just because I have this theory about velocity or angular velocity doesn't mean that it means anything unless I see how the wheel rolls.
In your view you'd build a wheel and define that it rolls. You would say "Why do I need an extended concept of rolling like using angular velocity? My car does work already."
The answer to that is, that someone might have more sophisticated uses for a wheel. Just as some people has more sophisticated uses for temperature.
 
  • #36
Gerenuk said:
I believe you wholly think like an engineer and you have not tried thinking about my statements as a physicst. That's good your using existing stuff, but not good for an in-depth understanding to finally discover new connections.

I have a Ph.D. is in theoretical astrophysics, and I did my dissertation on radiation hydrodynamics, and my bachelors was in physics at MIT. I don't like bringing up credentials, but you are the one that brought it up.

Having been one, yes, I do know how physicists think.

That might not be your definition or that of an engineers. But I bet physicist who deal with more general systems than normal gases, have the advanced definition that includes all of yours.

No. The definition of temperature is what a thermometer measures. Now we can create other thermometers. We can create models for how temperatures work, but those aren't *definitions*. The definition of temperature is what a thermometer measures. If you want to know what a thermometer is, I can hand one to you.

From statisical mechanics and probability theory you can prove the law of increase of entropy.

No you can't. You can model entropy. You can show that given some assumptions and definitions that entropy increases, but that's not proof.

All these high-level laws (like ideal gas law and so on) have derivations from microscopic principles.

Which you then compare to observation, and if they don't match, you toss the theory and start over again.
 
  • #37
Gerenuk said:
Every sentence in physics only halfs the amount of missing information, so that - similar to popular physics literature - one never arrives at 100% understanding. Whereas an equation unambigiously states what's going on.

This isn't true. In physics you want to avoid equations when possible. You can't, but you should try.

How did this experiment define temperature?

Here is a thermometer. Here is what it measures.
 
  • #38
twofish-quant said:
I have a Ph.D. is in theoretical astrophysics, and I did my dissertation on radiation hydrodynamics, and my bachelors was in physics at MIT. I don't like bringing up credentials, but you are the one that brought it up.
Instead of mentioning credentials, why don't you prove it by showing some in-depth knowledge? I believe you that you are hard-working. But have you ever sat down and thought if there are more connections in - say statmech - than you have been told at university? Everyone should start organising their knowledge, beause many things they teach you are actually special cases.

For the entropy derivation I don't have a good link. But that's basically just the statement that the multinomial distribution has a very sharp maximum. Have you tried to think about that?
I recall seeing this idea illustrated in roughly in
http://www.princeton.edu/WebMedia/lectures/
"Fashion, Faith and Fantasy in the New Physics of the Universe, Lecture 2: FAITH"
(or maybe one of the other 2 parts)
So look out for these type of explanation if you some day want to see where increase of entropy derives from.

For the derivation of chemical potential from Boltzmann see
http://www-thphys.physics.ox.ac.uk/user/JamesBinney/statphys2.pdf
(I hope it's the right notes. My download is stuck so I cannot check if that's the file I meant at the moment)

So why do you say thing aren't there or cannot be done just because you don't know them?
 
  • #39
twofish-quant said:
Every sentence in physics only halfs the amount of missing information, so that - similar to popular physics literature - one never arrives at 100% understanding. Whereas an equation unambigiously states what's going on.
This isn't true. In physics you want to avoid equations when possible. You can't, but you should try.
I'm not surprised to read that. Blunty speaking I believe exactly this type of physicist are only quoting and repeating what they have learned and they will never make a new discovery themselves.
But it possible that I'm wrong on that.
Who knows...

As an example I once saw experimentalists reading lots of books about superconductivity (with little equations). It was good enough for bragging small talk, but when they talked to theorists, the theorists noticed they were just talking b#!$ß!t
Had these people only once tried to understand the simple BCS wave function, they wouldn't make these mistakes. I strongly suggest popular physics reading is interesting but harmful to understanding.
For example an electron isn't parttime wave and parttime particle. I guess you know very well, that it is a consistent mathematical structure.
 
  • #40
Gerenuk said:
Instead of mentioning credentials, why don't you prove it by showing some in-depth knowledge?

You're the one that told me that I wasn't thinking like a physicist.

For the entropy derivation I don't have a good link. But that's basically just the statement that the multinomial distribution has a very sharp maximum. Have you tried to think about that?

Which proves nothing about the laws of thermodynamics. The laws of thermodynamics are observations. You can come up with a model that is consistent with observations, but that doesn't *prove* anything.

So why do you say thing aren't there or cannot be done just because you don't know them?

Because I'm wrong about things. Thanks for the derviation.
 
  • #41
I looked at the Carnot temperature definition in detail. Here is what I conclude:

With the assumptions that
  1. all systems can be ordered according to some real parameter called temperature and it is impossible for energy to spontaneously (i.e. without external work) flow from a lower temperature to a high temperature
  2. there exists reversible heat engines that move heat by consuming or producing work
  3. the heat engine operate cyclically, i.e. there internal energy is restored after one cycle
one can show that
  1. up to a factor one can determine the temperatures T for all system configurations that can be reached with reversible engine changes or isothermal processes; however any function like T^*=\ln T is just as good and here has no absolute 0
  2. for reversible changes the change in entropy is given by \mathrm{d}S=\mathrm{d}Q/T
  3. total entropy is constant for reversible changes and changes with no heat transfer
  4. provided work can be transformed into heat and engines that deal with negative temperatures exist, one can show that this would contradict assumption (1)

Negative temperature would be ok, if one skips assumption (1) for them.
 
Last edited:
  • #42
Negative temperatures are something that you can fit into a themodynamic framework if you think of them as things hotter than infinity rather than colder than absolute zero. One thing that works when dealing with temperatures from a theory viewpoint is to work with (beta) i.e. 1/T rather than T. At that point beta for absolute zero becomes infinite. Infinite temperature becomes beta=zero, and then negative temperatures become hotter than infinity.

Once you have that then you can create a negative temperature system by having a system with two energy states in which you pump the object so that items in the higher energy state are more populated than in the lower energy state.
 
  • #43
Gerenuk said:
IAs an example I once saw experimentalists reading lots of books about superconductivity (with little equations). It was good enough for bragging small talk, but when they talked to theorists, the theorists noticed they were just talking b#!$ß!t Had these people only once tried to understand the simple BCS wave function, they wouldn't make these mistakes.

What usually happens at this point is that the experimentalists go back and say "hell yes, we understand the BCS wave function, it's just that your theory doesn't match our data." Also it's possible for a theory to be mathematical elegant and almost totally useless. General relativity for example. It's a beautiful, elegant theory, except that in all but the most simple situations (FRW metric), it's largely useless when you actually try to match it to experimental data, so the first thing that you have to do is to calculate approximations so that you can actually get out numbers that you can test with experiment.

The reason you want to avoid math and use very simple arguments if you can is that these tend to be more robust and give you more insight as to how a physical system behaves.
 
  • #44
Andy Resnick said:
This is a false argument, because it requires an individual to "interpret" a derived result "in a proper way". Proper according to whom? Nature cares not a whit how you interpret your model, or what model you invent.

Of course not, but humans have developed a framework for understanding nature, and whether or not the results of our model are physically meaningful (to us) may indeed depend on how we interpret what the model is telling us. In certain contexts a negative temperature is not a meaningful thing - this is not such a context. What negative temperature tells us is that if I put a system with a negative absolute temperature in thermal contact with a system with positive absolute temperature heat will flow from the negative temperature system to the positive temperature system. This is a physical prediction. You may still be resisting based on whether or not I can produce for you a system with a negative absolute temperature. A page on [URL='https://www.physicsforums.com/insights/author/john-baez/']John Baez's website[/url] explains:

"Can this system ever by realized in the real world, or is it just a fantastic invention of sinister theoretical condensed matter physicists? Atoms always have other degrees of freedom in addition to spin, usually making the total energy of the system unbounded upward due to the translational degrees of freedom that the atom has. Thus, only certain degrees of freedom of a particle can have negative temperature. It makes sense to define the "spin-temperature" of a collection of atoms, so long as one condition is met: the coupling between the atomic spins and the other degrees of freedom is sufficiently weak, and the coupling between atomic spins sufficiently strong, that the timescale for energy to flow from the spins into other degrees of freedom is very large compared to the timescale for thermalization of the spins among themselves. Then it makes sense to talk about the temperature of the spins separately from the temperature of the atoms as a whole. This condition can easily be met for the case of nuclear spins in a strong external magnetic field.

Nuclear and electron spin systems can be promoted to negative temperatures by suitable radio frequency techniques. Various experiments in the calorimetry of negative temperatures, as well as applications of negative temperature systems as RF amplifiers, etc., can be found in the articles listed below, and the references therein."

I'll save you the effort of digging through the references to find such an actual system:

N.F. Ramsey, "Thermodynamics and statistical mechanics at negative absolute temperature," Phys. Rev. 103, 20 (1956) Describes the theory being spins systems in which negative temperatures may be realized.

A. S. Oja and O. V. Lounasmaa, "Nuclear magnetic ordering in simple metals at positive and negative nanokelvin temperatures", Rev. Mod. Phys. 69, 1 - 136 (1997) A lengthy review article which discusses negative temperatures in it, including experimental realizations

E. M. Purcell and R. V. Pound, "A Nuclear Spin System at Negative Temperature", Phys. Rev. 81, 279 - 280 (1951) The first, according to the above review article, realization of negative temperature in a nuclear spin system.

Any further arguing that negative temperature isn't a physical thing should now be a matter of semantics.

No! It's of direct relevance, if your model implies 'temperature' applies only to a system *in equilibrium*. Which it does using statistical mechanics. Our living bodies are not in "energetic equilibrium" (or any other kind of equilibrium) with the environment.

Really? So as I sit here in a room that is kept at room temperature (with small fluctuations about this temperature), you suggest that the flux of energy out of my body is not, on average, equal to the flux of energy coming into my body? So if I sit here long enough, either all of the energy in my body will radiate away, or I will absorb energy until I combust? I doubt that's the case. That's seems to me like a kind of energetic equilibrium, or at least a steady state if we want to split semantic hairs, and as I mentioned the concept of temperature has been extended to steady states using the Onsager reciprocal relations. So, in this way, I can define a body temperature, and I never needed to know directly which processes in by body were not in some sort of equilibrium with some other part of my body. These processes certainly lead to the energy being radiated from my body, but they do so in such a way that the energy radiated is balanced by the energy absorbed.
That's fine, but again, I am interested in systems that are far from equilibrium- living systems. Thus, I require (or I need to develop) a model that works better than the models I currently have. Frankly, I'm not smart enough to develop such a model. So I am reduced to hoping someone else publishes an idea that I can glom on to.

I whole-heartedly agree that systems out of equilibrium are fascinating and ought to be understood (and I would like to try and understand them!); my argument is merely that the body is in a steady state (for sufficiently long periods of time) that a sensible temperature can be defined for it.
 
  • #45
Mute said:
<snip>Really? So as I sit here in a room that is kept at room temperature (with small fluctuations about this temperature), you suggest that the flux of energy out of my body is not, on average, equal to the flux of energy coming into my body? So if I sit here long enough, either all of the energy in my body will radiate away, or I will absorb energy until I combust? I doubt that's the case.
On the contrary, it *is* exactly the case. Sit in a dark room long enough and you will die- or have you forgotten the function of breathing, eating and drinking? I wouldn't say the energy radiated away or you combusted, you simply starved (or asphyxiated). Or are dietary calories not considered energy? Is the use of oxygen outside of physics?
Mute said:
<snip>

I whole-heartedly agree that systems out of equilibrium are fascinating and ought to be understood (and I would like to try and understand them!); my argument is merely that the body is in a steady state (for sufficiently long periods of time) that a sensible temperature can be defined for it.

You may think the problem is solved; I disagree.
 
Last edited:
  • #46
Mute said:
<snip>You may still be resisting based on whether or not I can produce for you a system with a negative absolute temperature. A page on [URL='https://www.physicsforums.com/insights/author/john-baez/']John Baez's website[/url] explains:

<snip>

That not what I'm contesting. You apparently didn't notice the little disclaimer at the top of the page:

"Under certain conditions, a closed system can be described by a negative temperature."

Note those three words: 'a closed system'. A closed system admits equilibrium and hence negative temperatures are logical predictions of statistical mechanics.

However, all that line does is re-formulate my objection "negative temperatures are unphysical" to equivalently stating "closed systems are unphysical". You may choose to respond that a closed system is a good approximation of a real, physical, system (or even that a real, physical system can be prepared arbitrarily close to a closed system), but this does not invalidate the essence of my objection.
 
  • #47
twofish-quant said:
Negative temperatures are something that you can fit into a themodynamic framework if you think of them as things hotter than infinity rather than colder than absolute zero. One thing that works when dealing with temperatures from a theory viewpoint is to work with (beta) i.e. 1/T rather than T. At that point beta for absolute zero becomes infinite. Infinite temperature becomes beta=zero, and then negative temperatures become hotter than infinity.

Once you have that then you can create a negative temperature system by having a system with two energy states in which you pump the object so that items in the higher energy state are more populated than in the lower energy state.

That's not really true, either- can you heat something up from (positive) infinity to negative temperatures without passing through zero? If so, by what path along the real number line do you propose this? Never mind about how to heat something up from 300K to infinity...
 
  • #48
twofish-quant said:
What usually happens at this point is that the experimentalists go back and say "hell yes, we understand the BCS wave function, it's just that your theory doesn't match our data."

The problem is rather they hear one word and over simplify that to their own common knowledge instead of getting the abstract idea. You tell them there is an energy "gap" and they will draw you an actual gap into the Fermi surface diagram on paper.
You ask them what d-wave symmetry is and they will draw you a cloverleaf and that drawing is about all the can say about it and yet they believe they know it all.
To illustrate general relativity, a popular physics article might draw a plane and ball rolling on top. Then it might say a dent on the plane would be the analogy to a planet in space attracting. This analogy is actually even wrong and misses the concept of "straight lines" in curved space.
I even fell victim to one of these stupid analogies. In school they draw EM waves equal to waves on a rope. It took me a while before I noticed that the EM waves do not shake in space (i.e. have directions but no displacement in space).
Unfortunately physics isn't as simple as a non-physicists toy picture of the world. So one better starts understanding the equations.
 
  • #49
Gerenuk said:
I tried to work through the Carnot definition of absolute temperature, but the common sources of information I found are very sloppy and full of hidden assumption that I find hard to unclutter.

<snip>

It's good to perform some scholarship every once in a while- you learn all kinds of new information, once you clear away the layers of *other people's* interpretations. And yes, it is often difficult. Carnot had impeccible physical intuition, but his mathematical presentation was... sloppy, to be nice.

Gerenuk said:
I'm not sure what this link is implying. From what I understood: First it mentions the air-thermometer of temperature which is based on the assumption V=Tf(p). Second it calls for a temperature which for all reversible engines should obey W/Q_\text{in}=f(T_2-T_1). Right?

That article is the one that Kelvin wrote to define an absolute temperature scale; if we are to discuss temperature, it seems logical to at least be familiar with how it came to be defined.

Recall, Carnot's function \mu depends only on temperature. Kelvin thus first defines an absolute temperature t = \int \mu (\theta) d\theta, which is now independent of the scale of \mu. It should be noted that t can indeed vary from (-\infty, +\infty) and has an arbitrary zero.

However... this contradicts what is also known, namely that the latent heat at constant volume is not zero, and that the signs of the latent heat (at constant volume) and the variation of pressure with temperature are either of the same sign or both vanish. These are important considerations becasue both make up (via a ratio) the definition of \mu. So we define a *new* absolute temperature T = exp(t/g), where g is a constant. T, as opposed to t, ranges from (0, \infty ) and has an 'arbitrary value' of 1.

'g', you may guess, is somehow related to the Boltzman factor 'k'.

So, to recap, based on measurements of \mu, we constructed a universal function T that is material-independent and scale-independent. The construction of the function T does not depend on any notion of irreversibility, ideal gas, statmech distribution, or constitutive relation - although the value of 'g' chosen *does*.
 
Last edited:
  • #50
Andy Resnick said:
On the contrary, it *is* exactly the case. Sit in a dark room long enough and you will die- or have you forgotten the function of breathing, eating and drinking? I wouldn't say the energy radiated away or you combusted, you simply starved (or asphyxiated). Or are dietary calories not considered energy? Is the use of oxygen outside of physics?

The idea that I have apparently failed to convey to you is that there are periods of time over which the net amount of energy radiated from a body is equal to the net amount of heat absorbed, and during such a steady state I may define a temperature as per the Onsager reciprocal relations. Do you still disagree?


You may think the problem is solved; I disagree.

Which problem? The problem of developing a non-equilibrium statistical mechanics is certainly nowhere near solved, nor did I ever claim it was. I said the problem of defining "body temperature" was solved, as it is (effectively) a steady state problem and may be described by the Onsager reciprocal relations.

That not what I'm contesting. You apparently didn't notice the little disclaimer at the top of the page:

"Under certain conditions, a closed system can be described by a negative temperature."

Note those three words: 'a closed system'. A closed system admits equilibrium and hence negative temperatures are logical predictions of statistical mechanics.

However, all that line does is re-formulate my objection "negative temperatures are unphysical" to equivalently stating "closed systems are unphysical". You may choose to respond that a closed system is a good approximation of a real, physical, system (or even that a real, physical system can be prepared arbitrarily close to a closed system), but this does not invalidate the essence of my objection.

Did you look up the references I gave you? The ones in which nuclear spins systems were experimentally demonstrated to behave as having a negative temperature? How is that "unphysical"? A statistical mechanical model of the system predicts the spin degrees of freedom to have a negative temperature. I put this spin system in contact with a spin system with positive temperature and the heat flows from the negative temperature system to the positive temperature system. This procedure has been done experimentally in nuclear systems where the spin degrees of freedom do not interact with the non-spin degrees of freedom over relevant time scales, and so are effectively a separate subsystem from the rest of the degrees of freedom and may have negative temperatures. So, what is your objection to this information? Is it that this is simply "an approximation" or something else?

You can argue all you like that your objections are not invalidated because the real world, strictly speaking, does not conform perfectly to the models, but your objections are then, practically speaking, pointless. Strictly speaking, there is no such thing as a phase transition in the real world, but that does not stop water from turning into ice or steam. All of our theoretical understanding of nature is through idealized models intended to capture the essence of the phenomena we are studying. If you object to the model on the basis of it never fully describing a real system, then what exactly is your criterion for a prediction of the model to be "physical"?
 
Back
Top