Gerenuk said:
I was also asking which other parameters exists that claim to be called temperature. Also I asked for criteria for a parameter to be called temperature. So basically I was asking why conflicting definitions have the right to be called temperature. They might be... but what physical equation or concept is the reason?
Temperature is abstractly defined as the property of two connected systems that is constant when there is no net energy flow between them. From the statistical mechanical view point if I have two systems connected by prevented from exchanging energy and I then release the constraint that they can't exchange energy, then the two systems will eventually evolve to configurations where the net energy in both of them doesn't change in time, which defines our "equilibrium". The two systems need not have the same energy, but it turns out that
\left(\frac{\partial S_1}{\partial E_1}\right) = \left(\frac{\partial S_2}{\partial E_2}\right)
Whatever this is, it's equal between the two systems. We choose to call this thing 1/T, where we call T "Temperature":
\frac{1}{T} = \left(\frac{\partial S}{\partial E}\right)
So, in any system where there is a property that we can measure that somehow behaves like energy (it doesn't actually have to be an energy - it just has to behave somehow like one mathematically), the net flow of which between two systems doesn't change in time defines an equilibrium condition analogous to thermal equilibrium in ideal gases, eg. We may then define a temperature-like variable \theta by
\frac{1}{\theta} = \left(\frac{\partial S}{\partial \mathcal E}\right),
where \mathcal E is the energy-like variable. S is the entropy, of course, defined by
S = k\ln \Omega.
Here, k is NOT necessarily Boltzmann's constant. In thermodynamics the Boltzmann's constant value isn't really fundamental - it just fixes the units we measure temperature or energy in. Aside from this constant, the number of microstates of the system is, in principle, a well defined number that doesn't really care about whether or not the system we're studying is a physical thermodynamic system. For example, I can calculate the entropy of a deck of cards: the number of possible sequences of a standard deck of cards is \Omega = 52!, so S = \log_2 (52!), where I chose k such that entropy is measured in bits. It's not a thermodynamic entropy, it's an "information" entropy. One could argue that statistical mechanics is simply information theory applied to systems which exchange energy, particles, etc.
So, this is in general our
theoretical definition of temperature. In systems where we can define a temperature-like variable we expect it to play a role analogous to temperature in a physical system. So, while \theta might not be a physical temperature, it might still be the variable we tune to induce a "phase transition" in our system of interest.
I was asking for a temperature that includes statmech and ideal gases and also extends to as much as possible. So basically the most general temperature possible to define - i.e. the best one can do.
Rather than find the definition of temperature that covers the most cases, it is probably best to pick several definitions which overlap in certain regimes. The statistical mechanical definition is very nice, but it is perhaps not always practical. So, for practical purposes what we can do is find a system in which we can define some other definition of temperature, "A", that coincides with our stat-mech definition in some regime, and if we then want to measure temperature in a regime where our stat-mech definition is cumbersome but our other definition still works, we can specify yet another definition of temperature, "B", that coincides with our definition A in this region where our stat mech definition is impractical.For example, in the classical limit, the stat mech definition coincides with the definition of temperature defined by relating "thermal energy" to the average kinetic energy of the ideal gas. If we then consider a block of wood, for which we might never hope to calculate the temperature from the stat mech defintion, we can measure it with our "average kinetic energy definition". We might then be able to find another quantity which matches the "average kinetic energy definition" in a regime where "average kinetic energy definition" is no longer equal to the stat mech definition, and take that as another definition of temperature, which might still work in a regime where the "average kinetic energy definition" fails. (This is conceptual like analytic continuation of complex functions, I suppose).
I think temperature is a cardinal scale, so you cannot arbitrarily scale/transform it, without disturbing some equations?
You can certainly scale it. All that really amounts to is changing the units I chose to measure temperature in. I can set k_B = 1 if I so desire; all that results in is me measuring temperature in joules instead of Kelvin. There's no fundamental difference.
Andy Resnick said:
Really? Kelvin would disagree with you.
Negative temperature is certainly a physical thing, if you interpret it properly (and, depending on how you are defining "temperature" in that statement!). Via wikipedia, "a system with a truly negative temperature is not colder than absolute zero; in fact, temperatures colder than absolute zero are impossible. Rather, a system with a negative temperature is hotter than any system with a positive temperature (in the sense that if a negative-temperature system and a positive-temperature system come in contact, heat will flow from the negative- to the positive-temperature system)."
Negative temperatures as defined in the "average kinetic energy" sense are impossible. As you yourself say, it is too restrictive. From the stat mech definition, however, negative temperatures are allowed, as you seem to be aware. So, it's certainly a physical thing from the stat mech viewpoint. (the wikipedia article:
http://en.wikipedia.org/wiki/Negative_temperature)
I think you missed my point. I am not in any way close to equilibrium, and neither are you. Yet we can both use a thermometer to measure our temperature. If temperature can only be defined for a body in equilibrium, how is it that we have a temperature of 98.6 F?.
Not in any way close to equilibrium with respect to
what? Equilibrium is not a property of a system on its own, it's a property of a system with respect to another system. In the case relevant to defining a body temperature, that system is the environment, and we are most certainly in an energetic equilibrium with that: the amount of energy we are radiating away must be equal to the amount of energy we're absorbing from the environment - that is, the net flux of energy between our bodies and the environment is constant. (Of course, there are times when we are not in such an equilibrium, but there are periods of time where things are in a steady state and this applies). You might argue that temperature is still ill-defined because energy isn't constant, but we're in a steady state, and there is a generalization to steady state processes. In this case, "temperature" is defined by
\frac{1}{T} = \left(\frac{\partial s}{\partial u}\right),
where u is the energy density and s is the entropy density. See the wikipedia article on the Onsager reciprocal relations for more info:
http://en.wikipedia.org/wiki/Onsager_reciprocal_relations