Haggard Rovelli thermodynamics paper says what time is

In summary, the paper discusses the zeroth law of thermodynamics which is a law that states that temperature is uniform at equilibrium. It discusses a generalized version of this law which remains valid in the general context. However, it is beyond the safety of experimental or observational confirmation at this point.
  • #1
marcus
Science Advisor
Gold Member
Dearly Missed
24,775
792
"Quantum jump" is a somewhat misleading term because the transition is not instantaneous. We can assume that at any given temperature there is a characteristic length of time it takes for the system to make the transition to a distinct quantum state. A local observer's proper time is essentially counting time in terms of that characteristic interval.

As I would paraphrase the following passage: thermal time is essentially a universal version of time that reconciles and unifies all those different clocks running at different temperatures. That is what equation (4) on page 1 is saying:
τ = (kT/ħ) t
Here is the context.

==quote 1302.0724, page 1==
The core idea is to focus on histories rather than states. This is in line with the general idea that states at fixed time are not a convenient handle on general relativistic mechanics, where the notion of process, or history, turns out to be more useful [12]. Equilibrium in a stationary spacetime, namely the Tolman law, is our short-term objective, but our long-term aim is understanding equilibrium in a fully generally covariant context, where thermal energy can flow also to gravity [13–15], therefore we look for a general principle that retains its meaning also in the absence of a background spacetime.
We show in this paper that one can assign an information content to a history, and two systems are in equilibrium when their interacting histories have the same information content. In this case the net information flow vanishes, and this is a necessary condition for equilibrium. This generalized principle reduces to standard thermodynamics in the non-relativistic setting, but yields the correct relativistic generalization.
This result is based on a key observation: at temperature T, a system transits
τ = (kT/ħ) t
states in a (proper) time t, in a sense that is made precise below. The quantity τ was introduced in [13, 14] with different motivations, and called thermal time. Here we find the physical interpretation of this quantity: it is time measured in number of elementary “time steps”, where a step is the characteristic time taken to move to a distinguishable quantum state. Remarkably, this time step is universal at a given temperature. Our main result is that two systems are in equilibrium if during their interaction they cover the same number of time steps...
==endquote==

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)
The zeroth principle of thermodynamics in the form "temperature is uniform at equilibrium" is notoriously violated in relativistic gravity. Temperature uniformity is often derived from the maximization of the total number of microstates of two interacting systems under energy exchanges. Here we discuss a generalized version of this derivation, based on informational notions, which remains valid in the general context. The result is based on the observation that the time taken by any system to move to a distinguishable (nearly orthogonal) quantum state is a universal quantity that depends solely on the temperature. At equilibrium the net information flow between two systems must vanish, and this happens when two systems transit the same number of distinguishable states in the course of their interaction.
5 pages, 2 figures

Notice that in the second paragraph of page 1 a distinction is made between two kinds of temperature--relativistic temperature labels the equivalence class of all systems which are in equilibrium with each other
( a kind of "zeroth law" temperature) and there is the temperature as measured by an ordinary thermometer.
This is a place where the paper may need further elucidation.
 
Last edited:
Physics news on Phys.org
  • #2
Rovelli's journey, now along with Haggard, continues to fascinate me (thanks Marcus), to whom it seems that they persuasively suggest: (1) that for many-component systems (not say atoms?)where temperature can be defined, time 'flows' in quantised steps of duration; (2) that the hotter the system, the smaller are the temporal steps ---time ticks the faster the hotter, as it were; and (3), perhaps most importantly, that one of our spacetime dimensions is quantised, which makes one wonder about the others.

Sadly, it seems that all this is way beyond the safety of experimental or observational confirmation. But one never knows!
 
Last edited:
  • #3
marcus said:
Notice that in the second paragraph of page 1 a distinction is made between two kinds of temperature--relativistic temperature labels the equivalence class of all systems which are in equilibrium with each other
( a kind of "zeroth law" temperature) and there is the temperature as measured by an ordinary thermometer.
This is a place where the paper may need further elucidation.

Isn't this saying that there is the equation for temperature (derived as an equivalence class of states in equilibrium) used in non general relativistic contexts, which relates energy and entropy. This is what's measured by an "ordinary thermometer". However, once you take general relativity into account, this same relation does NOT form an equivalence class (because of subtleties in defining energy), so that formula no longer satisfies the criterion we used to derive it (thus should be replaced). This paper tries to figure out what the appropriate equivalence class is for equilibrium in a general relativistic context, which is what he calls "relativistic temperature".

Because one definition is more naive than the other, they don't always agree, so not everything we use as a thermometer has the correct reading for temperature. Though it seems reasonable that they would sometimes agree.
 
  • #4
DimReg said:
Isn't this saying that there is the equation for temperature (derived as an equivalence class of states in equilibrium) used in non general relativistic contexts, which relates energy and entropy. This is what's measured by an "ordinary thermometer". However, once you take general relativity into account, this same relation does NOT form an equivalence class (because of subtleties in defining energy), so that formula no longer satisfies the criterion we used to derive it (thus should be replaced). This paper tries to figure out what the appropriate equivalence class is for equilibrium in a general relativistic context, which is what he calls "relativistic temperature".

Because one definition is more naive than the other, they don't always agree, so not everything we use as a thermometer has the correct reading for temperature. Though it seems reasonable that they would sometimes agree.

Thanks! I think that explains what I was wondering about. I'll dig into it a bit further and see if other comments and questions occur to me.
 
  • #5
Haggard & Rovelli said:
Let us now consider a system in thermal equilibrium
with a thermal bath at temperature T. Its mean energy
is going to be kT and the variance of the energy is also
going to be kT. Thus we have E  kT. At a given
temperature T, consider the time step
to = ~ kT ................: (11)
According to the previous discussion, this is the average
time the system takes to move from a state to the next
(distinguishable) state. This average time step is there-
fore universal: it depends only on the temperature, and
not on the properties of the system.

Shouldn't they be saying something about the size of time steps near zero degrees Kelvin, and about zero-point energies? See also their definition of temperature, eqn. 14.
 
  • #6
This seems like a "hot" new development!
 
  • #7
The Haggard and Rovelli use histories instead of states to make this result. Hartle recently wrote a paper about the history-based quantum theories which gives a good context for this paper.

The Quantum Mechanical Arrows of Time
James B. Hartle

http://arxiv.org/abs/1301.2844
 
  • #8
DimReg said:
Isn't this saying that there is the equation for temperature (derived as an equivalence class of states in equilibrium) used in non general relativistic contexts, which relates energy and entropy. This is what's measured by an "ordinary thermometer". However, once you take general relativity into account, this same relation does NOT form an equivalence class (because of subtleties in defining energy), so that formula no longer satisfies the criterion we used to derive it (thus should be replaced). This paper tries to figure out what the appropriate equivalence class is for equilibrium in a general relativistic context, which is what he calls "relativistic temperature"...
I like this way of summing it up. Also they connect their idea of temperature (as equilibrium classifier) with the passage of time in non-uniform gravity. They seem to get an intriguing match-up.
Paulibus said:
Shouldn't they be saying something about the size of time steps near zero degrees Kelvin, and about zero-point energies?

I think they are saying something, it is just implicit in equation (11) and the accompanying passage you quoted. As the temperature approaches zero, everything slows down---the time-step to goes to infinity. Here is equation (11) again, and accompanying text:
==quote==
to = ħ/kT

According to the previous discussion, this is the average
time the system takes to move from a state to the next
(distinguishable) state. This average time step is there-
fore universal: it depends only on the temperature, and
not on the properties of the system.
==endquote==
 
Last edited:
  • #9
Implication is not the same as saying something! The duration of time, as quantified by counting steps that are transitions from one quantum state to another, in a thermodynamic system, is perhaps worth a little more 'unpacking' and discussion.

This thread was opened with the statement:
Marcus said:
"Quantum jump" is a somewhat misleading term because the transition is not instantaneous. We can assume that at any given temperature there is a characteristic length of time it takes for the system to make the transition to a distinct quantum state. A local observer's proper time is essentially counting time in terms of that characteristic interval.
... which, I take it, is the step talked about in this paper, for transitions .

Consider also the case of a non-thermal system, say a single atom. Here transitions involve the emission/absorption of a photon, and a 'step' or 'quantum jump' is, for any observer of the process, just her/his proper time for a single photon oscillation. (For an observer traveling with the photon, as it were, the measured proper time for the step is infinite, as in the case of 'thermal time' at absolute zero.) Remember that for a single atom the emergent anthro'centric concept of temperature is meaningless. In this case the observed frequency of emitted/absorbed light then quantifies step-length.

It is just we slow-pokes who say that time passes and insist that we can distinguish colours of light.
 
Last edited:
  • #10
Paulibus, I can't think of any way to add to or improve on your comment. It points to several interesting directions of discussion. I'll think more about this idea of temperature, equilibrium and time and hope that by evening something will have occurred to me to contribute.
 
  • #11
DimReg explained the H-R idea in terms of "relativistic temperature" which is the same between processes in equilibrium---but I don't see this concept of relativistic temperature defined or used in the paper. However DimReg's explanation was very helpful and points us in the right direction. I want to reproduce it with slightly different words.

What they actually said is at the very bottom of page 4, the second to last paragraph of the Conclusions.

When things are at equilibrium what is the same is not the temperature, but the product of temperature and proper time. This should be the criterion, not temperature alone.

This is actually a very simple idea and one can understand it with an example:
Suppose upstairs and downstairs are at equilibrium. Then we know that proper time clocks run a little slower downstairs. Time goes slower deeper in a gravity well. But also, by the Tolman effect we know that the temperature downstairs is higher.

So chemical processes run faster because of higher temperature. The proper time SECOND is longer downstairs, but more stuff happens in a second because of higher temperature.

So there is an equilibrium of INFORMATION FLOW up and down. If the number of possible states up and down are equal, and downstairs is keeping the upstairs posted on all the changes it is experiencing, and vice versa, then there is a balance in the number of bits of information. This is said very imprecisely and it would need side conditions and additional clarification to make it meaningful, but that is the idea, I think. Information equilibrium.

The bottom paragraph of the lefthand column on page 4, with its two clocks example may be clearer than what I just said.
 
Last edited:
  • #12
Marcus; your latest post confuses me a bit. Literally!

Your clever version of the popular TV soap, Upstairs, Downstairs is nicely scripted.

But: "Then we know that proper time clocks run a little slower downstairs. Time goes slower deeper in a gravity well ". Not as measured by an observer located where the clock is situated, though. Physics is covariant. So as much stuff ruled by covariant physics happens in a downstairs proper second as in an upstairs proper second, or so it seems to me. When you added: " The proper time SECOND is longer downstairs, but more stuff happens in a second because of higher temperature" I think your first phrase may be wrong, unless you meant to involve another version of time, possibly thermal time? Or does the stipulation about number of possible states up and down being equal constitute the US cavalry to the rescue?

But time is so tricky, and I'm a bit slow. Could you clarify who is doing the observing in this scenario, and with what kind of time, please?
 
  • #13
marcus said:
DimReg explained the H-R idea in terms of "relativistic temperature" which is the same between processes in equilibrium---but I don't see this concept of relativistic temperature defined or used in the paper. However DimReg's explanation was very helpful and points us in the right direction. I want to reproduce it with slightly different words.

What they actually said is at the very bottom of page 4, the second to last paragraph of the Conclusions.

When things are at equilibrium what is the same is not the temperature, but the product of temperature and proper time. This should be the criterion, not temperature alone.

This is actually a very simple idea and one can understand it with an example:
Suppose upstairs and downstairs are at equilibrium. Then we know that proper time clocks run a little slower downstairs. Time goes slower deeper in a gravity well. But also, by the Tolman effect we know that the temperature downstairs is higher.

So chemical processes run faster because of higher temperature. The proper time SECOND is longer downstairs, but more stuff happens in a second because of higher temperature.

So there is an equilibrium of INFORMATION FLOW up and down. If the number of possible states up and down are equal, and downstairs is keeping the upstairs posted on all the changes it is experiencing, and vice versa, then there is a balance in the number of bits of information. This is said very imprecisely and it would need side conditions and additional clarification to make it meaningful, but that is the idea, I think. Information equilibrium.

The bottom paragraph of the lefthand column on page 4, with its two clocks example may be clearer than what I just said.

This is a more explicit version of what I meant (you state what the new derived equivalence class is).
 
  • #14
Paulibus said:
... Could you clarify who is doing the observing in this scenario, and with what kind of time, please?
Here are some of my thoughts on it. I think I'm saying about the same thing as DimReg, but he might want to correct me on some of this. It will not sound at first as if I am answering your question but I'll get around to it.

I think basic to thermodynamics is the idea of controlled contact and from this we get a primitive "pre-quantitative" idea of time as simply the interval when two systems are in contact, when they are interacting. They can be measuring time differently with different numbers but there is an unambiguous interval when the door or window is open between them.

Controlled contact is basic to thermo because you want to be able to consider a system in isolation, when it is not in contact with environment. And you want to be able to put two systems together and talk about the flow of heat or of information between them.

I think H&R are addressing a basic straightforward problem which is how do you talk about equilibrium in the general covariant setting. In the non relativistic setting you would associate equilibrium with being at the same temperature (during the interval of contact there is zero net heat flow, doesn't that mean same temperature?)

But that doesn't work in curved space-time. You put two systems together and the temperature will never equalize if one is lower than the other! Downstairs always stays warmer! So what is it that does not flow when you briefly open the hatch between them?

The downstairs clock runs slower so it measures the contact interval as having fewer seconds, but downstairs is warmer so more happens in a second.

The upstairs clock runs faster so it measures the contact interval has having more seconds but the people up there are more chilly and reserved so less happens in a second.

If there is equilibrium, I think H&R are saying, the "quantity of activity" during the contact interval is the same. Because of that trade-off. So if they are communicating thru the hatch there will be no net flow of information.

This is a curious and tentative thought. Your asking questions helps me a lot. I think I see some questions that need to be asked. However I will wait and see if you have some in response to what I just said.

(What is the same is the local temperature multiplied by the local proper time measure of the contact interval.
Actually this has a name, it is the thermal time measure of the contact interval, that they call tau.)

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)
 
Last edited:
  • #15
I understand your upstairs/downstairs model better now. Thanks, Marcus, for the clarification. Perhaps it helps to think of your ‘amount of activity’, upstairs or downstairs, as say 1/(the frequency of maximum-intensity local black body radiation). Which is an nice equilibrium thermal number itself, quite conWieniently proportional to temperature.

Of course we do actually live in a kind of upstairs/downstairs universe. We’re up?stairs and the remote universe is down?stairs, exchanging with us red-shifted information, through a hatch, as it were. Perhaps expansion is just gravity’s way of trying, (but failing?) to maintain covariant thermal equilibrium (or a Steady-State, as some might call it) over a big, previously hot spacetime.
 
  • #16
I read parts of the paper, and I find equation 11 weakly motivated. without adequate motivation the whole theory quickly falls apart. I wonder if anyone has something to say about it.
 
  • #17
Prathyush said:
I read parts of the paper, and I find equation 11 weakly motivated. without adequate motivation the whole theory quickly falls apart. I wonder if anyone has something to say about it.
Hi Pratyush, I'm glad to hear from you! Equation (11) is simply equation (7) with ΔE identified as kT.
So I think you must find equation (7) weakly motivated.

Could you discuss a little why you find it so?

I can't claim expertise but I thought the derivation of (7) looked rather straightforward---a Taylor expansion and an inspection of the second derivative term.

Did you look at footnote 2 on page 3, this is where the argument is generalized to systems with many degrees of freedom. I did not go through every step but only glanced at it.

I would be interested to know in a little more detail what you find dubious about equation (7), or about its consequent equation (11). Could you spell it out a little more for me?
 
  • #18
Equation 7 essentially follows from Schrodinger's equation(please correct me on this if i am wrong), though i haven't verified the algebra it, seems to be correct.

Equation 11 he introduces temperature and uses ΔE=kT. This is what I find to be strange.

What is the logic behind this? The original ΔE is to do with energy eigenstates of the Hamiltonian under consideration.

He asserts that variance is kT, firstly this need not be true for all systems. the general expression will depend on the specific Hamiltonian under consideration.
Secondly assuming that the variance is kT, how can you associate this variance with ΔE in equation 7 that deals with energy eigenstates of the Hamiltonian.

It is possible that I did not understand the authors intentions, in the derivation of equation 11. But I think it needs further explanation.
 
  • #19
Thanks! This gives me something to focus on and think about. Maybe the identification of ΔE with kT is shaky. It seemed solid to me, but I will have another look. (Tomorrow when I wake up, it's bedtime here :biggrin:)

For readers new to the thread, here's the paper being discussed:

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)
The zeroth principle of thermodynamics in the form "temperature is uniform at equilibrium" is notoriously violated in relativistic gravity. Temperature uniformity is often derived from the maximization of the total number of microstates of two interacting systems under energy exchanges. Here we discuss a generalized version of this derivation, based on informational notions, which remains valid in the general context. The result is based on the observation that the time taken by any system to move to a distinguishable (nearly orthogonal) quantum state is a universal quantity that depends solely on the temperature. At equilibrium the net information flow between two systems must vanish, and this happens when two systems transit the same number of distinguishable states in the course of their interaction.
5 pages, 2 figures
 
Last edited:
  • #20
Prathyush said:
I read parts of the paper, and I find equation 11 weakly motivated. without adequate motivation the whole theory quickly falls apart. I wonder if anyone has something to say about it.

Take ΔEΔt ≈ hbar. It's just the Heisenberg uncertainty principle for energy time. Compare it to equation 7, and note that in the paragraph above equation 11 ΔE is derived to be ≈ kT.

Edit: it looks like Marcus already responded before I got here. oops
 
  • #21
marcus said:
For readers new to the thread, here's the paper being discussed:

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)

Interesting. It seems likely that the universal time scale h/kT associated with a temperature T has some significance.
 
Last edited:
  • #22
DX: Yes, for me it is interesting. See my post #9, where I wrote
Paulibus said:
...Consider also the case of a non-thermal system, say a single atom. Here transitions involve the emission/absorption of a photon, and a 'step' or 'quantum jump' is, for any observer of the process, just her/his proper time for a single photon oscillation.
Just substitute for kT, the change in energy for the step or quantum jump, and the relation step energy-change = h times the frequency of the emitted photon gives the result I've emphasized in the above quote. I thought this was interesting.

The 'universal time scale' you mention is a scale where time is counted in steps of (photon frequency)^-1, at least for single atoms.
 
Last edited:
  • #23
marcus said:
Thanks! This gives me something to focus on and think about. Maybe the identification of ΔE with kT is shaky. It seemed solid to me, but I will have another look.

It might be interesting to look at a degenerate Fermi gas (i.e., when [itex]kT \ll E_F[/itex], the Fermi energy). Then the average internal energy is

[tex]E \equiv \frac{U}{N} \sim \frac{3 E_F}{5} \left[ 1 + \frac{5}{12} \left( \frac{\pi k T}{E_F}\right)^2 \right].[/tex]

This contains the leading order correction in an expansion in [itex]kT/E_F[/itex] (see, for example, eq (8.30) of http://www.physics.udel.edu/~glyde/PHYS813/Lectures/chapter_8.pdf). We can compute the variance in the energy using ([itex]\beta = 1/(kT)[/itex])

[tex](\Delta U)^2 = - \frac{\partial U}{\partial \beta},[/tex]

so that

[tex]\Delta E \sim \pi \sqrt{ \frac{(kT)^3}{2E_F}}.[/tex]

This is very different from [itex]\sim kT[/itex], because the leading term in the energy was independent of the temperature. There is obviously some issue with the proposed "universal time step" when you apply it to the simplest fermionic system.
 
  • #24
Interesting, a system where ΔE ~ T1.5 instead of the more typical ΔE ~ T1 As a reminder for anyone reading the thread, here's the paper being discussed:

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)
The zeroth principle of thermodynamics in the form "temperature is uniform at equilibrium" is notoriously violated in relativistic gravity. Temperature uniformity is often derived from the maximization of the total number of microstates of two interacting systems under energy exchanges. Here we discuss a generalized version of this derivation, based on informational notions, which remains valid in the general context. The result is based on the observation that the time taken by any system to move to a distinguishable (nearly orthogonal) quantum state is a universal quantity that depends solely on the temperature. At equilibrium the net information flow between two systems must vanish, and this happens when two systems transit the same number of distinguishable states in the course of their interaction.
5 pages, 2 figures
=======================

One thing to note about this topic is that the overall aim is to develop general covariant thermodynamics (among other things, invariant under change of coordinates) so that "state" at a particular time may be the wrong approach to defining equilibrium. One may need to define equilibrium between processes or histories rather than between states.

Defining a state at a particular time appears to break general covariance, at least at first sight. There may be some way to get around this. But in any case one of the first things one needs to do is generalize the idea of equilibrium to a general covariant setup, where you put two processes in contact. Equilibrium corresponds to no net flow (of something: heat, information...) between the two.

I've been absorbed with other matters for the past few days, but this paper is intriguing and I want to get back to it. So maybe we can gradually get refocused on it.
 
  • #25
All the paper does, in a certain sense, is motivate and propose a general covariant idea of equilibrium. The non-relativistic examples and discussion leading up to section IV are heuristic.
==quote==
IV. EQUILIBRIUM BETWEEN HISTORIES
Let us come to the notion of equilibrium. Consider two systems, System 1 and System 2, that are in interaction during a certain interval. This interaction can be quite general but should allow the exchange of energy between the two systems. During the interaction interval the first system transits N1 states, and the second N2, in the sense illustrated above. Since an interaction channel is open, each system has access to the information about the states the other has transited through the physical exchanges of the interaction.
The notion of information used here is purely physical, with no relation to semantics, meaning, significance, consciousness, records, storage, or mental, cognitive, idealistic or subjectivistic ideas. Information is simply a measure of a number of states, as is defined in the classic text by Shannon [17].
System 2 has access to an amount of information I1 = logN1 about System 1, and System 1 has access to an amount of information I2 = log N2 about System 2. Let us define the net flow of information in the course of the interaction as δI = I2 − I1. Equilibrium is by definition invariant under time reversal, and therefore any flow must vanish. It is therefore interesting to postulate that also the information flow δI vanishes at equilibrium. Let us do so, and study the consequences of this assumption. That is, we consider the possibility of taking the vanishing of the information flow
δI = 0 (15)
as a general condition for equilibrium, generalizing the maximization of the number of microstates of the non-relativistic formalism.3
==endquote==

You can see that the paper is still in a heuristic mode because in thinking about information we fall back on the idea of state. I expect that a mathematically rigorous treatment of the same subject might employ Tomita time. What is being set out here is an intuitive basis---how to think about equilibrium in general covariant context. But I could be wrong and the idea of state could be rigorously defined at this point.

==quote from Conclusions==
VI. CONCLUSIONS
We have suggested a generalized statistical principle for equilibrium in statistical mechanics. We expect that this will be of use going towards a genuine foundation for general covariant statistical mechanics.
The principle is formulated in terms of histories rather than states and expressed in terms of information. It reads: Two histories are in equilibrium if the net information flow between them vanishes, namely if they transit the same number of states during the interaction period.
This is equivalent to saying that the thermal time τ elapsed for the two systems is the same,..
==endquote==

That, I think, is the key statement of the paper. However you think about it, whatever your intuitive grasp, a DEFINITION of gen. cov. equilibrium is being proposed.
Two processes or histories are in equilibrium if during an interval of contact the thermal time elapsed in each is the same.
 
Last edited:
  • #26
Marcus, any chance you could explain this to me?
The zeroth principle of thermodynamics in the form "temperature is uniform at equilibrium" is notoriously violated in relativistic gravity.
 
  • #27
Drakkith said:
Marcus, any chance you could explain this to me?
Well there was a guy at Caltech, named Richard Tolman, who wrote a book (published 1934) about General Relativity.
http://en.wikipedia.org/wiki/Richard_C._Tolman
He found that in curved spacetime a column of material at equilibrium would be at different temperature. It was a very slight effect. Temperature was naturally higher when you were lower down in a gravitational field.

If you ignore GR, and the Tolman Effect, then temperature is a good indicator of equilibrium. Two systems are in equilibrium if they are the same temperature. ("Zeroth Law") Put them in contact and there is no net flow of heat between.

But if you take account of GR, and the Tolman Effect, then that is not true. Upstairs and downstairs can be in contact and have come into equilibrium, but downstairs is a tiny bit higher temperature. So ever since 1930s it has been known that the Zeroth Law notoriously fails if you allow for GR.

EDIT: I didn't know the name of the book, so looked it up:
Relativity, Thermodynamics, and Cosmology. Oxford: Clarendon Press. 1934.
 
Last edited:
  • Like
Likes wabbit
  • #28
Ah, that does seem like a tiny problem. Thanks!
 
  • #29
I'm looking for a really simple way to consider covariant thermal equilibrium, and have got to wondering whether the information exchange by two observers in black-body cavities, differently situated in a spacetime pervaded by gravity, couldn't be quantified by simply counting the black-body photons each observer receives from the other, through small windows.

Perhaps equilibrium could be judged to prevail when each observer finds the locally measured flux of black-body photons coming from the other to be the same? Such measured flux depends on measured space dimensions and on measured time intervals which, for Tomita or thermal time, seem to me to be a count of time-steps of size (reciprocal of measured photon frequency).

Since both perceived space dimensions and perceived time step-lengths vary over gravity-pervaded spacetime, could this provide a covariant procedure?
 
Last edited:
  • #30
That sounds like a way to prove the Tolman effect! Have an upstairs and a downstairs cavity. And a small hole connecting the two. Thermal radiation from upstairs would gain energy (be blueshifted ) by falling into the downstairs cavity. The observer downstairs would think that he was getting the same inflow as he was losing as an outflow.

the two observers would think they were in equilibrium, although they would actually be in slightly different temperatures.

I've never bothered to look up how Richard Tolman proved that effect. I'm lazy I guess and tend to just wait for the next paper rather than looking ahead--I expect other people to do the work :biggrin:

but actually what you are talking about does sound like ingredients for a math proof of the Tolman effect.

BTW one way people have of talking about the Tolman effect is to say "Energy weighs." I'm not sure if that is a good way to think about it, or if it helps much, but I've seen the phrase used. Maybe there's some intuition in it.
Getting late here, so I'd better be off to bed.
 
Last edited:
  • #31
I've been reviewing the Haggard Rovelli "Zeroth Law" paper, and now see it as a truly basic one.

I think it provides the conceptual framework for how general covariant statistical mechanics will be done.
Notice that because the idea of the "state of a system a given time" is not a covariant notion, we shift our focus from instantaneous state to protracted process.

"The core idea is to focus on histories rather than states.

Two systems placed in contact are described as two histories joined for a given interaction period.

In this conceptual framework, the paper shows how natural ideas of time, temperature, and equilibrium arise in a generally covariant way.

As an example, the authors give an elementary derivation of Wien's displacement law. (Section 5, page 4).

Thermal time turns out to be connected to the Heisenberg uncertainty principle, which thereby acquires new concrete meaning. See page 3, right before equation (14):
"In a sense it is 'time counted in natural elementary steps', which exist because the Heisenberg principle implies an effective granularity of the phase space."

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)
The zeroth principle of thermodynamics in the form "temperature is uniform at equilibrium" is notoriously violated in relativistic gravity. Temperature uniformity is often derived from the maximization of the total number of microstates of two interacting systems under energy exchanges. Here we discuss a generalized version of this derivation, based on informational notions, which remains valid in the general context. The result is based on the observation that the time taken by any system to move to a distinguishable (nearly orthogonal) quantum state is a universal quantity that depends solely on the temperature. At equilibrium the net information flow between two systems must vanish, and this happens when two systems transit the same number of distinguishable states in the course of their interaction.
5 pages, 2 figures

As it happened this paper did quite well on our first quarter MIP poll :biggrin: (over a quarter of us voted for it).
 
Last edited:
  • #32
There has been some followup to this paper, and some related work has appeared. I'll try to bring the references up to date.
http://arxiv.org/abs/1306.5206
The boundary is mixed
Eugenio Bianchi, Hal M. Haggard, Carlo Rovelli
(Submitted on 21 Jun 2013)
We show that Oeckl's boundary formalism incorporates quantum statistical mechanics naturally, and we formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, and surmise that local gravitational processes are indivisibly statistical with no possible quantal versus probabilistic distinction.
8 pages, 2 figures

As far as I can see this clinches the choice of formalism. The problem of quantum gravity is that of finding a general covariant QFT describing the behavior of geometry. And we know that GR has a deep relation to statistical mechanics. Ultimately that means quantum statistical mechanics or QSM
So the goal involves finding a single general covariant formalism for both covariant QFT and QSM.

In a general covariant setting there is no preferred time, so time-flow will probably need to arise Tomita-style from the quantum descriptor of the process enclosed in the boundary---that is by an element (or mix of elements) of the boundary Hilbertspace.
 
  • #33
There is (what I believe is) a very interesting related development by Laurent Freidel. Together with students/collaborators such as Ziprick and Yokokura. Freidel uses the term "screen" for the boundary of a spacetime region containing a process. He also calls it a "time-like world-tube".

Freidel makes the telling distinction between a truncation (e.g. a finite dimensional Fock space) and an approximation (the sort of thing one might expect to have a continuum limit.)

At the same time he is proposing a new kind of truncation for geometry: a continuous cell-decompostion into spiral-edge cells with flat interior. See the first talk of http://pirsa.org/13070057 , by Ziprick. This seems a substantial improvement over previous cellular decompositions used in QG, and generalizes Regge action. The edges of the spatial cell do not HAVE to be helical, they can be straight, but they are allowed to corkscrew or roll a little if they need to.

Freidel's talk is the first one of http://pirsa.org/13070042. You might, as I did, find some of the concepts unfamiliar and difficult to grasp, but nevertheless could find it worth watching (perhaps more than once.) He insists on concentrating the physics in the boundary as much as possible (surface tension, entropy production, internal energy, relaxation to equilibrium...everything is happening in boundary, or as he says "screen"). BTW the boundary can have several topological components and usual ideas of inside/outside can be reversed. The observer can be surrounded by process, looking out from his own world-tube.

One reason the video talks, and the slides PDF, are valuable is because in many cases more pictorial. E.g. Ziprick shows a sample picture of a spiral-edge cell.
 
Last edited:
  • #34
We are seeing a paradigm take shape, I think. Made of separately familiar ideas in a possibly new configuration.
A process has a boundary (Oeckl gives the axiomatics).
A boundary is an interface for information flow---one could say a "channel". Freidel says "screen".
Two adjoining processes are in equilibrium if the net information flow is zero during their interface contact.

This is kind of interesting. During their contact the two processes could be experiencing different rate of TIME and different subjective TEMPERATURE but if they are in equilibrium the effects somehow balance out. They each see the other going through the same number of changes, the same number of phasespace cells.

The quantum descriptor of a process lives in a Hilbertspace defined on the BOUNDARY of the process. I will refrain from calling the descriptor a "state" because that has the usual connotation of a "state at a given instant of time". There is no time: no objective time external to the process which can be referenced independently of the process descriptor.

The boundary Hilbertspace vectors describe accessible initial-during-final information about the process.
If it is a deep-rooted unalterable habit to call certain elements of a Hilbertspace by the name of "states" then you should, but I am calling them "descriptors" of the process interfaced by the boundary mainly just to teach myself to think differently, namely in process or history terms.

One can ask the amplitude of a given description on the process boundary. It is a general covariant version of "transition amplitude", and the theory should give this.

One can ask about the time-flow subjective to the process, as described by a given element or mix of elements in the boundary Hilbertspace.

Tomita told us how to get an idea of "time" from such a descriptor, that is a flow on the observable algebra, or a one-parameter group of automorphisms.

That's kind of interesting. Still lots of gaps and questions in the paradigm. I understand only a tiny percentage of it. In Oeckl's talk he said that if you want to include FERMIONIC information in the boundary Hilbertspace the you have to generalize the Hilbertspace to have a negative definite as well as a positive definite piece. A "Krein" space is the direct sum of an ordinary (pos) Hilbert and a kind of inverted (neg) Hilbert. Strange, if true. If it is true, then can one carry through with the Tomita construction? I'm totally in the dark about this. Which is why it's interesting. Apparently there was a Mr. Krein who lived in the Ukraine, someone who will be famous if Oeckl has his way. Google it if you like. :biggrin:

So there is a kind of reading list (or "watching list") to lay out

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics

http://arxiv.org/abs/1306.5206
The boundary is mixed

Chirco's lead talk on http://pirsa.org/13070085

Freidel's lead talk on http://pirsa.org/13070042

Oeckl's lead talk on http://pirsa.org/13070084

Ziprick's lead talk on http://pirsa.org/13070057.

It surprised me to find that all four talks I wanted to cite were the first on their respective session recordings. That's lucky because it makes it easier to start watching. You don't have to wait for buffering before you drag to the desired segment. You just click on the link, select flash, and it starts.

Goffredo Chirco is a postdoc at Marseille who is interested in general covariant QSM (quantum stat. mech.). In a way I am repeating, in this post, the viewpoint presented in his talk. The adoption of Oeckl boundary formalism aims at getting both QFT and QSM in the same general covariant setup. Freidel's current work on "screens" seems to me like parallel evolution (which has turned up some very interesting new things).

Repeating some comment: It may be shallow of me but I like Freidel's distinction between a truncation (e.g. a finite dimensional Fock space) and an approximation (the sort of thing one might expect to have a continuum limit.) One can ask about the conditional amplitude of something on the condition that there are N particles. One does not have to take the direct sum of all the Fock spaces for every N.
Also even though it's risky to adopt something this novel, Freidel's radically new truncation for geometry appeals to me. It is a continuous cell-decompostion into spiral-edge cells, each with flat interior. The edges of the spatial cell do not HAVE to be helical, they can be straight, but they are allowed to corkscrew or roll a little if they need to.
Loops 13 talks are an important resource to keep handy.http://pirsa.org/C13029
Here are abstracts of parallel session talks: http://www.perimeterinstitute.ca/sites/perimeter-www.pi.local/files/conferences/attachments/Parallel%20Session%20Abstracts_7.pdf
Here are links to the parallel session talks:
https://www.physicsforums.com/showthread.php?p=4461021#post4461021
 
Last edited:
  • #35
Since we've turned a page I'll bring forward the papers we are discussing in this thread. Please keep in mind that the topic is the idea of time that arises in these particular papers.
If someone has a different idea perhaps connected with some other research, they are welcome to start their own thread about it in the appropriate forum.

But in this thread let's please stay focused on what is presented here:

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics
Hal M. Haggard, Carlo Rovelli
(Submitted on 4 Feb 2013)
The zeroth principle of thermodynamics in the form "temperature is uniform at equilibrium" is notoriously violated in relativistic gravity. Temperature uniformity is often derived from the maximization of the total number of microstates of two interacting systems under energy exchanges. Here we discuss a generalized version of this derivation, based on informational notions, which remains valid in the general context. The result is based on the observation that the time taken by any system to move to a distinguishable (nearly orthogonal) quantum state is a universal quantity that depends solely on the temperature. At equilibrium the net information flow between two systems must vanish, and this happens when two systems transit the same number of distinguishable states in the course of their interaction.
5 pages, 2 figures

Thermal time turns out to be connected to the Heisenberg uncertainty principle, which thereby acquires new concrete meaning. See page 3, right before equation (14):
"In a sense it is 'time counted in natural elementary steps', which exist because the Heisenberg principle implies an effective granularity of the phase space."

http://arxiv.org/abs/1306.5206
The boundary is mixed
Eugenio Bianchi, Hal M. Haggard, Carlo Rovelli
(Submitted on 21 Jun 2013)
We show that Oeckl's boundary formalism incorporates quantum statistical mechanics naturally, and we formulate general-covariant quantum statistical mechanics in this language. We illustrate the formalism by showing how it accounts for the Unruh effect. We observe that the distinction between pure and mixed states weakens in the general covariant context, and surmise that local gravitational processes are indivisibly statistical with no possible quantal versus probabilistic distinction.
8 pages, 2 figures

To recall the essentials of what I said earlier, this seems to clinch the choice of formalism. The problem of quantum gravity is that of finding a general covariant QFT describing the behavior of geometry. And we know that GR has a deep relation to statistical mechanics. Ultimately that means quantum statistical mechanics or QSM
So the goal involves finding a single general covariant formalism for both covariant QFT and QSM.

In a general covariant setting there is no preferred time, so time-flow will probably need to arise Tomita-style from the quantum descriptor of the process enclosed in the boundary---that is by an element (or mix of elements) of the boundary Hilbertspace.
 

Similar threads

  • Thermodynamics
Replies
3
Views
991
Replies
7
Views
830
Replies
14
Views
2K
Replies
5
Views
2K
  • Thermodynamics
Replies
3
Views
812
  • Beyond the Standard Models
Replies
2
Views
2K
Replies
15
Views
1K
  • Beyond the Standard Models
Replies
13
Views
1K
Replies
5
Views
554
  • Introductory Physics Homework Help
Replies
3
Views
717
Back
Top