Haggard Rovelli thermodynamics paper says what time is

In summary, the paper discusses the zeroth law of thermodynamics which is a law that states that temperature is uniform at equilibrium. It discusses a generalized version of this law which remains valid in the general context. However, it is beyond the safety of experimental or observational confirmation at this point.
  • #36
The statement in the Bianchi Haggard Rovelli abstract highlighted above
"...Oeckl's boundary formalism incorporates QSM naturally, and we formulate general-covariant QSM in this language."
makes it urgent to ask questions about Oeckl's formulation of quantum theory. He has recently come out with a radically different alternative version which requires fewer axioms. I gather it really is proposed as an optional alternative, not as a replacement. The earlier axioms are included in an appendix.

The new alternative version is apt to strike people as conceptually unfamiliar---it uses positive real numbers (a generalized notion of probability) in place of complex amplitudes (!) but promises to be able to recover conventional quantum mechanical results. Lucian Hardy is credited with having inspired this seemingly risky gambit. On the other hand this alternative Oeckl formulation is IMHO aesthetically appealing. It certainly is not the version being used by the Loop gravity authors but I don't want to ignore it.

http://arxiv.org/abs/1212.5571
A positive formalism for quantum theory in the general boundary formulation
Robert Oeckl (CCM-UNAM)
(Submitted on 21 Dec 2012)
We introduce a new "positive formalism" for encoding quantum theories in the general boundary formulation, somewhat analogous to the mixed state formalism of the standard formulation. This makes the probability interpretation more natural and elegant, eliminates operationally irrelevant structure and opens the general boundary formulation to quantum information theory.
28 pages

A recent exposition of the more familiar older form of Oeckl's formulation of quantum theory is here:
http://arxiv.org/abs/1201.1877
Schrödinger-Feynman quantization and composition of observables in general boundary quantum field theory
Robert Oeckl (UNAM)
(Submitted on 9 Jan 2012)
We show that the Feynman path integral together with the Schrödinger representation gives rise to a rigorous and functorial quantization scheme for linear and affine field theories. Since our target framework is the general boundary formulation, the class of field theories that can be quantized in this way includes theories without a metric spacetime background. We also show that this quantization scheme is equivalent to a holomorphic quantization scheme proposed earlier and based on geometric quantization. We proceed to include observables into the scheme, quantized also through the path integral. We show that the quantized observables satisfy the canonical commutation relations, a feature shared with other quantization schemes also discussed. However, in contrast to other schemes the presented quantization also satisfies a correspondence between the composition of classical observables through their product and the composition of their quantized counterparts through spacetime gluing. In the special case of quantum field theory in Minkowski space this reproduces the operationally correct composition of observables encoded in the time-ordered product. We show that the quantization scheme also generalizes other features of quantum field theory such as the generating function of the S-matrix.
47 pages

One slight inconsistency of terminology: in the more recent paper an infinitesimally thin region is called a "slice". What is now called a slice region was called an "empty region" in the earlier paper. This change is pointed out by the author. In any case confusion is unlikely to result. Overall the style is conveniently thorough and clear.
 
Last edited:
Physics news on Phys.org
  • #37
The last section ("thermality of gravitational states") of the Bianchi Haggard Rovelli paper begins with three fairly dense paragraphs that require study, in my case at least.
==quote June 2013 BHR paper section 6==
So far, gravity has played no direct role in our considerations. The construction above, however, is motivated by general relativity, because the boundary formalism is not needed as long as we deal with a quantum field theory on a fixed geometry, but becomes crucial in quantum gravity, where it allows us to circumvent the difficulties raised by diffeomorphism invariance in the quantum context.

In quantum gravity we can study probability amplitudes for local processes by associating boundary states to a finite portion of spacetime, and including the quantum dynamics of spacetime itself in the process. Therefore the boundary state includes the information about the geometry of the region itself.

The general structure of statistical mechanics of relativistic quantum geometry has been explored in [15], where equilibrium states are characterized as those whose Tomita flow is a Killing vector of the mean geometry. Up until now it hasn’t been possible to identify the statistical states in the general boundary formalism and so this strategy hasn’t been available in this more covariant context. With a boundary notion of statistical states this becomes possible. It becomes possible, in particular, to check if given boundary data allow for a mean geometry that interpolates them.
==endquote==
Reference [15] is to C. Rovelli, “General relativistic statistical mechanics,” arXiv:1209.0065. It seems my work is cut out for me, if I want to understand what's taking shape here. The crucial connection between the two papers makes use of the concept of a mean geometry.
 
Last edited:
  • #38
http://arxiv.org/abs/1209.0065 was basically a *classical* paper, and moreover did not introduce the boundary formalism. In retrospect one can see how it set things up in preparation for the BHR paper discussed in this thread.

Here are some points listed in the conclusion of the September 2012 paper
==quote "General relativistic statistical mechanics" 1209.0065 ==
We have extended the machinery of statistical thermodynamics to the general covariant context. The new concepts with respect to conventional statistical mechanics are:

1. The statistical state is defined on the space of the solution of the field equation.

2. Each statistical state defines a preferred time flow, called thermal time.

3. A statistical state whose thermal time flow has a geometrical interpretation, in the sense that it can be reinterpreted as evolution with respect to a local internal time, defines a generalized Gibbs state, with properties similar to the conventional equilibrium states.

4. For such states, it is possible to define the relative global temperature between two states.

5. A mean geometry is a stationary classical geometry with a timelike killing field and a time foliation, such that the value of a suitable family of observables reproduces the statistical expectation values of these observables in the statistical ensemble.

6. If a mean geometry exists, a local temperature is defined. Local temperature is the ratio between proper time and thermal time on the mean geometry:
T(x)=([STRIKE]h[/STRIKE]/k)dτ/ds
It yields immediately the Tolman law.

This construction reduces to conventional thermodynamics for conventional Hamiltonian systems rewritten in a parametrized language.

Examples, extension of the formalism to the boundary formalism [44–46], which is the natural language for quantum field theory in the generally covariant context, and applications to horizon thermodynamics, and in particular to the local framework defined in [47] and the derivation of black hole entropy in loop quantum gravity in [42], will be considered elsewhere.
==endquote==

So this present development was planned-for and announced last September, almost a year ago. It seems one should have been able to anticipate it.
 
Last edited:
  • #39
Marcus said:
We are seeing a paradigm take shape, I think. Made of separately familiar
ideas in a possibly new configuration. A process has a boundary (Oeckl gives the axiomatics).
A boundary is an interface for information flow---one could say a "channel". Freidel says
"screen". Two adjoining processes are in equilibrium if the net information flow is zero during
their interface contact.

Perhaps it's possible to start to unpack this kind of paradigm in simple terms -- I'll try!

Think of yourself as a process -- a talking, walking, writing animal living in a changing and
evolving universe, filled with stuff that is always and everywhere ruled by physical laws. You
and lots of this stuff can be represented mathematically as localised collections of things with
bulk interiors and boundaries.

All this is subject to change that we try to describe by mathematically representing how
change happens across the represented dimensions of bulk and/or boundaries. Lots of
information is thereby locally generated and exchanged between various processes, perhaps
excepting fundamental entities like electrons.

Statistics is an appropriate tool for compressing information about collections of stuff. A
familiar example is the descriptor 'temperature' (T). This is an emergent, quantifiable
descriptor iff the focus is on collections of stuff that are busy interchanging energy among
their constituents. Examples: 1. T can be quantified in the case of quantum items, say atoms,
by the exponential way energy is distributed among energy levels or, 2. in the case of items
that exchange energy via photons, by the peak frequency of the ‘black-body’ radiation that
facilitates energy exchange. Such compression of information enables powerful laws to be
formulated, like the laws of thermodynamics, which apply to collections of interacting stuff.

The way descriptions are now developing seems to me (see your post #38 above) to be that
the descriptor T that quantifies our familiar words ‘hot’ and ‘cold’ is to be joined by a
statistical, emergent descriptor t that for an individually represented process quantifies time,
via the concept of Tomita time.

post 38 said:
General relativistic statistical mechanics" 1209.0065 ==
We have extended the machinery of statistical thermodynamics to the general covariant
context. The new concepts with respect to conventional statistical mechanics are:

1. The statistical state is defined on the space of the solution of the field equation.

2. Each statistical state defines a preferred time flow, called thermal time.

3. A statistical state whose thermal time flow has a geometrical interpretation, in the sense
that it can be reinterpreted as evolution with respect to a local internal time, defines a
generalized Gibbs state, with properties similar to the conventional equilibrium states.

4. For such states, it is possible to define the relative global temperature between two states.

5. A mean geometry is a stationary classical geometry with a timelike killing field and a time
foliation, such that the value of a suitable family of observables reproduces the statistical
expectation values of these observables in the statistical ensemble.

6. If a mean geometry exists, a local temperature is defined. Local temperature is the ratio
between proper time and thermal time on the mean geometry:
T(x)=(h/k)dτt/ds
It yields immediately the Tolman law.

This construction reduces to conventional thermodynamics for conventional Hamiltonian
systems rewritten in a parametrized language.

Examples, extension of the formalism to the boundary formalism [44––46], which is the
natural language for quantum field theory in the generally covariant context, and applications
to horizon thermodynamics, and in particular to the local framework defined in [47] and the
derivation of black hole entropy in loop quantum gravity in [42], will be considered elsewhere.
On a lighter note, I’m reminded of:

The Siphonaptera said:
Great fleas have little fleas upon their backs to bite 'em,
And little fleas have lesser fleas, and so ad infinitum.
And the great fleas themselves, in turn, have greater fleas to go on,
While these again have greater still, and greater still, and so on.
 
  • #40
Hi Paulibus, great hearing from you. Your regular-language account seems right on target for the most. But it tells me I may have blundered by trying to substitute the word "descriptor" for what the experts are calling "state". Still undecided, and trying alternatives out.

The state is an overall PROFILE of a process which has/will happen(ed) in a compact 4D region.
OOPS I have to go, forgot about a time-critical errand my wife needs done this morning.
Back later. Anyway for the moment I will bring forward the post you were referencing.
I may have to resort to saying "state" even though to many ears it sounds like "state at a given instant of time". Here it means state that describes what we know or expect to find out about the process for its entire duration.This state lives in a Hilbert space of the possible states each of which gives a profile of the process for its entire duration. Specifying a state could involve making/planning a number of measurements and imposing side restrictions
post 34 said:
We are seeing a paradigm take shape, I think. Made of separately familiar ideas in a possibly new configuration.
A process has a boundary (Oeckl gives the axiomatics).
A boundary is an interface for information flow---one could say a "channel". Freidel says "screen".
Two adjoining processes are in equilibrium if the net information flow is zero during their interface contact.

This is kind of interesting. During their contact the two processes could be experiencing different rate of TIME and different subjective TEMPERATURE but if they are in equilibrium the effects somehow balance out. They each see the other going through the same number of changes, the same number of phasespace cells.

The quantum [STRIKE]descriptor[/STRIKE] profile? of a process lives in a Hilbertspace defined on the BOUNDARY of the process. I will refrain from calling the [STRIKE]descriptor[/STRIKE] profile a "state" because that has the usual connotation of a "state at a given instant of time". There is no time: no objective time external to the process which can be referenced independently of the process [STRIKE]descriptor[/STRIKE] profile.

The boundary Hilbertspace vectors describe accessible initial-during-final information about the process.
If it is a deep-rooted unalterable habit to call certain elements of a Hilbertspace by the name of "states" then you should, but I am calling them "[STRIKE]descriptors[/STRIKE]profiles" of the process interfaced by the boundary mainly just to teach myself to think differently, namely in process or history terms.

One can ask the amplitude of a given description on the process boundary. It is a general covariant version of "transition amplitude", and the theory should give this.

One can ask about the time-flow subjective to the process, as described by a given element or mix of elements in the boundary Hilbertspace.

Tomita told us how to get an idea of "time" from such a [STRIKE]descriptor[/STRIKE] [oh hell I might as well give up and call it a state], that is a flow on the observable algebra, or a one-parameter group of automorphisms.

That's kind of interesting. Still lots of gaps and questions in the paradigm. I understand only a tiny percentage of it. In Oeckl's talk he said that if you want to include FERMIONIC information in the boundary Hilbertspace the you have to generalize the Hilbertspace to have a negative definite as well as a positive definite piece. A "Krein" space is the direct sum of an ordinary (pos) Hilbert and a kind of inverted (neg) Hilbert. Strange, if true. If it is true, then can one carry through with the Tomita construction? I'm totally in the dark about this. Which is why it's interesting. Apparently there was a Mr. Krein who lived in the Ukraine, someone who will be famous if Oeckl has his way. Google it if you like. :biggrin:

So there is a kind of reading list (or "watching list") to lay out

http://arxiv.org/abs/1302.0724
Death and resurrection of the zeroth principle of thermodynamics

http://arxiv.org/abs/1306.5206
The boundary is mixed

Chirco's lead talk on http://pirsa.org/13070085

Freidel's lead talk on http://pirsa.org/13070042

Oeckl's lead talk on http://pirsa.org/13070084

Ziprick's lead talk on http://pirsa.org/13070057.

It surprised me to find that all four talks I wanted to cite were the first on their respective session recordings. That's lucky because it makes it easier to start watching. You don't have to wait for buffering before you drag to the desired segment. You just click on the link, select flash, and it starts.

Goffredo Chirco is a postdoc at Marseille who is interested in general covariant QSM (quantum stat. mech.). In a way I am repeating, in this post, the viewpoint presented in his talk. The adoption of Oeckl boundary formalism aims at getting both QFT and QSM in the same general covariant setup. Freidel's current work on "screens" seems to me like parallel evolution (which has turned up some very interesting new things).

Repeating some comment: It may be shallow of me but I like Freidel's distinction between a truncation (e.g. a finite dimensional Fock space) and an approximation (the sort of thing one might expect to have a continuum limit.) One can ask about the conditional amplitude of something on the condition that there are N particles. One does not have to take the direct sum of all the Fock spaces for every N.
Also even though it's risky to adopt something this novel, Freidel's radically new truncation for geometry appeals to me. It is a continuous cell-decompostion into spiral-edge cells, each with flat interior. The edges of the spatial cell do not HAVE to be helical, they can be straight, but they are allowed to corkscrew or roll a little if they need to.
Loops 13 talks are an important resource to keep handy.http://pirsa.org/C13029
Here are abstracts of parallel session talks: http://www.perimeterinstitute.ca/sites/perimeter-www.pi.local/files/conferences/attachments/Parallel%20Session%20Abstracts_7.pdf
Here are links to the parallel session talks:
https://www.physicsforums.com/showthread.php?p=4461021#post4461021
 
Last edited:
  • #41
This thread is now imperfectly titled because since its beginning I've noticed several other people active in this: Bianchi, Oeckl, Freidel, Ziprick, Chirco...

To summarize, I think I see a new approach taking shape to general covariant QFT and QSM (quantum statistical mechanics, with implications for thermodynamics as well)---an approach that is different from the familiar Dirac "Hamiltonian constraint" scheme where the Hamiltonian has to be identically equal to zero.

I could be wrong--a significant new approach to replace "Dirac canonical" might NOT be taking shape. Or a new approach (based on the boundary formalism) might be taking shape but destined to FAIL because of some as-yet-unrecognized irremediable FLAW. It's a suspenseful time. We have to wait and see.

But right now this revolution-like development seems to make sense and to be gaining momentum.
More people seem to be getting involved as time goes on, and it's fun to watch.
 
  • #42
BTW it's worth noting the two radically different uses of the word "lattice". check this out:
http://en.wikipedia.org/wiki/Lattice_(order )

A partial ordered set in which any two elements have a "join" and a "meet". The subspaces of a Hilbert space form a lattice. The "meet" of two subspaces is their intersection (which is itself a subspace). The "join" of two subspaces is the unique smallest subspace containing their union. In this case it is the span of the two.

In a Hilbertspace there is a natural idea of the projection operator onto a given subspace. Corresponding to the lattice of subspaces there is a lattice of projection operators.

I mention this because in some areas of physics the word "lattice" refers to something completely different---a kind of regular GRID or cellular skeleton. Confusion might result.

In Oeckl's positive boundary formalism (I'll abbreviate it PBF) the lattice of projections operators is almost, you could say, more important than the Hilbertsapce itself. Or it is the raison d'être of the Hilbertspace---the reason for having it in the first place.
======
Another thing is that Oeckl PBF is based on "topological manifolds" which sounds very complicated and technical but is in fact the most boring nondescript structureless thing you could imagine basing a physics theory on. A topo manifold is a shapeless set: no metric, no differential structure. Besides its topology all it really has is a certain dimensionality. In the sense of being locally map-able (homemorphic) to Rd. Let's specialize to d=4. All we ask is that every point in the set have an open neighborhood that maps to an open neighborhood of R4--continuously in both directions.
 
Last edited by a moderator:
  • #43
What I'm trying to stress is the lightness of the math framework--how undemanding the assumptions are in Oeckl PBF. All you basically require is that the process you are studying occur in a compact bounded 4d topological manifold. He calls that a region. And the boundary is assumed to be a 3d topological manifold which he calls a hypersurface.

There's a Hilbertspace associated with the boundary which means there is a lattice of projection operators that I think of as corresponding to STATEMENTS you can make about the boundary, and consequently make regarding what was/is/will be in progress inside the region. Including as regards how that process interacts with and therefore includes the geometry in the region.

What I'm thinking is (and it would be great if you could offer a contrasting viewpoint and persuade me otherwise) is that this setup is probably MINIMAL. These are the weakest most structureless assumptions that it is possible to make and still be able to make statements about some process going on somewhere--some quantum field theory or statistical mechanics process happening somewhere.

Also on standby here is the idea of "truncation". Because we can never make more than a finite number of measurements, detections, or predictions--or impose more than finite number of sideconditions--it's probably advisable to have some convenient way to truncate the information content. Some "N", maybe an arbitrary restriction on the dimension of the Hilbertspace. I'm not sure about this at the moment and can't make a definite comment. I don't see it in any of the Oeckl papers I've looked at.
 
Last edited:
  • #44
The next installment:
http://arxiv.org/abs/1309.0777
Coupling and thermal equilibrium in general-covariant systems
Goffredo Chirco, Hal M. Haggard, Carlo Rovelli
(Submitted on 3 Sep 2013)
A fully general-covariant formulation of statistical mechanics is still lacking. We take a step toward this theory by studying the meaning of statistical equilibrium for coupled, parametrized systems. We discuss how to couple parametrized systems. We express the thermalization hypothesis in a general-covariant context. This takes the form of vanishing of information flux. An interesting relation emerges between thermal equilibrium and gauge.
8 pages, 3 figures
 
  • #45
The Planck Stars paper, with its new understanding of what's going on in a black hole, and especially the paper by Chirco Haggard Riello Rovelli ("CHRR") that re-interprets Jacobson's 1995 result put the QG and Thermodynamics research discussed in this thread in a new light.

There seems to be a high-stakes game in progress: a coherent line of investigation involving a number of people. Not only Haggard and Rovelli as mentioned in the title of the thread, but a longer list including:
Bianchi, Chirco, Haggard, Riello, Rovelli, Vidotto, and possibly others I'm forgetting to mention.

The investigation of general covariant thermodynamics, thermal time, equilibrium, temperature, information flux, entropy (especially in general covariant setting) which we saw getting started or restarted back in 2012 has been paying off.

Resolving, for example, the "black hole information paradox" and seemingly obviating the "firewall" puzzle that until recently occupied the attention of so many reputable senior researchers.

So I'll update this thread with a few links and excerpts of more recent work.

The CHRR thread: https://www.physicsforums.com/showthread.php?t=734298
and a couple abstracts:

http://arxiv.org/abs/1401.5262
Spacetime thermodynamics without hidden degrees of freedom
Goffredo Chirco, Hal M. Haggard, Aldo Riello, Carlo Rovelli
(Submitted on 21 Jan 2014)
A celebrated result by Jacobson is the derivation of Einstein's equations from Unruh's temperature, the Bekenstein-Hawking entropy and the Clausius relation. This has been repeatedly taken as evidence for an interpretation of Einstein's equations as equations of state for unknown degrees of freedom underlying the metric. We show that a different interpretation of Jacobson result is possible, which does not imply the existence of additional degrees of freedom, and follows only from the quantum properties of gravity. We introduce the notion of quantum gravitational Hadamard states, which give rise to the full local thermodynamics of gravity.
12 pages, 1 figure

http://arxiv.org/abs/1401.6562
Planck stars
Carlo Rovelli, Francesca Vidotto
(Submitted on 25 Jan 2014)
A star that collapses gravitationally can reach a further stage of its life, where quantum-gravitational pressure counteracts weight. The duration of this stage is very short in the star proper time, yielding a bounce, but extremely long seen from the outside, because of the huge gravitational time dilation. Since the onset of quantum-gravitational effects is governed by energy density --not by size-- the star can be much larger than Planckian in this phase. The object emerging at the end of the Hawking evaporation of a black hole can then be larger than Planckian by a factor (m/mP)n, where m is the mass fallen into the hole, mP is the Planck mass, and n is positive. The existence of these objects alleviates the black-hole information paradox. More interestingly, these objects could have astrophysical and cosmological interest: they produce a detectable signal, of quantum gravitational origin, around the 10−14cm wavelength.
5 pages, 3 figures.
 
Last edited:
  • #46
Another paper has appeared which relates to and possibly extends those in this thread. this is the February 2015 "Compact phase space, cosmological constant" paper of Rovelli Vidotto.
I will get the link.
It turns out that implanting a small cosmological curvature constant Λ in the simplices of quantum gravity leads to a compact phase space. This gives a minimal separation of distinguishable states, so tends to confirm the kind of practical discreteness that the HR and CHR papers are talking about (as a springboard to the first rigorous general covariant thermodynamics)
Here is the link.
http://arxiv.org/abs/1502.00278
Compact phase space, cosmological constant, discrete time
Carlo Rovelli, Francesca Vidotto
(Submitted on 1 Feb 2015)
We study the quantization of geometry in the presence of a cosmological constant, using a discretization with constant-curvature simplices. Phase space turns out to be compact and the Hilbert space finite dimensional for each link. Not only the intrinsic, but also the extrinsic geometry turns out to be discrete, pointing to discreetness of time, in addition to space. We work in 2+1 dimensions, but these results may be relevant also for the physical 3+1 case.
6 pages
 
Last edited:
  • #47
There is an intuitive reasoning which says that thermodynamics is the way to rigorous confirmation of LQG.
The line of reasoning goes as follows.
We suspect from the CHRR paper and Jacobson 1995 that the Einstein GR equation is the thermodynamic equation of state of the LQG degrees of freedom. They are the molecules of geometry which in bulk constitute the large-scale fluid of geometry which the Einstein GR equation of state describes.
So the way to rigorously confirm our suspicion is to understand thermodynamics better.
But thermodynamics and statistical mechanics have never been given a general covariant treatment, so as to be compatible with GR!
So a general covariant thermo and stat mech is kind of a first order of business.

Also one wants both theoretical confirmation and observational. And here thermodynamics also plays a role because our most promising observations of phenomena involving quantum geometric effects are those of the ANCIENT LIGHT background that has enormously magnified shapes of the early universe in it. In other words, what does your theory say about the very start of expansion (confusingly called "big bang singularity") And what about the considerable variety of unexplained explosions that dot the sky? Could some of those ("gammaray bursts" etc...) be black holes blowing up due to quantum effects?
That calls for a better understanding of thermodynamics too, because one has to understand the TIME DILATION at extreme gravitational depth where changes happen very slowly. If gravitational collapse results in a quantum rebound explosion it would be extremely time delayed by the depth of collapse. But what about the Tolman effect of the extreme temperature? And what happens to the information? So to get observational confirmation for the theory again involves understanding general covariant thermodynamics. It is the second order of business as well as the first.
 
Back
Top