Prospects of the canonical formalism in loop quantum gravity [Alexandrov, Roche]

In summary, Alexandrov and Roche present a critical overview of loop and spin foam approaches to quantization of four-dimensional general relativity. They discuss issues such as quantization ambiguities, secondary second class constraints, and the lack of full diffeomorphism invariance in canonical LQG. They also explore alternative quantization approaches that aim to overcome these issues while preserving the full Lorentz gauge symmetry. However, these approaches still face challenges in terms of maintaining covariance and addressing second class constraints. Overall, the authors highlight the need for further research and development in order to fully understand and address the shortcomings of these theories.
  • #1
tom.stoer
Science Advisor
5,779
172
I would like to continue discussing canonical LQG based on chapter 2 from

http://arxiv.org/abs/1009.4475
Critical Overview of Loops and Foams
Authors: Sergei Alexandrov, Philippe Roche
(Submitted on 22 Sep 2010)
Abstract: This is a review of the present status of loop and spin foam approaches to quantization of four-dimensional general relativity. It aims at raising various issues which seem to challenge some of the methods and the results often taken as granted in these domains. A particular emphasis is given to the issue of diffeomorphism and local Lorentz symmetries at the quantum level and to the discussion of new spin foam models. We also describe modifications of these two approaches which may overcome their problems and speculate on other promising research directions.

As a summary, Alexandrov and Roche present (and discuss in detail) indications that canonical LQG in its present form suffers from quantization ambiguities (ordering ambiguities, unitarily in-equivalent quantizations, Immirzi-parameter), insufficient treatment of secondary second class constraints leading to anomalies, missing off-shell closure of constraint algebra, missing consistent definition of the Hamiltonian constraint and possibly lack of full 4-dim diffeomorphism invariance. They question some of the major achievements like black hole entropy, discrete area spectrum, uniquely defined kinematical Hilbert space (spin networks).

In chapter 3 Alexandrov and Roche discuss spin foam models which may suffer from related issues showing up in different form but being traced back to a common origin (secondary second class constraints, missing Dirac’s quantization scheme, …). I would like to discuss this chapter 3 in a separate thread. For the rest of this post I will try to present the most important sections from chapter 2, i.e. canonical LQG.

Although we have raised the points already known by the experts in the field, they are rarely spelled out explicitly. At the same time, their understanding is crucial for the viability of these theories. ...

Thus, we will pay a particular attention to the imposition of constraints in both loop and SF quantizations. Since LQG is supposed to be a canonical quantization of general relativity, in principle, it should be straightforward to verify the constraint algebra at the quantum level. However, in practice, due to peculiarities of the loop quantization this cannot be achieved at the present state of knowledge. Therefore, we use indirect results to make conclusions about this issue. ...

Although LQG can perfectly incorporate the full local Lorentz symmetry, we find some evidences that LQG might have problems to maintaining space-time diffeomorphism symmetry at the quantum level. Thus, we argue that it is an anomalous quantization of general relativity which is not physically acceptable. ...

Since the action (2.1) possesses several gauge symmetries, 4 diffeomorphism symmetries and 6 local Lorentz invariances in the tangent space, there will be 10 corresponding first class constraints in the canonical formulation. Unfortunately, as will become clear in section 2.2, there are additional constraints of second class which cannot be solved explicitly in a Lorentz covariant way. To avoid these complications, one usually follows an alternative strategy. It involves the following three steps:
... the boost part of the local Lorentz gauge symmetry is fixed from the very beginning by
choosing the so called time gauge, which imposes a certain condition on the tetrad field
...the three first class constraints generating the boosts are solved explicitly w.r.t. space components of the spin-connection; ...


An important observation is that the spectrum (2.14) is proportional to the Immirzi parameter. This proportionality arises due to the difference between ... having canonical commutation relations with the connection. It signifies that this parameter, which did not play any role in classical physics, becomes a new fundamental physical constant in quantum theory. ...

A usual explanation is that it is similar to the theta-angle in QCD [35]. However, in contrast to the situation in QCD, the formalism of LQG does not even exist for the most natural value corresponding to the usual Hilbert–Palatini action. Moreover, the Immirzi parameter enters the spectra of geometric operators in LQG as an overall scale, which is a quite strange effect. Even stranger is that the canonical transformation (2.6) turns out to be implemented non-unitarily, so that the area operator is sensitive to the choice of canonical variables. To our knowledge, there is no example of such a phenomenon in quantum mechanics. ...

Below we will argue that the dependence on the Immirzi parameter is due to a quantum anomaly in the diffeomorphism symmetry, which in turn is related to a particular choice of the connection used to define quantum holonomy operators (2.7). ...

Alexandrov and Roche do not address all weak points related to Thiemann's construction of the Hamiltonian but refer to
14] H. Nicolai, K. Peeters, and M. Zamaklar, “Loop quantum gravity: An outside view,” Class. Quant. Grav. 22 (2005) R193, arXiv:hep-th/0501114.
[15] T. Thiemann, “Loop quantum gravity: An inside view,” Lect. Notes Phys. 721 (2007) 185–263, arXiv:hep-th/0608210.

In the following Alexandrov and Roche present alternative quantization approaches developped by Alexandrov et al.; this does not really solve these issues but it clarifies the main problems, i.e. quantization ambiguities, inequivalent quantization schemes, secondary second class constraints missed by using the Poisson structure instead of the Durac brackets.

The reduction of the gauge group originates from the first two steps in the procedure leading to the AB canonical formulation on page 6. Therefore, it is natural to construct a canonical formulation and to quantize it avoiding any partial gauge fixing and keeping all constraints generating Lorentz transformations in the game. The third step in that list (solution of the second class constraints) can still be done and the corresponding canonical formulation can be found in [51]. However, this necessarily breaks the Lorentz covariance. On the other hand, it is natural to keep it since the covariance usually facilitates analysis both at classical and quantum level. Thus, we are interested in a canonical formulation of general relativity with the Immirzi parameter, which preserves the full Lorentz gauge symmetry and treats it in a covariant way. ...

The presence of the second class constraints is the main complication of the covariant canonical formulation. They change the symplectic structure on the phase space which must be determined by the corresponding Dirac bracket. ...

They identify a two-parameter family of inequivalent (!) connections; their main results are
...First of all, the connection lives now in the Lorentz Lie algebra so that its holonomy operators belong to a non-compact group. This is a striking distinction from LQG where the compactness of the structure group SU(2) is crucial for the discreteness of geometric operators and the validity of the whole construction.
...The symplectic structure is not anymore provided by the canonical commutation relations of the type (2.3) but is given by the Dirac brackets.
...In addition to the first class constraints generating gauge symmetries, the phase space to be quantized carries second class constraints. Although they are already taken into account in the symplectic structure by means of the Dirac brackets, they lead to a degeneracy in the Hilbert space constructed ignoring their presence ...


(Remarkably some of these issues identified by Alexandrov and Roche in the generalized canonical approach will show up in the "new SF" models which have become popular over the last years)

Thus, we see that the naive generalization of the SU(2) spin networks to their Lorentz analogues is not the correct way to proceed. A more elaborated structure is required. The origin of this novelty can be traced back to the presence of the second class constraints which modified the symplectic structure and invoked a connection different from the usual spinconnection. ...

The spectrum (2.40) depends explicitly on the parameters a, b entering the definition of the connection. This implies that the quantizations based on different connections of the two-parameter family are all inequivalent. ...

Finally, we notice that the projected spin networks are obtained by quantizing the phase space of the covariant canonical formulation ignoring the second class constraints. Therefore they form what can be called enlarged Hilbert space and as we mentioned above this space contains many states which are physically indistinguishable. To remove this degeneracy one has to somehow implement the second class constraints at the level of the Hilbert space. ...

(The authors claim that this is what is missed in all SF models)

Alexandrov and Roche present two special choices for the connection (on physical grounds) in order to demonstrate how one could proceed
1) LQG in a covariant form from which the standard time-gauge LQG framework can be recovered
2) CLQG which leads to physically different results!

Although the commutativity of the connection is a nice property, there is another possibility of imposing an additional condition to resolve the quantization ambiguity, which has a clear physical origin. Notice that the Lorentz transformations and spatial diffeomorphisms, which appear in the list of conditions (2.34), do not exhaust all gauge transformations. What is missing is the requirement of correct transformations under time diffeomorphisms generated by the full Hamiltonian. Only the quantity transforming as the spin-connection under all local symmetries of the theory can be considered as a true spacetime connection. ...

In particular, it involves a Casimir of the Lorentz group and hence this spectrum is continuous. But the most striking and wonderful result is that the spectrum does not depend on the Immirzi parameter! Moreover, one can show that this parameter drops out completely from the symplectic structure ...

Thus, it [the Immirzi parameter] remains unphysical as it was in the classical theory, at least at this kinematical level. ...

From our point of view, this is a very important fact which indicates that LQG may have troubles with the diffeomorphism invariance at the quantum level. ...

But as we saw in the previous subsection, there is an alternative choice of connection, suitable for the loop quantization, which respects all gauge symmetries. Besides, the latter approach, which we called CLQG, leads to results which seem to us much more natural. For example, since it predicts the area spectrum independent on the Immirzi parameter, there is nothing special to be explained and there is no need to introduce an additional fundamental constant. Moreover, the spectrum appears to be continuous which is very natural given the non-compactness of the Lorentz group and results from 2+1 dimensions (see below). Although these last results should be taken with great care as they are purely kinematical and obtained ignoring the connection non-commutativity, in our opinion, the comparison of the two possibilities to resolve the quantization ambiguity points in favor of the second choice. ...

The authors have serious doubts that the discrete area spectrum should be taken as a physical result:

In fact, there are two other more general issues which show that the LQG area spectrum is far from being engraved into marble. First, the area operator is a quantization of the classical area function and, as any quantization, is supplied with ordering ambiguities ...

Second, the computation of the area spectrum has been done only at the kinematical level. The problem is that the area operator is not a Dirac observable. It is only gauge invariant, whereas it is not invariant under spatial diffeomorphisms and does not commute with the Hamiltonian constraint. This fact raises questions and suspicions about the physical relevance of its spectrum and in particular about the meaning of its discreteness, even among experts in the field [77, 78]. ...

In the following Alexandrov and Roche show that there are to different interpretations on quantization of gravity, namely the Dirac scheme and the relational interpretation

The difference between the two interpretations and the importance of this issue has been clarified in [77, 81]. Namely, the authors of [77] proposed several examples of low dimensional quantum mechanical constrained systems where the spectrum of the physical observable associated to a partial observable is drastically changed. This is in contradiction with the expectation of LQG that the spectrum should not change. Then in [81] it was argued that one should not stick to the Dirac quantization scheme but to the relational scheme. Accepting this viewpoint allows to keep the kinematical spectra unchanged. Thus, the choice of interpretation for physical observables directly affects predictions of quantum theory and clearly deserves a precise scrutiny. ...

Whereas the relational viewpoint seems to be viable, the work [77] shows that if we adhere only to the first interpretation, which is the most commonly accepted one, then it is of upmost importance to study the spectrum of complete observables. Unfortunately, up to now there are no results on the computation of the spectrum of any complete Dirac observable in full LQG. ...

In our opinion, all these findings and the above mentioned issues clearly make the discreteness found in LQG untrustable and suggest that the CLQG spectrum (2.52) is a reasonable alternative. ...

I don't think that this means that LQG is wrong and CLQG is right. But it definately means that even canonical LQG is far from being unique!

However, since the spectrum (2.52) is independent of the Immirzi parameter, the challenge now is to find such counting which gives the exact coefficient 1/4, and not just the proportionality to the horizon area. In fact, the last point is the weakest place of the LQG derivation comparing to all other derivations existing in the literature. ...

Besides, there are two other points which make the LQG derivation suspicious. First, it is not generalizable to any other dimension. If one draws direct analogy with the 4-dimensional case, one finds a picture which is meaningless in 3 dimensions and does not allow to formulate any suitable boundary condition in higher dimensions. ...

(This aspect is discussed in a series of papers by Thiemann et al., but I have to admit that I did not check for results relevant in this context)

Last but not least Alexandrov and Roche focus on diffeomorphism invariance, non-separability of LQG Hilbert space and non-uniqueness of the Hamiltonian constraint.

In fact, there still remain some continuous moduli depending on the relative angles of edges meeting at a vertex of sufficiently high valence [96]. Due to this the Hilbert space HGDiff is not separable and if one does not want that the physics of quantum gravity is affected by these moduli, one is led to modify this picture. To remove this moduli dependence, one can extend Diff(M) to a subgroup of homeomorphisms of M consisting of homeomorphisms which are smooth except at a finite number of points [97] (the so called “generalized diffeomorphisms”). If these points coincide with the vertices of the spin networks, the supposed invariance under this huge group will identify spin networks with different moduli and solve the problem. However, this procedure has different drawbacks. First, the generalized diffeomorphisms are not symmetries of classical general relativity. Moreover, they transform covariantly the volume operator of Rovelli–Smolin but not the one of Ashtekar–Lewandowski which is favored by the triad test [39]. This analysis indicates that these generalized diffeomorphisms should not be implemented as symmetries at quantum level and, as a result, we remain with the unsolved problem of continuous moduli. ...

In the Dirac formalism the constraints Hi only generate diffeomorphisms which are connected to the identity. Therefore, there is a priori no need for defining HGDiff to be invariant under large diffeomorphisms. On the other hand, in LQG these transformations, forming the mapping class group, are supposed to act trivially. This is justified in [2] (section I.3.3.2) to be the most practical option given that the mapping class group is huge and not very well understood. ...

Thus, the simplest option taken by LQG might be an oversimplification missing important features of the right quantization. ...

That means that LQG may simply miss the full diffeomorphism symmetry, especially all structures related to large diffeomorphisms may be lost. ... Moreover, a huge arbitrariness is hidden in the step suggesting to replace the classical Poisson brackets, as for example (2.22), by quantum commutators. In general, this is true only up to corrections in h and on general ground one could expect that the Hamiltonian constructed by Thiemann may be modified by such corrections. This is a bit disappointing situation for a would be fundamental quantum gravity theory. In principle, all this arbitrariness should be fixed by the requirement that the quantum constraints reproduce the closed Dirac constraint algebra. However, the commutators of quantum constraint operators are not under control ...

Here's the summary of chapter 2

Trying to incorporate the full Lorentz gauge symmetry into the standard LQG framework based on the SU(2) group, we discovered that LQG is only one possible quantization of a twoparameter family of inequivalent quantizations. All these quantizations differ by the choice of connection to be used in the definition of holonomy operators — the basic building blocks of the loop approach. LQG is indeed distinguished by the fact that the corresponding connection is commutative. Nevertheless, a more physically/geometrically motivated requirement selects another connection, which gives rise to the quantization called CLQG. Although the latter quantization has not been properly formulated yet, it predicts the area spectrum which is continuous and independent on the Immirzi parameter, whereas LQG gives a discrete spectrum dependent on the IP. We argued that these facts lead to suspect that LQG might be an anomalous quantization of general relativity: in our opinion they indicate that it does not respect the 4d diffeomorphism algebra at quantum level. If this conclusion turns out indeed to be true, LQG cannot be physically accepted. At the same time, CLQG is potentially free from these problems. But due to serious complications, it is far from being accomplished and therefore the status of the results obtained so far, such as the area spectrum, is not clear. We also pointed out that some of the main LQG results are incompatible either with other approaches to the same problem or with attempts to generalize them to other dimensions. We consider these facts as supporting the above conclusion that LQG is not, in its present state, a proper quantization of general relativity.

So much for today!
 
Physics news on Phys.org
  • #3
Well these issues among many other were thoroughly discussed in our recent meeting, Alexandrov having given a nice review: http://ph-dep-th.web.cern.ch/ph-dep-th/content2/THInstitutes/2011/QG11/talks/Alexandrov_CERN11.pdf

Though at the end there seemed to be more confusion about the status of LQG (and other approaches such as asy safety) than before. One point that bothered many was the issue of the strange quantization, ie, having no Fock space, no anomalies, no harmonic oscillator. It seems that gravity theorists and particle physicists really talk and think quite differently...
 
Last edited by a moderator:
  • #4
suprised said:
Well these issues among many other were thoroughly discussed in our recent meeting, Alexandrov having given a nice review: http://ph-dep-th.web.cern.ch/ph-dep-th/content2/THInstitutes/2011/QG11/talks/Alexandrov_CERN11.pdf

Though at the end there seemed to be more confusion about the status of LQG (and other approaches such as asy safety) than before. One point that bothered many was the issue of the strange quantization, ie, having no Fock space, no anomalies, no harmonic oscillator. It seems that gravity theorists and particle physicists really talk and think quite differently...
Well, I remembered that we discussed Alexandrov's paper last year here in the BF; if I remember correctly it was your hint. Then you referred to the CERN workshops a few weeks ago, I found Alexandrov's slides and started to read his paper again (b.t.w.: he hasn't published anything since, strange).

marcus is continuously telling us that LQG (SF) is consistent and unique - forgetting / ignoring / ... LQG (canonical approach). I thought in order to make progress we should discuss papers and formulas instead of claims and guesses.

Instead of summarizing Thiemann's work (he is just 10 kilometers away from me and I really should meet him from time to time; I know his predecessor in Erlangen - my advisor about 20 years ago ) which is huge I wanted to start with a small but rather exhaustive list of issues. Instead of referring to Nicolai' "outside view" which could be critizised as being out-dated I started with Alexandrov's paper which seems to be still relevant (at least for the canonical part). It is great that he is working in LQG but - at the same time - constantly checking the conceptual weak points.

Remember my rather long thread regarding string theory and being disappointed? LQG must be careful not to make the same mistake: claiming to have solved the BH and the big bang puzzle and advertising one unique theory of QG, whereas one can see that currently there is either none - or many :-)

Regarding the strange quantization: afaik it's the only theory with diff. inv. that has to be quantized, and that makes a big difference. I have a particle physics background (non-perturbative QCD, canonical quantization) and I have to say that it does not look so strange! It is strange when looking at it from the 'high energy perspective'. But being educated in non-perturbative QCD and looking at strings this seems to be strange as well - so that shouldn't bother you ;-)

There is something like Fock space (or let's say Hilbert space) in LQG, and hopefuly it will really turn out to be separable. But b/c all separable Hilbert spaces are isomorphic you could then translate LQG to a standard harmonic oscillator basis - it will only blow up the operators :-)

Seriously - the anomalies might very well be there, but if you hide them off-shell and w/o being able to make 'off-shell calculations' you will not find them; the harmonic oscillator is no serious issue; think about a particle moving on a circle S1; there is no harmonic oscillator as well simply b/c the operator x does not exist; it has to be replaces by something like U(x) = exp(2 pi i x/L) from which you can construct nice interactions but NOT x². This is something that always comes to my mind: LQG with SU(2) and Immirzi parameter seems to be similar to this QM system - which admits an uncountable set of inequivalent canonical quantizations!

I don't think that it is a problem to have a new formalism (you are familiar with strings, you shouldn't be concerned); but once you have this new formalism - and there are very good reasons for it - you must take it seriously!

Let's see what others have to say ...
 
Last edited by a moderator:
  • #5
Wonderful summary, btw. I await eagerly the promised thread on chapter 3. My feeling is that a canonical quantisation is, in some senses, too hard; i.e. too many possibilities for completing the constrain algebra, operator ordering, smearing, etc. But if the claim is that these problems don't go away in spin-foams then that's quite interesting...
 
  • #6
tom.stoer said:
marcus is continuously telling us that LQG (SF) is consistent and unique - forgetting / ignoring / ... LQG (canonical approach). I thought in order to make progress we should discuss papers and formulas instead of claims and guesses.

In my opinion, the hardest push to reconcile the new spin foams and canonical LQG comes from Lewandowski and colleagues' work. http://arxiv.org/abs/0909.0939

The other line, but indirect is Dittrich and colleagues' arguments about the link between Asymptotic Safety and the Hamiltonian constraint http://arxiv.org/abs/0907.4323,http://arxiv.org/abs/0912.1817.

But what is the analogue of Asymptotic Safety in spin foams? The closest thing to a "UV fixed point", at least in spirit, seems to be Rovelli and Smerlak's http://arxiv.org/abs/1010.5437.

A recent Lewandowski paper that again deals, indirectly, with the spin foam/canonical link is http://arxiv.org/abs/1107.5185, which at the end hints cryptically that Rovelli and Smerlak's proposal is germane to the issue.
 
Last edited by a moderator:
  • #7
It's unfair to say that I tell you LQG is unique or that I forget the work done on the canonical approach. I often mention various parts of the LQG program. There is one definite formulation which has become the predominant one, in the past two years, and is followed in the majority of research papers. This is as far as I can tell, I haven't made a careful count :biggrin:.

It is a definite formulation because it is clearly laid out in about a page and at the end of the page one can say "That is the theory." One can calculate and, I think, make some testable predictions. Is there another version of LQG which has reached such a definitive formulation? Does the author say concisely what it is---period?

There are various approaches currently pursued in the general LQG program, mostly a bit vague. I don't know what formulation Alexandrov is talking about, perhaps an old one since he refers to some circa 2005 papers. People tend to choose what they want LQG to be. I am taking a pragmatic majoritarian view---most of the new research papers use the 1102.3660 formulation. Alexandrov's paper is over a year old and backwards-looking. Unfortunately it is not obvious that his paper is relevant.
 
  • #8
marcus said:
It is a definite formulation because it is clearly laid out in about a page and at the end of the page one can say "That is the theory."
Alexandrov proves that there is a two-parameter family of inequivalent theories; one of them is equivalent to the standard LQG, another one (CLQG) differs in the kinematical sector (the area spectrum is not a Dirac invariant, therefore this need not be a serious problem).Then Alexandrovs shows that "standard" LQG cannot be formulated in d<>4 dimensions, and he shows that especially in d=3 it is not even defined (the qunatization is special to d=4 b/c one uses SO(3,1) ~ SU(2)*SU(2)). Therefore it's reasonable to believe in a family of theories! In addition he shows that the "standard" LQG fails to be in-line with the Dirac constraint quantization procedure and that it quantizes the wrong symplectic structure. He shows that the implementation of the constraints is unreasonable and that this may result in anomalies in the diffeomorphism group at the quantum level. He demonstrates that standard LQG misses certain structures of the diffeomorphism group, especially the mapping class group.

All this could be a problem to canonical quantization only, but Alexandrov shows that the new SF models (as of spring 2010) suffer from the same problems b/c again the treatment of the secondary second class constraints is similar to the canonical approach.

The problem is that neither is the derivation of the SFs in-line with well-known QM formalisms, nor do the results prove that GR is recovered in the IR. So I agree that there is one theory, but it is still unclear whether it's consistent and unique. It could very well be that there is no theory at all, or that there is an uncountable family of theories.

(this will be another thread which I'll start in about a week)

marcus said:
One can calculate and, I think, make some testable predictions.
which one?

marcus said:
People tend to choose what they want LQG to be.
That's certainly not true for Alexandrov and Thiemann.

marcus said:
Alexandrov's paper is over a year old and backwards-looking. Unfortunately it is not obvious that his paper is relevant.
Half of the paper is about the new models. And if you read his (and my) arguments carefully that you will recognize that his critizism is deeper than just 'use a new vertex'.

a finalword: marcus, I think it would be fair to discuss Alexandrov's arguments in detail instead of constantl yrepeating that they may be irrelevant. Where are new developments overcoming or circumventing the problems (in the canonical approach) Alexandrov is mentioning?
 
Last edited:
  • #9
tom.stoer said:
Alexandrov proves that there is a two-parameter family of inequivalent theories; one of them is equivalent to the standard LQG, another one (CLQG) differs in the kinematical sector...

This is where I came in, on Alexandrov, in 2003. He was proposing CLQG ("covariant LQG") as an alternative. It didn't catch on. He is still bringing it up, whenever he makes a critique. Go figure :wink:
 
  • #10
tom.stoer said:
"standard" LQG fails to be in-line with the Dirac constraint quantization procedure and that it quantizes the wrong symplectic structure...

The underlying classical symplectic structure is one of the main things that FGZ paper (Freidel Geiller Ziprick) is about. They scrutinize it very carefully and prove an equivalence. I respect Freidel as a mathematician--especially after reading this recent work. I don't know what the CERN organizers were thinking when they got that lineup of QG people and did not bring either Freidel or Rovelli to present the actual state of LQG and what is going on.
 
  • #11
marcus said:
This is where I came in, on Alexandrov, in 2003. He was proposing CLQG ("covariant LQG") as an alternative. It didn't catch on. He is still bringing it up, whenever he makes a critique. Go figure :wink:

Yeah, some people are so out of touch with the latest in spin foams, even claiming things like "... restoring manifest Lorentz covariance in canonical quantum gravity. The tools which make this link possible are the “projected” spin networks introduced by Livine [9] and developed by Alexandrov and Livine [10–12]."
 
  • #12
atyy said:
Yeah, some people are so out of touch with the latest in spin foams, even claiming things like "... restoring manifest Lorentz covariance in canonical quantum gravity. The tools which make this link possible are the “projected” spin networks introduced by Livine [9] and developed by Alexandrov and Livine [10–12]."

That sounds like Rovelli giving credit to Livine's projected spin networks idea. I remember seeing what I think was the original Livine paper with that. As I recall it was around the same time as his PhD thesis appeared, which I read some of even though in French. I subscribe to the view expressed in your quote: The work (which is not the same thing as CLQG) is well worth the praise given it.

Pointless irony.

==============================
EDIT: I found the article where R. said that, I knew it sounded familiar. Livine came up with the proj SN idea in 2002 apparently. It is different from CLQG a fixed idea of Alexandrov, with which it became temporarily entangled for a few years. Here is your quote from 1012.1739 page 1, with context:
The tools which make this link possible are the “projected” spin networks introduced by Livine [9] and developed by Alexandrov and Livine [10–12]. In particular, Alexandrov has extensively developed a manifestly Lorentz-covariant spin networks formalism [12–14]. Here we focus on aspects and results of this framework that are of direct value for LQG, disentangling them from Alexandrov’s attempts to find alternative models. In [15], Dupuis and Livine study a map f that sends a SU(2) spin networks into (a certain class of) projected spin networks...​

Reply to next post: No, not akin at all. Quite separate ideas involved for limited time in the same research.
"Same breath" does not signify kinship in this case.
 
Last edited:
  • #13
marcus said:
That sounds like Rovelli giving credit to Livine's projected spin networks idea. I remember seeing what I think was the original Livine paper with that. As I recall it was around the same time as his PhD thesis appeared, which I read some of even though in French. I subscribe to the view expressed in your quote: The work (which is not the same thing as CLQG) is well worth the praise given it.

Pointless irony.

True, not the same, but akin enough that Rovelli cites it in the same breath.
 
  • #14
tom.stoer said:
Perhaps I should add that -as expected - Thiemann has something to say about these issues, but that it's work in progress

http://arxiv.org/abs/1010.5444
http://arxiv.org/abs/1105.3704

atyy said:
In my opinion, the hardest push to reconcile the new spin foams and canonical LQG comes from Lewandowski and colleagues' work. http://arxiv.org/abs/0909.0939

Hmmm, Alesci, Thiemann & Zipfel have been following up Lewandowski and colleagues' efforts to link spin foams and canonical LQG, with rather good results: http://arxiv.org/abs/1109.1290

At the end, they indicate a future direction "The general case γ≠1 and the Lorentzian signature model: in this case there are indications (work in progress) that seem to suggest the use of projected spin-networks [38]." [38] is Alexandrov and Livine's CLQG paper. Probably like Rovelli, they will use only the projected spin-networks component of that work, rather than the full CLQG proposal.

If that works out, then the Rovelli and Smerlak proposal becomes paramount. I think of it as the "UV fixed point" of spin foams. Apparently there is some difference. In AS, the fixed point is required for consistency, and the critical manifold must be low dimensional for "uniqueness". But http://arxiv.org/abs/1107.2310" says "the continuum limit does not require the system to go to a critical point, as is the case for normal systems. The continuum limit is simply given by taking the number of discretization points to infinity, without tuning any parameter."
 
Last edited by a moderator:
  • #15
atyy, marcus, it's not about promoting CLQG, forget about it for a moment!

Alexandrov demontsrates that there is a two-parameter family of connections; quantizing this connection shows that the Dirac structure (not the Poisson structure omitting second class constraints - which is wrong) leads to inequivalent formulations. Alexandrov shows that there is a classical canonical transformation which cannot be mapped to a unitary operator.

That means that there are R² different 'LQG' quantizations!

Standard LQG is one of them, CLQG is another one. I do not want to follow Alexandrov who favours CLQG, that's not my point.

My point is that there is no unique quantum theory!

Are you familiar with what I mean? Do you know what it means to have unitarily inequivalent quantizations?

Then Alexandrov claims that quantizing the Poisson instead of the Dirac structure is wrong as it fails at the secondary second class constraint, but that standard LQG is doing exactly that.

Are you familiar with Dirac's constraint quantization program?

Then Alexandrov claims that the implementation of the constraints seems to be inconsistent.

1) fix time gauge and restrict SO(3,1) to SU(2) classicaly
2) fix the Gauss constraint bei introducing loops
3) resolving the diffeomorphism constraint - except for the following issue

a) still some continuous module which are fixed be "generalized diffeomorphism"
b) using the wrong structure on phase space (Poisson instead Dirac, see above)

Generalized diffeomorphisms are not a symmetry of GR

I do not say that CLQG is right; the only thing I do is to take Alexandrov's weak points (which are the weak points of LQG) seriously.
 
  • #16
Let me stress one central problem when quantizing a theory with 'unphysical' symmetries like gauge inv. or diff. inv. You have to make sure that the symmetry survives qantization!

The problem is that in canonical quantization using a physical (i.e. reduced) Hilbert space the symmetry is no longer there; it has been fixed, it's gone. In the physical sector it's the identity operator! That is not the case for global, physical symmetries like rotational invariance.

So after having fixed the symmetry you are no longer able to check explicitly if it's there! You do no longer have an operator which generates the symmetry; therefore you cannot check whether this operator is anomalous (many people are not familiar with this concept b/c in QFT one often does NOT use a physical state space but explicitly keeps the unphysical d.o.f., e.g. Fadeev-Popov or BRST, where there is still an enlarged space in which the symmetry acts non-trivially and where you can whether it is satisfied; Slavnov-Taylor identities).

So after having fixed the symmetry several scenarios are possible: you have a physical Hilbert space w/o this symmetry; for example GR (LQG) could reduce to the Hilbert space of the harmonic oscillator which has no diff.-inv. - but of course this is wrong! But how do we know? We can't! b/c we cannot check for the symmetry itself we must check for anomalies; but where? what is the operator that we can check? We don't know!

Finding an gauge-anomaly in a physical Hilbert space where the classical symmetry has been fixed in an anomalous (= wrong) way is non-trivial b/c you don't where to look at!

That's the reason why one has to take care about the quantization procedure itself and why it is not sufficient to look at the results. You don't know what the correct Hilbert space and the dynamical operators will be, so you can't decide by looking at the results if something went wrong. Quantization is by no means unique, so there is an uncountable family of different quantum theories with the same semiclassical limit. You can't decide based on this limit whether the quantum theoriy itself is correct.

If you find that LQG reduces to the harminic oscillator you can be sure that something went wrong. But if you find that it reduced to a specific spin network SN with a specific intertwiner I instead of SN' and I' you cannot be sure!

That's the reason why Thiemann et al. are stressing the construction = quantization itself and why looking at a given theory (like Rovelli) is not sufficient (that's the point where Rovelliy is at least partially wrong).
 
  • #17
Is the anomaly discussed in http://arxiv.org/abs/gr-qc/9705017 different from the one you are talking about?

Also, I believe it is known that the new spin foams are not diff-invariant (or triangulation independent) unless Rovelli and Smerlak's proposal works.

BTW, in my post #14 I wasn't talking about CLQG at all. You don't think KKL and Thiemann, Alesci & Zipfel are relavant?
 
Last edited:
  • #18
Hmmm, Alexandrov states that the new models are not consistent quantum theories. That's very, very serious! Is he right?
 
  • #19
atyy said:
Hmmm, Alexandrov states that the new models are not consistent quantum theories. That's very, very serious! Is he right?
I don't know yet.

I am not sure if and how anomalies will show up in the SF approach. As I explained in my last post it could very well be that there is no systematic way to construct the anomalous operator in the canonical formalism b/c the operator that becomes anomalous (e.g. the generator for gauge or diff.) becomes unphysical i.e. does no longer act in the physical Hilbert space.

Are you familiar with the Fujikawa's path-integral derivation of the chiral anomaly?

We know that a mass term for fermions explicitly breaks the chiral symmetry. When writing down a PI (w/o mass term!) one introduces a measure that unfortunately has the same chiral structure as the mass term. That means that after integrating out the fermions and 'exponentiating determinant' the effective actions obtains an anomalous term generated by the measure. The reason one can see the anomaly is that the symmetry acts non-trivially on the d.o.f.

I haven't seen any such attempt for SFs, neither to prove presence not absence of an anomaly. I guess one does not know how to do such a calculation in these formalisms. In addition b/c SFs start with a partially fixed symmetry some sectors cannot be explored using this approach. That's why I think that SFs are a good calculational tool, but are bad for conceptual questions.

The vertex is something like the expectation value of the Hamiltonian. So guessing a vertex is like guessing a Hamiltonian. Have you ever tried to guess a Hamiltonian with the correct symmetries in QFT? Good luck! Have you ever tried to reconstruct a Hamiltonian from its expectation values?

The canonical formalism is vicious; he does not forgive anything. Using PIs you often can do something which is partially correct (perturbation theory in QCD neglecting large gauge transformations and Gribov ambiguities). In the canonical approach you simply cannot proceed; you are stuck. That's why I think that the canonical approach is a must. Failing with some construction in the canonical approach often means that there is a fundamental issue. Failing in the PI approach can often be cured by 'sweeping the issues under the carpet'.
 
  • #20
tom.stoer said:
I don't know yet.

I am not sure if and how anomalies will show up in the SF approach. As I explained in my last post it could very well be that there is no systematic way to construct the anomalous operator in the canonical formalism b/c the operator that becomes anomalous (e.g. the generator for gauge or diff.) becomes unphysical i.e. does no longer act in the physical Hilbert space.

Are you familiar with the Fujikawa's path-integral derivation of the chiral anomaly?

We know that a mass term for fermions explicitly breaks the chiral symmetry. When writing down a PI (w/o mass term!) one introduces a measure that unfortunately has the same chiral structure as the mass term. That means that after integrating out the fermions and 'exponentiating determinant' the effective actions obtains an anomalous term generated by the measure. The reason one can see the anomaly is that the symmetry acts non-trivially on the d.o.f.

I haven't seen any such attempt for SFs, neither to prove presence not absence of an anomaly. I guess one does not know how to do such a calculation in these formalisms. In addition b/c SFs start with a partially fixed symmetry some sectors cannot be explored using this approach. That's why I think that SFs are a good calculational tool, but are bad for conceptual questions.

The vertex is something like the expectation value of the Hamiltonian. So guessing a vertex is like guessing a Hamiltonian. Have you ever tried to guess a Hamiltonian with the correct symmetries in QFT? Good luck! Have you ever tried to reconstruct a Hamiltonian from its expectation values?

The canonical formalism is vicious; he does not forgive anything. Using PIs you often can do something which is partially correct (perturbation theory in QCD neglecting large gauge transformations and Gribov ambiguities). In the canonical approach you simply cannot proceed; you are stuck. That's why I think that the canonical approach is a must. Failing with some construction in the canonical approach often means that there is a fundamental issue. Failing in the PI approach can often be cured by 'sweeping the issues under the carpet'.

I'm not familiar with the details. Roughly, the concept seems to be that the partition function is not invariant even though the integrand is, because the measure isn't invariant.

But the lack of symmetry (that is present classically) doesn't necessarily imply inconsistency, does it?
 
  • #21
atyy said:
But the lack of symmetry (that is present classically) doesn't necessarily imply inconsistency, does it?
An anomaly is not a problem for global symmetries like the chiral symmetry which has well-known physically relevant consequences (eta-prime mass). But a (local) gauge anomaly (an anomaly of a symmetry reducing unphysical d.o.f.) usually leads to a wrong or inconsistent theory. There are well-known examples where local anomalies lead to non-renormalizibility, where the quantum theories has wrong degrees of freedom etc.
 
  • #22
tom.stoer said:
An anomaly is not a problem for global symmetries like the chiral symmetry which has well-known physically relevant consequences (eta-prime mass). But a (local) gauge anomaly (an anomaly of a symmetry reducing unphysical d.o.f.) usually leads to a wrong or inconsistent theory. There are well-known examples where local anomalies lead to non-renormalizibility, where the quantum theories has wrong degrees of freedom etc.

A gauge anomaly means perturbative non-renormalizability for a field theory on flat spacetime. But isn't the spirit of LQG non-perturbative - so why does it matter for consistency?

Anyway, I agree from the point of view of Asymptotic Safety that the gauge symmetry should be preserved. I think the http://arxiv.org/abs/1010.5437" argue that if the fixed point exists, then one can have a discretization in which the symmetry is preserved.
 
Last edited by a moderator:
  • #23
atyy said:
A gauge anomaly means perturbative non-renormalizability for a field theory on flat spacetime.
A gauge anomaly is by no means restricted to perturbation theory; it is essentially non-perturbative; and it is by no means restricted to non-renormalizibility.

Think about a non-perturbative off-shell formalism. Then think about a classical symmetry generated by functions on phase space. An anomaly is nothing else but the fact that the algebra of quantum operators corresponding to the phase space functions do not carry the same symmetry structure as the phase space functions.
 
  • #24
tom.stoer said:
A gauge anomaly is by no means restricted to perturbation theory; it is essentially non-perturbative; and it is by no means restricted to non-renormalizibility.

Think about a non-perturbative off-shell formalism. Then think about a classical symmetry generated by functions on phase space. An anomaly is nothing else but the fact that the algebra of quantum operators corresponding to the phase space functions do not carry the same symmetry structure as the phase space functions.

Yes, I don't mean that one can't have gauge anomalies non-perturbatively. What I mean is that a gauge anomaly implies inconsistency because it implies perturbative non-renormalizability.

What I'm asking is whether a gauge anomaly also implies inconsistency if one is not using perturbative renormalizability as a consistency criterion (as LQG tries not to)?
 
  • #25
In LQG the anomaly would be in the diffeomorphism sector. That means that quantizing GR according to the LQG approach could break diffeomorphism invariance which could generate 'unphysical d.o.f.'. But b/c there is no operator generating diffeomorphisms after having completed the LQG quantization there is no 'indicator' for such an anomaly. So you can't check what went wrong (you do no longer have an off-shell formalism).

Think about quantizing a classical theory with SO(3) rotational invariance, i.e. a particle living on a sphere S². Now imagine that after having completed the quantization you have something like a harmonic oscillator on the plane R². That means that something went terribly wrong, the theory is different from the one you started with. But of course the new theory is 'unique'.

Think about quantizing SU(3) and renormalizing it using Pauli-Villars regularization, i.e. introducing a large gluon mass M. This breaks gauge invariance. The longitudinal gauge bosons do not decouple (in contrast to the abelian case) and therefore you create new, unphysical degrees of freedom. One effect is that these new degrees of freedom result in non-renormalizability, but this is only one effect. Perhaps you can show that this theory has a non-Gaussian fix point which means that the theory IS non-perturbatively renormalizable. But it's a new theory, not QCD!

Something like that could happen in LQG.
 
  • #26
tom.stoer said:
In LQG the anomaly would be in the diffeomorphism sector. That means that quantizing GR according to the LQG approach could break diffeomorphism invariance which could generate 'unphysical d.o.f.'. But b/c there is no operator generating diffeomorphisms after having completed the LQG quantization there is no 'indicator' for such an anomaly. So you can't check what went wrong (you do no longer have an off-shell formalism).

Think about quantizing a classical theory with SO(3) rotational invariance, i.e. a particle living on a sphere S². Now imagine that after having completed the quantization you have something like a harmonic oscillator on the plane R². That means that something went terribly wrong, the theory is different from the one you started with. But of course the new theory is 'unique'.

Think about quantizing SU(3) and renormalizing it using Pauli-Villars regularization, i.e. introducing a large gluon mass M. This breaks gauge invariance. The longitudinal gauge bosons do not decouple (in contrast to the abelian case) and therefore you create new, unphysical degrees of freedom. One effect is that these new degrees of freedom result in non-renormalizability, but this is only one effect. Perhaps you can show that this theory has a non-Gaussian fix point which means that the theory IS non-perturbatively renormalizable. But it's a new theory, not QCD!

Something like that could happen in LQG.

OK, but at least it wouldn't be inconsistent. My own view is that it's not so important for LQG to recover the right classical limit. It's more important that it produce a consistent quantum theory. Then maybe it could contain gravity plus other stuff, like AdS/CFT in which "non-gravitational" theories are gravitational theories.

So when Alexandrov says it's inconsistent, I guess he's assuming that LQG is quantizing gravity alone. But I don't think he rules out that LQG is accidentally doing different, but still consistent?
 
  • #27
An anomaly does not always mean that the quantum theory is inconsistent; it simply means that the quantum theory has a different symmetry structure than the classical theory, usually the symmetry is 'broken by the anomaly'.

There are even cases where this is welcome: the well-known chiral or UA(1) anomaly explains why the eta-prime meson is not a (asymptotic) Goldstone boson in addition to the well-known NF2-1 mesons of SU(NF), and it therefore explains why the eta-prime is so much heavier than the eta.

Constructing the anomaly using perturbation theory one observes that it is impossible to cure both the U(1) and the UA(1) anomaly in a triangle diagram; b/c U(1) is related to conservation of elecric charge and b/c is is required for the Ward-identities one choses to keep the U(1) symmetry and to break the UA(1) symmetry via the anomaly. Doing it that way one sees that the anomaly arises due to (e.g.) dimensional regularization and it could very well be that it is an artifact of this procedure.

Now nature (the eta-prime) tells us that the anomaly is welcome - but w/o such a hint we would be rather worried about our inability to conserve both classical symmetries. I think it is only due to the Atiyah-Singer-index theorem that we are able to understand why there is such an anomaly and that it has a deep mathematical meaning.

What does that mean for GR/LQG: we know that the theory has an unpysical structure, namely coordinate systems, which are washed away by factoring out diffeomorphisms. There is no hint for any breaking or violation of diffeomorphism invariance in nature, neither classically, nor in quantum gravity (b/c we are not able to study this regime experimentally).

What Alexandrov says is that the way LQG is constructed may result in an anomaly, and that this anomaly is NOT due to some physical or mathematical principle (like the index theorem) but due to a dirty trick (like dim.-reg.). It could very well be that there are other tricks to arive at (slightly) different spin networks with (slightly) different dynamics which shows that there is at least some ambiguity in the construction.

Let's look at the diffeomorphisms and what bothers some people working in the field:
1) constructing the spin network based on embeddeed graphs one uses something like generalized diffeomorphisms which are not smooth at the vertices and which are therefore not present in GR!
2) looking at the diffeomrophisms generated by H (classically) one sees that instead of having a closed algebra one has an algebra which closes only on the subspace of spacelike diffeomorphisms; the problem is that one is not able to construct a unique H (quantum mechanically) and that one is not able to find a unitary representation of this strange algebra (which has structure functions instead of structure constants)
3) b/c H is not known, the kernel H|phys>=0 is not known, either, and therefore one is not able to understand if and how the algebra of constraints closes in the diffeomorphism sector
4) there is no off-shell formalism which backs up the on-hell reduction.
5) LQG does not deal with the full diffeomorphism group but only with the sector which is conected to the identity; large diffeomorphisms / the mappig class group are not understood in this context.

So there are enough indications that diffeomorphisms in LQG are not properly understood.
 

1. What is loop quantum gravity?

Loop quantum gravity is a theory of quantum gravity that attempts to reconcile general relativity and quantum mechanics. It proposes that spacetime is made up of discrete, quantized units, rather than being continuous. It also suggests that gravity arises from the interaction between these units, known as "loops".

2. What is the canonical formalism in loop quantum gravity?

The canonical formalism is a mathematical framework used in loop quantum gravity to describe the dynamics of the quantum system. It involves using a set of canonical variables, such as the position and momentum of the loops, to write down the Hamiltonian, which describes how these variables evolve over time.

3. What are the prospects of the canonical formalism in loop quantum gravity?

The prospects of the canonical formalism in loop quantum gravity are still being explored and debated by scientists. Some believe that it has the potential to provide a complete and consistent theory of quantum gravity, while others argue that it may have limitations and further development is needed.

4. What are the key contributions of Alexandrov and Roche's work on the canonical formalism in loop quantum gravity?

Alexandrov and Roche have made significant contributions to the understanding of the canonical formalism in loop quantum gravity. They have proposed a new approach called the "covariant loop quantum gravity", which takes into account the covariance of general relativity. They have also provided new insights into the role of time in loop quantum gravity and the quantization of the gravitational field.

5. How does the canonical formalism in loop quantum gravity differ from other theories of quantum gravity?

The canonical formalism in loop quantum gravity differs from other theories of quantum gravity, such as string theory and causal dynamical triangulations, in its approach to quantizing spacetime. It also differs in its use of canonical variables and its focus on the dynamics of the quantum system. These differences make it a unique and promising approach to understanding the nature of space and time on a quantum level.

Similar threads

  • Beyond the Standard Models
Replies
9
Views
498
  • Beyond the Standard Models
Replies
7
Views
1K
Replies
13
Views
2K
  • Beyond the Standard Models
Replies
13
Views
2K
  • Beyond the Standard Models
Replies
4
Views
2K
  • Beyond the Standard Models
Replies
3
Views
2K
Replies
2
Views
2K
  • Beyond the Standard Models
Replies
4
Views
1K
Replies
9
Views
6K
  • Beyond the Standard Models
Replies
24
Views
4K
Back
Top