EPRL quantization found to be completely inconsistent.

In summary: Hamiltonian. But sometimes you are not lucky. Sometimes you find that the Hamiltonian is not a conserved quantity but generates a new constraint.In this case you have second class constraints. In electromagnetism the Hamiltonian is not a conserved quantity. The reason is that the Poisson structure of the variables is not canonical (not "equal time" Poisson brackets). Normally you have{qi, p_j} = delta_ijBut here you have{qi, p_j} = delta_ij + B_ijwhere B is a matrix which describes the magnetic field (the magnetic potential is B_ij).If you want to preserve the
  • #1
MTd2
Gold Member
2,028
25
http://arxiv.org/abs/1004.2260

The new vertices and canonical quantization

Sergei Alexandrov
(Submitted on 13 Apr 2010)
We present two results on the recently proposed new spin foam models. First, we show how a (slightly modified) restriction on representations in the EPRL model leads to the appearance of the Ashtekar-Barbero connection, thus bringing this model even closer to LQG. As our second result, we however demonstrate that the quantization leading to the new models is completely inconsistent since it relies on the symplectic structure of the unconstraint BF theory.
 
Physics news on Phys.org
  • #2
MTd2 said:
http://arxiv.org/abs/1004.2260

The new vertices and canonical quantization

Sergei Alexandrov
(Submitted on 13 Apr 2010)
We present two results on the recently proposed new spin foam models. First, we show how a (slightly modified) restriction on representations in the EPRL model leads to the appearance of the Ashtekar-Barbero connection, thus bringing this model even closer to LQG. As our second result, we however demonstrate that the quantization leading to the new models is completely inconsistent since it relies on the symplectic structure of the unconstraint BF theory.

page 12

"thus the new spin foam models (i.e EPRL) have no chance to describe quantum gravity"


"the quest for the right model is to be continued"

Ouch. Guess it's back to the drawing board?

"the relation between LQG and SF states is not so direct"

Perhaps it's futile to try to connect LQG to SF, and develop a SF with correct semiclassical limit without regard to LQG kinematical results.
 
  • #3
Rovelli already answered Alexandrov implicitly in his recent paper.
see page 1

I doubt you will see any "return to the drawing board" prompted by the Alexandrov result at this point :biggrin:

(essentially because the new formulation by Bianchi is too interesting.)
 
Last edited:
  • #4
"... with the only exception of the FK model without the Immirzi parameter by reasons to be explained below."
 
  • #5
marcus,

unfortunately this is a different thread, but you are mentioning Rovelli's remarks on page 1 of his latest paper.

Wich remarks? I cannot see anything addressing Alexandrov's issues.

Alexandrov is - as far as I can see - not cooperating with the majority of the LQG guys but working somehow isolated. Is this true? He has published over the years a couple of papers always adressing weaknesses of the mainstream approach. Now he is claiming again that he has identified a serious loophole in the standard SF quantization procedure, namely the wrong treatment of second class constraints. I was working on constraint quantization myself and my conclusion was always that second class constraints are a hint to identify other variables in terms of which the constraints become first class (so do not use the variables which generate second class constraints plus Dirac brackets; instead use different variables with standard Poisson brackets and canonical commutation relations). Is this something the LQG/SF community has already done but we (Alexandrov) are not aware of? Or did they really miss something?

If Alexandrov is right this is not simply a "different quantization" or a "non-standard approach" or anything like that. If he is right the SF quantization is simply wrong!

Can it really be that nobody has recognized that the simplicity constraints - together with other constraints - are second class and must be treated differently? Really nobody? I can't believe it.

What is strange is that nobody seems to care about his findings. There are no answers from Rovelli, Bianchi, Lewandowski, Thiemann, Ashtekar, ...
 
  • #6
The second class constraints are treated differently in the EPRL model (via expectation values or Master Constraint type considerations), his complaint is IIRC that the symplectic structure is wrong.

Anyhow Spin foam "quantisation" is not a quantisation in any standard meaning of the word anyway.

Furthermore when is a quantisation wrong? Or even inconsistent? What's the criterium? I'm really asking. I'm not an expert on quantisation techniques but it seems to me that each technique comes with its own "natural" choices. The mathematical rigours of Stone von Neumann don't apply, nor do the niceties of geometric quantisation, in fact we know there are infinitely many inequivalent quantisations of even the nicest of theories, so what makes this flat out wrong in your opinion?

And BTW Alexandrov speaks aat the conferences frequently and people do listen and reply. The problem is, he doesn't have a way to do it "better", so in the absence of alternatives we take what we can get.

Either way, too me the way to motivate spin foams is not from a quantisation procedure at all but from TQFTs, and these often don't have a (known) path integral formulation anyway.
 
  • #7
Regarding the quantization a la Dirac:

Suppose you have an action S and a derived Hamiltonian H which may contain unphysical (gauge) degrees of freedom plus "wrong coordinates". What happens?

First you have a set of constraints

C1n ~ 0 (weakly zero).

One example is the (missing) conjugated momentum for A0 in electromagnetism.
You have to require that this constraint is a conserved quantity (that the system stays on the constraint surface in phase space). That means you have to calculate

[H, C1n]

and require ~ 0 again.

If this commutator is exactly zero everything is fine. If it is a linear combination of constraints, that means

[H, C1n] = ank C1k

it's fine, too. But sometimes you get something new, a new constraint for which ~ 0 is a non-trivial condition. An exampleis the Gauss law which is generated by the above mentioned constraint.

That means that you have generated a new constraint

[H, C1n] = C2n

You call this a secondary constraint.

You can continue until (hopefully) the procedure terminates with either =0 or ~0 automatically. In electromagnetism the Gauss law does nozt generate a new constraint so you have essentially two, namely

"canonical conjugate momentum for A0" ~ 0
"Gauss law" ~ 0

These two constraints are "pure gauge"; they essentially kill two polarizations of the photon which gives 4-2 = 2 physical ones.

Up to now we talked about so called first class constraints. That means they form a (weakly) closed algebra:

[Cm, Cn] = amnk Ck ~ 0

The Gauss law in non-abelian gauge theories is a good example. It is nothing else but a "local" su(N) algebra.

The basic rule is that for first-class constraints
- the algebra of constraints closes weakly
- they are implemented as constraints on the physical states C|phys> = 0
- these constraints are "gauge" and eliminate unphysical degrees of freedom
- the canonical communtation relations remain unchanged

Now we have to talk about second-class constraints. Here the problem is that you may have something like

[Cm, Cn] = amnk Dk

D is something new. It is not due to the symplectic structure and the time evolution, but mostly due to "wrong coordinates" on phase space. In many cases you must not implement them at all, not even on physical states, because D may be a c-number!

What has happened?

You have chosen the wrong variables to be quantizeand - or you have used the wrong commutation relations. The procedure a la Dirac is to use different variables (which you may identify if you are lucky :-) or to use so-called Dirac brackets which is explained in Alexandrov's paperfor his toy model. This allows you to use the same variables as before, but you have to change the commutation relations in such a way that they generate a (weakly) closed algebra again. There is a unique way how to construct the Dirac brackets from the Poisson brackets which has been described by Dirac.

Lessons learned?

For first-class constraints there is some freedom how to implement them: "quantize first - solve later" a la LQG (the old approach) or "solve first - quantize later" a la old-fashioned QED with explicit solution for the Gauss constraint or "work with unphysical degrees of freedom and let them be killed by ghosts" a la Faddev-Popov / BRST in perturbative QCD.

For second-class constraints my feeling is that they essentially mean that something went wrong! You can cure that by using Dirac brackets, but I prefer to identify the correct degrees of freedom and use the Poisson brackets instead.

Now in LQG the problem is that all constraints are treated differently:
- Gauss law is solved as constraint on physical states
- Diff. inv. is implemented later
- for H refer to Nicolai's paper; I doubt that the whole story is well defined

Now it seems that SFs have similar problem, namely that you have new constraints and (again) treat them differently. You introduce BF theory + simplicity constraints on top of the "old fashioned" LGQ constraints.

My conclusion is that - if Alexandrov is right - SFs do not fix the quantization problems we know from old-fashioned LQG, but avoid to talk about them, to postpone them ...

I agree with Rovelli that there is no quantization procedure known which is "axiomatic"; it is not possible to derive a quantum theory uniquely from a classical theory (because the quantum theory is "more" than the classical theory). And I agree with you that quantizing a TQFT is again something different because you reduce the degrees of freedom to a finite set. But "new LQG" is not a TQFT because it is not BF but BF + constraints. It is interesting that these additional constraints on top of BF "constrain" the theory in a way that "kills the topological structure" and results in GR with more degrees of freedom than you would naively expect from BF / TQFT. In a sense you do not constrain the degrees of freedom but the constraints :-)

But: once you chose to work with a constraint algebra and you end up with second class constraints there is no way out; you have to respect what the maths gives you. There are now excuses. My problem is that I can hardly follow the new BF approach and it may very well be that I overlook some basic facts that are rather clear for the insiders. What bothers me is that - as you explain - Alexandrov is an insider and - nevertheless - identifies such a basic issue.
 
  • #8
It is my understanding that the "new LQG" is not in any way derived from classical considerations. It is simply a quantum theory which is mathematically well-defined, and has dynamics which approximate GR in the semi-classical limit. It is obviously free from UV divergences because it is not a field theory (only countable degrees of freedom). I'd think that if they can couple matter to it, and show the standard model in the weak gravitation limit then we'd pretty much be done (in the sense of we have *a* theory, now let's go test with experiments).

Caveat: no phase transitions lurking.
 
  • #9
You are right; it's not strictly a derivation, but at least it takes the derivation and the results from "old LQG" into account.

Nevertheless: if Alexandrov is right and there is a system with second class constraints then taken them not into account correctly is simply wrong; this does in no way depend on any other related approach; 2+2= 5 is wrong; you do not have to develop a new formalism (or compare this to an old formalism) which shows that it's 4.

But perhaps Alexandrov is not right and all these classifications regarding first- and second-class do not apply in LQG. Then there have to be some argments showing what is really different .

btw.: the same discussion applies to the Hamiltonian constraint in "old LQG" which generates an algebra that only closes on-shell; and it applies to SUGRA which relies on on-shell closure of local SUSY.

It is not clear to me if Rovelli's intention to start from some quantum theory w/o a strict derivation is viable. I agree that this approach is sufficient in order to understand the whole class of SF models, but in order to select one (and only one) one should insist on mathematical rigour. So I would say that these questions will come back sooner or later.

(nobody cared about Gribov ambiguities in QCD as long as they were only interested in DIS; but as soon as they wanted to understand confinement they had to solve these issues)
 
  • #10
Actually, thinking about this a bit more, I suspect that's why Rovelli does not give a Hamiltonian formulation, and in fact lists that as one of the priority topics for future work. I think you are completely correct that something is wrong about the treatment of the Hamiltonian constraint, but it might be that the answer is that the problem is with Hamiltonian mechanics in a covariant setting, rather than that the present theory is wrong. After all, there's no fundamental reason to prefer a canonical treatment, apart from that it's worked pretty well so far.

So in summary: we have a quantum theory, which is UV defined, with dynamics --- we can calculate transition amplitudes --- and which approximates GR in the (a?) semi-classical limit. Clearly, there is a lot of work to be done on the question of what exactly is this theory and exactly what its low energy (whatever that might mean) dynamics are --- the paper points out some good directions that people have looked at in terms of perturbative expansions. I am of the opinion that the failure to understand it as a standard field theory is a good thing, not bad; after all, if it was possible to do it as a field theory, we'd probably have found it a lot quicker. I hope that one day we'll have a better understanding of various quantum theories such that it becomes obvious why any field theoretic approach to an almost-topological-but-really-not-quite theory was doomed.
 
  • #11
genneth said:
... but it might be that the answer is that the problem is with Hamiltonian mechanics in a covariant setting ...
I don't think so; the canonical approach is more involved if one wants to prove covariance (regardless which theory you are talking about), but it works - iff it is applied correctly.

genneth said:
... we have a quantum theory, which is UV defined, with dynamics --- we can calculate transition amplitudes --- and which approximates GR in the (a?) semi-classical limit.
I agree; that's remarkable.

genneth said:
... the failure to understand it as a standard field theory
It is already far away from any other "standard field theory".

One of the biggest problems is that we cannot test it. The only accessable regime is the semiclassical one, where different approaches seem to converge. It is like developping QCD with access restricted to the energy range of atomic spectra.
 
  • #12
tom.stoer said:
I don't think so; the canonical approach is more involved if one wants to prove covariance (regardless which theory you are talking about), but it works - iff it is applied correctly.

...

One of the biggest problems is that we cannot test it. The only accessable regime is the semiclassical one, where different approaches seem to converge. It is like developping QCD with access restricted to the energy range of atomic spectra.

What I mean is that true diffeomorphism invariance is going to do serious violence to time variables. It is not obvious that a pure canonical approach is compatible --- which is why Rovelli uses a more direct definition of transition amplitudes, without defining a Hamiltonian, i.e. a generator of time translation.

As far as accessibility goes, I agree completely. On the other hand, we have only *one* theory currently that works. Of course, now that we do, it probably isn't hard to come up with irrelevant perturbations which make no difference at the semi-classical limit. On the other hand, it seems like that would be special pleading.

Frankly, I'm excited. If they (since I'm a complete outsider on this) can couple matter in such a way as to get the standard model out at low energies, I'd probably die a happy man. :wink:
 
  • #13
genneth said:
What I mean is that true diffeomorphism invariance is going to do serious violence to time variables. It is not obvious that a pure canonical approach is compatible ...
Using the canonical approach one has to assume a spacetime foliation like M4 = R*M3; R represents the real line. You can then use the canonical approach, provided diff.-inv. is implemented w/o anomaly. There is no need to identify R with physical time. It's nothing else but a direction perpendicular to something ...

genneth said:
Frankly, I'm excited. If they (since I'm a complete outsider on this) can couple matter in such a way as to get the standard model out at low energies, I'd probably die a happy man. :wink:
As far as I can see there is no restriction in coupling matter. So if they can put the standard model in, the result will be the standard model coupled to quantum gravity. I do not see how restructions regarding the matter content / gauge symmetry can emerge from LQG/SF.

What I would like to see is something like "Sundance Bilson-Thompson ribbons" or "SF preons". But I am afraid this is wishful thinking.
 
  • #14
tom.stoer said:
Using the canonical approach one has to assume a spacetime foliation like M4 = R*M3; R represents the real line. You can then use the canonical approach, provided diff.-inv. is implemented w/o anomaly. There is no need to identify R with physical time. It's nothing else but a direction perpendicular to something ...


As far as I can see there is no restriction in coupling matter. So if they can put the standard model in, the result will be the standard model coupled to quantum gravity. I do not see how restructions regarding the matter content / gauge symmetry can emerge from LQG/SF.

What I would like to see is something like "Sundance Bilson-Thompson ribbons" or "SF preons". But I am afraid this is wishful thinking.

But that's precisely the point. It seems that there are real obstacles to completing that quantization program rigorously. On the other hand, we clearly have a well-defined theory, but which might not fit into the canonical framework.

And in so far as matter goes, they can't *just* put a field theory on top --- no spacetime on which to do QFT. It might transpire that there's a trivial (in the trivial bundle sense) of just slapping gauge degrees of freedom on the network ala lattice QCD, but fermions might still be a bit of a problem. There's then the problem that interactions might cause some phase transition.
 
  • #15
genneth said:
But that's precisely the point. It seems that there are real obstacles to completing that quantization program rigorously. On the other hand, we clearly have a well-defined theory, but which might not fit into the canonical framework.
I would say that it can be done via the canonical formalism, the only question is how. I see no fundamental reason why the canonical formalism should not work.
In addition I am convinced that it must be done via the canonical formalism, simply because (strictly speaking) the path integral is derived from the Hamiltonian, not the other way round. So either it works in LQG as well, then one should do it in order to secure results from SF; or it does not work, then it's very important to understand why. This last step is missing. Currently it looks like "they are not smart enough todo it" - and this cannot be the end of the story.

Look at QCD: they told us for decades that QCD cannot be quantized in the canonical formalism and one has to use the PI. They managed to quantize QCD via PI - but only because they neglected non-perturbative issues, large gauge transformations, Gribov ambiguities etc. In the late eightees and early ninetess I was active in group focussing on non-perturbative quantization of QCD in the canonical framework, and it worked! But - and this is essential - one had to solve exactly the issues that had been neglected in the PI formalism.

genneth said:
And in so far as matter goes, they can't *just* put a field theory on top --- no spacetime on which to do QFT. It might transpire that there's a trivial (in the trivial bundle sense) of just slapping gauge degrees of freedom on the network ala lattice QCD, but fermions might still be a bit of a problem. There's then the problem that interactions might cause some phase transition.
Now I understand whyt you mean. You are addressing the problem that quantizing a field theory in the usual sense requires a spacetime manifold in order to define the field operators, propagators etc. If this is missing and if one expects back-reaction from the fields to the dynamical spacetime manifold (or whatever its replacement will be) then the standard field theory approach is in trouble. In addition I do not believe that SF+fields is similar to lattice (gauge) theory. They are both discrete, but on the lattice there is a well-defined continuum limit whereas in SF there is not even a lattice spacing. A link has a length defined by the spectrum of an SU(2) operator and there is no continuum limit. I agree that fermions may cause additional problems.

Again the canonical approach (now abandoned) seemed to be a framework to derive LQG + additional fields in one context. The canonical approach required some "discretization" = identification of "chunks of space" represented by diff.-inv. quantities (cylinder functions => spin networks). If you abandon this approach of "deriving" LQG you are no longer able to derive the matter coupling in parallel. You have to put matter on top of the SF w/o any hint where you came from. So again this is a strong hint that the canonical formalism may provide additional insights.

But I should come back to my main point: if there are second-class constraints, then they have to be taken into account properly, regardless which formalism you are using. I hope there will be some response to Alexandrov's latest paper on arxiv quite soon.
 
  • #16
tom.stoer said:
If you abandon this approach of "deriving" LQG you are no longer able to derive the matter coupling in parallel. You have to put matter on top of the SF w/o any hint where you came from.

That's why I can't see any potential in SF for true (non-ad hoc) unification of gravity with matter. That's another good reason to do string theory, which requires extra matter to be present for consistency.
 
  • #17
suprised said:
That's why I can't see any potential in SF for true (non-ad hoc) unification of gravity with matter.
I agree with this conclusion. LQG/SF as of today will definately not provide any hint towards a true unification. "Sundance Bilson-Thompson ribbons" and "SF preons" are wishful thinking, I am afraid.

suprised said:
That's another good reason to do string theory, which requires extra matter to be present for consistency.
I think you are not surprised that here I don't agree :-)
 
  • #18
tom.stoer said:
I agree with this conclusion. LQG/SF as of today will definately not provide any hint towards a true unification. "Sundance Bilson-Thompson ribbons" and "SF preons" are wishful thinking, I am afraid.


I think you are not surprised that here I don't agree :-)

So what do you think is most promising -- noncommutative geometry? :)
)
 
  • #19
ensabah6 said:
So what do you think is most promising -- noncommutative geometry? :)
)

believe it or not: LQG! learn how to quantize gravity consistently and then check how it can be applied to other theories heading for unification.
 
  • #20
tom.stoer said:
believe it or not: LQG! learn how to quantize gravity consistently and then check how it can be applied to other theories heading for unification.

I've wondered whether it's time to "dump" historical LQG, based on Ashketar's variables, and create a Hamiltonian constraint that gives GR in semiclassical limit, along with physical inner product and off-shell closure, with spin networks as its kinematics, which may or may not match LQG area and volume operators. What kind of hamiltonian constraint would give rise to GR in low energy regime, and then work from there. I call this a top down approach, start with a hamiltonian reproduces GR, and work from there, towards its Planck scale properties, as opposed to Ashketar's bottom up - start with GR, rewrite variables and then quantize it, and then find GR.

What do you mean by unification? A larger symmetry group ala SU(5) SO(10) GUT at 10^15 Gev scale that is then spontaneously broken to SM gauge group? If so, why not just go with string theory?
 
  • #21
The problem is that in most cases it’s hardly possible to write down a Hamiltonian which has the correct symmetries; you need the Lagrangian in order to identify them. But then you are in the same situation as at the starting point of the LQG research program …

By unification I mean that there is some approach which derives different interactions from one unique framework. It doesn’t matter if it’s “strings” or “GUT” or anything else.
 
  • #22
Aha. I think broadly we're actually in agreement. I completely agree that a canonical formulation is desirable, or else a very good reason why. I'm obviously not suggesting that the current state of affairs is the end of the road --- far from it. I'm excited because it's truly a beginning. However, on a technical point, I disagree with the statement that the canonical formalism is somehow more "fundamental" than sum-over-histories; I would also disagree on whether putting it the other way round. I fundamentally believe there is no *right* way to do time evolution in quantum mechanics. Just because somethings can be derived from each other, and has worked so far, does not give it priority logically. The current SF formulation is a proper theory because it is unambiguous about what transition amplitudes are, regardless of how they are calculated.

Because I'm a condensed matter theorist, I have prejudices about (quantum) field theory, and great sadness that people do not often look beyond. The fact that symmetries can be present in lower energy effective theories should be an invitation to look for UV-complete theories which don't live on a manifold. So the SF formulation is attractive to me in that way.

I agree of course that matter+SF is not the same as lattice gauge theory. However, there are ways to create gauge theories as the effective theory on a lattice (semi-familiar to condensed matter theorists, in the form of string-net condensates, see work by Levin and Wen), so that's what I mean when I say that I think gauge theories are probably quite tractable. Proper fermions usually come for free out of this, but in GR there's probably torsion to worry about. And finally, chiral fermions are a complete mess.
 
  • #23
tom.stoer said:
The problem is that in most cases it’s hardly possible to write down a Hamiltonian which has the correct symmetries; you need the Lagrangian in order to identify them. But then you are in the same situation as at the starting point of the LQG research program …

By unification I mean that there is some approach which derives different interactions from one unique framework. It doesn’t matter if it’s “strings” or “GUT” or anything else.

What would a Lagrangian look like in order that you could write down a Hamiltonian with correct symmetries?

Maybe there's no "unique" framework so any such attempt to construct one is ultimately hopeless. i.e some elementary particles are twists in geometry, others strings, still others, geodons, mini-stable black holes, string-net particles, etc.
 
  • #24
genneth said:
Aha. I think broadly we're actually in agreement. I completely agree that a canonical formulation is desirable, or else a very good reason why. I'm obviously not suggesting that the current state of affairs is the end of the road --- far from it. I'm excited because it's truly a beginning. However, on a technical point, I disagree with the statement that the canonical formalism is somehow more "fundamental" than sum-over-histories; I would also disagree on whether putting it the other way round. I fundamentally believe there is no *right* way to do time evolution in quantum mechanics. Just because somethings can be derived from each other, and has worked so far, does not give it priority logically. The current SF formulation is a proper theory because it is unambiguous about what transition amplitudes are, regardless of how they are calculated.
Yes, we broadly agree and therefore I will stop insisting on the canonical fiormulation as the more fundamental one ...

genneth said:
Proper fermions usually come for free out of this, but in GR there's probably torsion to worry about. And finally, chiral fermions are a complete mess.
Torsion is not something to worry about. I think everybody agrees that LQG with matter will not have GR but Einstein-Cartan theory as its semiclassical limit. I known that LQG with fermions has been studies, but afaik never with chiral fermions ...
 
  • #25
tom.stoer said:
But: once you chose to work with a constraint algebra and you end up with second class constraints there is no way out; you have to respect what the maths gives you. There are now excuses. My problem is that I can hardly follow the new BF approach and it may very well be that I overlook some basic facts that are rather clear for the insiders. What bothers me is that - as you explain - Alexandrov is an insider and - nevertheless - identifies such a basic issue.

You missed my point. I know all this. First of all look at the EPRL papers. The second class constraints are NOT implemented as C|phys> = 0 but as <phys|C|phys> = 0. This is another of the standard techniques to deal with them. Or rather in the papers from a few years back several options are explored. They turn out to do essentially the above.

It can be argued that in the BC model they were imposed to strongly. This misimplementation of the constraints was actually a starting point for Rovelli to investigate EPRL.

This is not Alexandrovs objection which is related to the symplectic structure involved. Otherwise he couldn't argue that FK without \gamma is not affected, as far as implementing the contraints goes, it's doing something extremely similar to EPRL.

Now again, while we can take hints from standard quantisation techniques Spinfoam models are NOT one of them. They do NOT follow the Dirac approach or the Faddeev Poppov approach or any other approach. At most they are a sort of Lattice-path integral quantisation. So the Dirac algorithm is not really the appropriate tool at all (which is why the BC model never bothered with it). As far as I understand you might be worried that you obtain the wrong meassure in the path integral but then I again ask: Wrong by what standard? It's certainly not wrong by the 2+2=5 standard, that is, inconsistent. You haven't given any consistency requirements after all. Thus your call for mathematical rigour is empty.*

To me the "quantisation" methods to arrive at the EPRL model are unconvincing anyways. It is however an extremely natural model from the point of view of representation theory and TQFT in the state sum approach. That's why it's interessting to work on for me. The BF+constraints thing is motivational. If it doesn't make a good motivation for you ignore it.

Now if you are married to Dirac style quantisation and the cannonical formalism by all means knock yourself out. That's old school LQG. Thomas Thiemann is working on this. Christina Giesel, Hanno Sahlmann and a good group of others, too. The problem there is how to interprete the theory. (And believe me, I worked on this, it's hard. Deeply fundamentally hard. Basically due to the introduction of the 3+1 split.) But if you can solve that, they have a mathematically wonderfully rigorous framework to sell you.

But that's NOT what spin foams are. Even if there is some overlapp to LQG techniques. Take the most succesfull spin foam/state sum model to date: the Turaev Viro model. Its relation to a pathintegral is extremely obscure. In fact there are many state sum TQFTs for which neither a Lagrangian nor a Hamiltonian formulation is known. Furthermore these theories naturally live on manifolds that don't have \sigma \times R topology.

As an aside: From the spin foam perspective this all doesn't matter. The believe that Dirac quantisation is somehow the only true way is suspect to me anyways. I apply the mathematicians criterium: Is the resulting theory interesting and free of contradictions? Dirac quantisation will not solve your conceptual proiblems for you, in fact it will make them harder. And the most crucial conceptual question of all is the one about the phase transition. Until we know how to ask and answer this question we don't know what classical theory we are looking at.




*As a matter of fact even in the cases where "rigorous" quantisation is well defined it is never unique (c.f. Haag's theorem for example and many many other results on quantisation ambiguities as well).
 
  • #26
I will try to invite the author of this paper to post here, hmm.
 
  • #27
It seems he also have something to say about LQG itself:

http://ipht.cea.fr/Phocea/file.php?file=Seminaires/991285/991285.pdf

Critical overview of Loops and Foams

Sergei Alexandrov

LPTA, Universit´e de Montpellier
I’ll give a critical review (from a personal viewpoint) of the present status and recent progressof loop and spin foam approaches to quantization of four-dimensional general relativity. I’ll argue that none of these approaches provided up to now a model, which can be considered as a candidate for a theory of quantum gravity. In particular, loop quantum gravity is likely to be an anomalous theory, whereas the present spin foam models suffer from incomplete (or even wrong) implementation of constraints of general relativity.
 
  • #28
Eugenio Bianchi, Daniele Regoli, Carlo Rovelli uploaded a paper today that tackled the issues raised by Alexandrov:

http://arxiv.org/abs/1005.0764

Face amplitude of spinfoam quantum gravity

Eugenio Bianchi, Daniele Regoli, Carlo Rovelli
(Submitted on 5 May 2010)
The structure of the boundary Hilbert-space and the condition that amplitudes behave appropriately under compositions determine the face amplitude of a spinfoam theory. In quantum gravity the face amplitude turns out to be simpler than originally thought.

******

"Alexandrov has stressed the fact that the implementation of second class constraints into a Feynman path integral in general requires a modification of the measure, and here the face amplitude plays precisely the role of such measure, since Av*e^(iAction). Do we have an independent way of fixing the face amplitude?

Here we argue that the face amplitude is uniquely determined for any spinfoam sum of the form (1) by three inputs: (a) the choice of the boundary Hilbert space, (b) the requirement that the composition law holds when gluing two-complexes; and (c) a particular “locality” requirement, or, more precisely, a requirement on the local composition of group elements."
 

1. What is EPRL quantization?

EPRL quantization, also known as the Engle-Pereira-Rovelli-Livine quantization, is a mathematical framework used in loop quantum gravity to describe the quantum nature of space and time.

2. What does it mean for EPRL quantization to be completely inconsistent?

It means that the mathematical equations used in EPRL quantization are unable to consistently describe the quantum nature of space and time. This can lead to contradictions and errors in the predictions made by the theory.

3. What are the implications of EPRL quantization being inconsistent?

If EPRL quantization is indeed found to be completely inconsistent, it would mean that the theory is not a viable candidate for describing the quantum nature of space and time. This would require researchers to explore other theories and approaches to understanding the fundamental nature of the universe.

4. How was it discovered that EPRL quantization is inconsistent?

The inconsistency of EPRL quantization was discovered through mathematical analysis and simulations. Researchers found that certain equations in the theory did not hold up under certain conditions, leading to contradictions and inconsistencies.

5. Is there any hope for resolving the inconsistencies in EPRL quantization?

Yes, there is still ongoing research and efforts to refine and improve the EPRL quantization framework. Some researchers believe that with further development and modifications, the theory could still be a viable candidate for describing the quantum nature of space and time.

Similar threads

Replies
2
Views
2K
  • Beyond the Standard Models
Replies
5
Views
2K
  • Beyond the Standard Models
Replies
6
Views
3K
  • Beyond the Standard Models
Replies
4
Views
1K
Replies
9
Views
6K
  • Beyond the Standard Models
Replies
9
Views
497
  • Beyond the Standard Models
Replies
24
Views
4K
  • Beyond the Standard Models
Replies
7
Views
5K
  • Beyond the Standard Models
Replies
7
Views
2K
Replies
26
Views
8K
Back
Top