Can Quantum Cosmology Be Saved with the Addition of Matter?

  • Thread starter Finbar
  • Start date
In summary, Martin Bojowald, in his review talk at loops 13 entitled "Loop Quantum Cosmology: A Eulogy", has declared that loop quantum cosmology has failed due to its inability to solve the anomaly problem inherited from loop quantum gravity. This problem has been known for years and has not been effectively addressed, leading to the conclusion that there is nothing of value left in the subfield. The anomaly problem refers to the inconsistencies and non-Lie algebra of the constraints in the canonical LQG approach, which have not been solved or addressed effectively. The quantization process also affects the constraint algebra in an uncontrolled way, leading to further issues such as unphysical gauge degrees of freedom. The Immirzi parameter
  • #1
Finbar
342
1
In his review talk at loops 13 entitled "Loop Quantum Cosmology: A Eulogy" Martin Bojowald has declared that loop quantum cosmology has failed.





http://www.perimeterinstitute.ca/videos/quantum-cosmology-1
 
Physics news on Phys.org
  • #2
! tom.stoer check out 24:00 "Rovellitis"! There's a discussion of the anomaly problem.
 
Last edited:
  • #3
Can't play the video :-(
 
  • #4
But can read the PDF - and LQG is seriously ill
 
  • #5
Who's the girl with the microphone?
 
  • #6
tom.stoer said:
Can't play the video :-(

I also thought that I couldn't play the video, but, a minute or two after clicking on it, the video appeared.
 
  • #7
the video is flash, and my device at home is an ipad ;-(

anyway, the slides explain a lot; and the anomaly-issue is known for years but mostly swept under the carpet
 
  • #8
...nothing unexpected
 
  • #9
The impression I got from skimming the video was that LQC, contrary to initial hopes, offers no advantage for escaping the problems of LQG. It seemed like the reasoning was: now we consider LQC just part of LQG, so LQC inherits all of LQG's problems. That is, the anomaly problem wasn't actually avoided, nor has it been solved, so there is nothing of real value left in the sub field.

Maybe I understood it wrong, but I figured I would try to give a summary for others who don't feel like watching it / can't watch it.
 
  • #10
What does the "anomaly problem" mean?
 
  • #11
bcrowell said:
What does the "anomaly problem" mean?

I don't remember exactly, it's been around for a while though. Roughly speaking, canonical LQG attempts to solve QG using two things: new variables, and a non standard approach to applying constraints. The approach to applying constraints has inconsistencies, specifically with the Hamiltonian constraint, and these inconsistencies are collectively referred to as "anomalies".

Over the years there has only been partial progress towards fixing the problem, or in other words it's still there. I think the standard response in the community is to hide in spin foam models, which typically avoid the the Hamiltonian and therefore don't have/hide the anomaly problem.

Edit: There was a paper I read that explained the inconsistency at a fairly elementary level (using a simple qm problem to illustrate it). Unfortunately I can't find it, maybe someone else knows what I'm talking about and can link it for you.

2nd Edit: Tom Stoer linked the paper I was talking about below, as well as explained it in much better detail. The paper is http://arxiv.org/pdf/1009.4475v1.pdf. If you read the section "a simple example", at the very least you will the see the problem illustrated.
 
Last edited:
  • #12
The anomaly problem sais that loop quantization destroys the classical symmetries of the Holst action and that LQG is inconsistent and not viable as a quantum theory of gravity.

There are several related issues.

The constraint algebra consists of three constraints Gauß G, diffeomorphism constraint D, Hamiltonian H (relict of timelike diffeomorphism due to spacelike foliation). The constraint algebra is non-Lie, the commutator of H[f] and H[g] with two testfunctions f and g involves structure functions instead of constants; a canonical treatment or solution is not known!

The constraints are not solved equivalently but step-wise. H cannot be solved at all (up to know); formal solutions are known but unphysical. When discretizing and using the Weyl algebra quantization the infinitesimal generators D is no longer defined; only finite diffeomorphisms can be defined. That is a first hint that the quantization affects the constraint algebra in an uncontrolled way.

A quantum version of H is still unknown (not unique) so its constraint algebra is not known (ambiguous), either! Thiemann's trick plus quantization / discretization results in a H that seems to be unphysical (does not create volume). Some steps in the quantization seem to be ad hoc. That is a second hint that the quantization affects the constraint algebra in an uncontrolled way.

The derivation of the LQG constraint algebra involves one time-gauge fixing before quantization. That is well-known from QED and QCD (temporal or Weyl gauge). All other gauge fixings are done after quantization. Attempts to check consistency w/o time-gauge using full constraint quantization a la Dirac generates 2nd class constraints for a consistent treatment / solution in LQG is still unknown (2nd class constraints modify the commutator relations). There are hints that unphysical gauge d.o.f. become dynamical gauge artifacts when quantizing incorrectly! (Alexandrov)

The way the constraints are implemented are far from natural. Time-gauge is applied by setting the d.o.f. = the field to zero. Other constraints are solved by applying them to states. But as infinitesimal diffeomorphisms fail to be defined after quantization this seems to be problematic (I do not know what is the current status but the way Thiemann implemented the constraints results in a rather strange, ultra-local topology). Sometimes constraints are implemented a la Gupta-Bleuler which is known to be problematic in non-abelian gauge theories. There is nothing explicitly wrong, but many constructions are not unique, ad hoc or seem to be strange.

There are hints that the operator algebra does not close after quantization. It only closed modulo constraints. The way these additional terms vanish depend in some sense on the above mentioned tricks. So this is a hint that quantization destroys the off-shell closure of the algebra and that the quantized theory contains gauge artifacts / unphysical gauge d.o.f.!

The algebra depends implicitly on the Immirzi parameter β which is still not fully understood and which introduces a quantization ambiguity. A natural choice β=i is not viable b/c it introduces reality conditions which are not fully understood in the canonical framework. For real β the Hamiltonian is awefully complicated. When looking at it from the spin foam perspective the simplicity constrains are somehow related to the reality conditions. But the simplicity constraints become second-class, affect the path integral measure which has not been worked out completely. All this indicates that we should not expect too many insights from the spin-foam approach (as far as I can see the path integral is never able to solve fundamental problems).

There have been some papers (by Perez?) studying the relation of discretization and quantization; and even discretization in the classical theory. The relation of spin foams and the canonical approach indicates that both approaches are somehow "singular limits" where curvature etc. is located edges / vertices. That could mean that "discretization + quantization" is one major problem. It is clear that discretization affects the constraint algebra in several ways, especially in the path integral approach, so this is another hint that inconsistencies are introduced in an uncontrolled way.

Unfortunately there is not one paper summarizing these issues. The spin foam community has - to a large extent - decided to work on physical applications instead of consistently defining their theory. The first paper I am aware of is Nicolai's "outside view" from 2005 and Thiemann's reply in his "inside view" from 2006. My feeling was always that Thiemann did not fully address all issues. Then there are Alexandrov's papers from 2010-2012 or so, the best summary I am aware of which discusses many of these issues in detail - unfortunately w/o finding a solution.

http://arxiv.org/abs/hep-th/0501114
http://arxiv.org/abs/hep-th/0608210

http://arxiv.org/abs/1009.4475

https://www.physicsforums.com/showthread.php?t=545596
https://www.physicsforums.com/showthread.php?t=544728

You should have a look at the two threads where we discussed Alexandrov's in-depth analysis of the anomaly problem in detail (but forget his own proposal which did not succeed)

One remark: it is often said that Alexandrov's papers are outdated b/c they do not take the EPRL/FK models into accout. As far as I can see this is wrong b/c these two approaches do neither address nor solve any of the above mentioned issues! The anomaly problem is still there.

Another remark: all consistency checks regarding graviton propagators, vertices, semi-classical limit etc. are irrelevant b/c they do not address the regime whe the anomaly would kill the theory.

I do not know whether the spinor/twistor approach (Wieland's papers) sheds new light on these issues. For me it seems that it's nothing else but the introduction of new variables which is in many ways equivalent to the standard approach (and therefore has the same problems). But perhaps I am overlooking something.

I do not know whether Thiemann's dust approach will help; as far as I can see there is no progress regarding these anomaly related issues from the Erlangen group.
 
Last edited:
  • #13
http://arxiv.org/pdf/1101.3294v4.pdf

"Calculating these constants for the EPRL/FK vertex amplitude appears
to be a difficult problem, but the solution must exist"

just optimism...

.
 
Last edited:
  • #14
I don't know if you've seen the following talks, but there some talks about Group Field Theory.
 
  • #15
GFT doesn't help;

And EPRL/FK doesn't help, either. It has intrinsic problems regarding simplicity constraints (2nd class) and it has the cosine-problem.

My experience from non-abelian gauge theory is that path integrals do not solve but hide conceptual problems. And I am convinced this is true in LQG as well.
 
Last edited:
  • #16
By the time of http://arxiv.org/abs/1208.1463 , Bojowald knew of the signature change which is causing the trouble. The signature change means that if one imposes a gauge that assumes eg. Lorentzian signature at the start, but the mathematics shows Euclidean signature, then that's like a gauge anomaly, which means the procedure is not consistent. But in the same paper, Bojowald writes that the theory makes testable and falsifiable predictions. How can an inconsistent theory make a prediction?
 
  • #17
He tries to try to solve these problems in these papers:

http://arxiv.org/abs/1302.5695

Deformed General Relativity

Martin Bojowald, George M. Paily
(Submitted on 19 Dec 2012)
Deformed special relativity is embedded in deformed general relativity using the methods of canonical relativity and loop quantum gravity. Phase-space dependent deformations of symmetry algebras then appear, which in some regimes can be rewritten as non-linear Poincare algebras with momentum-dependent deformations of commutators between boosts and time translations. In contrast to deformed special relativity, the deformations are derived for generators with an unambiguous physical role, following from the relationship between canonical constraints of gravity with stress-energy components. The original deformation does not appear in momentum space and does not give rise to non-locality issues or problems with macroscopic objects. Contact with deformed special relativity may help to test loop quantum gravity or restrict its quantization ambiguities.

http://arxiv.org/abs/1302.5695

Quantum matter in quantum space-time

Martin Bojowald, Golam Mortuza Hossain, Mikhail Kagan, Casey Tomlin
(Submitted on 22 Feb 2013 (v1), last revised 7 Mar 2013 (this version, v2))
Quantum matter in quantum space-time is discussed using general properties of energy-conservation laws. As a rather radical conclusion, it is found that standard methods of differential geometry and quantum field theory on curved space-time are inapplicable in canonical quantum gravity, even at the level of effective equation
 
  • #18
atyy said:
By the time of http://arxiv.org/abs/1208.1463 , Bojowald knew of the signature change which is causing the trouble. The signature change means that if one imposes a gauge that assumes eg. Lorentzian signature at the start, but the mathematics shows Euclidean signature, then that's like a gauge anomaly, which means the procedure is not consistent. But in the same paper, Bojowald writes that the theory makes testable and falsifiable predictions. How can an inconsistent theory make a prediction?

Well if you know some classical logic, then you know that from an inconsistent theory you can prove anything.

:-)
 
  • #19
I am not sure whether a quantum anomaly means inconsistency in the sense of 1=2. Usually anomalies mean that a mathematically consistent theory does no longer make sense physically.
 
  • #20
tom.stoer said:
I am not sure whether a quantum anomaly means inconsistency in the sense of 1=2. Usually anomalies mean that a mathematically consistent theory does no longer make sense physically.

I am not sure I understand this "does no longer make sense physically."

I mean for me, physically make sense is that a solution for a PDE for example doesn't diverge at infinity.

But when we check for physical theories, the only validity (I mean to make it physically sensible) is the experiments that are done to verify or falsify the theory, as long as the maths is consistent (which we assume as well, we can't prove it, we can only disprove its consistency).
 
  • #21
Let's make an example: "loss of unitarity" means that the result of a calculation is e.g "120% probability for the sum of all decay channels". If the math predicts the 120% uniquely, i.e. not 120%, 3457%, ... depending on whatever, then the theory is consistent mathematically: one can prove that a certain quantity has the value 120. But the theory is not viable physically, b/c the 120 can no longer be interpreted as probability for physical processes.

I do not know exactly what happens in LQC, but I doubt that one has derived a mathematical contradiction like 1=2. I guess the theory has defects such that physical predictions and interpretations are impossible. If you start in a certain time-gauge and observe a signature change this means that there is no time-direction any more. I do not see whether this is something like 1=2, i.e. whether there is a mathematical inconsistencies, or whether the theory is only no longer applicable physically.

But to be honest, this distinction is of minor importance.

What bothers me more is that time-gauge is an essential ingredient for full LQG as well. Afaik other approaches avoiding time-gauge and applying the full Dirac program to all d.o.f. has never succeeded. Alexandrov found rather complicated expressions for Dirac brackets such that a standard Lie-algebra like approach is impossible (even in time-gauge it's difficult due to the structure functions generated by the Hamiltonian constraint). So from these expressions one is not able to construct a consistent operator algebra b/c there is no way to define a reasonable operator ordering. Therefore gauge anomalies cannot be avoided in this approach.

So it might be that full LQG suffers from the same problems as LQC in time-gauge, but in addition is insolvable w/o time gauge. That would kill the theory.

But there is some hope: not many people paid attention to these conceptual problems ob the last years; issues have been ignored or controverted (partially inadequate); spin foam peopl trying to solve for the semiclassical limit, propagators on n-point functions; many people focussed on black hole entropy, state counting etc.; there was a lot of work in the spinor/twistor area with some new tools and insights. So not everything is lost.

One problem could be that the successes of the theory have been overrated and that the command "back to the drawing board" comes inopportunely.

From my perspective the situation is as follows:
- kinematics is well understood, some proofs (LOST) do exist, the arena is clear
- physical applications are on their way, tools are available
- the main open issues have been identified, the conceptual problems are known

Unfortunately regarding consistency = off-shell closure of the constraint algebra, absence of anomalies and well-defined dynamics = Hamiltonian with regularization / finiteness there is no progress over the last decade. In addition the (related) quantization ambiguity called "Immirzi parameter" is not fully understood. Last but not least a deSitter deformation / quantum group construction / cosmological constant is not developed to the same degree as the standard LQG framework (this is a severe problem b/c in all LQG approaches this enters as an algebraic quantum deformarmation whereas in other approaches like asymptotic safety is running coupling constant subject to renormalization group flow).

To continue with my perspective I see two major issues
- consistent, anomaly-free quantization
- role of the cosmological constant
 
Last edited:
  • #22
...dead ......:rolleyes:


.
 
  • #23
audioloop said:
...dead ......:rolleyes:.

I don't think one can be certain about these things.

 
Last edited by a moderator:
  • #24
atyy said:
I don't think one can be certain about these things.

I'm certain I've read something about uncertainty somewhere... hmm, where, where?

Aww, never mind, my computer is getting low on ink, anyway... carry on.




OCR
 
  • #25
rovellitis? is that an insider? what does he mean with that?
 
  • #26
Morbus Rovelli?
 
  • Like
Likes 1 person
  • #27
atyy said:
I don't think one can be certain about these things.

...lol... ...et mortui adhuc ambulamus

...
dead-thread-smiley.gif
 
Last edited by a moderator:
  • #29
John86 said:
article that makes interesting statements

http://johnwbarrett.wordpress.com/tag/category-theory-and-physics/

remains swarming around loop or string views and apart, reflecting reiterative approaches of conversion i.e. quantization or relativization.

interesting, but the problem calls for a new radical o totally unexpected ideas.


.
 
  • #30
I don't follow LQC at all, but I thought of it while reading "Strings in compact cosmological spaces", which develops a formalism to describe transition amplitudes between cosmological slices. String theory in closed backgrounds develops divergences that aren't cured by the usual stringy magic, but in this case they are canceled by adding to the formalism, background minisuperspace variables which describe the backgrounds in which the strings might be moving. (Just to be specific, divergences which appear at the "one-loop" level are canceled by minisuperspace variables at the "tree" level.)

Here string problems were solved by taking into account the quantum geometry of minisuperspace. I wonder if some of the problems of LQG/LQC can be solved by introducing matter?
 

1. What is quantum cosmology?

Quantum cosmology is a branch of theoretical physics that combines principles of quantum mechanics and general relativity to study the origin and evolution of the universe.

2. How does matter affect quantum cosmology?

Matter plays a crucial role in quantum cosmology as it is responsible for the formation of structures in the universe, such as galaxies and stars. It also affects the dynamics of the universe through its gravitational interactions.

3. Can quantum cosmology be saved with the addition of matter?

There is ongoing research and debate on whether the addition of matter can help resolve some of the challenges and limitations of quantum cosmology. Some theories suggest that matter can provide a more complete understanding of the universe, while others argue that it may complicate the already complex mathematical models.

4. What are the potential implications of adding matter to quantum cosmology?

The addition of matter to quantum cosmology could potentially provide a more comprehensive understanding of the universe, including its origin, structure, and evolution. It may also help reconcile the discrepancies between quantum mechanics and general relativity, two fundamental theories that govern the behavior of matter and the universe.

5. What are some current research efforts in this area?

Scientists are currently exploring various approaches to incorporate matter into quantum cosmology, such as loop quantum cosmology, string theory, and brane cosmology. These theories are being tested through mathematical models and observational data, such as from the cosmic microwave background radiation, to validate their predictions and implications.

Similar threads

  • Beyond the Standard Models
Replies
3
Views
2K
  • Beyond the Standard Models
Replies
9
Views
485
  • Beyond the Standard Models
4
Replies
105
Views
10K
  • Beyond the Standard Models
Replies
28
Views
4K
  • Beyond the Standard Models
Replies
1
Views
1K
  • Beyond the Standard Models
Replies
1
Views
188
  • Beyond the Standard Models
Replies
1
Views
3K
  • Beyond the Standard Models
Replies
4
Views
1K
  • Beyond the Standard Models
Replies
1
Views
2K
  • Beyond the Standard Models
Replies
24
Views
4K
Back
Top