Thiemann's New Theory: Solving Problem of Time in GR & Cosmology

  • Thread starter Thread starter selfAdjoint
  • Start date Start date
  • Tags Tags
    Theory
selfAdjoint
Staff Emeritus
Gold Member
Dearly Missed
Messages
6,843
Reaction score
11
Marcus points us to this new paper:

http://arxiv.org/abs/astro-ph/0607380
Solving the Problem of Time in General Relativity and Cosmology with Phantoms and k -- Essence
Thomas Thiemann

He introduces a negative energy scalar field, which he calls phantom because it is unobservable, in order to solve the problem of time in GR. He issues this challenge to common wisdom:

All textbooks on classical GR incorrectly describe the Friedmann equations as physical evolution equations rather than what they really are, namely gauge transformation equations. The true evolution equations acquire possibly observable modifications to the gauge transformation equations whose magnitude depends on the physical clock that one uses to deparametrise the gauge transformation equations.

Any comments?
 
Space news on Phys.org
selfAdjoint said:
Any comments?

I printed it out and am having a look. In the "links thread" I didn't give the abstract summary of the paper, but if we are going to discuss it , I may as well copy that in

"We show that if the Lagrangean for a scalar field coupled to General Relativity only contains derivatives, then it is possible to completely deparametrise the theory. This means that
1.Physical observables, i.e. functions which Poisson commute with the spatial diffeomorphism and Hamiltonian constraints of General Relativity, can be easily constructed.
2. The physical time evolution of those observables is generated by a natural physical Hamiltonian which is (constrained to be) positive.

The mechanism by which this works is due to Brown and Kuchar. In order that the physical Hamiltonian is close to the Hamiltonian of the standard model and the one used in cosmology, the required Lagrangean must be that of a Dirac -- Born -- Infeld type. Such matter has been independently introduced previously by cosmologists in the context of k -- essence due to Armendariz-Picon, Mukhanov and Steinhardt in order to solve the cosmological coincidence (dark energy) problem. We arrive at it by totally unrelated physical considerations originating from quantum gravity. Our manifestly gauge invariant approach leads to important modifications of the interpretation and the the analytical appearance of the standard FRW equations of classical cosmology in the late universe. In particular, our concrete model implies that the universe should recollapse at late times on purely classical grounds."
 
Last edited:
the first thing I hope someone will correct or confirm for me is this: it looks to me as if in the context of purely classical cosmology where for generations people have been using the FRW model (the Friedmann equation)
there is a little problem in this classical context simply because the TIME PARAMETER t that one finds in the solution to the Friedmann model is only formally a time parameter. It is not something that could be measured by a clock.

To me it has always looked like a very good time parameter and it is what cosmologists conventionally use to clock the evolution and age of the universe.

but when one scrutinizes it, the classical time parameter in the Friedmann model has only a kind of "honorary" or fiducial existence---it is more adopted for the sake of convenience-----as a parameter to keep track of developments----than as something forced on us by nature.

acknowledging this, if in fact it is true, goes against habit and is damnably awkward. perhaps someone will explain that it is not true, which would be a relief.

in any case this appears to be an awkwardness in the classical picture that carries over into Quantum Gravity. Or perhaps is only really problematical in canonical QG. there is a hamiltonian constraint, with which kosher observables commute, and no time observable commutes and can therefore be kosher.

Thiemann grapples with this by actually concocting an ERSATZ CLOCK in the form of a peculiar scalar field----quite an elusive one, I gather.
 
Particularly interesting to me are his comments about dark energy and dark matter (since this has immediate practical consequence in cosmology):

Thiemann said:
Notice by the equation of state the phantom behaves
like dust at small scales ρphantom → −E/O3 a and as a negative cosmological constant ρphantom → −α at large scales. This can be easily compensated by additional positive energy k – essence matter or simply by ordinary (dark?) matter plus an additional cosmological constant term  − α > 0. In a sense, if we want to explain the observational fact that the FRW equations describe the universe while their mathematical derivation violates the principles of gauge theory, then something like a phantom is needed for deparametrisation and in turn it requires something like k – essence for reasons of total positive matter energy budget. From this point of view, both a phantom and k – essence are a prediction of the mathematical formalism (gauge theory) together with observation
(FRW cosmology).

It sounds to me like he's saying that he has not solved the dark matter or dark energy problems, but rather given us another potential reason to need to solve them. He has introduced another dark component (the phantom) and said that, in order for this "phantom" to exist and for GR to work, we must have two other "dark" things (dark matter and dark energy). Interestingly, he makes this sound like a strength of the theory. I suppose it is better than adding a phantom that doesn't require dark matter and dark energy.

Also, it's not clear to me that he has given a need for dark matter, just a need for enough matter to balance the early-time contribution of the phantom. That the matter is dark would seem to remain a mystery in this theory.

There might be one positive thing, however. I didn't read the paper in a lot of detail, so perhaps it was discussed, but it seems like this could solve the fine-tuning problem. If the phantom scalar field had negative energy density, then the vacuum could have a positive energy density larger than the observed net energy density of the cosmological constant. In other words, a Planck scale cutoff for QFTs would then be possible.
 
In particular, our concrete model implies that the universe should recollapse at late times on purely classical grounds.
IMHO this claim is unsupported. In the paper he concedes that the big-rip solution is a possible one and he discards it because it is "clearly undesirable" due to the infinite scale factor. I cannot follow this.

By the way, the paper is very difficult. It would be great if someone could explain the meaning of this fourth "main message":

the usual interpretation of the cosmological framework, although fundamentally wrong because gauge transformations of gauge dependent objects are interpreted as actual physical evolution equations of observables, remains valid when analysed in the correct way, that is, by computing the physical evolution of gauge invariant observables
 
https://en.wikipedia.org/wiki/Recombination_(cosmology) Was a matter density right after the decoupling low enough to consider the vacuum as the actual vacuum, and not the medium through which the light propagates with the speed lower than ##({\epsilon_0\mu_0})^{-1/2}##? I'm asking this in context of the calculation of the observable universe radius, where the time integral of the inverse of the scale factor is multiplied by the constant speed of light ##c##.
The formal paper is here. The Rutgers University news has published a story about an image being closely examined at their New Brunswick campus. Here is an excerpt: Computer modeling of the gravitational lens by Keeton and Eid showed that the four visible foreground galaxies causing the gravitational bending couldn’t explain the details of the five-image pattern. Only with the addition of a large, invisible mass, in this case, a dark matter halo, could the model match the observations...
Hi, I’m pretty new to cosmology and I’m trying to get my head around the Big Bang and the potential infinite extent of the universe as a whole. There’s lots of misleading info out there but this forum and a few others have helped me and I just wanted to check I have the right idea. The Big Bang was the creation of space and time. At this instant t=0 space was infinite in size but the scale factor was zero. I’m picturing it (hopefully correctly) like an excel spreadsheet with infinite...
Back
Top