Introduction To Loop Quantum Gravity

  • #61
somebody might wish to ask how Loll-style "Triangulations" gravity implements background independence and diffeo invariance (or reasonable substitute, since it doesn't have any diffeos)
 
Physics news on Phys.org
  • #62
Introduction to "Triangulations" quantum gravity

the triangulations QG approach of Loll and coworkers looks like the most interesting, and perhaps promising, development being pursued by the people participating in this years "Loops 05" conference.

It is one of the broadly defined "Loop-and-allied" approaches that Loop people do----not narrowly defined core LQG. there are a bunch of approaches that deal with similar stuff but differ in details.

this thread can serve a useful purpose as an INTRODUCTION to more than just one of the Loop-and-allied approaches. Probably the most timely to consider at the moment is CDT-style Triangulations.

As a point of departure here is how the abstract of a recent landmark CDT paper starts off:

"We provide detailed evidence for the claim that nonperturbative quantum gravity, defined through state sums of causal triangulated geometries, possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale,..."

this is from
http://arxiv.org/hep-th/0505154
Reconstructing the Universe
J. Ambjorn (NBI Copenhagen and U. Utrecht), J. Jurkiewicz (U. Krakow), R. Loll (U. Utrecht)
52 pages, 20 figures

here is a short reading list
https://www.physicsforums.com/showpost.php?p=585294&postcount=59

Now what I want to do is describe the CDT "Triangulations" method as simply as I can.
 
Last edited by a moderator:
  • #63
First there is a kind of birdseye view illustrated by a talk that Renate Loll gave in 2002 called
"Quantum gravity IS counting geometries"

It is a possible approach---sometimes called "state sum" or "path integral"
Its roots go back to the Feynman path integral for a particle where you add up all the possible (approximate piecewise straight) paths the particle might take to get from here to there----with complex weights to make a kind of weighted average. It is a way to get probability amplitudes and calculate things about the quantum path the particle takes. This turns out to be 100 percent of the time very ROUGH, nowhere is it even differentiable, but nevertheless intuitively it kind of blurs or fuzzes out to resemble a smooth continuously differentiable classical path that you might expect from freshman calculus.

In the "state sum" approach to a particle's path the calculation is in effect counting lots of different (piecewise straight) paths. And adding them all together, with a system of weighting that embodies the microscopic dynamics, to get answers. It has been a very successful method. the CDT authors found out how to apply it to spacetimes.

A spacetime is like a path, from space being this way to space having evolved to be that way, or more grandly from the beginning of a universe to its end. In QUANTUM gravity, that is in quantum spacetime dynamics, one is not certain exactly which path it took. One only has amplitudes of various ways of evolving from this shape to that shape. It is very much analogous to the particle path. You can even think of the universe wandering around in the space of all geometries and its evolution an actual path, but I can see no compelling reason to think so abstractly as that about it.

to put it simply, the CDT authors found a way to approximate (by piecewise flat geometries, made of flat Minkowski building blocks) all the possible spacetimes that get you from here to there, or from the beginning to the end. And they found a way to compute experimental answers from the STATE SUM of all these geometries.

After that it almost seems obvious and really straightforward. They can generate random spacetimes, random histories of the universe, as 4D worlds living in the computer memory, and they can HAVE LITTLE IMAGINARY MEN RUN AROUND IN THEM TO EXPLORE THEM, by taking random walks----a so called diffusion process---which is a way of finding out about the geometry, like what dimension it really is in there.

then after studying each random example they can add everything up with the usual weights (there are actually two sets of weights connected by the Wick trick, one set is simply real numbers like ordinary probabilities and the other set of weights is complex amplitude-type numbers but this doesn't matter to the overall picture) So they add everything up in a weighted average and get the state sum report (from the little imaginary men) on how it is in there.

Now having done this, Loll and co-workers are catching results like the fish are running. They are just pulling them in hand over hand. throw in the line and hook one every time.
This is a big change from the Nineties when many people worked the state sum triangulations approach but didnt catch anything edible. everything they got was the wrong dimension.

so this is part of an overview.

what I have to EXPLAIN is how they set up one of these layered triangulated geometries----and how they then shuffle the cards so as to get a series of random geometries. this is the nutsandbolts part.

a 4-simplex is the 4D analog of a triangle and they build these appoximate piecewise flat geometries out of two TYPES of 4-simples, the
"level"-kind and the "tilt"-kind

they call them the (4,1) kind and the (3,2) kind. it is how the vertices are destributed between two causal layers

I have to balance giving an overview with giving some introductory nutsandbolts.
 
  • #64
Here is some more overview. It elucidates the "state sum" idea of adding up all possible geometries. and the essential business of reducing the calculation to COUNTING.


In this case what we have is something from an American Physical Society publiscation

http://focus.aps.org/story/v14/st13

The American Physical Society sponsors the major peer-review journals series Phys. Rev. and Physical Review Letters. And they pick out articles for highlighting journalistically in the accompanying publication Physical Review Focus.

This is from Adrian Cho's Focus article on a paper by Loll and co-workers.

<<The researchers added up all the possible spacetimes to see if something like a large-scale four-dimensional spacetime would emerge from the sum. That was not guaranteed, even though the tiny bits of spacetime were four-dimensional. On larger scales the spacetime could curve in ways that would effectively change its dimension, just as a two-dimensional sheet of paper can be wadded into a three-dimensional ball or rolled into a nearly one-dimensional tube. This time the researchers found that they could achieve something that appeared to have one time dimension and three space dimensions--like the universe we know and love.

"It's exceedingly important" work, says Lee Smolin of the Perimeter Institute for Theoretical Physics in Waterloo, Canada. "Now at least we know one way to do this." Des Johnston of Heriot-Watt University in Edinburgh, Scotland, agrees the work is "very exciting" and says it underlines the importance of causality. "The other neat thing about this work is that you're essentially reducing general relativity to a counting problem," Johnston says. "It's a very minimalist approach to looking at gravity.">>
 
  • #65
...
...this is part of an overview.

what I have to explain is how they set up one of these layered triangulated geometries----and how they then shuffle the cards so as to get a series of random geometries. this is the nutsandbolts part.

a 4-simplex is the 4D analog of a triangle and they build these appoximate piecewise flat geometries out of two TYPES of 4-simples, the
"level"-kind and the "tilt"-kind

they call them the (4,1) kind and the (3,2) kind. it is how the vertices are destributed between two causal layers

I have to balance giving an overview with giving some introductory nutsandbolts.

the best source on the basics is http://arxiv.org/hep-th/0105267

we have to COUNT THE CAUSAL GEOMETRIES of spacetime, it sounds terribly hard but it isn't and they managed to program it, and it's the basic job we can't get around

causal means LAYERED, each model of spacetime gets laid down in sheets or slices, like a book with pages or a tree-trunk with rings, an event in one layer can only be caused by something from a deeper layer----or think of it like a many-storied building.

so we have to BUILD ALL POSSIBLE LAYERED SPACETIME GEOMETRIES in such a way that we can COUNT THEM or anyway explore to find what are the most numerous kind or the most likely kind, or somehow average them.

Maybe in the end we won't be able to count them exactly but we will have statistics and averages and random samples about them just as if we could actually count them. We will take the census of these layered spacetime geometries.

The technique will be to learn how to build layered geometries using "triangular" building blocks cut out of the txyz space of special relativity

these blocks will all be the same size----their spatial edges will be a fixed length 'a' that we will successively make smaller and smaller---and they will be of two kinds. the LEVEL kind and the TILT kind. The level kind is like a pyramid which has a 3D spatial tetrahedron as its base, on one floor of the building, and its 5-th vertex on the floor above or, the upsidedown version, the floor below.

the authors write the level kind as either (4,1) or (1,4), because it has 4 vertexes (the 4 vertices of the tetrahedron) on this floor and 1 vertex on the floor above, or viceversa one vertex on this floor and 4 on the floor above

intuitively one layer is all of 3D space, and the spacetime history of the universe is being built 3D layer by 3D layer, so it is like a book except the pages are 3D.

a LEVEL kind of building block has 4 timelike edges going from each of the four corners of its spatial tetrahedron base up to the vertex on the floor above, or else going down to the solitary vertex on the floor below. the other kind of buildingblock is like the LEVEL kind but tilted over so that now one of those timelike edges becomes a ridge and is entirely in the floor above, and instead of sitting on a full tetrahedron base it is now only sitting on a triangle side of it.

the authors write the TILT kind of buildingblock as either (3,2) or (2,3)
because it has 3 vertices on one floor, that make its spatial triangle base, and it has 2 vertices on the floor above or below, that make this ridge I mentioned. Like the ridge of a roof or the keel of a boat, depending it is up or down.

the TILT kind has 4 spacelike edges (three for the triangle and one for the base) and it has 6 timelike edges, whereas the LEVEL kind had 6 spacelike (that you need to make a tetrahedron) and 4 timelike.

the quickest way to understand this business is to follow through the analogous 3D case which is spelled out in
http://arxiv.org/hep-th/0105267

there, the building blocks are tetrahedrons---spatial layers are intuitively 2D, like the pages of a book---everything is easy to imagine, and they have a lot of drawings

but I am trying to discuss modeling 4D spacetime geometry without first going thru the 3D case.
 
Last edited by a moderator:
  • #66
after that it is not too hard to say, in general terms, how the method works

you want to get the EFFECT of building a layered geometry with, say, a halfmillion identical LEVEL kind buildingblocks
and howevermany you need (which will also be about halfmillion) TILT kind blocks to fill in.

because when you try to build layers with the LEVEL kind it always turns out that you get gaps which are just right to fit the other kind into, so it turns out that to build up layer by layer you need approximately the same number of the other kind.

NOW YOU DONT ACTUALLY BUILD EVERY POSSIBLE LAYERED SPACETIME GEOMETRY with these million virtually identical blocks

it is like taking an opinion poll where you don't talk to everybody, you take a random sample.

you want the EFFECT of having built all of them, and studied each one, and counted and made statistics about how they all are. you don't want to actually do it. you want the effect as if you did it.

this is where "shuffling the deck of cards" comes in. the CDT authors call it "thermalizing" the geometry. you set up a very simple plain geometry to start with, in computer memory, and then you do RANDOMIZING PASSES thru it, until it gradually becomes totally unrecognizable.


like, have a look at Figures 4,5 and 6 in "Reconstructing the Universe"
http://arxiv.org/hep-th/0505154
they are all three quite different-looking but they all come from starting with a simple initial geometry and doing randomizing passes.

the authors call each pass "making a sweep", and each sweep involves doing a million or so "Monte Carlo moves" which are individual shuffles that change some of the building blocks around.

they use a lot of computer time. thermalizing (thoroughly randomizing) a geometry can take a week on a workstation. then you study it and measure things

when you have a random geometry you can run random walks in it, or diffusion process, and you can measure distances and volumes and see how they relate...
 
Last edited by a moderator:
  • #67
here is an example.
say you have built a layered spacetime geometry in your computer and you pick just one spatial layer and you want to explore that by a random walk.

well a spacelike slice is just made of the tetrahedrons which were the bases of the LEVEL buildingblocks!
So you have a set of things in your computer which are little 4-face pyramids (equilateral triangle bottom, so 3 side faces and a bottom face, except no way to distinguish the bottom from the other faces). And this bunch of tetrahedrons are fitted together some way so that every face of one is up against the face of some other.

So you can pick a random block to start in, and then TOSS A FOURSIDED COIN to select which face to go out of

and when you go out one face you are now in a new tetrahedron and you can toss the foursided coin to choose which face to pass thru, and again and again.

it will be a clue to the actual dimensionality of the spatial slice to see if you get completely lost by doing this random walk, or if you now and then get back home to where you started. the authors determine the probabilities EMPIRICALLY by actually running the random walks in the computer, and this tells them about the dimensionality of the spatial slices

the nice thing is the answers gotten this way are weird and quite Alice-in-Wonderland. at microscopic level the continuum (as pictured by CDT) is a non-classical, unexpected world which Lewis Carroll would have loved.
 
Last edited:
  • #68
the Einstein Hilbert action

in the path integral or state sum approach you make a WEIGHTED sum using a badness or handicap function S(path) which is large for really kooky unphysical paths

this is how you introduce the classical dynamics into the quantum picture.

Feynmann has an essay about the "least action principle" in his Lectures, it is one of the core things in the Feynman physics textbook. It is terribly important and he suddenly takes a serious tone of voice when he gets to it.

the way you do classical physics is you consider all the possible paths and you DIFFERENTIATE S(path) and set to zero so as pick the one and only one path that MINIMIZES S(path)-------you pick the unique path that minimizes the "action", which is a word for badness or silliness of paths.

we have inherited our "action" functions from great old classical guys like Lagrange and Einstein and Hamilton, directly or indirectly, whom we revere, and they have the feature that minimizing them gets you the expected classical equations of motion

the way you do quantum physics is when you consider all possible paths you don't try to pick the unique winner, you ADD THEM ALL TOGETHER, but you don't do this in a completely indiscriminate way! You handicap each one by putting a little real or complex number "weight" on it. this is incidentally how they used to handicap racehorses, with a little weight, but you can do it at a different level with the betting odds too.

one weight you might consider putting on is exp( - S(path))
that is:
"eee to the minus badness"

e-badness

so if the badness is large it make the weight exponentially very small and then the path tagged with that weight will not count for very much in the sum or weighted average of all paths

YOU DO NOT JUST PICK ONE HORSE THAT IS YOUR FAVORITE, you add together all the horses, but you weight each one so your favorites count for more and the bad ones count for less-------you get a "composite" horse.

another kind of weight you might consider putting on is exp(iS(path))

"eee to the eye badness"

ei badness

As you may know from elementary complex numbers "eee to the eye theta" is complex numbers going around and around the unit radius circle.
and this is very clever because if you go around the circle, around zero, very fast it will average out to ZERO ITSELF by simple vector addition.

taking a step N and S and E and W adds up to going nowhere

so if you are averaging things with rapidly increasing badness and tagging them with "eee to the eye badness" numbers then these things with lots of badness will CANCEL EACH OTHER OUT in the sum and not have much influence on the sum

this is the two kinds of handicaps, the real number weights and the complex number weights from around the unit radius circle.

YOU CAN GET FROM ONE SET OF WEIGHTS TO THE OTHER SET OF WEIGHTS by the simple expedient of changing the eye into a minus sign, or the minus into an eye. this is called the WICK ROTATION, in honor of Joe Wick born in Torino around 1906.
 
  • #69
the Einstein Hilbert action and the Wick rotation

the Einstein action measures roughly speaking how much some spacetime is off from being a well behaved classical solution of the classical Einstein equation of General Relativity. So if you minimize the einstein action it you get the classical equation back.
so it measures how much the "path" is screwing up and getting distracted from its studies and cutting classes and taking dope and all what it isn't supposed to be doing---how "busy" it is with messing up---that "busy-ness" is the action. believe me you want to cut down on it.

a spacetime is just a path from the beginning of the world to the end, a path in "geometry space" if you can picture the space of all geometries.

the quantum idea is the universe doesn't just follow one path, it is a fuzzy mixture, well that is a rather distracting idea so let's not get into that.

the extremely beautiful thing is that with simplex geometries, with geometries built of triangles, YOU CAN IMPLEMENT THE ACTION FUNCTION JUST BY COUNTING DIFFERENT KINDS OF TRIANGLES

so even a computer, merely able to count up things in its memory, can do it

so we get back to our story where Loll and friends are running a computer model of spacetime, and the model is doing "sweeps" consisting of a million or so "Monte Carlo moves" which are localized elementary rearrangements of the simplex building blocks

each time they roll the dice and pick a monty move at random, they calculated some "badness" or "action" numbers to see whether to ACCEPT OR REJECT the proposed move!

this is how the localized microscopic "DYNAMICAL PRINCIPLE" that Loll talks about enters into the picture

it is this action principle operating at a microscopic Planckian or even maybe sub-Planckian level that the overall spacetime grows from. However it looks, whether it has 4 dimensions or 3 dimensions or some fraction etc, whatever its geometry, it grows out of many many local applications of the action principle at micro-scale.

I will try to find a quote.
 
  • #70
Yeah, this is going to seem very dry and overdetailed but it shows how the quantum spacetime dynamics, the path integral action principle, was implemented:

<<3. Numerical implementation

We have investigated the infinite-volume limit of the ensemble of causal triangulated four-dimensional geometries with the help of Monte Carlo simulations at finite four-volumes N4 = N(4,1) + N(3,2) of up to 362,000 four-simplices. A simplicial geometry is stored in the computer as a set of lists, where the lists consist of dynamic sequences of labels for simplices of dimension n from zero to four, together with their position and orientation with respect to the time direction. Additional list data include information about nearest neighbours, i.e. how the triangulation “hangs together”, and other discrete data (for example, how many four-simplices meet at a given edge) which help improve the acceptance rate of Monte Carlo moves. The simulation is set up to generate a random walk in the ensemble of causal geometries of a fixed time extension t. The local updating algorithm consists of a set of moves that change the geometry of the simplicial manifold locally, without altering its topological properties. These can be understood as a Lorentzian variant of (a simplified version of) the so-called Alexander moves [21, 22, 23], in the sense that they are compatible with the discrete time slicing of our causal geometries. For example, the subdivision of a four-simplex into five four-simplices by placing a new vertex at its centre is not allowed, because vertices can only be located at integer times tau . Details of the local moves can be found in [8]. As usual, each suggested local change of triangulation is accepted or rejected according to certain probabilities depending on the change in the action and the local geometry. (Note that a move will always be rejected if the resulting triangulation violates the simplicial manifold property.) The moves are called in random order, with probabilities chosen in such a way as to ensure that the numbers of actually performed moves of each type are approximately equal. We attained a rather high average acceptance rate of about 12.5%, which was made possible by keeping ...>>

By the way Alexander wrote his book in 1930. that is how far these "Monty Carlo moves" go back. they are just modified Alexander moves. Pachner is also cited. So this his how they "shuffle the deck".

Now I can say what part the Wick rotation plays. The Wick rotation changes the complex weights into real weights which can be dealt with as PROBABILITIES in this process of choosing the next random move, in "shuffling the deck" or randomizing spacetime geometry by Monte Carlo moves.

the probabilities enter each time you do a local rearrangement of some building blocks, you check whether that local microscopic rearrangement would be favored or disfavored by the Einstein equation. you do that by comparing badness. and it is still random----there is still always a chance that you can do a move that increases the badness, that happens lots in fact---but the probabilities are weighted against it (the House of general relativity wins over the long run). well maybe that is too impressionistic an impression.

I promised in Quantum Graffiti thread to say something about Wick rotation and Einstein Hilbert action
 
Last edited:
  • #71
marcus said:
YOU CAN GET FROM ONE SET OF WEIGHTS TO THE OTHER SET OF WEIGHTS by the simple expedient of changing the eye into a minus sign, or the minus into an eye. this is called the WICK ROTATION, in honor of Joe Wick born in Torino around 1906.
Is there a general theorem available for that purpose? I believe to remember that the mathematics behind that is usually quite non-obvious.
 
  • #72
It's a procedure in complex variables, called analytic continuation.
 
  • #73
selfAdjoint said:
It's a procedure in complex variables, called analytic continuation.
So Wick rotation = analytic continuation?
 
  • #74
Cinquero said:
So Wick rotation = analytic continuation?

You have the integrals defined on the real axis, corresponding to Minkowski space, but they don't converge there, because they have factors like e^{ut}, which is unbounded as t goes to infinity. They are howeve analytic in the half plane above the real axis, and by continuation therefore on the imaginary axis, which correspond to i\tau = t, or euclidean four space. Then the integrals convege because the factors now read e^{ui\tau} which is bounded for all tau. Then after you evaluate the integrals (they mostly reduce to a gaussian quadrature) you can rotate back.
 
  • #75
selfAdjoint said:
You have the integrals defined on the real axis, corresponding to Minkowski space, but they don't converge there, because they have factors like e^{ut}, which is unbounded as t goes to infinity. They are howeve analytic in the half plane above the real axis, and by continuation therefore on the imaginary axis, which correspond to i\tau = t, or euclidean four space. Then the integrals convege because the factors now read e^{ui\tau} which is bounded for all tau. Then after you evaluate the integrals (they mostly reduce to a gaussian quadrature) you can rotate back.

You've got that exactly reversed. It's the imaginary exponentials that fail to converge and are converted by Wick rotation. Another way of saying the same thing is that a Wick rotation takes a QFT (which has a i\hbar in the exponential) to a statistical mechanics (which has a -k/T in the exponential). Your note is essentially saying that +k/T diverges and this is true, but the solution in a Wick rotation is to rotate in the opposite direction. That way you end up with exponentials that converge. This all reminds me of the method of "steepest descent"[sp] that is used in Schroedinger's equation.

But that's not why I was reading the thread.

Stephen Hawking's latest paper uses "Euclidean Quantum Gravity":
http://arxiv.org/PS_cache/hep-th/pdf/0507/0507171.pdf

Does EQG have anything to do with LQG? My field is elementary particles, not gravitation. Sorry for the laziness. Hawking references a book I don't have immediate access to.

Carl
 
Last edited by a moderator:
  • #76
CarlB said:
...
Stephen Hawking's latest paper uses "Euclidean Quantum Gravity":
http://arxiv.org/PS_cache/hep-th/pdf/0507/0507171.pdf

Does EQG have anything to do with LQG? My field is elementary particles, not gravitation. Sorry for the laziness. Hawking references a book I don't have immediate access to.

Carl

"Euclidean QG" developed by hawking and friends in 1980s was a path integral AFAIK
and so it would be closer akin to Renate Loll Lorentzian path integral by CDT method ("causal dynamical triangulations") that we hear a lot about these days

Hawking never got Euclidean path integral to work, but he uses it to think with. It sounds a bit eccentric for him to call it the "only sane way to do nonperturbative QG"
the Lorentzian path integral people (Loll et al) have an equally nonperturbative approach that they are getting results with, including confirming a conjecture or two of hawking. No way is Loll's approach not sane. It is at least as sane as the Euclidean version.

I need to get you some online links. there is a 1998 survey of QG methods by rovelli which describes hawking Euclid. path integral. More recent online stuff do not discuss hawking's method very much because it is long obsolete except for him and one or two proteges. But I will get the link to the 1998 survey

Yes, here:
http://arxiv.org/abs/gr-qc/9803024
Strings, loops and others: a critical survey of the present approaches to quantum gravity
Carlo Rovelli
Plenary lecture on quantum gravity at the GR15 conference, Pune, India

"I review the present theoretical attempts to understand the quantum properties of spacetime. In particular, I illustrate the main achievements and the main difficulties in: string theory, loop quantum gravity, discrete quantum gravity (Regge calculus, dynamical triangulations and simplicial models), Euclidean quantum gravity, perturbative quantum gravity, quantum field theory on curved spacetime, noncommutative geometry, null surfaces, topological quantum field theories and spin foam models. I also briefly review several recent advances in understanding black hole entropy and attempt a critical discussion of our present understanding of quantum spacetime."
 
Last edited by a moderator:
  • #77
http://arxiv.org/abs/gr-qc/9803024
Strings, loops and others: a critical survey of the present approaches to quantum gravity
Carlo Rovelli

Section B. "Old hopes (becoming) approximate theories"

---quote Rovelli---
B. Old hopes -> approximate theories

1. Euclidean quantum gravity

Euclidean quantum gravity is the approach based on a formal sum over Euclidean geometries [[my comment: HERE ROVELLI GIVES THE PATH INTEGRAL, BUT I CAN'T COPY IT EASILY, it is labelled equation (6)]] As far as I understand, Hawking and his close collaborators do not anymore view this approach as an attempt to directly define a fundamental theory. The integral is badly ill defined, and does not lead to any known viable perturbation expansion. However, the main ideas of this approach are still alive in several ways. First, Hawking’s picture of quantum gravity as a sum over spacetimes continues to provide a powerful intuitive reference point for most of the research related to quantum gravity. Indeed, many approaches can be sees as attempts to replace the ill defined and non-renormalizable formal integral (6) with a well defined expression. The dynamical triangulation approach (Section IVA) and the spin foam approach (Section VC2) are examples of attempts to realize Hawking’s intuition. Influence of Euclidean quantum gravity can also be found in the Atiyah axioms for TQFT (Section VC1). Second, this approach can be used as an approximate Second, this approach can be used as an approximate method for describing certain regimes of nonperturbative quantum spacetime physics, even if the fundamental dynamics is given by a more complete theory. In this spirit, Hawking and collaborators have continued the investigation of phenomena such as, for instance, pair creation of black holes in a background de Sitter spacetime. Hawking and Bousso, for example, have recently studied the evaporation and “anti-evaporation” of Schwarzschild-de Sitter black holes [61]...
---end quote---

Equation (6) here looks very much like Loll's dynamical triangulations path integral. but they start with exp(iS) where S is the Regge form of Einst action.
Loll et al do a Wick rotation to get a euclidean version which gets used in the computer calculations.
This equation (6) is still very much like what Loll CDT starts with, but instead of a metric [g] there is a TRIANGULATION T. so they are summing over all triangulations of a particular kind. Otherwise it looks formally the same.

However there is a practical difference in that Loll et al can actually calculate. they do the sum (using montecarlo method) and get results.
some of these results have born out hawking conjectures, so they cite him a lot.

but his particular type of (euclidean) path integral i don't think any significant effort is being made to use it.

to compare hawking EQG with current CDT path integral, have a look at the first 2 or 3 pages of these two papers
http://arxiv.org/hep-th/0105267
http://arxiv.org/hep-th/0505154

you will see how close the CDT path integral is to Hawking's euclidean one.
 
Last edited by a moderator:
  • #78
in the past couple of pages of this thread we have been responding to questions from CarlB and cinquero and it may be time to regroup. I decided earlier that unless there is some reason not to do so we ought to make this thread serve as an introduction NOT ONLY to narrowly defined canonical LQG but to the main approaches to NONPERTURBATIVE QUANTUM GRAVITY.

That includes canonical LQG but also spin foams, and other path integral approaches like CDT. selfAdjoint, at one point, proposed the term "Background Independent Quantum Gravity" for the general field. Renate Loll seems to favor "Nonperturbative QG". The organizers of the Loop 05 conference use the collective modifier
"background independent/nonperturbative"
And Lee Smolin has started to say "relational".

But I think "nonperturbative" is going to win out as the mainest of mainstream term. As sideline observers we can't reform language, just have to go with the prevailing talk.

I think one of the ambient ideas in the Loop 05 conference is that if you can forge a concept "NQG" and impress on people's minds the idea that there is research in "nonperturbative quantum gravity" then maybe a few more universities will establish professorships in NQG or faculty positions of some kind in NQG. It will be perceived as a lack not to have some research in nonperturbative QG being conducted in the physics department.

It also means recommending each other's graduate students. if it is a field then there is more solidarity than if it is just a bunch of splinter group research lines.

Hermann Nicolai definitely would like some professorships in German universities that are echo or counterpart to his lines of reseach at AEI, he has talked about that in Die Zeit interview. And AEI is hosting Loop 05.

so it is time to assemble into a research field with an identifying label which is not String, and to get it recognized that a physics department has an embarrassing GAP if it doesn't have some research under way in Nonper Quavity.
 
  • #79
Let's recap the introduction to the triangulations approach---Loll CDT.
Here is a reading list from earlier in this thread
https://www.physicsforums.com/showthread.php?p=585294#post585294

Here's a short popularization by Loll, at her website, written for general audience
http://www.phys.uu.nl/~loll/Web/research/research.html

This PF thread has more stuff like that
https://www.physicsforums.com/showthread.php?t=77639&page=1&pp=15

========================
To give an idea of where the field is at the moment, I am simply going to quote, in its entirety, the first paragraph of each of Loll's three most recent papers. These papers are dated May, June, July 2005. The first paragraph of a research paper often gives a bit of an overview or some perspective on the field. This is a fastmoving field and this will be one way to keep up with where things are at the moment. We have no more recent survey available.


http://arxiv.org/hep-th/0505154
Reconstructing the Universe
http://arxiv.org/gr-qc/0506035
Counting a black hole
http://arxiv.org/hep-th/0507012
Taming the cosmological constant...topology change


Very encouraging progress has been made recently in constructing spacetime dynamically from a nonperturbative gravitational path integral, by studying the continuum limit of causal dynamical triangulations [1, 2, 3, 4]. The quantum geometries generated in this way exhibit semiclassical properties at sufficiently large scales: they are four-dimensional [5, 6] and the large-scale dynamics of their spatial volume is described by an effective cosmological minisuperspace action [7]. Their short-distance behaviour is highly nonclassical, including a smooth dynamical reduction of the spectral dimension from four to two [8] and evidence of fractality [6].

Despite recent progress [1, 2], little is known about the ultimate configuration space of quantum gravity on which its nonperturbative dynamics takes place. This makes it difficult to decide which (auxiliary) configuration space to choose as starting point for a quantization. In the context of a path integral quantization of gravity, the relevant question is which class of geometries one should be integrating over in the first place. Setting aside the formidable difficulties in “doing the integral”, there is a subtle balance between including too many geometries – such that the integral will simply fail to exist (nonperturbatively) in any meaningful way, even after renormalization – and including too few geometries, with the danger of not capturing a physically relevant part of the configuration space.

Nonperturbative quantum gravity can be defined as the quest for uncovering the true dynamical degrees of freedom of spacetime geometry at the very shortest scales. Because of the enormous quantum fluctuations predicted by the uncertainty relations, geometry near the Planck scale will be extremely rugged and nonclassical. Although different approaches to quantizing gravity do not agree on the precise nature of these fundamental excitations, or on how they can be determined, most of the popular formulations agree that they are neither the smooth metrics... (or equivalent classical field variables) of general relativity nor straightforward quantum analogues thereof. In such scenarios, one expects the metric to re-emerge as an appropriate description of spacetime geometry only at larger scales. [/color]

I'll try to interpret some---as time permits. but hopefully this is already fairly clear and doesn't need much explication
 
Last edited by a moderator:
  • #80
I had better keep a list of links to the prediction polls that folks at PF have so that when the time comes to look we can easily find the thread with the predictions

Background independence talks at Strings 06
https://www.physicsforums.com/showthread.php?t=85207
(when the programme of talks is posted, check to see who was right)

August-September hits on Smolin latest
https://www.physicsforums.com/showthread.php?t=83578
(in late September 2005, start checking
http://citebase.eprints.org/cgi-bin/citations?id=oai:arXiv.org:hep-th/0507235
to see if they are counting and registering downloads of "The case for background independence")

String Forecast Poll
https://www.physicsforums.com/showthread.php?t=81739
(around March 2006 check SLAC/Stanford for the 2005 HEP Topcites. This year around March 2005 they brought out the 2004 Topcites as usual. But they have not yet done the full job with Michael Peskin's review, which is worrisome. the list to check is whatever is analogous to this
http://www.slac.stanford.edu/library/topcites/2004/annual.shtml )

Will Loll etc. achieve sum over topologies in 4D?
https://www.physicsforums.com/showthread.php?t=81626
(this prediction poll has no definite declared cut-off date, which was an oversight. we will have to use reasonableness and see whether, in a reasonable time, Loll et al manage to extend the results on topology change to higher dimensions)
 
Last edited by a moderator:
  • #81
A major chronological bibliography for LQG
Over a thousand papers (arranged by date) often with arxiv numbers making online access easy
Over forty books and PhD dissertations.
Plus miscellaneous other useful sources of information.

http://www.arxiv.org/abs/gr-qc/0509039
Bibliography of Publications related to Classical Self-dual variables and Loop Quantum Gravity

Alejandro Corichi, Alberto Hauser
45 pages
"This bibliography attempts to give a comprehensive overview of all the literature related to what is known as the Ashtekar-Sen connection and the Rovelli-Smolin loop variables, from which the program currently known as Loop Quantum Gravity emerged..."

Corichi gives some guidance as to his own judgement of what are good introductions, primers, surveys, mathematical treatments.

======================================
Dan Christensen's SpinFoam website at U Western Ontario
is another resource for people wanting to get acquainted with LQG and related QG

http://jdc.math.uwo.ca/spin-foams/

he has links to things sorted out by topic, and level and different users' needs and purposes, and he has some links to some Greg Egan JAVA applets. Seeing how he organizes things gives you a practical overview of QG from his perspective.

Dan says he has room for some more grad students and postdocs in his QG/computation program. It looks like anybody who might want to study QG (or massive parallel computation applied to QG) should probably check this out.

----------------------------
EDIT TO REPLY TO CINQUERO
Hi Cinquero, since i can still edit this I will reply this way and save making a new post. Please go to Dan Christensen site. He has many links in an organized convenient form. If there is anything that you need a further PDF link for, tell me what it is is and I will try to find it. I am not certain I understand your request for links to PDF----was it links to things found at Dan's UWO page or for something else?
 
Last edited:
  • #82
Thx!

But could someone please add hyperlinks to the PDF output? :-)))
 
  • #83
Hi Cinquero, I responded to your post #82 by editing the previous post. Hope you saw the note.
At the moment just need a place to stash the links to the audio of a two-part Bojowald talk given last Friday and concluded today at Penn State. He is talking about the LQG model Black Hole.

the audio of the first part is here
http://www.phys.psu.edu/events/index.html?event_id=1255;event_type_ids=0;span=2005-08-20.2005-12-25
Loop Quantum Cosmology of the Kantowski-Sachs Model
Gravity Theory Seminar by Martin Bojowald from Albert Einstein Institute (Germany)
Friday at 11:00 AM in 318 Osmond (9/16/2005)


and the second part (which was today) is here
http://www.phys.psu.edu/events/index.html?event_id=1256&event_type_ids=0&span=
Spherically Symmetric Quantum Geometry
Gravity Theory Seminar by Martin Bojowald from Albert Einstein Institute
Friday at 11:00 AM in 318 Osmond (9/23/2005)

at the same page there was also this audio
http://www.phys.psu.edu/events/index.html?event_id=1268;event_type_ids=0;span=
Generalizing Quantum Mechanics for Quantum Gravity
IGPG Seminar by James Hartle from University of California, Santa Barbara
Monday at 3:00 PM in 318 Osmond (9/19/2005)

and this audio as well
http://www.phys.psu.edu/events/index.html?event_id=1260;event_type_ids=0;span=2005-08-20.2005-12-25
Quantum Nature of the Big-Bang: Numerical Issues
Gravity Theory Seminar by Thomas Pawlowski & Parampreet Singh
Friday at 11:00 AM in 318 Osmond (9/9/2005)

Ashtekar has announced that he has a paper, written with Thomas Pawlowski & Parampreet Singh, to appear about this topic: LQG picture of the big bang.
Several of these seminar talks relate to the Ashtekar Bojowald collaboration about LQG of big bang and black hole, see for example their recent paper
http://www.arxiv.org/gr-qc/0509075
Quantum geometry and the Schwarzschild singularity
 
Last edited by a moderator:
  • #84
Actually, mys request for hyperlinks was in regard to:

"Bibliography of Publications related to Classical Self-dual variables and Loop Quantum Gravity"

:)
 
  • #85
Cinquero said:
Actually, mys request for hyperlinks was in regard to:

"Bibliography of Publications related to Classical Self-dual variables and Loop Quantum Gravity"

:)

Ah!, I see what you mean. Corichi writes the URLs out for online sources, so one could paste them in and get to them, but in the PDF version these URLs do not automatically function as hyperlinks, as they might if he had provided an HTML version. I understand you may be joking, but it wouldn't be a bad idea for Corichi to make an up-to-date selective HTML bibliography of online quantum gravity sources.

If you want to encourage him to do this you could email him. Be sure to mention PF. He--or else a good friend of his--has often visited us, I believe, and supplied helpful information.
 
Last edited:
  • #86
This is an update of post #56 which was about a book edited by Abhay Ashtekar scheduled to be published this year by World Scientific. here is the publisher's webpage

http://www.worldscibooks.com/physics/5876.html

A Hundred Years of Relativity.

Several chapters of this book are already online as preprints:

Martin Bojowald
http://arxiv.org/abs/gr-qc/0505057
Elements of Loop Quantum Cosmology

Larry Ford
http://arxiv.org/abs/gr-qc/0504096

Rodolfo Gambini and Jorge Pullin
http://arxiv.org/abs/gr-qc/0505023
Discrete space-time

Hermann Nicolai
http://www.arxiv.org/abs/gr-qc/0506031
Gravitational Billiards, Dualities and Hidden Symmetries

Thanu Padmanabhan
http://arxiv.org/abs/gr-qc/0503107
Understanding Our Universe: Current Status and Open Issues

Alan Rendall
http://arxiv.org/abs/gr-qc/0503112

Clifford Will
http://arxiv.org/abs/gr-qc/0504086
Was Einstein Right? Testing Relativity at the Centenary
========
other stuff:
Ashtekar there are several useful surveys, such as
http://arxiv.org/abs/gr-qc/0410054
Gravity and the Quantum
http://arxiv.org/abs/gr-qc/0404018
Background Independent Quantum Gravity: A Status Report
 
  • #87
Agons, moments of truth.
for some reason I keep thinking back to the times Smolin spoke up at the Toronto string panel discussion, and each time immediately afterwards he was put down by you know who.

https://www.physicsforums.com/showthread.php?t=84585

http://www.fields.utoronto.ca/programs/scientific/04-05/string-theory/strings2005/panel.htmland I remember Atiyah at Santa Barbara interrupted and almost derailed as he tried to get across his "old man's crazy thoughts".

https://www.physicsforums.com/showthread.php?t=96806

http://online.kitp.ucsb.edu/online/strings05/atiyah/

and as if to compensate there is Gerard 't Hooft's response after listening to some strange and quite possibly wrong ideas from Atiyah---That sounds like physics![/color]
 
Last edited:
  • #88
There may be hints of slow shift in research interest from string to non-string approaches to quantum gravity. The latter include Causal Dynamical Triangulations (CDT), spinfoams, Loop Quantum Gravity (LQG) and others on the Loops '05 conference programme.
this shift in research activity, if it exists, is hard to verify and measure statistically. here is one indicator---by itself not conclusive but something to watch along with the rest.
Last month at selfAdjoint's suggestion I included "heterotic, superstring" in the list of keywords and did a search using the Harvard ADS abstract service engine.
https://www.physicsforums.com/showthread.php?p=789185#post789185
I'll be glad to try. The main thing is just to have a fixed set of keywords you can apply year by year, to get the trend in papers with those keywords.
In accordance with your suggestion, I checked how many papers were published each year (October thru September) with the words "brane" or "M-theory" or "AdS/CFT" or "superstring" or "heterotic" in the abstract.
Code:
2001   1202
2002   1097
2003    970
2004    959
For continuity I tried the same check today. This is now the papers November thru October, year by year, with any of the same keywords in the abstract.
Code:
2001   1220
2002   1083
2003    972
2004    938

Part of this could certainly have nothing to do with a concurrent increase in QG research output in the non-string lines of investigation. It is very iffy and difficult to link the two trends! But at a level of anecdotal evidence one does encounter cases of people who have switched over.

Since the effort in non-string QG is still small compared with string, this shift (if it is occurring) could be viewed simply as diversification. One could take it NOT AS A SIGN THAT ONE THING IS RIGHT AND ANOTHER WRONG but that for whatever reason people are branching out in more directions, and trying non-string ones.

I will try to get some figures on non-perturbative QG research output trends.
 
Last edited:
  • #89
non-string QG research seems to have increased during the same period.
For a rough indication of this I use the keyword search engine at arxiv.org, to find the number of preprints submitted each year with certain terms in the abstract. It gets some papers it shouldn't (that just happen to have the right keywords) and it misses some. Here the results:

Code:
2001    98
2002   121
2003   140
2004   184

In case anyone is interested here are links to these arxiv.org searches, and to some others just to have them handy.

Year 2001:
http://arXiv.org/find/grp_physics/1...m+AND+OR+triply+doubly+special/0/1/0/2001/0/1
Year 2002:
http://arXiv.org/find/grp_physics/1...m+AND+OR+triply+doubly+special/0/1/0/2002/0/1
Year 2003:
http://arXiv.org/find/grp_physics/1...m+AND+OR+triply+doubly+special/0/1/0/2003/0/1
Year 2004:
http://arXiv.org/find/grp_physics/1...m+AND+OR+triply+doubly+special/0/1/0/2004/0/1
Last twelve months:
http://arXiv.org/find/grp_physics/1...m+AND+OR+triply+doubly+special/0/1/0/past/0/1
Year to date, 2005:
http://arXiv.org/find/grp_physics/1...m+AND+OR+triply+doubly+special/0/1/0/2005/0/1

BTW here I have been looking only at the 2001-2004 period. Can we say anything about current trends? Well it may take a while for the 2005 numbers to stabilize and it may be too soon to say anything much. But I think nonperturbative QG research is experiencing LOLL-SHOCK and is in a temporary lull where people are considering re-directing their efforts more in line with Causal Dynamical Triangulations (because of recent seemingly important results).

John Baez has been very frank about this. Last month (October) he presented an overview of Spinfoam where he pointed out CDT results and asked could Spinfoam be modified (introducing an analogous causality structure?) to be more like CDT and could it get similar results, and then maybe it would surpass CDT because of inherent advantage in some other department.
this was the talk he gave at Loops '05 and posted at his website.

There is a lot of new stuff to digest right now, besides CDT there is Thiemann single constraint program, another approach which is not standard LQG and which look attractive to some LQG people, and there is Freidel's result that highlights DSR as a possibly necessary feature of QG.

For whatever reason there has been almost no QG appearing on arxiv.org since the 10-14 October conference. Loll-shock is my best guess but their could be other reasons.
 
Last edited:
  • #90
Here is a great talk by Sundance Bilson-Thompson given 16 November 2005 at Perimeter Institute

http://streamer.perimeterinstitute....rType=WM64Lite&mode=Default&shouldResize=true

it is split screen, slides and video, he occasionally goes to the blackboard to explain stuff and the camera gets that too.
he is talking about his preon model (simple basis for a sketchy approximation of the standard model) which he and Lee Smolin are currently trying to connect with the spin networks of LQG.

Please let me know if this link does not work for you. It worked for me when I tried it.
 
Last edited by a moderator:

Similar threads

  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 15 ·
Replies
15
Views
5K
Replies
26
Views
5K
Replies
16
Views
6K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 24 ·
Replies
24
Views
7K