Two World-theories (neither one especially stringy)

  • Thread starter Thread starter marcus
  • Start date Start date
  • #91
Hi
guest stopped in...will continue later, thanks. Great stuff...R
 
Physics news on Phys.org
  • #92
marcus said:
Something to keep in mind. the CDT spacetime is not made of simplexes but is the CONTINUUM LIMIT of approximating mosaic spacetimes with smaller and smaller building blocks.

the quantum mechanics goes along with this process of finer and finer approximation. so at each stage in the process of going for the limit you have an ensemble of many many mosaic geometries

so there is not just one continuum which is the limit of one sequence of mosaics (mosaic = "piecewise flat", quite kinky manifold, packed solid with the appropriate dimension simplex)
there is a quantum jillion of continuums each being the limit of a quantum jillion of sequences of mosaics.

or there is a blur of spacetime continuums with a blur of different geometries and that blur is approximated finer and finer by a sequence of simplex manifold blurs

BUT DAMMIT THAT IS TOO INVOLVED TO SAY. So let us just focus on one of the approximating mosaics. Actually that is how they do it with their computer model. they generate a mosaic and study it and measure things, and then they randomly evolve it into another and study that, one at a time, and in this way they get statistics about the set of possible spacetime geometries. One at a time. One definite concrete thing at a time. Forget about the blur.

This all sounds like a numerical method for the calculation of some calculus. Do they have a differential or integral equation for this process that they are doing with a numerical algorthm? Do they show that there is something pathalogical with the calculus to justify the numerical approach with computers? Thanks.
 
  • #93
Mike2 said:
This all sounds like a numerical method for the calculation of some calculus. Do they have a differential or integral equation for this process that they are doing with a numerical algorthm? .

yes they have a rather nice set of equations, look at the article
http://arxiv.org/hep-th/0105267
equation (2) gives the action integral
the discrete version (following Regge) is (38) on page 13
from thence, in the next section, a transfer matrix and a hamiltonian

You may not realize this but the Einstein equation of classical gen rel is normally solved NUMERICALLY IN A COMPUTER these days because one cannot solve it analytically. that is how they do differential equations these days, for a lot of physical systems. It is not pathological, it is normal and customary AFAIK.

the Einstein eqns are only solvable analytically in a few highly simplified cases. so be glad they have a model that they CAN implement numerically---in the real world that is already something of a triumph
:smile:

Do they show that there is something pathalogical with the calculus to justify the numerical approach with computers? Thanks

as I say, they don't have to justify using numerical methods, and it is not pathological----its the customary thing to do if you are lucky
 
Last edited by a moderator:
  • #94
marcus said:
yes they have a rather nice set of equations, look at the article
http://arxiv.org/hep-th/0105267
equation (2) gives the action integral
the discrete version (following Regge) is (38) on page 13
from thence, in the next section, a transfer matrix and a hamiltonian
So from (2) it would seem that they are integrating over the various possible metrics on a given dimension. It would seem that the demension is given a-priori as 4D. I don't get, then, what this talk is about 2D at small scales.

Edit:
Just a moment, does all possible metrics include those that give distance only in 1, 2, and 3 dimensions. If so, is this the way to integrate over various dimensions as well?

marcus said:
as I say, they don't have to justify using numerical methods, and it is not pathological----its the customary thing to do if you are lucky
Isn't it more desirable to find an analytic expression? Or are they just taking it from experience that these path integrals are generally not analytic and require numerical methods to solve? And why Monte Carlo method. Is this to avoid even the possibility that the other methods of numerical integration can be pathological? Thanks.
 
Last edited by a moderator:
  • #95
Mike2 said:
...Isn't it more desirable to find an analytic expression? Or are they just taking it from experience that these path integrals are generally not analytic and require numerical methods to solve? And why Monte Carlo method. Is this to avoid even the possibility that the other methods of numerical integration can be pathological? Thanks.

It doesn't seem efficient for me to just be repeating what they say much more clearly and at greater length in their paper, mike. I hope you will read more in the paper.

There is also this recent paper hep-th/0505154 which IIRC has some discussion of what they are actually integrating over, and why numerical, and why Monte Carlo. They are the ones you should hear their reasons from rather than me trying to speak for them. thanks for having a look-see at the papers
cheers
 
Last edited:
  • #96
Here is a relevant quote from right near the beginning of
http://arxiv.org/hep-th/0505154
the most recent CDT paper.

It addresses some of the issues raised in Mike's post, such as why numerical, why Monte Carlo. To save reader trouble i will copy this exerpt in, from bottom of page 2, the introduction section.

----quote from AJL----
In the method of Causal Dynamical Triangulations one tries to construct a theory of quantum gravity as a suitable continuum limit of a superposition of spacetime geometries [6, 7, 8]. In close analogy with Feynman’s famous path integral for the nonrelativistic particle, one works with an intermediate regularization in which the geometries are piecewise flat (footnote 2.) The primary object of interest in this approach is the propagator between two boundary configurations (in the form of an initial and final spatial geometry), which contains the complete dynamical information about the quantum theory.

Because of the calculational complexity of the full, nonperturbative sum over geometries (the “path integral”), an analytical evaluation is at this stage out of reach. Nevertheless, powerful computational tools, developed in Euclidean quantum gravity [9, 10, 11, 12, 13, 14, 15] and other theories of random geometry (see [16] for a review), can be brought to bear on the problem.

This paper describes in detail how Monte Carlo simulations have been used to extract information about the quantum theory, and in particular, the geometry of the quantum ground state (footnote 3) dynamically generated by superposing causal triangulations.

It follows the announcement of several key results in this approach to quantum gravity, first, a “quantum derivation” of the fact that spacetime is macroscopically four-dimensional [17], second, a demonstration that the large-scale dynamics of the spatial volume of the universe (the so-called “scale factor”) observed in causal dynamical triangulations can be described by an effective action closely related to standard quantum cosmology [18], and third, the discovery that in the limit of short distances, spacetime becomes effectively two-dimensional, indicating the presence of a dynamically generated ultraviolet cutoff [19]...



FOOTNOTES:
2. These are the analogues of the piecewise straight paths of Feynman’s approach. However, note that the geometric configurations of the quantum-gravitational path integral are not imbedded into a higher-dimensional space, and therefore their geometric properties such as piecewise flatness are intrinsic, unlike in the case of the particle paths.

3. Here and in the following, by “ground state” we will always mean the state selected by Monte Carlo simulations, performed under the constraint that the volume of spacetime is (approximately) kept fixed, a constraint we have to impose for simulation-technical reasons.

--------end quote from AJL----
 
Last edited by a moderator:
  • #97
I want to highlight something from the above quote:

by “ground state” we will always mean the state selected by Monte Carlo simulations

the ground state of the geometry of the universe (is not made of simplexes but is the limit of finer and finer approximations made of simplexes and) IS A WAVE FUNCTION OVER THE SPACE OF ALL GEOMETRIES that is kind of like a probability distribution covering a great bunch of possible geomtries

and we find out about the ground state wavefunction, find out things like what kind of geometries make up the bulk of it and their dimensionality etc, we STUDY the ground state by doing Monty simulations.

that's an interesting way of approaching it, I think. it's clear what they mean and operationally defined. If anyone is scandalized by this way of defining the quantum ground state, it would be lovely if they would tell us about it. contribute to the conversation etc.

Love this stuff. Renate Loll is keen as a knifeblade. they do not mess around. often seem to take original approaches.

Oh PLEASE CORRECT ME IF I AM WRONG. I think that before 1998 NO MATHEMATICIANS EVER STUDIED A PL MANIFOLD THAT WAS CAUSALLY LAYERED like this. The PL manifold, or simplicial manifold, is an important object that has been studied for many decades, I don't know how long but I encountered it already in grad school a long time ago. But what Ambjorn Loll invented to do was to MAKE THE SIMPLEXES OUT OF CHUNKS OF MINKOWSKI SPACE and to construct a NECCO WAFER foliation of spacelike sheets with a timelike FILLING in between. So you have a PL manifold which is 4D but it has 3D sheets of tetrahedrons.

and in between two 3D sheets of tets there is this yummy white frosting filling made of 4-simplexes which CONNECT the tets in one sheet with the tets in the next sheet up

and of course another layer of filling that connects the tets in that sheet with those in the one below it.

HISTORY: for a few years after 1998, Ambjorn and Loll tried calling that a "Lorentzian" triangulation. So it was going to be, like, a "Lorentizian" quantum gravity using Lorentzian PL manifolds. But the nomenclature didnt work out----the initials would have had to be LQG , for one thing, causing endless confusion with the other LQG.
So then in around 2003 or 2004 they started saying "causal" instead of "Lorentzian"

So here we have a geometric structure which AFAIK has not been studied in mathematics. A CAUSAL PL manifold.

TERMINOLOGY: The traditional "PL" means "piecewise linear" and it could be misleading. the thing is not made of LINES but rather building blox, but simplexes are in a very general abstract sense linear. So a simplicial manifold assembled out of simplex pieces has (for decades) been called "piecewise linear" or PL and the MAPPINGS BETWEEN such things are also piecewise linear (which is very important to how mathematicians think, they like to think about the mappings, or the morphisms of a category)

A NEW CATEGORY: we now have a new category with new mappings. the CAUSAL PL category. it will be studied. the Ambjorn Loll papers are the ground floor.


----quote from AJL----
In the method of Causal Dynamical Triangulations one tries to construct a theory of quantum gravity as a suitable continuum limit of a superposition of spacetime geometries [6, 7, 8]. In close analogy with Feynman’s famous path integral for the nonrelativistic particle, one works with an intermediate regularization in which the geometries are piecewise flat (footnote 2.) The primary object of interest in this approach is the propagator between two boundary configurations (in the form of an initial and final spatial geometry), which contains the complete dynamical information about the quantum theory.

Because of the calculational complexity of the full, nonperturbative sum over geometries (the “path integral”), an analytical evaluation is at this stage out of reach. Nevertheless, powerful computational tools, developed in Euclidean quantum gravity [9, 10, 11, 12, 13, 14, 15] and other theories of random geometry (see [16] for a review), can be brought to bear on the problem.

This paper describes in detail how Monte Carlo simulations have been used to extract information about the quantum theory, and in particular, the geometry of the quantum ground state (footnote 3) dynamically generated by superposing causal triangulations.

It follows the announcement of several key results in this approach to quantum gravity, first, a “quantum derivation” of the fact that spacetime is macroscopically four-dimensional [17], second, a demonstration that the large-scale dynamics of the spatial volume of the universe (the so-called “scale factor”) observed in causal dynamical triangulations can be described by an effective action closely related to standard quantum cosmology [18], and third, the discovery that in the limit of short distances, spacetime becomes effectively two-dimensional, indicating the presence of a dynamically generated ultraviolet cutoff [19]...



FOOTNOTES:
2. These are the analogues of the piecewise straight paths of Feynman’s approach. However, note that the geometric configurations of the quantum-gravitational path integral are not imbedded into a higher-dimensional space, and therefore their geometric properties such as piecewise flatness are intrinsic, unlike in the case of the particle paths.

3. Here and in the following, by “ground state” we will always mean the state selected by Monte Carlo simulations, performed under the constraint that the volume of spacetime is (approximately) kept fixed, a constraint we have to impose for simulation-technical reasons.

--------end quote from AJL----[/QUOTE]
 
  • #98
the CPL category. CPL mappings

OK so we have a new category where the objects are CPL manifolds and the morphisms are CPL mappings (causal piecewise linear)

there are only two basic papers so far
http://arxiv.org/hep-th/0105267
http://arxiv.org/hep-th/0505154

IMO it is a good time to get into the field.
AFAIK the CPL category has not been studied
and it will be studied.
(it is the basis of a new approach to quantum gravity which has really come alive in the past two years)
math grad students in Differential Geometry should know about CPL manifolds and consider proving some of the first easy theorems, picking the "low hanging fruit" is not a bad idea in math---where really new things are uncommon.

Or maybe i should not say "differential geometry" anymore, I should be saying "combinatorial geometry" or "simplicial geometry" I don't know---language fashions change---fields change terminology as they develop, sometimes.

a CPL mapping has to be a PL mapping (takes simplexes to simplexes, piecewise linear) and it has to respect the causal ordering as well.

I wonder if these things are going to be interesting. maybe and maybe not, can't tell much ahead of time. wouldn't have guessed the physics results of the past two years would be so exciting.

Matter fields have to be laid onto these CPL manifolds. I wonder how that will be done and what kind of new mathematical structure will appear when that is done?

these manifolds do NOT have coordinate patches----they are not locally diffeomorphic to Rn because they don't have a differentiable structure. you could put one on, maybe, but it wouldn't fit well, like a bad suit of clothes.

these CPL manifolds already have curvature. but the curvature is defined combinatorially by COUNTING the number of simplexes clustered around a "bone" (this is a very curious idea, the face of a face is a "bone" or "hinge")

I suspect Rafael Sorkin of promoting the word "bone" (or possibly even coining it). BONES ARE WHAT YOU ADD UP THE DIHEDRAL ANGLES AROUND so that you can know the deficit or surplus angle. I would have to go back to hard copy, a 1975 article by Sorkin in Physics Rev. D, to find out and that seems too much bother. His 1975 article is "Time-evolution in Regge Calculus". I suspect, but do not know, that the delightful word "bone" occurs in this article.

what one wants to be able to do in the CPL category is take LIMITS, going with finer and finer triangulations. I picture it as somewhat like taking "projective limits". the idea of a limit of finer and finer 4d causal triangulations would be defined. it would be some kind of manifold, maybe a new kind of manifold.

when you go to the limit, the bones disappear, but you still have the curvature. how?
 
Last edited by a moderator:
  • #99
marcus said:
I want to highlight something from the above quote:

by “ground state” we will always mean the state selected by Monte Carlo simulations

the ground state of the geometry of the universe (is not made of simplexes but is the limit of finer and finer approximations made of simplexes and) IS A WAVE FUNCTION OVER THE SPACE OF ALL GEOMETRIES that is kind of like a probability distribution covering a great bunch of possible geomtries
Do they have a "cononical" field equation like position and cononical momentum operators on the "wave function"?

It seems to me that they are supposing without explanation a Lorentzian metric. Can this metic form be derived? Is it necessary to do any calculations at all?

How does one pronounce "simplicial"?

Thanks.
 
  • #100
Mike2 said:
How does one pronounce "simplicial"?

Hi Mike2

Pronounce "simplicial" with the stress on the second syllable "pli" with a short 'i' as in sim-PLi-shawl

Of course, I'm a minority accent English speaker!

Cheers
Kea
 
  • #101
Kea said:
Hi Mike2

Pronounce "simplicial" with the stress on the second syllable "pli" with a short 'i' as in sim-PLi-shawl

Of course, I'm a minority accent English speaker!

Cheers
Kea

I expect we all would like your accent very much if we could hear it, let us adopt sim-PLi-shawl
as per Kea
(except that I am apt to say shul instead of the more elegant shawl)

all vowells have a tendency become the schwa
and be pronounced "uh"
 
  • #102
If I may chime in a comment on statistics, the usual reason for using monte carlo methods is to give an unbiased representation of all possible [or at least reasonable] initial states of the system under study. The 'outlier' states are the ones you worry about - the ones where the model collapses and lead to unpredicatable outcomes. It is vitally important to find the boundary conditions, where the model works and where it does not. This is not necessarily a continuum, where the model always works when x>y, x<z. There may, instead, be discrete intervals where it does not work. You need to run the full range of values to detect this when you do not have a fully calculable analytical model. Interestingly enough, this kind of problem often arises in real world applications - like manufacturing - where you have complex interactions between multiple process variables.
 
Last edited:
  • #103
Chronos said:
If I may chime in a comment on statistics, the usual reason for using monte carlo methods is to give an unbiased representation of all possible [or at least reasonable] initial states...

Chronos, thanks for chiming in. I expect it has different associations. For some people, "Monte Carlo method" is a way of evaluating an integral over a multidimensional space, or more generally a way of evaluating the integral of some function which is defined over a very LARGE set, so that it would be expensive in computer time to do ordinary numerical integration.

what one does is to consider the integral as an average, or (in probabilistic terms) an EXPECTATION VALUE. And then one knows that one can estimate the expectation value empirically by sampling. So one picks some RANDOM points in the large set, and evaluates the function at each point in that random sample, and averages up the function values----and that "monte carlo sum" is a stab at the true value of the integral.

(I may be just repeating something you said already in different words. Cant be sure. but want to stress the application of M.C. to evaluating integrals over large sets where other methods inapplicable or too costly)

Naturally the more random points one can include in one's sample the better the value of the integral one is going to get.

Ambjorn et al (AJL) approach to quantum gravity is a PATH INTEGRAL approach, where the "path" is AN ENTIRE SPACETIME.

It is like a Feynman path integral except Feynman talks about the path of an actual particle as it goes from A to B, and AJL talk about the PATH THE UNIVERSE TAKES IN THE SPACE OF ALL GEOMETRIES AS IT GOES FROM BIGBANG TO BIGCRUNCH or from beginning to end whatever you want to call them. And for AJL a "path" is a possible spacetime or a possible evolution of the geometry. Well that is not such a big deal after all. It is just a Feynmanian path integral, but in some new territory.

And they want to study various properties like dimension. So they want to find expectation values, essentially, but the set of all paths is a BIG SET. So it is not practical to do the whole integral (over the range of all spacetimes, all evolutions from A to B or beginning to end). So what they are doing with their Monte Carlo is this:

they found a clever way to pick random spacetimes that are paths of geometry from beginning to end. So they pick many many, a large random sample, and they evaluate the function they want to study.

they evaluate the function they want to study for each of a random sample of spacetimes and they AVERAGE UP and that is using Monty method to evaluate the "path integral"

for now, the functions they are evaluating at sample points are very basic functions like "overall spacetime hausdorff dimension" or "spatial slice dimension" or "smallscale diffusion dimension, in the spacetime" , or in the spatial slice, or in a "thick slice". they have a lot of ways to measure the dimension and they are studying the general business of dimensionality.

but the functions could be more sophisticated like "number of black holes" or "density of dark energy" or "abundance of lithium" (maybe? I really can't guess, I only know that this is probably only the beginning)
With Monty path integral method it should be possible to evaluate many kinds of interesting functions (defined on the ensemble of spacetimes).

this is early days and they are studying dimensionality, but they can study a lot of other aspects of the world this way and i expect this to be done. they say they are now working on putting matter into the picture.

they are going to need more computer time.

the present spacetimes are ridiculously small (on order of a million blox) and shortlived.
Have you had a look at a computergenerated picture of a typical one of their spacetimes? if so, you know what i mean
 
Last edited:
  • #104
marcus said:
Chronos, thanks for chiming in. I expect it has different associations. For some people, "Monte Carlo method" is a way of evaluating an integral over a multidimensional space, or more generally a way of evaluating the integral of some function which is defined over a very LARGE set, so that it would be expensive in computer time to do ordinary numerical integration.
I had to do a numerical calculation in a graduate course I took years ago to see the difference between the Monte Carlo method and some other traditional algorithms of numerical integration. What I learned was that most of the other numerical integration schemes rely on predictable algorithms that make some integrals impossible to evaluate. They blow up to infinity. Or there is always a great difference when you increase the resolution; they don't converge. It seems the algorithm used in traditional methods to divide the interval of integration into subdivisons itself actually contributes the the pathological nature of that numerical method. But the Monte Carlo method introduces a measure of randomness in the algorithm to help avoid any pathologies introduced by more predictable algorithms. Monte Carlo still equally divides the interval of integration, but picks at random where in each interval to evaluate the integrand.

I suspect that it is now common place to evaluate integrals in physics using Monte Carlo just to avoid even the possibility of other methods being pathological. Maybe someone else could confirm or deny this suspicion of mine.
 
  • #105
Mike2 said:
I suspect that it is now common place to evaluate integrals in physics using Monte Carlo just to avoid even the possibility of other methods being pathological. Maybe someone else could confirm or deny this suspicion of mine.

Just occurred to me what Monty Carlo means:
it is Carlo Rovelli in his birthday suit.
 
  • #106
I was just now replying to a post of selfAdjoint in the "1-to-10" thread and the thought occurred to that a lot of people may not have realized that getting a new model of the continuum may turn out to be THE FASTEST WAY TO A TOE.

Causal Dynamical Triangulations has a limited goal of merely arriving at a quantum model of spacetime that reproduces Gen Rel at large scale.
(but is based on quantum spacetime dynamics at microscopic scale)

Once people have a new continuum, and start working on it, and building theories of matter on that instead of Minkowski space, then it can be argued that the new features of the continuum are likely to inspire and enable new matter physics.
 
  • #107
The CDT quantum continuum is still very new and the preliminary results on it are still coming in, so it involves guesswork to look ahead.

Suppose people start to reconstruct the Standard Model on the CDT spacetime foundations instead of on static flat txyz Minkowski, or a smooth-metric manifold.

CDT continuum has extremely non-classical geometry at small scale, but classical geometry at large scale. Or so appears (so early it is hard to be sure)

what new ideas about matter are going to be inspired on the way to putting matter fields into the "triangulations" picture, and what new mathematics enabled?

we should keep clearly in mind that the CDT is not a LATTICE approach where you take some tame classical geometry (mostly flat) and cover it with a grid.
Triangulations has the blocks assembled everywhichway, in arrangements you could not embed into a conventional flat txyz space, and results in extremely wild non-classical geometries.

so what happens when people start painting matter into the triangulations?
I suspect it would be foolish to try to put preconceptions on the outcome---they would just blind us.

And BTW let's keep in mind the general concept "nonperturbative quantum gravity" theory for something that gives you a new quantum picture of spacetime with Gen Rel in the large scale.
CDT is one possible "nonperturbative quantum gravity".
It happens to be one where they have reached the point of computer
simulations of the new model spacetime, and where they are getting interesting results.

But if there was a broad effort focussed on getting this kind of thing there could be other "nonperturbative quantum gravity" approaches also running computer models of spacetime and getting interesting results about dimensionality and bigbang cosmology and soforth.

spinfoams and loopcosmology does some of that, though CDT seems to have moved ahead of the pack, at least for now.

it just happens that this thread topic is CDT (small and fast-developing compared with LQG)
 
Last edited:
  • #108
Looks like in this thread I never gave the abstract for the new CDT paper
"Reconstructing the Universe". I is good to examine the abstract of a Loll paper because she carefully articulates what she is doing and it can convey some perspective

http://arxiv.org/abs/hep-th/0505154
Reconstructing the Universe
J. Ambjorn (NBI Copenhagen and U. Utrecht), J. Jurkiewicz (U. Krakow), R. Loll (U. Utrecht)
52 pages, 20 postscript figures

"We provide detailed evidence for the claim that nonperturbative quantum gravity, defined through state sums of causal triangulated geometries, possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale, and at the same time provides a nontrivial consistency check of the method of causal dynamical triangulations..."

To me this suggests that NONPERTURBATIVE QUANTUM GRAVITY regardless whether you do it with triangles has these features.
There may be various ways to do this, and arrive at quantum models of the continuum. And they may CONVERGE on a picture with

1. 4D and Gen Rel in largescale limit

2. Semiclassical (HawkingHartle) wavefunction of the scalefactor or size of universe

3. some picture of quantum spacetime dynamics at very small scale, which cumulatively and collectively generates expected largescale behavior

The abstract does not start off by saying "CDT", it comes to that only later.
So it is putting forward a rather bold claim
It says the authors have evidence that IT DOESNT MATTER WHETHER YOU USE OUR EXACT METHOD OR NOT, we have found out something about spacetime
and if you do some OTHER method of nonpert. QG and get a quantum spacetime dynamics that reproduces Gen Rel at large scale then quite possibly you will get similar results because THAT IS HOW SPACETIME IS.

this is a bold claim and they don't present it as a certainty, but something that they offer "detailed evidence" for.

And in fact the paper in question is full of detailed evidence.

so it is not saying "our CDT is the unique only approach, you all have to change to our method", it is saying that however you do the approximation whether or not with triangles and pathintegral, or whatever, if you can open a quantum spacetime dynamics window on the small scale that reproduces Gen Rel spacetime at the large, then you will see similar things!

So please try a slew of other approaches! We will see you at the finish line.

It's confident, and seems at the same time to have a clear modest reasonableness.

Well, I started this CDT thread well before "Reconstructing" appeared, and it appeared right around post #67 of this thread. I quoted some from the first paragraph, in post #67. But I never quoted from the abstract yet in this thread, so it was high time to listen to it.
 
Last edited:
  • #109
Let's remove references to any specfic method (in the above quote) and see what Loll's overall program might be:

http://arxiv.org/abs/hep-th/0505154
Reconstructing the Universe

"We provide detailed evidence for the claim that nonperturbative quantum gravity... possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale,..."

So the program can be called "quantum spacetime dynamics" in that you start with a dynamical principle at very small scale, and all of spacetime is supposed to GROW from the action of that principle at very small scale.

You don't make global assumptions----like it is a smooth manifold of some fixed chosen dimension with some number of coordinates-----you only specify how something works at the subsubatomic level. the whole shebang is supposed to HATCH from that seed.

and what hatches has to look and act right on the large scale

that is her program,

and IT SHOULD NOT DEPEND ON THE PARTICULAR METHOD
anything that deserves to be called "nonperturb. quantum gravity" should POSSESS A LARGESCALE LIMIT---should be able to put in place some microscopic dynamical principle and have a familiar 4D spacetime grow from it. It should be able to because they did this in one example of a "nonperturb. quantum gravity" and surely they do not have a patent on spacetime!

So now I think the bar has been raised. It should be possible to model the spacetime continuum with many different methods of nonpert. QG and get this kind of result. Because it is it---different ways of looking at it should converge.

This is perhaps rather radical to say and might be wrong, but it says the map of QG is now changed and the game is redefined with a new price of admission. The candidate methods can show who they are by reproducing 4D and the semiclassical cosmology result that was mentioned.
Or something like that. i am still not certain how exactly things have changed but i do believe we have a new game.
 
Last edited:
  • #110
http://arxiv.org/abs/hep-th/0505154
Reconstructing the Universe

"We provide detailed evidence for the claim that nonperturbative quantum gravity... possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale,..."

So the program can be called "quantum spacetime dynamics" in that you start with a dynamical principle at very small scale, and all of spacetime is supposed to GROW from the action of that principle at very small scale.
...

just understanding the terms in which the major players see what they are doing can be a project in itself

what does the overall goal of nonperturbative quantum gravity mean to the people who are driving toward it along these various approaches like CDT?

Fotini Markopoulou gave a short definition in a recent paper she did with Mohammad Ansari

<<The failure of perturbative approaches to quantum gravity has motivated theorists to study non-perturbative quantization of gravity. These seek a consistent quantum dynamics on the set of all Lorentzian spacetime geometries. One such approach which has led to very interesting results is the causal dynamical triangulation (CDT) approach[1, 2]. In the interest of understanding why this approach leads to non-trivial results, in this paper we study...>>

this is from the introduction of
http://arxiv.org/hep-th/0505165
A statistical formalism of Causal Dynamical Triangulations
Mohammad H. Ansari, Fotini Markopoulou
20 pages, 19 pictures, 1 graph
Abstract:"We rewrite the 1+1 Causal Dynamical Triangulations model as a spin system and thus provide a new method of solution of the model."
 
Last edited by a moderator:

Similar threads

  • · Replies 11 ·
Replies
11
Views
1K
  • · Replies 50 ·
2
Replies
50
Views
10K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 74 ·
3
Replies
74
Views
11K
  • · Replies 17 ·
Replies
17
Views
5K