Chronos said:
If I may chime in a comment on statistics, the usual reason for using monte carlo methods is to give an unbiased representation of all possible [or at least reasonable] initial states...
Chronos, thanks for chiming in. I expect it has different associations. For some people, "Monte Carlo method" is a way of evaluating an integral over a multidimensional space, or more generally a way of evaluating the integral of some function which is defined over a very LARGE set, so that it would be expensive in computer time to do ordinary numerical integration.
what one does is to consider the integral as an average, or (in probabilistic terms) an EXPECTATION VALUE. And then one knows that one can estimate the expectation value empirically by sampling. So one picks some RANDOM points in the large set, and evaluates the function at each point in that random sample, and averages up the function values----and that "monte carlo sum" is a stab at the true value of the integral.
(I may be just repeating something you said already in different words. Cant be sure. but want to stress the application of M.C. to evaluating integrals over large sets where other methods inapplicable or too costly)
Naturally the more random points one can include in one's sample the better the value of the integral one is going to get.
Ambjorn et al (AJL) approach to quantum gravity is a PATH INTEGRAL approach, where the "path" is AN ENTIRE SPACETIME.
It is like a Feynman path integral except Feynman talks about the path of an actual particle as it goes from A to B, and AJL talk about the PATH THE UNIVERSE TAKES IN THE SPACE OF ALL GEOMETRIES AS IT GOES FROM BIGBANG TO BIGCRUNCH or from beginning to end whatever you want to call them. And for AJL a "path" is a possible spacetime or a possible evolution of the geometry. Well that is not such a big deal after all. It is just a Feynmanian path integral, but in some new territory.
And they want to study various properties like dimension. So they want to find expectation values, essentially, but the set of all paths is a BIG SET. So it is not practical to do the whole integral (over the range of all spacetimes, all evolutions from A to B or beginning to end). So what they are doing with their Monte Carlo is this:
they found a clever way to pick random spacetimes that are paths of geometry from beginning to end. So they pick many many, a large random sample, and they evaluate the function they want to study.
they evaluate the function they want to study for each of a random sample of spacetimes and they AVERAGE UP and that is using Monty method to evaluate the "path integral"
for now, the functions they are evaluating at sample points are very basic functions like "overall spacetime hausdorff dimension" or "spatial slice dimension" or "smallscale diffusion dimension, in the spacetime" , or in the spatial slice, or in a "thick slice". they have a lot of ways to measure the dimension and they are studying the general business of dimensionality.
but the functions could be more sophisticated like "number of black holes" or "density of dark energy" or "abundance of lithium" (maybe? I really can't guess, I only know that this is probably only the beginning)
With Monty path integral method it should be possible to evaluate many kinds of interesting functions (defined on the ensemble of spacetimes).
this is early days and they are studying dimensionality, but they can study a lot of other aspects of the world this way and i expect this to be done. they say they are now working on putting matter into the picture.
they are going to need more computer time.
the present spacetimes are ridiculously small (on order of a million blox) and shortlived.
Have you had a look at a computergenerated picture of a typical one of their spacetimes? if so, you know what i mean