oldman said:
... Instead it seems to show that on a larger scale the quantum chaos (reality) will behave like our smooth (near-de Sitter) spacetime with 4 dimensions and a cosmological constant (another reality), albeit without matter (yet)...
That sounds right, oldman. You seem to be doing all right. I wouldn't say I am a lot better off, though I may have a faster internet connection making the link to the SciAm article usable.
I've read the 2005 paper "Universe from Scratch". It is fine, just a little old. I would also suggest looking at these three if you haven't already:
http://arxiv.org/abs/0711.0273 (emergence of spacetime---short paper presenting argument for the approach)
http://arxiv.org/abs/hep-th/0505154 (reconstructing the universe---long informative paper many figures & charts)
http://arxiv.org/abs/0712.2485 (planckian birth of quantum desitter universe---short paper presenting recent result)
Universe from Scratch may be perfect and meet your requirements, but just to know the alternatives I would say to glance at some of these others, may find they provide a supplement.
======================
you asked about the meaning of path-integral in this context. here is my take on it. In their computer work, they typically run the path integral to go from zero (or minimal) spatial state at the beginning back to zero at the end. That is because the computer is finite and only contains a finite number of simplexes. So the little simulated universe has to have a finite life. So what they get is a universe pops into existence, swells up, then shrinks down, and pops out of existence.
that is an oversimplification. for technical reasons which they explain but I don't understand, they use periodic time with the period much longer than the lifespan of the little universe-----so it is as if they had infinite time and somewhere along there the thing popped into and out of existence. Also for technical reasons the zero spatial state is the minimal number of simplexes that you can glue together so all the faces are covered. As I recall it takes a dozen or less. You want the minimum number of tets required to make something that is topologically a three-sphere S
3.
the evolution equation allows a minimal spatial state either to just sullenly persist as minimal, or to abruptly take off and grow
they wouldn't have to always run the path integral from minimal state Initial to minimal state Final. They could presumably run it from Initial space geometry A to final space geometry B, and have A and B be extensive interesting shapes. But as I understand it they always run essentially from zero to zero, or rather from minimal to minimal.
======================
now what is in the computer at any given moment is a history of how spatial geometry could evolve from initial to final
and it is typically a kinky unsmooth history----a 4D story without much symmetry to it.
and this is in effect randomly chosen.
in a given computer run they may go through a million such randomly chosen 4D histories----paths in geometry-space so to speak, paths thru the plethora of possible spatial geometries which lead from Initial to Final.
they get this sample of a million possible paths---a million 4D histories---by a process of randomly modifying one of them to get the next, and modifying that to get the next.
=======================
so how should we think about this. well think about Feynman talking about the path that a PARTICLE takes to get from point A to point B. for him the classical trajectory doesn't exist. all that exists is the realm of all possible paths, which are mostly unsmooth and nowhere differentiable and rather kinkylooking----and each one has an amplitude----and nature makes the weighted average of all the paths and that is how she gets the particle from A to B
well Ambjorn and Loll could say that in the same way SPACETIME does not exist. it is something we imagine like the smooth classical path of a particle. what exists is this realm of possible spatial geometries----and all the possible kinky wacko paths thru this realm, that begin with geometry A and end with geometry B-----and each of these unsmooth 4D histories has an amplitude
and the spacetime which we think we observe is really nature averaging all these fluctuating 4D histories up in a weighted sum.
and so the sum over histories smooths out and looks classical to us. it is the average path
I think it may have been Hawking who popularized the phrase "sum over histories". It is a synonym for the spacetime Feynman path integral. But Hawking Euclidean quantum gravity didn't work. they tried to regularize using simplexes and for 10 years it messed up. then in 1998 Ambjorn and Loll got the idea how to fix it.
==================
Another thing is, when you do a Feynman path integral for a particle going from A to be, the particle goes along line segments---it is a
polygonal path-----all zigzaggy. And then you let the size of a segment go to zero. That is not because Feynman claimed natures paths were zigzag polygonal. It is a regularization. which means that to make the problem finite you restrict down to some representatives.
So by analogy, Ambjorn and Loll are not saying that nature is playing with simplexes and tetrahedra! That is just a regularization.
The set of all possible paths is too big. We take a skeleton version of just representative paths (4D histories that pass only thru simplicial geometries). Averaging with a measure on all paths would be too much. We average using a representative sample---a regularization.
And in principle we could let the size of the building blocks go to zero.