Two World-theories (neither one especially stringy)

  • Thread starter marcus
  • Start date
  • #101
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
Kea said:
Hi Mike2

Pronounce "simplicial" with the stress on the second syllable "pli" with a short 'i' as in sim-PLi-shawl

Of course, I'm a minority accent English speaker!

Cheers
Kea
I expect we all would like your accent very much if we could hear it, let us adopt sim-PLi-shawl
as per Kea
(except that I am apt to say shul instead of the more elegant shawl)

all vowells have a tendency become the schwa
and be pronounced "uh"
 
  • #102
Chronos
Science Advisor
Gold Member
11,408
738
If I may chime in a comment on statistics, the usual reason for using monte carlo methods is to give an unbiased representation of all possible [or at least reasonable] initial states of the system under study. The 'outlier' states are the ones you worry about - the ones where the model collapses and lead to unpredicatable outcomes. It is vitally important to find the boundary conditions, where the model works and where it does not. This is not necessarily a continuum, where the model always works when x>y, x<z. There may, instead, be discrete intervals where it does not work. You need to run the full range of values to detect this when you do not have a fully calculable analytical model. Interestingly enough, this kind of problem often arises in real world applications - like manufacturing - where you have complex interactions between multiple process variables.
 
Last edited:
  • #103
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
Chronos said:
If I may chime in a comment on statistics, the usual reason for using monte carlo methods is to give an unbiased representation of all possible [or at least reasonable] initial states...
Chronos, thanks for chiming in. I expect it has different associations. For some people, "Monte Carlo method" is a way of evaluating an integral over a multidimensional space, or more generally a way of evaluating the integral of some function which is defined over a very LARGE set, so that it would be expensive in computer time to do ordinary numerical integration.

what one does is to consider the integral as an average, or (in probabilistic terms) an EXPECTATION VALUE. And then one knows that one can estimate the expectation value empirically by sampling. So one picks some RANDOM points in the large set, and evaluates the function at each point in that random sample, and averages up the function values----and that "monte carlo sum" is a stab at the true value of the integral.

(I may be just repeating something you said already in different words. Cant be sure. but want to stress the application of M.C. to evaluating integrals over large sets where other methods inapplicable or too costly)

Naturally the more random points one can include in one's sample the better the value of the integral one is going to get.

Ambjorn et al (AJL) approach to quantum gravity is a PATH INTEGRAL approach, where the "path" is AN ENTIRE SPACETIME.

It is like a Feynman path integral except Feynman talks about the path of an actual particle as it goes from A to B, and AJL talk about the PATH THE UNIVERSE TAKES IN THE SPACE OF ALL GEOMETRIES AS IT GOES FROM BIGBANG TO BIGCRUNCH or from beginning to end whatever you want to call them. And for AJL a "path" is a possible spacetime or a possible evolution of the geometry. Well that is not such a big deal after all. It is just a Feynmanian path integral, but in some new territory.

And they want to study various properties like dimension. So they want to find expectation values, essentially, but the set of all paths is a BIG SET. So it is not practical to do the whole integral (over the range of all spacetimes, all evolutions from A to B or beginning to end). So what they are doing with their Monte Carlo is this:

they found a clever way to pick random spacetimes that are paths of geometry from beginning to end. So they pick many many, a large random sample, and they evaluate the function they want to study.

they evaluate the function they want to study for each of a random sample of spacetimes and they AVERAGE UP and that is using Monty method to evaluate the "path integral"

for now, the functions they are evaluating at sample points are very basic functions like "overall spacetime hausdorff dimension" or "spatial slice dimension" or "smallscale diffusion dimension, in the spacetime" , or in the spatial slice, or in a "thick slice". they have a lot of ways to measure the dimension and they are studying the general business of dimensionality.

but the functions could be more sophisticated like "number of black holes" or "density of dark energy" or "abundance of lithium" (maybe? I really cant guess, I only know that this is probably only the beginning)
With Monty path integral method it should be possible to evaluate many kinds of interesting functions (defined on the ensemble of spacetimes).

this is early days and they are studying dimensionality, but they can study a lot of other aspects of the world this way and i expect this to be done. they say they are now working on putting matter into the picture.

they are going to need more computer time.

the present spacetimes are ridiculously small (on order of a million blox) and shortlived.
Have you had a look at a computergenerated picture of a typical one of their spacetimes? if so, you know what i mean
 
Last edited:
  • #104
1,306
0
marcus said:
Chronos, thanks for chiming in. I expect it has different associations. For some people, "Monte Carlo method" is a way of evaluating an integral over a multidimensional space, or more generally a way of evaluating the integral of some function which is defined over a very LARGE set, so that it would be expensive in computer time to do ordinary numerical integration.
I had to do a numerical calculation in a graduate course I took years ago to see the difference between the Monte Carlo method and some other traditional algorithms of numerical integration. What I learned was that most of the other numerical integration schemes rely on predictable algorithms that make some integrals impossible to evaluate. They blow up to infinity. Or there is always a great difference when you increase the resolution; they don't converge. It seems the algorithm used in traditional methods to divide the interval of integration into subdivisons itself actually contributes the the pathological nature of that numerical method. But the Monte Carlo method introduces a measure of randomness in the algorithm to help avoid any pathologies introduced by more predictable algorithms. Monte Carlo still equally divides the interval of integration, but picks at random where in each interval to evaluate the integrand.

I suspect that it is now common place to evaluate integrals in physics using Monte Carlo just to avoid even the possibility of other methods being pathological. Maybe someone else could confirm or deny this suspicion of mine.
 
  • #105
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
Mike2 said:
I suspect that it is now common place to evaluate integrals in physics using Monte Carlo just to avoid even the possibility of other methods being pathological. Maybe someone else could confirm or deny this suspicion of mine.
Just occurred to me what Monty Carlo means:
it is Carlo Rovelli in his birthday suit.
 
  • #106
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
I was just now replying to a post of selfAdjoint in the "1-to-10" thread and the thought occurred to that a lot of people may not have realized that getting a new model of the continuum may turn out to be THE FASTEST WAY TO A TOE.

Causal Dynamical Triangulations has a limited goal of merely arriving at a quantum model of spacetime that reproduces Gen Rel at large scale.
(but is based on quantum spacetime dynamics at microscopic scale)

Once people have a new continuum, and start working on it, and building theories of matter on that instead of Minkowski space, then it can be argued that the new features of the continuum are likely to inspire and enable new matter physics.
 
  • #107
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
The CDT quantum continuum is still very new and the preliminary results on it are still coming in, so it involves guesswork to look ahead.

Suppose people start to reconstruct the Standard Model on the CDT spacetime foundations instead of on static flat txyz Minkowski, or a smooth-metric manifold.

CDT continuum has extremely non-classical geometry at small scale, but classical geometry at large scale. Or so appears (so early it is hard to be sure)

what new ideas about matter are going to be inspired on the way to putting matter fields into the "triangulations" picture, and what new mathematics enabled?

we should keep clearly in mind that the CDT is not a LATTICE approach where you take some tame classical geometry (mostly flat) and cover it with a grid.
Triangulations has the blocks assembled everywhichway, in arrangements you could not embed into a conventional flat txyz space, and results in extremely wild non-classical geometries.

so what happens when people start painting matter into the triangulations?
I suspect it would be foolish to try to put preconceptions on the outcome---they would just blind us.

And BTW let's keep in mind the general concept "nonperturbative quantum gravity" theory for something that gives you a new quantum picture of spacetime with Gen Rel in the large scale.
CDT is one possible "nonperturbative quantum gravity".
It happens to be one where they have reached the point of computer
simulations of the new model spacetime, and where they are getting interesting results.

But if there was a broad effort focussed on getting this kind of thing there could be other "nonperturbative quantum gravity" approaches also running computer models of spacetime and getting interesting results about dimensionality and bigbang cosmology and soforth.

spinfoams and loopcosmology does some of that, though CDT seems to have moved ahead of the pack, at least for now.

it just happens that this thread topic is CDT (small and fast-developing compared with LQG)
 
Last edited:
  • #108
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
Looks like in this thread I never gave the abstract for the new CDT paper
"Reconstructing the Universe". I is good to examine the abstract of a Loll paper because she carefully articulates what she is doing and it can convey some perspective

http://arxiv.org/abs/hep-th/0505154
Reconstructing the Universe
J. Ambjorn (NBI Copenhagen and U. Utrecht), J. Jurkiewicz (U. Krakow), R. Loll (U. Utrecht)
52 pages, 20 postscript figures

"We provide detailed evidence for the claim that nonperturbative quantum gravity, defined through state sums of causal triangulated geometries, possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale, and at the same time provides a nontrivial consistency check of the method of causal dynamical triangulations..."

To me this suggests that NONPERTURBATIVE QUANTUM GRAVITY regardless whether you do it with triangles has these features.
There may be various ways to do this, and arrive at quantum models of the continuum. And they may CONVERGE on a picture with

1. 4D and Gen Rel in largescale limit

2. Semiclassical (HawkingHartle) wavefunction of the scalefactor or size of universe

3. some picture of quantum spacetime dynamics at very small scale, which cumulatively and collectively generates expected largescale behavior

The abstract does not start off by saying "CDT", it comes to that only later.
So it is putting forward a rather bold claim
It says the authors have evidence that IT DOESNT MATTER WHETHER YOU USE OUR EXACT METHOD OR NOT, we have found out something about spacetime
and if you do some OTHER method of nonpert. QG and get a quantum spacetime dynamics that reproduces Gen Rel at large scale then quite possibly you will get similar results because THAT IS HOW SPACETIME IS.

this is a bold claim and they dont present it as a certainty, but something that they offer "detailed evidence" for.

And in fact the paper in question is full of detailed evidence.

so it is not saying "our CDT is the unique only approach, you all have to change to our method", it is saying that however you do the approximation whether or not with triangles and pathintegral, or whatever, if you can open a quantum spacetime dynamics window on the small scale that reproduces Gen Rel spacetime at the large, then you will see similar things!

So please try a slew of other approaches! We will see you at the finish line.

It's confident, and seems at the same time to have a clear modest reasonableness.

Well, I started this CDT thread well before "Reconstructing" appeared, and it appeared right around post #67 of this thread. I quoted some from the first paragraph, in post #67. But I never quoted from the abstract yet in this thread, so it was high time to listen to it.
 
Last edited:
  • #109
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
Let's remove references to any specfic method (in the above quote) and see what Loll's overall program might be:

http://arxiv.org/abs/hep-th/0505154
Reconstructing the Universe

"We provide detailed evidence for the claim that nonperturbative quantum gravity... possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale,..."

So the program can be called "quantum spacetime dynamics" in that you start with a dynamical principle at very small scale, and all of spacetime is supposed to GROW from the action of that principle at very small scale.

You dont make global assumptions----like it is a smooth manifold of some fixed chosen dimension with some number of coordinates-----you only specify how something works at the subsubatomic level. the whole shebang is supposed to HATCH from that seed.

and what hatches has to look and act right on the large scale

that is her program,

and IT SHOULD NOT DEPEND ON THE PARTICULAR METHOD
anything that deserves to be called "nonperturb. quantum gravity" should POSSESS A LARGESCALE LIMIT---should be able to put in place some microscopic dynamical principle and have a familiar 4D spacetime grow from it. It should be able to because they did this in one example of a "nonperturb. quantum gravity" and surely they do not have a patent on spacetime!

So now I think the bar has been raised. It should be possible to model the spacetime continuum with many different methods of nonpert. QG and get this kind of result. Because it is it---different ways of looking at it should converge.

This is perhaps rather radical to say and might be wrong, but it says the map of QG is now changed and the game is redefined with a new price of admission. The candidate methods can show who they are by reproducing 4D and the semiclassical cosmology result that was mentioned.
Or something like that. i am still not certain how exactly things have changed but i do believe we have a new game.
 
Last edited:
  • #110
marcus
Science Advisor
Gold Member
Dearly Missed
24,738
785
http://arxiv.org/abs/hep-th/0505154
Reconstructing the Universe

"We provide detailed evidence for the claim that nonperturbative quantum gravity... possesses a large-scale limit in which the dimension of spacetime is four and the dynamics of the volume of the universe behaves semiclassically. This is a first step in reconstructing the universe from a dynamical principle at the Planck scale,..."

So the program can be called "quantum spacetime dynamics" in that you start with a dynamical principle at very small scale, and all of spacetime is supposed to GROW from the action of that principle at very small scale.
...
just understanding the terms in which the major players see what they are doing can be a project in itself

what does the overall goal of nonperturbative quantum gravity mean to the people who are driving toward it along these various approaches like CDT?

Fotini Markopoulou gave a short definition in a recent paper she did with Mohammad Ansari

<<The failure of perturbative approaches to quantum gravity has motivated theorists to study non-perturbative quantization of gravity. These seek a consistent quantum dynamics on the set of all Lorentzian spacetime geometries. One such approach which has led to very interesting results is the causal dynamical triangulation (CDT) approach[1, 2]. In the interest of understanding why this approach leads to non-trivial results, in this paper we study...>>

this is from the introduction of
http://arxiv.org/hep-th/0505165 [Broken]
A statistical formalism of Causal Dynamical Triangulations
Mohammad H. Ansari, Fotini Markopoulou
20 pages, 19 pictures, 1 graph
Abstract:"We rewrite the 1+1 Causal Dynamical Triangulations model as a spin system and thus provide a new method of solution of the model."
 
Last edited by a moderator:

Related Threads on Two World-theories (neither one especially stringy)

  • Last Post
Replies
0
Views
2K
  • Last Post
Replies
1
Views
2K
  • Last Post
Replies
3
Views
2K
  • Poll
  • Last Post
Replies
4
Views
3K
Replies
1
Views
3K
Replies
5
Views
3K
Replies
2
Views
1K
  • Last Post
2
Replies
49
Views
8K
Replies
10
Views
3K
Replies
2
Views
1K
Top