Likelihood of M-theory: 1-10 Scale

  • Thread starter Thread starter Mwyn
  • Start date Start date
  • Tags Tags
    Scale
  • #31
kneemo said:
Hi Marcus

Let us return to the path integral in eq. (1) of hep-th/0105267. The path integral is re-written as a discrete sum over inequivalent triangulations T. A basic question is: how does one acquire just one of the many inequivalent triangulations T? And given a specific triangulation T_0, what action is performed to acquire a new triangulation T_1?
.

Hello Mike,
I believe you indicated you were working towards your Masters at Cal State LA? Do I have that right? You can either take an exam or write a thesis, then---don't have to do both, please correct me if I'm wrong. How are things going?

It sounds like you have read at least the first page of hep-th/0105267. This is wonderful! I am delighted and urge you seriously to read more.

The answer to your question is section 7, page 20. there are some interesting details on pages 23-25.

the presentation is very clear and concise
Frankly I could not hope to do better. So I suggest you read their pages 20, 23-25, rather than my trying to paraphrase.
You will be pleased to see that they describe rather concretely and explicitly what action is performed by the computer program---or, to put it in your words, they say
"given a specific triangulation T_0, what action is performed to acquire a new triangulation T_1".
 
Physics news on Phys.org
  • #32
Hi Marcus, hi kneemo

Well, I had guests also. Just back. Up to page 4 now:

They put causality in by hand? Why these type of simplices? Why the requirement that the resultant spacetime be a simplicial manifold? Is that because they are only considering the classical limit? If that's so, I'm OK with that point. By the way 'classical limit' in this context should mean (IMHO) standard quantum logic plus classical manifold spacetime. But this does not appear to be what AJL mean. They appear to be discussing what they believe to be an approximation of a full analytical approach to QG.

Weights from the Einstein action ... seems reasonable but, once again, I don't see their justification for this use of naive quantum principles.

Hope you can clarify some of these points for us, Marcus.
Kea
 
  • #33
By the way...

Marcus,

Obviously I have not stressed the following enough in my ravings about category theoretic logic.

1. Small scale = high 'particle' number = omega-categorical, implies dual 2D structure (although admittedly the details are still being worked out)

2. Large scale = minimal interaction = 2:2 qubit tetracat logic (which we also expect to mean 4D)

No manifolds put in by hand. No fixed dimension.

All the best
Kea :smile:
 
  • #34
Marcus, will you allow me to skip the numerics for now?

On page 37 they state "all the geometric properties of the spatial slices
measured so far can be modeled by a particular kind of branched polymers..."

By 'branched polymer' they mean what kneemo and I would call a 'rooted tree'. These beasts appear en mass in NCG. Recall that on PF we have discussed Connes, Marcolli, Kreimer and the new rigour behind the standard model - and its connection to NCG. A nice random reference:

H. Figueroa, J.M. Gracia-Bondia
On the antipode of Kreimer's Hopf algebra
http://arxiv.org/PS_cache/hep-th/pdf/9912/9912170.pdf

Still reading...
 
Last edited by a moderator:
  • #35
The geometry of dynamical triangulations pg. 12 said:
Dynamical triangulations are a variant of Regge calculus in the sense that in this formulation the summation over the length of the links is replaced by a direct summation over abstract triangulations where the length of the links is fixed to a given value a. In this way the elementary simplices of the triangulation provide a Diff-invariant cut-off and each triangulation is a representative of a whole equivalence class of metrics

We see that dynamical triangulations have a fixed link length 'a'. Now, if this assumption is valid will depend on the method by which we generate an elementary simplex.

Using NCG, we can attempt to generate an elementary symplex as a quiver (or pseudograph). Quivers arise in categorical approaches to D-branes and deconstruction (hep-th/0110146, hep-th/0502105), and have been discussed by Aaron Bergman and Urs Schreiber in sci.physics.strings:
Aaron Bergman said:
In article
<Pine.LNX.4.31.0503091443440.19481-...man.harvard.edu>,
Urs Schreiber <Urs.Schreiber@uni-essen.de> wrote:

> Lubos' blog entry on deconstruction made me have a closer look at this
> stuff, which I should have had long before.
>
> If I understand correctly a quiver can equivalently be defined as a
> functor from some graph category to Vect. Given some graph, it associates
> finite vector spaces to vertices and linear operators between these to
> directed edges.

That's not the definition of a quiver. A quiver is just a directed
graph. A quiver representation is a vector space at each node and a map
for each arrow. The graph should define a category and you could look at
the functors from this category to k-Vect.

Equivalently, you can form the quiver algebra. Let the arrows be denoted a_j and, to each node j, have an idempotent

e^2_j=e_j

You have two functions source and target which take these to nodes. For
the idempotents, both the source and the target maps take the idempotent
to its respective node. The source and target map for the arrows is
obvious. Then, take the free algebra on this set subject to the relation
that a product ab is nonzero iff t(b)=s(a) .

Aaron Bergman's quiver algebra description can be realized in projective space. This means our quiver simplex can eventually be represented on a fuzzy sphere (hep-th/0503039), which is an NCG construction.

There is more to say, but alas, I must sleep. :zzz:
 
  • #36
Mwyn said:
ok on a scale from one to ten exactly how likely is it for M-theorie to be true?

There is no zero choice? Because that is what it's predictive powers currently are. Maybe one day that will change, but by then you can be assured it will be a different animal than it is today, and will deserve a different name.
 
  • #37
kneemo said:
We see that dynamical triangulations have a fixed link length 'a'. Now, if this assumption is valid will depend on the method by which we generate an elementary simplex.

Using NCG, we can attempt to generate an elementary symplex as a quiver (or pseudograph)...

The CDT approach to quantizing gravity has no fixed link length 'a'. One particular triangulation will have a length 'a' which is the size of a spacelike tetrahedron. Then one let's 'a' go to zero.

the spacetime of CDT defined by taking the limit (as 'a' goes to zero) of finer and finer triangulated spaces using smaller and smaller simplexes.

the spacetime of CDT is not made of simplexes, the simplexes used in the approximations are, I guess I would say, a mathematical convenience

(as, in Freshman Calculus, "step functions" might be used in defining the integral, but ultimately the integral is not made of little skinny rectangles---the step functions are a convenience used along the way)

in some CDT papers, other simple geometrical objects are used besides simplexes.

the simplex is a very old mathematical object, it does not need NonCommutativeGeometry to define or validate it.

thanks for trying to show some fundamental overlap between NCG and CDT!
I still have hope that Kea will come up with an essential connection between the two----which would make NCG, in my view, considerably more promising as a possible way to describe gravity!

You too, Mike. Keep trying if you want. It was your notion that the two were connected (or so I interpret something you said) that I originally asked you, and later Kea when she appeared to concur in it, to substantiate.
 
Last edited:
  • #38
From page 14 of Reconstructing:

"We will currently concentrate on the purely geometric observables, leaving the coupling to test particles and matter fields to a later investigation..."

Marcus, I'm afraid you are going to have to do some very smooth talking to convince the likes of kneemo and I that there is any such thing as gravity without matter.

Kea :smile:
 
  • #39
kneemo,

I think our job here is to really convince Marcus that we're right, because if we can do that, if Marcus agrees with us, a whole lot more people will make an effort to understand NCG...and that's what counts.
 
  • #40
reagrdless of whether Marcus agrees i sure would like to know what it all means minus the geek speak and number crunching :bugeye:

Is plain fools english for plain english speaking fools like me too much to ask without making too much of an effort o:)
 
  • #41
marcus said:
The CDT approach to quantizing gravity has no fixed link length 'a'. One particular triangulation will have a length 'a' which is the size of a spacelike tetrahedron. Then one let's 'a' go to zero.

Hi Marcus

By using NCG, one need not let the link length 'a' go to zero. Read pgs. 2-3 of J. Madore's gr-qc/9906059 for a simple example of how lattices become fuzzy in NCG.
 
  • #42
spicerack said:
reagrdless of whether Marcus agrees i sure would like to know what it all means minus the geek speak and number crunching :bugeye:

Is plain fools english for plain english speaking fools like me too much to ask without making too much of an effort o:)

Hi all

Kneemo, this has been my quest, too. However, I have tried to learn to speak geek and to crunch numbers because that is the language spoken here.

One problem I have encountered trying to translate geekspeak is that geeks are now trying to investigate spacetime relationships that are fundamental but not obveous to daily experience. Our language (English anyway) was developed to deal with daily experience. As a result, we have many enforced thought habits which do not serve us well when dealing with quantum spacetime.

Mathematics is descriptive of but not limited to our daily experience. So it is actually easier to talk about these things using math rather than English. But math is indeed another language, and the alphabet in that language is huge, the vocabulary immense. Even Chinese looks like wooden building blocks compared to the advanced architecture of math.

Don't give up. Keep trying to read the physics and the math. I keep reading even when the words become gibberish. Somehow things percolate in the subconsious, and even though you did not understand a word of it yesterday, today it seems to make a little sense, and tomorrow it may even appear reasonable.

Be well,

nc
 
  • #43
kneemo said:
Hi Marcus

By using NCG, one need not let the link length 'a' go to zero. Read pgs. 2-3 of J. Madore's gr-qc/9906059 for a simple example of how lattices become fuzzy in NCG.

In some versions of NCG (as far as I know, at least where applied to gravity), one is PREVENTED from making length parameters smaller than a certain amount by a minimal length barrier.

One of the interesting things about CDT, and something that makes it different from several other approaches, is that it HAS NO MINIMAL LENGTH.

at least until now, no minimal length has been found in CDT, here is a recent statement to that effect from hep-th/0505113, page 2

"in quantum cosmology. We have recently begun an analysis of the microscopic properties of these quantum spacetimes. As in previous work, their geometry can be probed in a rather direct manner through Monte Carlo simulations and measurements. At small scales, it exhibits neither fundamental discreteness nor indication of a minimal length scale."

this may point to a theoretical divide between CDT and NCG! For instance, as you can see from the first 5 pages of the Madore article you cited, the versions of NCG he discusses have minimal lengths

here is a sample from page 5 of the article you cited:

"... Such models necessarily have a minimal length associated to them and quantum field theory on them is necessarily finite [90, 92, 94, 24]. In general this minimal length is usually considered to be in some ..."
 
Last edited:
  • #44
marcus said:
One of the interesting things about CDT, and something that makes it different from several other approaches, is that it HAS NO MINIMAL LENGTH.

That is, the simulations don't find any minimal length.

But the common a of all links, which then goes to zero (but only TOWARD zero in the simulations!) makes it all look more and more like what the lattice physicists do. Since the triangulation is only built to subsequently go away in the continuum limit, how is this fundamental?
 
  • #45
Nightcleaner and Spicerack, this discussion of "geekspeak" has me chuckling.

I can imagine that to Spicerack ears it sounds pretty esoteric and technical to be saying that two pictures of spacetime are incompatible because one theory gives rise to a minimal length or a notion of fundamental spacetime discreteness (which I am not sure is quite the same thing although related)

and the other theory does NOT give birth to a minimal length---a barrier smaller than which length is meaningless----or to a discreteness idea.

We are not in some primitive discussion like "UGH, DIS IS GOOD! UGH DIS IS BAD!" We are trying, I hope, to sort out various models of spacetime and see whether and how they connect to each other.

So at this moment I am looking at two called CDT and NCG (which to me looks like a large family or tribe of theories really, not a single unique one like CDT).

And i am looking at CDT and the NCG tribe----both are interesting and show some promise----and trying to distinguish some significant details that can let me see objectively what possible overlap there is.

so of course it is going to sound technical.

If you are mainly interested in having your imagination INSPIRED by stimulating talk about different theories, or if you are looking for something to BELIEVE in, then almost certainly this kind of technical examination of details would not interest you one bit!

However it is the details about CDT that have made it suddenly change the map of QG.
CDT does not give rise to a minimal length, does not exhibit fundamental discreteness, and it appears to be MORE BACKGROUND INDEPENDENT than Loop Gravity. CDT is not built on a pre-established differentiable manifold continuum with a pre-established dimensionality and coordinate functions.
It changes the map because it makes radical departures. It is based on ROUGHER AND LESS PREDETERMINED objects or foundations.

this is not to make a value judgement like "UGH, DIS GOOD!" Indeed maybe it is bad. Who cares? What matters is not what you think is good or bad or what you want to believe in or not believe in or what makes appealing mental images in one's head. What matters right now is that suddently something new is on the table.

another thing with CDT is you can run computer simulations and generate universes "experimentally" and study them and find out things (like about the dimension, or the effects of the dark energy Lambda term or whatever). you can find out things that you didnt anticipate! The CDT authors have been experiencing this. It was something of a surprise to them when last year they got a spacetime with largescale 4D dimensionality for the first time. Must have been great to see that coming out of the computer, the first time.

anyway it is somewhat unusual that CDT has ample numerical opportunities, a lot of theories are so abstract that you cannot calculate with them. they are not very "hands on". CDT is very hands on and constructive. the computer builds spacetimes for you and you get to study them.

the objective sign of the "change in the map" that I am seeing is the change in the programme topics between May 2004 Loop conference (in Marseille) and October 2005 Loop conference (in Potsdam)

I sympathize with Spicerack puzzlement, but I am not sure "geekspeak" is the real problem. The real problem may be that there is no reason compelling for her to be learning about CDT because it may not offer the imagistic stimulation or the verbal excitement of something like Brian Greene-style String theory. It is kind of Plain Jane Spacetime, modeled with the most unpretentious possible tools, with the least prior assumptions, with little by way of grand shocking discoveries like "eleven dimensions with the extra dimensions rolled up" and "fundamental discreteness and minimal length" and "colliding brane-worlds" and such.
 
  • #46
selfAdjoint said:
Since the triangulation is only built to subsequently go away in the continuum limit, how is this fundamental?

I don't know that the particular type of triangulation is fundamental. Did I say the triangulation was fundamental? As I pointed out several times, Renate Loll has used other shapes besides simplexes in some papers. Simplexes are simple tho, so there is probably no reason not to use the well-established theory of simplicial manifolds.

I remember in grad school in the late 1960s we got lectured about piecewise linear ( PL) manifolds. there was a guy who believed in studying PL manifolds rather than differentiable manifolds. At the time I did not see why, but maybe I see now. I did not guess that actual realworld spacetime might be better approximable using a quantum theory of PL manifolds instead of differentiable ones. CDT is based on PL geometry instead of Differential Geometry.

"Fundamental" something of a slippery term. I want to communicate what i think is fundamentally different about the CDT approach. The image is how a Feynman path is the limit of PIECEWISE STRAIGHT segments. And a CDT spacetime is the limit of piecewise flat, or PL, or piecewise minkowski, chunks.

Maybe Feynman would have been wrong if he had tried to approximate his path by smooth infinitely differentiable paths. Maybe we are wrong now if we try to approximate our spactime with smooth differentiable manifolds. maybe we should be approximating with PL manifolds, like they do in CDT.

But that is just a mental image. Let me try to list some ways CDT is DIFFERENT.

It is not based on a differentiable manifold (LQG and some others are)

It is not based on something using coordinates----curvature in CDT is found combinatorially, by counting

It does not automatically reflect a prior choice of dimension. the dimension emerges or arises from the model at run-time---it is dynamic and variable. again the dimension is something you find combinatorially, essentially by counting. (this feature is absent in some other quantum theories of gravity. one might hope that whatever is the final QG theory will explain why the universe looks 4D at large scale and this CDT feature is a step in that direction)

CDT has a hamiltonian, a transfer matrix, see e.g. the "Dynamically..." paper, one can calculate with it. The CDT path-integral is a rather close analog of the Feynman path-integral for a nonrelativistic particle using
piecewise straight paths. The simplexes are the analogs of the straight pieces. by contrast some other QG theories with which you cannot calculate much.

CDT is fundamentally different from some other simplicial QGs because of the causal layering. (the authors explain how this leads to a well-defined Wick rotation, which they say is essential to their computer simulations)
this layering actually has several important consequences, AJL say.

well, I can't give a complete list, only a tentative and partial one. maybe you will add or refine this
 
Last edited:
  • #47
marcus said:
I don't know that the particular type of triangulation is fundamental. Did I say the triangulation was fundamental? As I pointed out several times, Renate Loll has used other shapes besides simplexes in some papers. Simplexes are simple tho, so there is probably no reason not to use the well-established theory of simplicial manifolds.

I wasn't talking about the detailed technology of the triangulation, but about the whole project of doing a triangulation, doing nonperturbative physics on it (if only via simulations), and then letting the scale go to zero to recover the continuum. That's the QCD lattice strategy, and it seems to be Ambjorn et al's strategy too.

I remember in grad school in the late 1960s we got lectured about piecewise linear ( PL) manifolds. there was a guy who believed in studying PL manifolds rather than differentiable manifolds. At the time I did not see why, but maybe I see now. I did not guess that actual realworld spacetime might be better approximable using a quantum theory of PL manifolds instead of differentiable ones. CDT is based on PL geometry instead of Differential Geometry.

Somebody mentioned finite element method in engineering. That's a valid refence too. To me PL manifolds seem a kludge - neither honest polyhedra nor honest manifolds. Do we have any important topological results from them that couldn't be obtained a step up or a step down the generality ladder?

"Fundamental" something of a slippery term. I want to communicate what i think is fundamentally different about the CDT approach. The image is how a Feynman path is the limit of PIECEWISE STRAIGHT segments. And a CDT spacetime is the limit of piecewise flat, or PL, or piecewise minkowski, chunks.

I am sure you know Feynmann's pretty little piecewise-limiting picture is problematic in the Minkowski context. Does the phrase Wick rotation ring a bell? How about paracompact?

Maybe Feynman would have been wrong if he had tried to approximate his path by smooth infinitely differentiable paths. Maybe we are wrong now if we try to approximate our spactime with smooth differentiable manifolds. maybe we should be approximating with PL manifolds, like they do in CDT.

Maybe so. Cerainly it's a valid research program. But you seem to be defending it the way Lubos used to defend string theory; as the One True Way. Neither LQG nor string theory, to name just two, is truly down for the count, and Kea's higher categories may come from behind to conquer all, or something entirely differnt may happen. Let us keep our options open.

But that is just a mental image. Let me try to list some ways CDT is DIFFERENT.

It is not based on a differentiable manifold (LQG and some others are)

It is not based on something using coordinates----curvature in CDT is found combinatorially, by counting

It does not automatically reflect a prior choice of dimension. the dimension emerges or arises from the model at run-time---it is dynamic and variable. again the dimension is something you find combinatorially, essentially by counting. (this feature is absent in some other quantum theories of gravity. one might hope that whatever is the final QG theory will explain why the universe looks 4D at large scale and this CDT feature is a step in that direction)

The dimension aspect was certainly the strongest aspect of it last year. It remains to be seen whether the running dimension of this year strengthens their case or weakens it.

CDT has a hamiltonian, a transfer matrix, see e.g. the "Dynamically..." paper, one can calculate with it. The CDT path-integral is a rather close analog of the Feynman path-integral for a nonrelativistic particle using
piecewise straight paths. The simplexes are the analogs of the straight pieces. by contrast some other QG theories with which you cannot calculate much.

Correct me if I'm wrong, but the Hamiltonian only subsists at the a > 0 level, it does not carry through in the limit. Or have they somehow discovered how to generate a non constant Hamiltonian in GR?

CDT is fundamentally different from some other simplicial QGs because of the causal layering. (the authors explain how this leads to a well-defined Wick rotation, which they say is essential to their computer simulations)
this layering actually has several important consequences, AJL say.

Some have expressed a suspicion that they built pseudo-Riemannian in with their "causal" specification. Lubos used to say their path integrations were unsound because they refused to include acausal paths, which must be done (he said) if you want to generate valid physics.

well, I can't give a complete list, only a tentative and partial one. maybe you will add or refine this

You have been a splendid defender of CDT. And I am not really a critic of it. But it disturbs me to see you so...evangelical.. about it.
 
Last edited:
  • #48
marcus said:
I remember in the 1960s or 1970s in grad school we got lectured to about PL manifolds. there was a guy who believed in studying PL manifolds rather than differentiable manifolds. At the time I did not see why, but maybe I see now. CDT is based on PL geometry instead of Differential Geometry.

Indeed there is power in the use of PL manifolds. Even more basic, however, is a zero-dimensional manifold. Zero-dimensional manifolds are naturally produced in noncommutative geometry, from the spectra of C^*-algebras. For a commutative, unital C^*-algebra \mathcal{A}, the Gel'fand-Naimark theorem ensures that we recover a compact topological space X=\textrm{spec}(\mathcal{A}), such that C(X)=\mathcal{A}. What Alain Connes did was extend the essentials of the Gel'fand-Naimark construction and apply it to noncommutative C^*-algebras.

In Matrix theory, higher dimensional branes are built using the spectrum of hermitian matrix scalar fields \Phi^{\mu}. Their spectrum alone, only yields a zero-dimensional manifold. What is important are the functions over the space, which are encoded as entries of the scalar fields \Phi^{\mu}. The hermitian scalar fields \Phi^{\mu} are elements of \mathfrak{h}_N(\mathbb{C})\subset M_N(\mathbb{C}). As M_N(\mathbb{C}) is a noncommutative C^*-algebra, a spectral triple is built, and noncommutative geometry ensues.

On further analysis, we see that only the hermitian scalar fields \Phi^{\mu}\in \mathfrak{h}_N(\mathbb{C}) are used for the spectral procedure. The spectrum of \mathfrak{h}_N(\mathbb{C}) is thus of more importance than the full C^*-algebra M_N(\mathbb{C}). \mathfrak{h}_N(\mathbb{C}) is a simple formally real Jordan *-algebra, thus is commutative, but nonassociative under the Jordan product. Hence, the spectral geometry is not a noncommutative geometry, but is rather a nonassociative geometry. I've been using the term 'NCG' to include these nonassociative geometries as well, as the spectral procedure is based on that of NCG.

The nonassociative geometry of \mathfrak{h}_N(\mathbb{C}) includes the projective space \mathbb{CP}^{N-1}, whose points are primitive idempotents of \mathfrak{h}_N(\mathbb{C}). The lines of the space are rank two projections of \mathfrak{h}_N(\mathbb{C}). By the Jordan GNS construction, \mathfrak{h}_N(\mathbb{C}) becomes an algebra of observables over \mathbb{CP}^{N-1}. The noncommutative algebra over \mathbb{CP}^{N-1} is the C^*-algebra M_N(\mathbb{C}). The gauge symmetry of this quantum mechanics arises from the isometry group of \mathbb{CP}^{N-1} which is \textrm{Isom}(\mathbb{CP}^{N-1})= U(N), with Lie algebra \mathfrak{isom}(\mathbb{CP}^{N-1})=\mathfrak{su}(N). This is how one properly formulates the N-dimensional complex extension of J. Madore's fuzzy sphere.

Now consider the N=3 case, which yields the projective space \mathbb{CP}^{2}, with \mathfrak{h}_3(\mathbb{C}) as an algebra of observables. The Jordan GNS eigenvalue problem provides three real eigenvalues over \mathbb{CP}^{2}, corresponding to three primitive idempotent eigenmatrices. This provides us with a three-point lattice approximation of \mathbb{CP}^{2}. We acquire a projective simplex by recalling the projective geometry axiom:

For any two distinct points p, q, there is a unique line pq on which they both lie.

This provides three unique rank two projections connecting our primitive idempotent eigenmatrices in \mathbb{CP}^{2}. The gauge symmetry of this projective simplex is U(3), arising from the isometry group of \mathbb{CP}^{2}.

The moral of the story is: a simplex is not just a simplex when points and lines are matrices. When we allow more general simplex structures, we see we can incorporate gauge symmetry. Now imagine the power of a full projective triangulation of this type with a richer isometry gauge group. :!)
 
Last edited:
  • #49
selfAdjoint said:
You have been a splendid defender of CDT. And I am not really a critic of it. But it disturbs me to see you so...evangelical.. about it.

As for splendid, thanks! I am really just responding (partly as a mathematician but perhaps moreso) as a journalist. CDT is the hot story in quantum gravity at this time. The math is relatively fresh (more background independent and although the means are quite limited there seems opportunity for both computational experiment and new kinds of results)

If you have been reading my posts about this in various threads you can see that I am clearly not betting on any final outcome. It could turn out that LQG is right and NOT CDT, and it could turn out that NEITHER. Guesses about the final outcome are not so interesting to me as the story of CDT current developments.

An amusing side of it is that CDT has been achieving a series of firsts in the past couple of years (they point them out explicitly in their 4 recent papers so I probably don't have to list them for you if you have been keeping up) and yet----there are only 3 core workers!

String has on the order of 1000 active researchers and has been rather in the doldrums for past couple years. Not much to cheer about. Well maybe it is mathematically overweight or taking a pause to catch breath or something.

And Loop has on the order of 100 researchers and has made some notable progress in the past few years, I guess most notably in the cosmology department, getting back through the big bang, finding a generic mechanism for inflation, now beginning to understand the black hole.

and Loop output is growing sharply. Last time I looked it was posting around 170 papers per year on arxiv---a very rough measure, but I remember when the rate was more like 60 per year!

so from the journalist eye view Loop is showing outwards signs of success and robust health. But the hot story, for me, is what these THREE researchers have been achieving in a field where the basic output rate on arxiv is only around 4 papers per year!

The irony of this tickles me. The last shall be first and all that. So if you please you can consider my instincts not evangelical but news-houndish.
 
Last edited:
  • #50
selfAdjoint said:
...Correct me if I'm wrong, but the Hamiltonian only subsists at the a > 0 level, it does not carry through in the limit...
...

Exactly, this is my reading too. remember the field is very very new. they only got 4D last year. but at least for now the limit is only a ghostly presence defined as a limit of concrete things. maybe it never will be any more than that (I am speculating here)

ANY calculating that you want to do, you can ONLY do in the approximations. all the features of the limiting spacetime are only accessible and calculable (as accurately as one pleases, in principle, but practically limited by computer size and power) in the approximating triangulated spacetime.
 
  • #51
kneemo, thanks for the thoughts about NonComGeometry!
I am having a bit of difficulty reading some of the LaTex right now, hope it clears up.
 
  • #52
marcus said:
It is not based on something using coordinates----curvature in CDT is found combinatorially, by counting.

Hi Everybody

One can't get a good night's sleep around here without missing a hot discussion!

Marcus, we all agree that any decent theory of QG can't use spacetime coordinates as fundamental entities. I was hoping you might address some of my questions from yesterday, but they seem to have been forgotten.

Another question: the cosmological constant appears to play an important role in the AJL simulations; what if we had good reason, observationally, to think it was zero?

Cheers
Kea :smile:
 
Last edited:
  • #53
selfAdjoint said:
...neither honest polyhedra nor honest manifolds...

Neither of the two you mention seems likely to me exactly right for spacetime. "honest manifolds" means differential manifolds. IMO they are overdetermined, not background independent enough.
you have a single dimension (the number of smooth coordinate functions) which is good all over the manifold and at every scale even the smallest.

if you try to relax that by additional superstructure you get mathematically topheavy

on the other hand the usual polyhedron idea is a SIMPLICIAL COMPLEX and that can be a hodgepodge of differerent dimension simplices joined by toothpicks. It can be totally crazy and ugly. Not at all like spacetime ought to be. So going to simplicial complexes is relaxing too much.

the PL manifold (aka simplicial manifold) is intuitively (IMHO) relaxing the restrictions just enough. It is a simplicial complex which satisfies an additional condition which gives it a degree of uniformity.

they did not seem too interesting to me in the late 1960s when I was exposed to them, but I did not have foresight clairvoyance either. Now it seems just the ticket. we should probably have a tutorial thread on simplicial manifolds. Ambjorn has some online lecture notes aimed at the grad student level.
 
  • #54
Kea said:
... the cosmological constant appears to play an important role in the AJL simulations; what if we had good reason, observationally, to think it was zero?

that is an excellent question. I believe that CDT is falsifiable on several counts.

I think this is one. If one could show that Lambda was exactly zero then I THEENK that would shoot down CDT.

In other words CDT predicts, and bets its life, on a positive cosmological constant. At least in its present rather adolescent form. this is only my inexpert opinion.

I happen to find theories interesting which risk prediction and bet the ranch on various things, the more the better because it gives experimentalists more to do.

I kind of think that finding evidence of spatial discreteness or a minimal length would ALSO shoot down CDT. well there is enough here for several conversations. I have another chorus concert tonight and must leave soon
 
  • #55
Nice to see you here too, selfAdjoint.

Marcus, if you will allow me, I can give you a rough idea why the 'classical spacetime' limit produces causality from a more fundamental concept of observable:

F.W. Lawvere pointed out some time ago (1973) that the non-negative reals (plus infinity) form a nice symmetric monoidal category. A metric space may be thought of as a construction based on this category. \mathbb{R}^{+} is used here in the same way that the category \mathbf{2} of one non-identity arrow is used to construct posets. In other words, the two objects of \mathbf{2} somehow represent the two values, true and false, of classical logic. Standard quantum logic, as we all know, relies on a principle of superposition and the replacement of a 2 element set by a number field. That is, we must introduce negative quantities, which forces the possibility of pseudo-Riemannian metrics.

More on zero \Lambda later.

Cheers
Kea
 
Last edited:
  • #56
Kea said:
...That is, we must introduce negative quantities, which forces the possibility of pseudo-Riemannian metrics.
...

it is difficult to apply this to what I am interested in Kea, because in CDT there is no pseudo-Riemannian metric in sight and I don't know of anyone, certainly not the CDT authors, who wants there to be.

there is no differentiable manifold in sight for such a metric to be defined on. so what use? maybe you have some non-standard construction in mind.

so something that "forces the possibility" of such a metric does not appear relevant to CDT, even if it was, as you say, discovered in 1973.

I want to park my old sig. get back to it later.
CDT http://arxiv.org/hep-th/0105267 , http://arxiv.org/hep-th/0505154
GP http://arxiv.org/gr-qc/0505052
Loops05 http://loops05.aei.mpg.de/index_files/Programme.html
CNS http://arxiv.org/gr-qc/9404011 , http://arxiv.org/gr-qc/0205119

concert went well, lot of fun. unfortunately it is now summer break
 
Last edited by a moderator:
  • #57
Marcus said:
it is difficult to apply this to what I am interested in Kea, because in CDT there is no pseudo-Riemannian metric in sight and I don't know of anyone, certainly not the CDT authors, who wants there to be.

there is no differentiable manifold in sight for such a metric to be defined on. so what use? maybe you have some non-standard construction in

At the triangulation level they don't have pseudo-Riemannian, but that is what their "causality" does; it leads to pseudo-Riemannian in the continuum limit. No?

Kea pehaps you should start a new thread about these ideas? They are much worth looking at, but not so much under the rubric of CDT.
 
  • #58
selfAdjoint said:
At the triangulation level they don't have pseudo-Riemannian, but that is what their "causality" does; it leads to pseudo-Riemannian in the continuum limit. No?
...

Obviously it does lead to pseudo-Riemannian if you have a differential manifold to put the metric on that is what pseudo-Riemannian is all about!

but we do not know that the continuum limit is a differentiable manifold

I thought I made that clear. the continuum limit of quantum theories of simplicial geometry may be a new type of continuum

it may not be just some old differential manifold like we have been playing physics with since 1850. in fact this is what the CDT authors work INDICATES, because they get things happening with the dimension, in the continuum limit, which do not happen with diff. manif.

in other words the CDT technique is a doorway to a new model of continuum which gives us some more basic freedom in modeling spacetime

and a pseudoRiemannian metric is a specialized gizmo that works on vintage 1850 continuums and not on the new kind---that is how it is defined---so it is irrelevant

however it should certainly be fun to study and learn about the new kind of continuum, and there is a lot of new mathematics for PhD grad students to do here
:smile:
 
  • #59
marcus said:
...but we do not know that the continuum limit is a differentiable manifold

Marcus,

In our approach we don't assume differentiable manifolds either. I was just trying to make the point that by putting causality in by hand you cannot possibly be doing something as fundamental as is required, IMHO. Actually, Lawvere is discussing generalised metric spaces. Forget the manifolds. In CDT they talk about lengths. What kind of a mathematical beast is that?

selfAdjoint, at some point I'll update the "Third Road" with these causality issues.

Kea :smile:
 
Last edited:
  • #60
selfAdjoint said:
I wasn't talking about the detailed technology of the triangulation, but about the whole project of doing a triangulation, doing nonperturbative physics on it (if only via simulations), and then letting the scale go to zero to recover the continuum. That's the QCD lattice strategy, and it seems to be Ambjorn et al's strategy too.

Excellent point selfAdjoint. The analogy to lattice QCD is accurate, and it is well known these lattice techniques are problematic. As an alternative to the lattice, Snyder proposed choosing a sphere instead with noncommuting position operators. The supersymmetric extension then becomes de Sitter superspace (hep-th/0311002). Mathematically, Snyder's sphere (and its generalizations) amount to higher-dimensional versions of the fuzzy sphere of Madore, so are inherently NCG.
 
Last edited:

Similar threads

  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
1
Views
3K
Replies
72
Views
10K
  • · Replies 0 ·
Replies
0
Views
3K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 9 ·
Replies
9
Views
3K