Why is spacetime four-dimensional

  • Thread starter Thread starter tom.stoer
  • Start date Start date
  • Tags Tags
    Spacetime
  • #61
tom.stoer said:
So instead of having a dim-fixed starting point + dim=4 selecting dynamics it's the other way round: one as a dim-free setup + non-dyn. selection principle + dim-agnostic dynamics.

Couldn't 4D be selected because first principles require an infinite number of homeomorphic but non-diffeomorphic structures. So I was looking for where such structures might be used in a physical context and thought about how Feynman paths might be homeomorphic but not diffeomorphic to each other, and you'd need an infinite number of them. Although, you'd probably have to do a path integral of 4D space (paths) that are homeomorphic but not diffeomorphic to each other. So if one could justify the use of Feynman type path integrals, then 4D might become logically necessary, right?
 
Physics news on Phys.org
  • #62
tom.stoer said:
So the question is: what are physical identical entities
We agree on the question.

This is also a different but deeper perspective to the old question of what the important observables; I mean do we quantize observer invariants, or do we form new invariants from quantized variants?

Because "quantization" is not just a mechanical procedure although one somtimes get that impression. It's is just "taking the inference perspective seriously". The choice reflects how seriously we take the inferencial status of physics. They way QFT "implements this" mathematically can IMHO be understood as necessarily a special case.

Namely: Who is counting? An inside observer, or an external observer? That's the first question.

I'd suggest that current QFT, makes sense in this perspective if the counter is an external observer. And here external is relatively speaking, not external to the universe of crouse. Just external to the interaction domain, which is the case in particle experiments. The external observer is the labframe. In this sense current understanding is purely descriptive, it is not really the basis for decision making.

But this is not the general case, therefore the exact mathematical abstraction of QFT, breaks down for a "general inside counter", and an inside counter is not merely doing descriptive science, it bets it's life on it's counting, since the action of this inside observer is dependent on predicting the unknown environment.

To imagine inside counters, also in a deep sense touches upon RG. Since it is like scaling the counting context. So that you count naked events or events from the much more complex screene/antiscreened original system. Again current RG, describes this scaling descriptively relative to a bigger context. Ie. from assumptions of some naked action and a environment with screening antiscreening effects this is predictable; and this can be described and tested against experiment. Again this theory or theory scaling is not a proper inside view in RG.

So the same idealisation exists there. RG and counting, are integral parts, and both these things will need reconstruction in such a counting scheme you seek ( and I see it too, so it think we share the quest here).

So I think it's not possible to resolve this, by keep taking the same of PI formalism for granted and ONLY focus on various spacetime topologies and diffomorphisms... I agree that needs to be done, but I feel quite confident in my hunch that clarifying this, in the sense you suggest... is probably possible, but it will require a deepening of many things.. including foundations of QM and RG.

But if we can agree on a common question here, that's still quite nice. If I understand surprised right he seems to more or less share the same quest, except the question may be formulated different from within ST?

More later...

/Fredrik
 
  • #63
Since I sometimes think of evolution, one should maybe clarify the difference to "dynamical" evolution.

tom.stoer said:
So instead of having a dim-fixed starting point + dim=4 selecting dynamics it's the other way round: one as a dim-free setup + non-dyn. selection principle + dim-agnostic dynamics.

If I understand you right, you by "dynamical selection principle" mean a deterministic law (although it can of course stil lbe probabilistic; just like QM) that rules the dynamics of the system, and this then selects the 4D structure.

Then I fully agree that such an "dynamical selection" does in fact no explain anyitnh, it's just a recoding of the same problem, but where the "why 4D" then transforms into "why this particular dynamical law(that "happens" to select 4D;)"

I do however think of the mechanism of evolution, that does select 4D. But not one which is ruled by deterministic evoluton laws, but more a darwinian evolution.

Of course the details of this must be clarified. I see this as work in progress. But this can explain things like; we do NOT count all "past possibilities" in the action integral, we only count the FUTURE possibilities. Because for a real bounded observer, I think that part of the history must necessarily be forgotten.

So evolution of law can still be seens as a random walk, ande here the number of possibilities and favouring of 4D may still have a place like you suggest. But this I see not as a "dynamical evolution" but rather as an selective and adaptive evolution.

I figure you will think that this is starting to just get more foggy and foggy, but I think there are some expoits here that to my knowledge has never been explored.

Namely to reconstruct the counting, in depth, and consider "artifically" probably evolutionary steps and come up with arguments for why nature looks like it does, that are more like rational inferences, rather than logical necessities.

I really do not have much time at all myself, although I try to make progress with the little tiny time I hade. I do enjoy and hope to see some of the promising professionals that are working in a promising direction make some progress here.

/Fredrik
 
  • #64
tom.stoer said:
So any other (discrete) structure that could do the same job would be welcome.

All I can say at the moment, is that I have some fairly specificf ideas here, but they are very immature. But I think this way is the right now.

My exploit is to start my reconstruction in the low complexity end of the RG. And consider how the evolving interactions develop relations (seed to spacetime) and how the set of possibilities increase as complexity does. The point is that at the low complexity limit, you can pretty much manually look a the possibilities. I think this would correspond to a level beyond the continuum beyond "strings" or other continuum measures. Something like at causet level... but still for some reason causet papers tend to get a different turn that I want to see. But the basic abstraction of ordered sets (corresponding to events) and historeise or chains of events corresponding to observers are I think plausible to me.

The continuum structures you think about, should emergen in some large complexity limit, and I am not crazy enough to think that a physical theory need to model every information bit in the universe... rather at some point we wil lconenct to ordinary continuum models, but very enriched with the new strong guidance we apparently need.

/Fredrik
 
  • #65
suprised said:
That's the hitch. Nothing forbids eg. d=10, that is, no compactifiction. Or simple torus compactification with maximal susy to any d up to 9.

In all those sugra compactifications like Freund-Rubin one always assumes some background, or some class of background,

Just to be sure, have you read the paper of F-R and do you remember that it assumes some background, or are you guessing? My recollection was that it was a dynamical argument, from a lagrangian and an action.

Also, I remember there was papers such as "10 into 4 doesn't go", showing that the F-R arguments were very particular of 11=7+4.

I think that in this kind of threads we are dangereously near of the mechanisms of consensus science: someone guess some content, it coincides with another guess, and nobody checks. I can try to xerox some papers for interested people, but if you guys don't have access even to commonplace journals that are available in any university campus, I am not sure if it is worthwhile.
 
  • #66
arivero said:
Just to be sure, have you read the paper of F-R and do you remember that it assumes some background, or are you guessing? My recollection was that it was a dynamical argument, from a lagrangian and an action.

The FR paper is available at KEK http://ccdb4fs.kek.jp/cgi-bin/img_index?198010222

There's no dynamical argument at all. The whole point of FR solutions is that they are maximally supersymmetric, however that means that they are at the same energy as the uncompactified theory. So there is no dynamical argument selecting FR without additional physics that we do not as yet know about.

Also, I remember there was papers such as "10 into 4 doesn't go", showing that the F-R arguments were very particular of 11=7+4.

Again, FR solutions, in their original sense, were maximally supersymmetric solutions. There are many more options available if you only want to preserve one supersymmetry in 4d. That these were not known in 1980 does not mean that we should ignore them.
 
Last edited by a moderator:
  • #67
fzero said:
The FR paper is available at KEK http://ccdb4fs.kek.jp/cgi-bin/img_index?198010222

There's no dynamical argument at all. The whole point of FR solutions is that they are maximally supersymmetric, however that means that they are at the same energy as the uncompactified theory. So there is no dynamical argument selecting FR without additional physics that we do not as yet know about.

Thanks, my recollection was different! My reading was that maximal supersymmetry limits the choosing to the 3-index antisymmetric tensor, and that then Einstein-Hilbert equations imply that any separation, if it exists, must me 4+7.

EDIT: In fact, my re-reading of the paper doesn't contradict my previous recollection, first they proof that the existence of a s-indexed antysym tensor implies that compactifications must be of the form (s+1), (D-s-1). They use Einstein-Hilbert equations, not susy, to prove this argument. Then D=11 Sugra in maximal susy has a s=3 tensor, ann they get the announced result. But the compactification argument does not use susy at all, it seems to me.
 
Last edited by a moderator:
  • #68
arivero said:
Thanks, my recollection was different! My reading was that maximal supersymmetry limits the choosing to the 3-index antisymmetric tensor, and that then Einstein-Hilbert equations imply that any separation, if it exists, must me 4+7.

EDIT: In fact, my re-reading of the paper doesn't contradict my previous recollection, first they proof that the existence of a s-indexed antysym tensor implies that compactifications must be of the form (s+1), (D-s-1).

They make the assumption that the (s+1)-form must be proportional to the volume form of the compact manifold. It is a worthwhile class of solutions to study, but it is by far not the only class. In fact, one reason not to do so is that the VEV of the kinetic term for the form becomes the negative cosmological constant of the AdS part of the solution. While there are models like Bousso-Polchinski, where the fluxes partially cancel the naive 10^{120}~\text{eV} scale CC, they are all incredibly fine-tuned. Other examples of moduli stabilization rely on much more modest amounts of flux.

They use Einstein-Hilbert equations, not susy, to prove this argument. Then D=11 Sugra in maximal susy has a s=3 tensor, ann they get the announced result. But the compactification argument does not use susy at all, it seems to me.

True, there are various internal manifolds that one can consider. The round spheres are maximally supersymmetric. This, together with hints at gauge groups from deformed spheres was what made these models interesting.

Incidentally, it is important to check the stability of these solutions in the absence of supersymmetry. I don't remember any relevant references, but I think most non-SUSY solutions would be unstable to decay to flat space.
 
  • #69
fzero said:
They make the assumption that the (s+1)-form must be proportional to the volume form of the compact manifold.

Ah, so proportionality of the s-form + application of Einstein-Hilbert action imply (s+1), and then susy implies s=3. And it uses an action principle (Einsten-Hilbert).

Of course it is not the right solution. If it were, we should not be here discussing about how to find solutions. :cool:

I think that the question of stability was studied too in the eighties, for spheres and deformed spheres, with both good and bad results, depending of parameters. In any case, as the problem of fermions show, spheres are not the complete solution neither, just interesting models that seem to be close to the real thing. Probably the deformed 7 spheres and the spaces with standard model isometries are connected from the fact that CP2 is a branched covering of the 4-sphere, a very singular situation.

The point of 11 SUGRA=7+4 being near of the real thing is that it was a serious justification to study M-theory. In fact it is better justification that to study it "because it is cool", or "because I am going to get more citations". Blame the split between hep-ph and hep-th.
 
  • #70
jal said:
Fra always says ... “from a given observers "inside view"”

Fra, take what you you say to the level of the universe of what a QUARKION would say.
:cool:

I await to hear what else you think the QUARKIONS WOULD SAY about their universe.

Jal, you're right that asking what a "quark would see" does fit into my intrinsic inference quest :)

Though it's too early for me to speculate in this. The main reason is that before quarks enter the picture I expect the formation of continuum like structure comes first. Now, even if someone would argue that it's 4D rather than 2D, 2D is neverthelss a countinuum.

So to attach my envisions construction into the standard big bang timeline, the starting points is somehow the Planck epoch. As early as this, is where the "discrete picture applies". When we get to the quark formation we first need to understand how the complexions separated out from gravity and how the continnuum approximation is formed.

/Fredrik
 
  • #71
jal said:
5. In the beginning, It appeared that our degrees of freedom were limited to 2 and that we were organized so that we could only move from a cubic to a hex. pattern.

Roughly, the simplest way I imagine how 2D "spacetime" emerges from evolving discrete complexions is like this.

Consider an observer that has a finite information capacity (memory) that can distinguish only ONE boolean event. Consider a counter that simply encodes/stores the historical counts indexed by 0 and 1.

At each instant all there is, is a counter state.

At the high complexity limit when the counter structure becomes sufficiently complex, the limit of the state space of the counter converges fills [0,1]. So almost a real number (but the further construction can only be understood if it's acknowledged that the limit is never reached).

The state of this counter is constantly challanged by new events and when the counter is saturated, a decision problem appears: An existing count needs to be erased from memory in order to make room from fresh data. What is the optimal information update here? I conjecture that data is ereased randomly!

(This means the erased data is randomly distributed with respect to the emitter, but not necessariy with respect to the receiver; compare here to black body radiation and the information content of hawking radiation)

As the complexity of the observer increases (getting close to the continuum), more possibilities of reencoding the microstructure appears! For example one can consider histories of counter states, effectively considering a history of real numbers. This is the first dimension.

This can then be repeated. But clearly the stability of this higher dimensional records depends on the complexity. At low complexity, the idea is taht these are unlikely to appears, for statistical reasons. The are not forbidden at all, they just don't happen since they are unstable.

But in parallell to this simple cobordism type of genration of dimensions, there are OTHER maybe more interesting development, such as more complex recodings... cobordism is extremely SIMPLE. More complex things is formation of non-commutative strucutres such as a fourier-like transform of the first "string" of real numbers. This would encode the state of change, and thus increase predictivity and stability of the entire measure complex.

So dimensional creation and creation of non-commutative structures are really both just different types of recoding of the data. The selection of WHICH of these recodings that are most stable is the challange.

IF you start from the low complexity end, one can user combinatorics and look as things.

Also the cobordism type of development (histories of states by recursion) and the development of parallell non-commutative structures are in equilibrium since both processes are constrained by the same ocmplexity. Inflating higher dimensions is extremely complexity demanding, but even creating parallell non-commutin structures are... but at the same time this entire structure complexed is constantly challaged by it's environemnt... and if you picture an idea where ALL these possibilities are randomly tried, what emerges in evolution is the optimally fit decomosition of eternal dimensionality and internal non-commuting structures. There is some equilibrium condition we seek here. This is how I see it.

I'm working on this and, all along the guiding principles are no ad hoc actions, all actions are rational random actions. The point is that what is just a entropic dissipation in a simple microstructure, will generate highly nontrivial actions when you combine it with higher dimensions (ie more than one:) and non-commuting structures.

/Fredrik
 
  • #72
jal said:
1. the universe is confined to 10^-15m

Since I think you expected some informal associations, to spawn imagination, it's tempting to make also the following picture of confinement and the origina of quark mass.

The most obvious reason why you never see something in isolation, is because it's just one face of something bigger, right? There is always flip side, and they support each other.

If you compare some ideas from ST where quarks are associated with end of the string. Then combine that with the idea above that the string index is the [0,1]. Then confinement seems to be related to that it doesn't make senes to consider the upper limit of the state space unless there is an lower limit.

I mean the only way to separate the limits, is to split the index (ie SPLIT the STATE SPACE of the counter into TWO) which then corresponds to creating new pair of "ends". This is easier to understand if one understand that the string index is really just an index defined by the states of a counter. And of the history of this counter for somereason weakens the support of the index in the middle-states, then that effectively creates two new ends, and even the slighest fluctuation and random deletion of data (mentioned previously) risks breaking the link. In neither way does an isolated upper limit make sense w/o it's lower limit.

I think it's the fact that quarks are not seen in isolation, may make understading their mass values easier. The origin of the mass of the quarks might then always happen in the not one by one, but in the bound quark-systems. The bound system is created directly as a measure complex, and the quarks are just inseparable logical components of this.

This only way to really split them, is by creating more of them.

I hope no one is too offended by this baloney, but it is just another "mental image" that may explain make sense of this "counting picture" the thread is about. After all it's a subtle thing, to ask for hte physical basis of counting. All these visions are circling my head but there is indeed enourmous effeorts needed to develop this into a full blown theory. But acquiring some intuition and abstraction models is I think good, that doesn't mean there is any reason to mix these visions up with the full model. It's perhaps though, what it would take to UNDERSTAND such a model, once it's on the table. At least that's how I see it.

/Fredrik
 
  • #73
“Since I think you expected some informal associations, to spawn imagination”

I’m an amateur compared to you.
:blushing:

“So to attach my envisions construction into the standard big bang timeline, the starting points is somehow the Planck epoch. As early as this, is where the "discrete picture applies". When we get to the quark formation we first need to understand how the complexions separated out from gravity and how the continuum approximation is formed.”

... where the "discrete picture applies"

My understanding is that quarks are considered discrete.
If you make the assumption that discreteness originates at the Planck epoch then you are obliged to consider densest packing, (hex. or cubic) with the size of a dimension being reduced, (Not a new concept. String uses that concept).

CERN is on the verge of giving us some hints on discreteness of quarks and maybe the discreteness in the perfect liquid.

Should discreteness be demonstrated, in the perfect liquid, then my avatar would be good visualization and lattice, LQG, string calculations should lead to a mathematical description of what could be happening and what could have happened in the beginning.

jal
 
  • #75
jal said:
I’m an amateur compared to you.

I'm definitely not a professional either, if I were I should have made far more progress than I have since I resumed this. The difference between trying to make progress in small time slots at weekends and nights and beeing paid to spend all days doing it is gigantic. (although of course, most professionals doesn't all days eithers as they need to often do part time teaching etc).

To look at the bright side of life, freedom of affiliation is also strenght, as it's easer to be faithful to your original ideas. Time is the only issue.

jal said:
My understanding is that quarks are considered discrete.
If you make the assumption that discreteness originates at the Planck epoch then you are obliged to consider densest packing, (hex. or cubic) with the size of a dimension being reduced, (Not a new concept. String uses that concept).

You seem to always come back to this picutre of "perfect symmetry" etc.. I think you think in a different way. You seem to see the big bang from an external view. Ie. a perfect symemtry that is subsequently broken? something like that? this is an extenral picture.

I argue that an internal observer, would not SEE this perfect symmetry. The internal observer is just undecidable about almost everything. An internal observer can not infer a perfect symmetry - only an external observer can. This is the different I think between considering the conditions close to the big bang in a laboratory; where we DO have an external observer, and to SCALE the thery back to those proto-observer that did exists back then.

Of course, both perspective are valid! I just think that tha latter perspective has the simplest view (easiest to understand), this is the exploit I picture.

The quark masses for example, the external inference we have to day are experimental. But a good "checkpoint" would be to see if relations between the masses (and mass I associae to complexity), can be postdicted. The wrong postdiction would kill the reconstruction.

From the inferencial perspective, anything with mass is not elementary. This is why ALL mass needs to be explaind. Just explaining 95% of all mass as confined energy still leaves us with 5%.

/Fredrik
 
Last edited:
  • #76
With the assumptions of more than 3 spatial dimensions, then the definition of a closed system must be expanded to include those other dimensions.
Would this imply the redefining the role of a neutrino?
Does it take energy to open up a path to another dimension? Could neutrinos be that energy requirement?
What are the kinds of energies would can come into our 3 space dimension?
( dark energy?, gravity?, tachyons?, virtual particles or quantum tunneling?)
---
http://en.wikipedia.org/wiki/Neutrino
Neutrino

Wolfgang Pauli theorized that an undetected particle was carrying away the observed difference between the energy, momentum, and angular momentum of the initial and final particles.
---
http://en.wikipedia.org/wiki/Conservation_of_energy
The law of conservation of energy is an empirical law of physics. It states that the total amount of energy in an isolated system remains constant over time (is said to be conserved over time). A consequence of this law is that energy can neither be created or destroyed: it can only be transformed from one state to another. The only thing that can happen to energy in a closed system is that it can change form: for instance chemical energy can become kinetic energy.
 
  • #77
Are you referring the universe as a "closed system"?

FWIW It's not how I see it. And more importantly I don't think it's how an inside observer can possibly see it: I do not see how an inside observer can much such an inference that the environment in which itself lives is closed. What does that even mean? I simply can't imagine the inference. What I can imagine is an expectation or illusion that it's closed. But the stability of such illusions remains undecidable. And if I understan you right, you seek to use this as a hard constraint. That logic is not sound to me.

To think of the universe from the outside as something that is closed, expands etc is to me somewhat of a fallacy due to applying the science we are known to apply to subsystems to the universe, where there always IS an effective external view. From the inside view, this external view is as I see it totally wiped out.

/Fredrik
 
  • #78
Are you referring the universe as a "closed system"?
No. It is limited by the event horizon.
However, only the universe of the proton, which is what is being considered, it is closed/confined. (10^-15m)
 
Last edited:
  • #79
Ok, but I'm still not sure what you mean by closed. Even if one can not isolate quarks without creating other quarks around it, the entire complex (say proton, neutron) might be scalable. The origin and organisation of information in the proton, and how the proton responds to external perturbation is exactly what I think requires explanation. I can not imagine using this as a starting point. Then one has already missed some interesting steps.

jal said:
With the assumptions of more than 3 spatial dimensions, then the definition of a closed system must be expanded to include those other dimensions.

In the way I mentally picture the discrete complexion picture above, there is no god given dimensionality at all. And different dimensionalities can exist without changing the complexity just by different ordering and grouping of discreteness.

I do not have a _visual_ picture of this at all, my own picture is just an abstraction in terms of a information processing/creating/storing observer that does a random walk in just a black swamp. The only map he has, is in his internal structure, acquired from the past. During equlibirium his internal map will not need revision and we have an holographic connection. But many systems aren't in equilibrium, it's just a special case.

jal said:
Would this imply the redefining the role of a neutrino?
Does it take energy to open up a path to another dimension? Could neutrinos be that
energy requirement?
What are the kinds of energies would can come into our 3 space dimension?
( dark energy?, gravity?, tachyons?, virtual particles or quantum tunneling?)

These specific question I can't yet connect to. It's too early for me, but I think at some point they will be a handle on this.

Personally I picture some sort of unified quantum, from which the various quantums of the other interactions branch off as more complex observers emergg (starting from some basic Planck view, or below that, what do I know).

So in this perspective, a proton is indeed already a very complex observer.

A simple observer, then we're talking about perhaps a single massless bit or something fuzzy like that. So there would be an hierarcy starting from an almost trivial "observer, and then as you let the complexity scale run, stable-observer-complexes emerged along the way and serve as more complex building blocks for further bigger sturctures. Somehere in this hierarchy all the elementary particle must come up, or that's the idea.

And WITH THEN, implicit in their relations, also the selection of 4D spacetime.

/Fredrik
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 43 ·
2
Replies
43
Views
10K
  • · Replies 0 ·
Replies
0
Views
3K
  • · Replies 19 ·
Replies
19
Views
5K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 31 ·
2
Replies
31
Views
6K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
6
Views
3K
Replies
26
Views
5K
  • · Replies 38 ·
2
Replies
38
Views
6K