ccdantas said:
I just would like to point to some references here that are -- what I would call -- borderline to that system's view of spacetime.
.
Thanks for the refs. I'm familiar with Sorkin and Requardt. And I think they neatly illustrate the basic issue I am trying to highlight here. They are not what I would call a "systems" approach. Let me try to explain.
There would seem to be two broad view that can be taken of some kind of general pregeometry/self-organisation route from QM to classical spacetime.
1) Sorkin and others (yourself?) take the position that the quantum realm is an "everythingness" of events. Stuff is pre-geometrically going off in every direction. And then all this activity becomes self-selecting to some flat classical average. Spacetime emerges from the foam in self-organising fashion, avoiding deadlocks, etc.
Or as Requardt puts it: "we rather view primordial space-time as a large dynamic array of interacting elementary degrees of freedom which have the propensity to generate, as an emergent phenomenon, a macroscopic smooth space-time on a coarser level of resolution."
So we are imagining that at a certain scale, the sub-planck, there is a pregeometry that condenses through some form of self-organisation into more cogent spacetime.
2) The alternative view, one which I would say is the true systems approach, one that would fit with the dissipative structure and hierarchy theory models of systems, shares much of this basic thinking. But the critical difference is that the "selection rules" - whatever it is that acts on the QM foam to shape it up - lie within the system itself. The foam is not intrinsically self-organising, but is organised by constraints imposed by the system that emerges from it.
The big difference this makes is that it introduces scale to the story. The QM realm no longer has to contain both the potentials and the self-organisation. Instead these two aspects of reality are divorced. The QM realm becomes just the potential and the system is then an informational structure, a dissipative system, which is "milking" this potential second law-wise to create itself, to expand and develope/cool.
I think this second dissipative structure approach to pregeometry is in the back of many people's minds. And I took this to be what you were thinking when you talk about:
"The acting on shared resources by concurrent processes in turn leads to what we call "causality relations" and "time flow"."
The shared resources would be the larger scale, the global state of the system, in which the selection rules are embedded. In Fra's terms perhaps, certainly in a pan-semiotic/Peircean view, it would be the generalised observer. Then this larger scale acts downwards on the local or smallest scale. From a foam of QM potential, events are selected. There is a decoherence that dissipates the foam's excess of degrees of freedom and fixes a part of classical universe safely in place.
3) So there is a choice between a nakedly self-organising pregeometry (which would make the small scale of physical description the most fundamental) and a systems-style or hierarchical organisation (which would now say that both the global scale and the local scale are fundamental - equal even if different).
To me, the systems view sorts out all sorts of problems.
For example, QM gravity becomes a false concern. People are feeling the urge to collapse relativity to QM, to render all large scale description in terms of the smallest possible scale. If small is fundamental, this clearly this is what we must achieve.
But if we have a systems model, then it is natural to expect to have two "fundamental" modes of description. In hierarchy theory, we would expect a global scale that constrains, a local scale that constructs - so two different "causalities" that then are in interaction.
What would happen is that if you tried to collapse the local and the global into one same thing is that you would end up dissolving the whole system. You would be left with neither the small scale, nor the large, just a vague potential again. A pregeometry - without any selection rules to organise it, dissipate all its degrees of freedoms.
So you can see how the placement of the selection rules (the epistemic cut, the quantum collapse, the decoherence) is a foundational issue.
There is a big difference whether you are presuming selection is part of the QM realm itself (so classical/relativitistic levels of the system are "simply" emergent). Or whether the selection rules are what emerge and cannot be found in the QM description for a good reason (now we are talking about hierarchical or "complex emergence").
As a quick illustration of the dissipative structure logic, think of a tornado or dust devil or a whorl in a stream.
There is some entropy gradient, some flow of air or water. A lot of pregeometry in the local chaotic motions of molecules. Then a selection rule develops. A vortex that grows, expands, by entraining local motions to a globally more efficient dissipative structure.
So the local particles construct the whole. They add up to make the shape of the system. But it is the shape of the system that imposes general constraints. It is providing the "least mean path" for entropy flow. It is the scale of description that represents the selection rule that is causing the whole to self-organise into crisp being.
Zoom down to local scale and all you see is random particles - QM foam. Step back to global scale and you see continuous, coherent curvature - the "closed universe" of the whorl.