Vagueness as a model of initial conditions

Click For Summary
The discussion centers on exploring "vagueness" as a potential framework for understanding the universe's initial conditions, contrasting it with traditional models that emphasize clear beginnings or states. Vagueness represents a pre-symmetrical state of chaotic potential, suggesting that the universe could emerge from a realm of indistinct possibilities rather than defined entities. This idea revives ancient philosophical concepts while challenging Bertrand Russell's assertion that vagueness is merely semantic and not a feature of reality. Current thinkers like Paul Davies and Lee Smolin are considering more complex metaphysical accounts, integrating selection principles that could govern how potential becomes actual. Ultimately, vagueness offers a fifth perspective on initial conditions, positing that everything exists as potential rather than as definite entities.
  • #31
This is all about modelling, so what works as metaphysical description rather than starting with a claim about what is true. Both standard reductionism and a systems approach to causality are models. You might have reason to use both, but in different domains. For instance, a machine like logic is very good for building machines. A systems logic maybe simply what you need for understanding fundamental nature.

Then to the question of does ontological vagueness (a general metaphysical idea) = quantum indeterminancy (a possible specific example - although possibly a canonical example)?

Currently I would say that QM models indeterminancy as probabilties - that is crisp variety, countable microstates. So not vague states, but crisp alternatives like spin up and spin down that are "in superposition" or "entangled".

But it is exactly this that makes a mystery of the collapse mechanism.

I'd look at it differently using the systems view. I would see the observing context as the crisp macrostate - the constraining boundary conditions. It is the classical context of an experiment that sets up the wavefunction and imposes constraints on a region of quantum vagueness. Crisp outcomes are determined via top-down causality - only spin up or spin down can be observed due to experimental set-up - and quantum vagueness decoheres accordingly, the local potential to be various things developing into crisply some localised event.

Non-locality fits in here because the decohering context is the whole of the context involved. And it is also a final causes argument as the global scale involves both space and time, allowing a retro-causal (top-down from the largest scale) or transactional approach.

Vagueness itself I define as infinite symmetry. So collapse of a wave-function would be a symmetry breaking.

Symmetry breaking can have to general forms. We can either end up with a symmetry breaking within scale (as in the left/right mirror image breaking of charge where both halves are still the same size, and unstably broken as a result - the two halves want to get back together).

Or we can have a symmetry breaking across scale - a local~global asymmetry where one half ends up shunken very small, the other becomes the very large. We would call this figure and ground, event and context. Or other things, like atom and void, QM and GR, the discrete and the continuous...

Symmetry breakings across scale - asymmetrical outcomes - are stable because the two poles of being have moved as far away as possible from each other, and also - being a systems story - they are mutually defining. This is why certain metaphysical dichotomies have proved so robust. Whatever is not changing is by asymmetric definition static. Whatever is not material is void. Whatever is not signal is noise.

The point here is that we find these kinds of dichotomies at the heart of QM. The various local~global descriptions such as particle~wave, position~momentum, energy~time.

So in QM already, there is something that looks just like the infinite symmetry, the unbroken potential, of vagueness. It is just being modeled in mechanical terms as a set of crisp entangled probabilities (that we reason post-measurement as having had to exist "in there", in the wavefunction).

And in QM we have something that looks just like the dichotomous symmetry breaking mechanism that is about asymmetry of scale, a breaking which finds the form local~global.

Uncertainty arises when the scale of observation is so constrained that there is no room for stable dichotomisation (into a classically certain particle or event within a classically certain world or context). You instead get the Planck grain effect. Radically unconstrained outcomes.
 
Physics news on Phys.org
  • #32
I definitely agree with a vagueness view of QM. This does seem to get rid of any collapse problems. You can similarly avoid any of those issues by just taking an instrumentalist view of science, or an epistemological view of the vagueness. That leaves all sorts of problems what ontology is though.

Whether the vagueness is epistemological, ontological, or you collapse the two or consider it some other way... I think it's necessary. There are just too many inconsistencies with collapse and a lot of other QM interpretations. In that way, I think QM is a good case study for vagueness.
 

Similar threads

  • · Replies 71 ·
3
Replies
71
Views
10K