Vagueness as a model of initial conditions

Click For Summary
SUMMARY

This discussion centers on the concept of "vagueness" as a model for understanding the Universe's initial conditions, proposing it as a fifth alternative to established metaphysical positions. Vagueness, described as a state of potentiality devoid of order, contrasts sharply with the crisp definitions of nothingness, somethingness, circular logic, and everythingness. The dialogue references influential thinkers such as Paul Davies, Lee Smolin, and Alexander Linde, who explore the implications of these models on cosmological origins. Ultimately, the conversation suggests that vagueness could provide a framework for a selection principle that governs the transition from potentiality to actuality in the cosmos.

PREREQUISITES
  • Understanding of metaphysical concepts related to cosmology
  • Familiarity with the works of Paul Davies, Lee Smolin, and Alexander Linde
  • Knowledge of the philosophical implications of vagueness in logic
  • Awareness of the historical context of vagueness in ancient Greek and Buddhist thought
NEXT STEPS
  • Research "ontic vagueness" and its implications in modern physics
  • Explore Paul Davies' "The Goldilocks Enigma" for insights on initial conditions
  • Study Lee Smolin's evolutionary cosmology and its selection principles
  • Investigate the philosophical underpinnings of "law without law" as proposed by John Archibald Wheeler
USEFUL FOR

Cosmologists, philosophers of science, and anyone interested in the foundational questions of the Universe's origins and the role of vagueness in theoretical physics.

  • #31
This is all about modelling, so what works as metaphysical description rather than starting with a claim about what is true. Both standard reductionism and a systems approach to causality are models. You might have reason to use both, but in different domains. For instance, a machine like logic is very good for building machines. A systems logic maybe simply what you need for understanding fundamental nature.

Then to the question of does ontological vagueness (a general metaphysical idea) = quantum indeterminancy (a possible specific example - although possibly a canonical example)?

Currently I would say that QM models indeterminancy as probabilties - that is crisp variety, countable microstates. So not vague states, but crisp alternatives like spin up and spin down that are "in superposition" or "entangled".

But it is exactly this that makes a mystery of the collapse mechanism.

I'd look at it differently using the systems view. I would see the observing context as the crisp macrostate - the constraining boundary conditions. It is the classical context of an experiment that sets up the wavefunction and imposes constraints on a region of quantum vagueness. Crisp outcomes are determined via top-down causality - only spin up or spin down can be observed due to experimental set-up - and quantum vagueness decoheres accordingly, the local potential to be various things developing into crisply some localised event.

Non-locality fits in here because the decohering context is the whole of the context involved. And it is also a final causes argument as the global scale involves both space and time, allowing a retro-causal (top-down from the largest scale) or transactional approach.

Vagueness itself I define as infinite symmetry. So collapse of a wave-function would be a symmetry breaking.

Symmetry breaking can have to general forms. We can either end up with a symmetry breaking within scale (as in the left/right mirror image breaking of charge where both halves are still the same size, and unstably broken as a result - the two halves want to get back together).

Or we can have a symmetry breaking across scale - a local~global asymmetry where one half ends up shunken very small, the other becomes the very large. We would call this figure and ground, event and context. Or other things, like atom and void, QM and GR, the discrete and the continuous...

Symmetry breakings across scale - asymmetrical outcomes - are stable because the two poles of being have moved as far away as possible from each other, and also - being a systems story - they are mutually defining. This is why certain metaphysical dichotomies have proved so robust. Whatever is not changing is by asymmetric definition static. Whatever is not material is void. Whatever is not signal is noise.

The point here is that we find these kinds of dichotomies at the heart of QM. The various local~global descriptions such as particle~wave, position~momentum, energy~time.

So in QM already, there is something that looks just like the infinite symmetry, the unbroken potential, of vagueness. It is just being modeled in mechanical terms as a set of crisp entangled probabilities (that we reason post-measurement as having had to exist "in there", in the wavefunction).

And in QM we have something that looks just like the dichotomous symmetry breaking mechanism that is about asymmetry of scale, a breaking which finds the form local~global.

Uncertainty arises when the scale of observation is so constrained that there is no room for stable dichotomisation (into a classically certain particle or event within a classically certain world or context). You instead get the Planck grain effect. Radically unconstrained outcomes.
 
Physics news on Phys.org
  • #32
I definitely agree with a vagueness view of QM. This does seem to get rid of any collapse problems. You can similarly avoid any of those issues by just taking an instrumentalist view of science, or an epistemological view of the vagueness. That leaves all sorts of problems what ontology is though.

Whether the vagueness is epistemological, ontological, or you collapse the two or consider it some other way... I think it's necessary. There are just too many inconsistencies with collapse and a lot of other QM interpretations. In that way, I think QM is a good case study for vagueness.
 

Similar threads

  • · Replies 71 ·
3
Replies
71
Views
10K