Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Uncertainty Principle as impossibility of simultaneity

  1. May 22, 2010 #1
    I've been following Peter Lynds work- which is a periodic rediscovery of the philosophical
    difficulties surrounding the concept of discontinuity, threshold, edge etc and its application to Zeno's paradox. Lynds' solution has emerged a number of times in showing the paradox is invalid because the concept of a discrete point in time or space cannot be real since any discrete point in space or instant in time would be composed of an infinite number of further instants and points. Hence discrete points and instants can have no physical reality and if they did then continuous, dynamic phenomena could not flow from beginning to end.

    This raises the notion that if no time interval can exist without being divisible into smaller intervals, then at the most fundamental level, no 2 events can be said to be absolutely
    simultaneous since they could never exactly occupy the same time interval. In this sense
    simultaneity (not in the relativistic sense) does not exist.

    If so then the "simultaneous" measurement of phenomena at the most fundamental level
    does not exist and measurements will always be sequential.

    Conjecture: The Heisenberg inequalities stem from the effect of the observer attempting to enforce simultaneity upon two sequential events- i.e. measurements of position
    and momentum (or energy and time)- and encountering real physical limits at the planck level.

    I'm not sure why these conjugate pairs are singled out for uncertainty relations and
    others are not, so this will require more thought.

    But essentially the conjecture takes its departure from the fundamental metaphysical issue over 2000 years ago. i.e. whether reality is indivisibly contiguous at some fundamental
    level or atomic in its divisibility.

    We've had great success with the latter until QM has revealed strangeness like nolocality
    and other clues that the divisibility of reality has some fundamental limits.
  2. jcsd
  3. May 22, 2010 #2


    User Avatar
    Science Advisor
    Gold Member

    Welcome to PhysicsForums, hollowsolid!

    Maybe you are correct about simultaneity. Would that be so surprising? Exactly what class of phenomena would be affected were this true? I don't see how you could determine this is either true or false based on existing theory.
  4. May 22, 2010 #3
    Thank you for your reply Dr. Chinese.

    I think the impossibility of simultaneity would be surprising to most physicists and yet it explains the statistical nature of QM itself i.e. ultimately we run into real limits in our ability to impose discreteness on nature and it yields only a statistical interpretation.

    Einstein never accepted this, - his famous god and dice statement- but if we view reality as fundamentally contiguous then all phenomena when examined at this level must become statistical rather than discrete. This naturally follows but at the macro level our ability to impose greater and greater levels of discreteness is assumed to work all the way down to an infinitely deep level. QM shows this is not so.

    For hundreds of years we have been successful in discovering the level to which discreteness exists, however QM has shown that when "pushed" existence rebounds phenomenologically. e.g. in the Bell inequalities. Whenever we attempt to wrest apart a phenomena at this level then it rebounds in ways that violate other laws yet this must be so otherwise existence itself would cease to exist!.

    I think your comment about falsifiability is excellent. In fact, ALL experiments testing QM are tests of the phenomenological divisibility of nature as are ALL physical experiments per se since in the act of measurement they attempt to define exactly where a phenomenon exists and where it does not e.g. tests of locality, wave particle duality etc.

    In this sense, measurement and observation are ultimately attempts to impose boundary
    conditions on reality and at a fundamental level this attempt fails because the contiguous nature of reality will not allow it and returns not only statistical information but can as in the Bell inequalities sacrifice lower order laws ni order to do so.

    In terms of testability here is my second conjecture:

    At the QM level, tests of conservation (mass, energy, spin etc) should yield a **conservation order** if suitable experiments could be devised to test them. i.e. reality would in a sense prefer to conserve one property before another.

    If such a conservation order exists then this would provide a very useful tool for such things as quantum computation.
  5. May 23, 2010 #4
    divergence & convergence vs simultaneity. When setting in motion the PBS at that moment is the split exactly at the same 'time'? if it's not then does it split before it's supposed to or after? many more implications there. then when reconvergence happens do we further divide down the moment?

    I can see for two separate events that we would try to correlate to each other the argument for divisibility however if it is applied to the splitting of a singular event then you keep the door open to which happened first if you loose simultaneity and most importantly how did it know to split? FTL? . Then your faced with it being a pre-determined.
  6. May 23, 2010 #5
    Here's the problem. When we think of three-dimensional space, as such, what we are really thinking about is the principle of differentiation of surfaces. What this means, then, is that impenetrable two-dimensional boundaries within a three-dimensional void is the natural picture that the human mind uses in order to develop its physical systems of thought.

    But in the 19th century, the idea of the three-dimensional surface started to catch on, by way of the investigations of Gauss, Lobachevski, Bolyai, and most significantly, Riemann. Then, in 1870, William Kingdon Clifford published a communication called 'On the Space Theory of Matter' to the Cambridge Philosophical Society that describes physical reality as consisting only of "ripples" of the three-dimensional continuum.

    The reason that the human mind is hesitant to accept ideas such as this is simply that it necessarily experiences nature in terms of a duality between spatial voids and planar boundaries. But the way in which this kind of experience is understood to be [theoretically] possible is a different matter entirely. And perhaps to best way to explain this possibility is to conceive of a fourth dimension into which spatial "infinitesimals" may extend, in the manner of waves that travel along a taut string or drum membrane.

    But of course, the notion of a "true infinitesimal" of three-dimensional space is impossible to comprehend, so we find ourselves having to make do with spaces that are "small enough" for the task at hand. It is very tempting to refer to such entities in the language of geometric locality (i.e. mathematical points), but it is a simple fact that objects of different dimensionalities can in no way be quantitatively related (in the sense of a plane just being a "very small" slice of space). That is, there is a qualitative difference between objects of various dimensionalities.

    Thus, we can see here a kind of Planck-like schema taking shape--not because of any "strange" experimental results--but rather because human beings have an innate need to calculate their environments in as fine-grained a manner as possible, so that no events of any significance go by unnoticed.

    Within each spatial infinitesimal, we must assert that all possible further subdivisions are "ontologically integral." That is, whatever is happening at one corner of the box (or whatever form you want to imagine) is simultaneously happening at the other corner. This idea is not merely a kind of "faster than light" signaling. Rather, we need to use a concepts different from everyday notions of locality in order to fully express what is going on.

    This is where the idea of the wave comes in. We can understand a wave simply as a continuous spatial form, whose several parts are integrally coordinated in a systematic unity. That is, a specific amplitude at one part of the wave necessarily means that another part will simultaneously consist of another specific amplitude. This, of course, is not due to any kind of "weirdness," but it is rather a necessary fact of the mathematical description of waves.

    So, the very moment that wave functionality became an essential aspect of physical theory (via the Schrodinger wavefunction), the inevitable result was that those who eventually "interpreted" this aspect in terms of classical, Newtonian trajectories of massive bodies were setting the rest of us up for never-ending arguments as to "what it all really means."

    But there is precisely no necessary reason to understand this kind of interpretation of Schrodingerian wave functionality as anything other than a desire to keep the mathematical framework of theoretical physics squarely within the tradition of trivial analytical calculability. So in the end, all that arguments like these come down to is this: Should fundamental physics allow itself to evolve into a discipline that requires a higher level of mathematical talent, or should it remain accessible to those with more modest mathematical skills?
  7. May 23, 2010 #6
    There is much here that i agree with:

    However I believe that the problems I'm referring to are much broader than the delineation of edge surfaces or the limits of measurement and representation within different formulations of space..

    I'm unsure if the representation of all existence can be completely described in the wave equation since there are "continuities" or continuous phenomena that exist which are not well represented by it and which provide ontological catatrophes when we attempt to
    force discontinuity upon them.

    For example, we could conceivably try to build experiments that impose boundary conditions upon phenomena such as units of charge or spin in attempts to force reality to conserve
    one or the other but not both. The schrodinger cat thought experinment is another attempt to drive a boundary condition between macroscopic and quantum phenomenena with the view to identifying their boundary. But in reality such boundaries do not exist and we are perturbed to discover that a statistical outcome is as good as it gets.

    All observation and hence all science aims in the act of measurement to define the discrete conditions under which phenomena can be said to exist. But these phenomena are more
    than spatial- as evidenced by tests of nonlocality- and include properties that have little
    spatial componentry such as charge, charm, flavour etc.

    If we attempt to drive boundary conditions into properties such as these, as we have done
    with quantum eraser expeiments, then we may discover that when the fundamental continuity of reality is tested, then ALL properties cannot be simultaneously conserved or at least, all cannot behave in the way we know them to behave when they are *not* subjected to fundamamental tests of their discreteness.

    If so, it would be interesting to determine if there is a conservation order in which reality "bends" to preserve itself and if this order of conservation can be used in the way that quantum phenomena are used in quantum encryption for example.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook