# Is the uncertainty principle an ontological statement?

• I

## Main Question or Discussion Point

From the following definition, it seems that the uncertainty principle is an epistemological statement.

"Heisenberg's uncertainty principle, is any of a variety of mathematical inequalities asserting a fundamental limit to the precision with which certain pairs of physical properties of a particle, known as complementary variables, such as position x and momentum p, can be known."

That is, we cannot know the position and momentum of a particle to infinite precision. But nothing is said about the ontology of the particle. So a particle could have a precise position $x=0$ and a precise momentum $p=0$ as long as the value of both variables cannot be known to us simultaneously.

But the following exercise seems to suggest that the uncertainty principle is an ontological statement, that is, a particle cannot have a precise position and a precise momentum simultaneously.

So is the uncertainty principle an epistemological statement or an ontological one? • Suwailem

Related Quantum Physics News on Phys.org
Ssnow
Gold Member
I think it is not an ontological statement, he describes a limit in the measurement process ...

I think it is not an ontological statement, he describes a limit in the measurement process ...
What's the limit in the measurement process?

The time limit should mean the maximum time the pencil can balance, which is due to the inequality $\Delta x\Delta p\geq\hbar$.

If the HUP is not an ontological statement, then the pencil could be both at the top (vertically right above its tip) and at rest. Then no maximum time would exist.

Last edited:
Nugatory
Mentor
The uncertainty principle follows from impossibility of preparing a quantum system in a simultaneous eigenstate of two non-commuting observables. You can attach words like "epistemological" or "ontological" to it if you want, but the math stays the same.

If one believes that QM is merely about our ability/inability to measure the position / momentum of the particle, which ontologically may exist, then the question of why an atom is stable or why double-slit-kind experiments with electron produce interference pattern, or why there Bell inequality violation is confirmed remain completely unanswered. So it looks more reasonable for me to assume that QM formalism, including Heisenberg uncertainty, is somehow related to the ontological side, i.e. to the reality. Still, there may be different opinions on this topic and different interpretations, and additional arguments may be brought to support opposite points of view.

But those people who say that QM is mostly epistemological, need to explain stability of the hydrogen atom and all the other stuff.

Ssnow
Gold Member
What I mean is that the Heisenberg principle doesn't speak about the ontological nature of position or the momentum of a particle. The problem is in the ''simultaneous measurement'' of both, it this sense express a limit of the measurement process.

It can hardly be a limit of the measurement process because the double-slit experiment clearly shows us that a particle may not have a definite position (otherwise it would pass through one of the slits in a classical way). So the dispersion of measurement may be not attributed to the measurement process limitation only, but to the fact that there is no definite coordinate for particular (most of the) states of a particle.

Now, Heisenberg principle taken in historical context or learnt from some popular books and even from some QM textbooks may look as to related to the measurement process only (epistemology side). But we need to take it in a general context of QM, not in the context where and how it was historically introduced.

The problem is in the ''simultaneous measurement'' of both, it this sense express a limit of the measurement process.
I presume "a limitation of the measurement process" is closer to what you mean?

The problem is in the ''simultaneous measurement'' of both, it this sense express a limit of the measurement process.
You practically don't need to measure them simultaneously. You may have a set of particles prepared in nearly equal state (don't ask me how physicists may do that), then, say, measure a coordinate for some of them and a moment for others. There still will be correct the same Heisenberg uncertainty inequality. Many books still tell us that when we measure coordinate, we change the moment of a particle in an uncontrolled way or vice versa. But this is not the whole picture, even if you measure a set of equally prepared particles, the same Heisenberg uncertainty inequality still works. So the uncertainty is not due to some classical-like "disturbance" we may introduce.

• Ssnow
The uncertainty principle follows from impossibility of preparing a quantum system in a simultaneous eigenstate of two non-commuting observables.
The answer given for the maximum time for a pencil to be balanced on its tip is $\approx\,$4s. Does this mean that every time a pencil is balanced it must fall after 4s at the latest or does it mean that in an ensemble of identically prepared pencils, some pencils fall before 4s and some pencils fall after 4s but the average time of fall is at most 4s?

If δx and δp are the precisions of position and momentum obtained in an individual measurement and $\sigma_x, \sigma_p$ their standard deviations in an ensemble of individual measurements on similarly prepared systems, then "There are, in principle, no restrictions on the precisions of individual measurements δx and δp, but the standard deviations will always satisfy $\sigma_x \sigma_p\geq\hbar/2$."

Then it seems that the latter is the correct interpretation.

However, this would mean that we now have a way of preparing a system in a simultaneous eigenstate of two non-commuting observables. Balance a large number of pencils on its tip. The longer the time a pencil remains balanced, the smaller the product $\sigma_x \sigma_p$ for the set of pencils still standing. So wait sufficiently long, and we will have prepared a set of pencils whose $\sigma_x \sigma_p<\hbar/2$.

Last edited:
I always have a problem with this example of balancing a pencil on its tip, not only because different people got wildly different estimates of the tipping time (ZapperZ has a blog post about this) so it is hard to believe this as the explanation of why we can't balance a pencil (I also find the classical explanation that the pencil is in an unstable equilibrium wholly sufficient) but also because, as you noted, it seems to carelessly mix an ontological and epistemological claim.

The resolution is probably that before decoherence, the pencil will be in a superposition of different angular displacements, which feel the gravitational force differently. Only the term that describes a perfectly balanced pencil, a single point in a continuum of all angular displacements, does not move. This state has zero probability of being realized (becomes "ontological" state of the pencil) by decoherence. So the pencil tips. In this way of thinking, the tipping time has to do with the decoherence time rather than the uncertainty principle. It just goes back to the classical fact that the equilibrium is unstable.

• Jilang