# Uncertainty principle problem.

This is problem 25 in chapter 4 of Modern Physics 2nd edition by Serway, Moses, and Moyer. I have changed the wording, but not the meaning I hope.

## Homework Statement

A person drops an object of mass m from a height H. Show that the miss distance must be at least
$$\Delta x = (\frac{\hbar}{m})^{1/2}(\frac{H}{2g})^{1/4}$$

## Homework Equations

$$\Delta p_x\Delta x \ge \frac{\hbar}{2}$$

## The Attempt at a Solution

I'm at a total loss. Where is the source of uncertainty in momentum here? There doesn't seem to be any. If there were, it would have to be uncertainty in the momentum in the x direction which is what? Height, perpendicular to the floor? Or some unspecified but arbitrary direction parallel to the floor? There seems to be some clue in the fact that the uncertainty principle equation is linear in hbar, but the uncertainty in the problem is proportional to the square root of hbar. One more thing I don't understand is that the uncertainty principle guarantees uncertainty in measurements, not inaccuracy. What guarantees a minimum miss distance?

Related Advanced Physics Homework Help News on Phys.org
TSny
Homework Helper
Gold Member
If you released the object at a precisely known point, then there would be a huge uncertainty in the horizontal component of momentum due to the uncertainty principle. This would lead to a large uncertainty in where the object will land horizontally after falling for a time t.

You can reduce the uncertainty in where it will hit by allowing for some uncertainty in horizontal location of the release point, which will reduce the uncertainty in horizontal speed at the time of release. But of course you don’t want too much uncertainty in x at the release point.

So, you want to optimize the uncertainty in x at the release point such that you get minimum overall uncertainty in where it will land.

Thank you TSny, I think you are exactly correct. I will work on this to see if I can come up with the formula in the book.

TSny
Homework Helper
Gold Member
One more thing I don't understand is that the uncertainty principle guarantees uncertainty in measurements, not inaccuracy. What guarantees a minimum miss distance?
I agree with you. There's always the chance that the object will land squarely on the target. So it seems as though the problem is poorly worded when it says "Show that the miss distance must be at least ..."

vela
Staff Emeritus
Homework Helper
Uncertainty is inherent in the system. It really has nothing to do with measurement at all.

The idea here is that there's always going to be some uncertainty Δx in the object's horizontal position, so that sets a lower limit on how well you can figure out where the object will land. In addition to that, there is some uncertainty Δpx in its horizontal momentum, so as the object falls, it's going to drift horizontally as well, which makes the uncertainty where it lands even bigger.

Now, if you try to minimize the uncertainty in the object's initial x position, you'll increase the uncertainty in its horizontal momentum, so even though you know where it started out really well, you won't know where it will land very precisely. On the other hand, you can minimize how much it drifts by minimizing Δpx, but then you won't know where the object started from. Somewhere in the middle, you get the ideal tradeoff between Δx and Δpx that will minimize the uncertainty in its horizontal coordinate when it lands.

EDIT: I write too slowly. ;-)

Thanks to you both. I have worked out the problem. In the spirit of homework help, I will not spell out the details. However, for future students who chance upon this thread, take heart, the answers given really do suffice to solve the problem.