You have some quantum particle, and at time t_0 you know its position and momentum reasonably well. Next, you measure the particle's momentum with 100% precision, losing all information about its position.(adsbygoogle = window.adsbygoogle || []).push({});

But, after a time interval t, the farthest something could travel is c*t. So, given your initial knowledge of the particle's position, it must now be somewhere in the sphere of radius c*t (sufficiently extended to take into account your initial [small] uncertainty of the particle before the measurement).

If you take smaller and smaller t (the amount of time after the measurement), you can effectively shrink this sphere until it is the same size as it was originally, i.e. when your uncertainty of position was small. But now, in addition, you know it's momentum with 100% precision. This clearly violates the uncertainty principle.

Where is the flaw in this line of reasoning?

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Uncertainty vs. speed of light

**Physics Forums | Science Articles, Homework Help, Discussion**