Chemists sometimes use lasers to cool a particle to close to absolute zero. I would interpret this as the momentum becoming very uncertain, but the lasers carry away the energy before the particle can move. So the particle is cold but it is heating the system as a whole. Once the lasers are turned off the particle could shoot off at a high speed. Instead of absolute zero, let's say we cool the particle to 3K or something like that. The idea is that the position and the momentum have about the "same" uncertainty. We have reasonably good idea of the mean location of the particle, and the mean momentum is 3K or whatever. Now look at the location and momentum over time. If we are doing the experiment correctly then the mean position over many particles and the mean momentum over many particles will remain the same, but for any individual particle the position and momentum will diverge. It's a trivial result: the AVERAGE (H/N-0.5) deviation of a series of coin flips converges, but the ACTUAL deviation(H-N/2) for any one series diverges. So if the particle has an irreducibly random character, then the momentum diverges. If the particle has a deterministic character, then the momentum doesn't diverge. Calculating this divergence I get that the magnitude of the momentum increases logarithmically. So it increases without bound, but the divergence is so slow that it doesn't matter on any but a small time scale. After a million years the divergence is about 50 times of that after a microsecond, so it isn't worth bothering about in any practical sense. Is this right?