- #1
peguerosdc
- 28
- 7
- TL;DR Summary
- Why we can approximate the uncertainty (std. dev) as 1) the difference between two measurements, 2) the value of one measurement?
Hi!
I am checking Zettili's explanation on the uncertainty principle and I have this confusion on what the "uncertainty" really means which arises from the following statements:
When introducing the uncertainty principle, for the case of position and momentum it states that: if the x-component of the momentum of a particle is measured with an uncertainty ##\Delta p##, then its x-position cannot, at the same time, be measured more accurately than ##\Delta x = \hbar / (2\Delta p)##:
$$
\Delta x \Delta p \geq \hbar / 2
$$
Similarly for the energy and time:
$$
\Delta E \Delta t \geq \hbar / 2
$$
But the two given examples doesn't seem to fit with that definition.
The energy example says that: if we make two measurements of the energy of a system and if these measurements are separated by a time interval ##\Delta t##, the measured energies will differ by an amount ##\Delta E## which can in no way be smaller than ##\hbar / \Delta t##.
Now, this doesn't makes sense to me when the more formal statement of the uncertainty principle is given in terms of the standard deviation ##\sigma##:
$$
\sigma_A \sigma_B \geq \frac {|\langle[A,B]\rangle|} 2
$$
How is the difference between two measurements equivalent to the standard deviation?
Then, the next example calculates the uncertainty of the position of a 50kg person moving at 2m/s:
$$
\Delta x \geq \frac \hbar {2\Delta p} \approx \frac \hbar {2mv} = \frac \hbar {2 \times 50kg \times 2 ms^-1}
$$
This doesn't feel consistent neither with the definition in terms of the standard deviation nor with the first example.
Thanks!
I am checking Zettili's explanation on the uncertainty principle and I have this confusion on what the "uncertainty" really means which arises from the following statements:
When introducing the uncertainty principle, for the case of position and momentum it states that: if the x-component of the momentum of a particle is measured with an uncertainty ##\Delta p##, then its x-position cannot, at the same time, be measured more accurately than ##\Delta x = \hbar / (2\Delta p)##:
$$
\Delta x \Delta p \geq \hbar / 2
$$
Similarly for the energy and time:
$$
\Delta E \Delta t \geq \hbar / 2
$$
But the two given examples doesn't seem to fit with that definition.
The energy example says that: if we make two measurements of the energy of a system and if these measurements are separated by a time interval ##\Delta t##, the measured energies will differ by an amount ##\Delta E## which can in no way be smaller than ##\hbar / \Delta t##.
Now, this doesn't makes sense to me when the more formal statement of the uncertainty principle is given in terms of the standard deviation ##\sigma##:
$$
\sigma_A \sigma_B \geq \frac {|\langle[A,B]\rangle|} 2
$$
How is the difference between two measurements equivalent to the standard deviation?
Then, the next example calculates the uncertainty of the position of a 50kg person moving at 2m/s:
$$
\Delta x \geq \frac \hbar {2\Delta p} \approx \frac \hbar {2mv} = \frac \hbar {2 \times 50kg \times 2 ms^-1}
$$
This doesn't feel consistent neither with the definition in terms of the standard deviation nor with the first example.
- In this case we only have one measurement for the momentum, so when comparing with the previous example, Why is the "uncertainty" of p approximated just to the value of p (instead of taking the difference between two measurements)?
- When comparing with the definition of the uncertainty principle, Why we are now approximating the standard deviation of p to the value of p?
Thanks!