tommy01
- 39
- 0
Hi together ...
In many textbooks on particle physics i encounter - at least in my mind - a misuse of the Heisenberg uncertainty principle.
For completeness we talk about
\Delta p \Delta x \geq \hbar/2
For example they state that the size of an atom is of the order of a few Angstroms. Therefore \Delta x of an Electron is \approx 10^{-10}m then the conclude that the momentum is \approx \hbar/\Delta x.
But what justifies this? The only thing you get is \Delta p this means the momentum is about p \pm \Delta p how one can conclude from the uncertainty of an observable its mean value?
In many textbooks on particle physics i encounter - at least in my mind - a misuse of the Heisenberg uncertainty principle.
For completeness we talk about
\Delta p \Delta x \geq \hbar/2
For example they state that the size of an atom is of the order of a few Angstroms. Therefore \Delta x of an Electron is \approx 10^{-10}m then the conclude that the momentum is \approx \hbar/\Delta x.
But what justifies this? The only thing you get is \Delta p this means the momentum is about p \pm \Delta p how one can conclude from the uncertainty of an observable its mean value?