Heisenberg Uncertainty Principle

In summary, the conversation discusses the Heisenberg uncertainty principle and the contradiction it poses in classical mechanics. The concept of wave/particle duality and the Heisenberg microscope thought experiment are also mentioned. The conversation then explores using a low energy photon to determine the electron's momentum and the uncertainty this would create in its position. However, the question arises of what measurements would be used to determine the momentum, and the fact that the Heisenberg uncertainty principle is a statistical relationship rather than a statement about individual measurements. The conversation concludes with the suggestion to read Feynman's discussion for further understanding.
  • #1
texta
6
0
Hi,

I have a problem with the uncertainty principle. The way I understand it, Heisenberg used ideas from classical mechanics and the concept of wave/particle duality to show a contradiction in classical mechanics, i.e. that it is impossible to know with exact precision both the momentum and position of a particle.

I've read about the Heisenberg microscope thought experiment which leads to:
[tex]\Delta[/tex]x [tex]\Delta[/tex]p [tex]\geq[/tex] h [tex]/[/tex] 4pi

Let's assume that instead of a gamma ray we use an extremely low energy photon to determine very precisely the momentum of the electron under the microscope. This means that the position of the electron would be very uncertain.

But, we know the time the photon was emitted. We also know the time that we observe the electron in the microscope. So the total distance covered by our photon could not be more than c * (observation time - photon emission time).

Example:
An electron with mass K, is moving at velocity V, with an uncertainty in the velocity of M.

[tex]\Delta[/tex]p = KVM
[tex]\Delta[/tex]x = h / 4 pi * KVM

If the photon's energy was low enough, the value of M would be small enough such that [tex]\Delta[/tex]x is greater than c * (observation time - photon emission time). But if the photon hit the electron, it means that the following must be true:
0 < electron x position < c * (observation time - photon emission time).

Can somebody please explain where I have gone wrong, this is really bugging me!
 
Physics news on Phys.org
  • #2
How, precisely, are you going to use the photon to determine the electron's momentum? In other words, what measurements are you going to make on the photon that will allow you to deduce the electron's momentum from the results? After all, a microscope is usually used to measure the *position* of something, not the momentum; seeing photons coming through the eyepiece of the microscope tells you the position of the object that reflected them (within the microscope's optical resolution), but doesn't tell you anything about the momentum of that object. So you must be making some different measurements on the photon (other than the direction it's coming from) to determine the electron's momentum. What measurements?
 
  • #3
PeterDonis said:
After all, a microscope is usually used to measure the *position* of something, not the momentum; seeing photons coming through the eyepiece of the microscope tells you the position of the object that reflected them (within the microscope's optical resolution), but doesn't tell you anything about the momentum of that object. So you must be making some different measurements on the photon (other than the direction it's coming from) to determine the electron's momentum. What measurements?

That's a good point, I'm really not sure what measurement to use. Maybe you could tell me how it is usually done for momentum? But wouldn't it also work in reverse? Using an extremely high energy photon, the velocity of the electron would get very close to C when the photon collides with the electron.

Example:
An electron is at position L, with an uncertainty in the position of M.

[tex]\Delta[/tex]x = LM
[tex]\Delta[/tex]p = h / 4 pi * LM

But if M was small enough [tex]\Delta[/tex]p would become very large. Since the rest mass of an electron can be assumed to be constant, that means that the velocity of the electron must be increasing. So it would be possible using that inequality to say that there is an uncertainty in the velocity > C, when it will always be < C.
 
  • #4
The Heisenberg uncertainty principle is a statistical relationship. It says nothing about the results of an individual measurement. Instead, it is talking about the statistical spread, the standard deviation, in the measured data that you would get should you do an ensemble of measurements.

That is, if I have a 1000 jars with an identical experiment setup inside them. Let's say I have a photon emitter and a wall of tiny detectors on the other side. I measure the energy/momentum of the photon by the spectral emmission and plot it's endpoint by which detector gets set off. I make these measurements on these 1000 independent but identical jars. Then I look at the results. What I find is that the detector and spectral emmission will have some variance in the results. The product of the statistical deviations in these results must satisfy the Heisenberg uncertainty principle.

Where this comes about can be very subtle. Sometimes, you can derive a thought experiment where the uncertainty comes about due to the system itself (like forcing a particle to pass through a hole, the size of the hole lends some variance in where the particle can be when it passes through and should the particle strike the wall of the hole it will experience a change in momentum). Feynman gives an example of this in his path integrals textbook in chapter 1-2. This is similar to the kind of thought experiment you are attempting. But I think you need to define what the actual uncertainty is here. What is the value of M? For example, if we are probing the electron with a photon, then the uncertainty could be due to the fact that the photon and electron must exchange some amount of momentum. Perhaps the photon changes direction and from momentum conservation this must mean that the electron's momentum is similarly affected. If the photon's momentum is changed, then the position measurement should also be affected. I think that there should be some cause and effect here that ties them together that you are ignoring.
 
Last edited:
  • #5
To elaborate on PeterDonis' point, if you know the momentum of your photon very precisely, you will also its energy and that sets a limit on how accurately you can know when the photon was emitted.

Also, Born2bwire's advice of reading Feynman's discussion is excellent.
 
  • #6
Thanks heaps for your reply.

Born2bwire said:
The Heisenberg uncertainty principle is a statistical relationship. It says nothing about the results of an individual measurement. Instead, it is talking about the statistical spread, the standard deviation, in the measured data that you would get should you do an ensemble of measurements.

That is, if I have a 1000 jars with an identical experiment setup inside them. Let's say I have a photon emitter and a wall of tiny detectors on the other side. I measure the energy/momentum of the photon by the spectral emmission and plot it's endpoint by which detector gets set off. I make these measurements on these 1000 independent but identical jars. Then I look at the results. What I find is that the detector and spectral emmission will have some variance in the results. The product of the statistical deviations in these results must satisfy the Heisenberg uncertainty principle.

But the principle argues that these uncertainties are inherit in nature, not in the precision of the experiment. The way I understood it was that the principle would allow the uncertainty of either position or momentum to become arbitrarily small, but the effect would be that the other variable would become arbitrarily large. I guess my question really asks: How can the other variable become arbitarily large when it is limited by C?

Imagine that the momentum is measured so precisely that the principle implies an uncertainty in the position of 10 light years. But if the time it took to measure the momentum was half a year (very large jars), how can the position at the time of measurement possibly be uncertain to 10 light years? You already know that the position of the particle at the time of measurement must have been within 0.5 light years of your detector, otherwise you would have never been able to measure the momentum in the first place!

I think how I derive it from a thought experiment is irrelevant and will only take the discussion of track. The experiment is Heisenbergs microscope. Examples are all over the net.
 
  • #7
The precison of the experiment is inherently intertwined with the physics of the experiment. That is what you need to keep in mind and why I asked you to specify the source of uncertainty in your experiment. This is wonderfully demonstrated in Feynman's example. In Feynman's example, the instruments of measurement are infinitely precise, however, the interactions of the particle with the system introduces the uncertainty. In the same way, you need to figure out how that is done in your thought experiment because while you may be able to measure at the detector to an infinite precision, the uncertainty will already be introduced by the interaction of the systems. Like I suggested before, if you are using a photon as your means of measurement, then the photon will interact with the electron. Perhaps they will impart an arbitrary amount of momementum, thus altering positions and measurements despite any kind of assumptions you make on your detectors.

This is also why it is a statistical property. You can measure to infinite precision with all your detectors, the uncertainty principle makes no comments on that. However, when you start to compare your results across an ensemble, then you will find the variance in your results because these variances arise independent of your detectors. You are assuming that you can reduce the variance of one observable to an arbitrary precision. I am saying that such an assumption is incorrect, you should try to find the dependence of the variance in the observables of your specific system. If you are saying that you have a system that MUST restrict the variance of one observable to a range of values, then that means that the other observable must be restricted into a similar range.
 
Last edited:
  • #8
texta said:
But the principle argues that these uncertainties are inherit in nature, not in the precision of the experiment. The way I understood it was that the principle would allow the uncertainty of either position or momentum to become arbitrarily small, but the effect would be that the other variable would become arbitrarily large. I guess my question really asks: How can the other variable become arbitarily large when it is limited by C?
But you are using macroscopic coordinate system where we can establish more or less universal c.
But let's assume that in quantum world we don't have common coordinate system for very precise measurement of momentum and very precise measurement of position. Say we have two incompatible coordinate systems. In one coordinate system we have very good agreement with euclidean geometry in large scale (momentum measurement) but in other coordinate system we have euclidean geometry in small scale (position measurement).
That way it can turn out that large scale c is somewhat uncertain in small scale coordinate system.

texta said:
Imagine that the momentum is measured so precisely that the principle implies an uncertainty in the position of 10 light years. But if the time it took to measure the momentum was half a year (very large jars), how can the position at the time of measurement possibly be uncertain to 10 light years? You already know that the position of the particle at the time of measurement must have been within 0.5 light years of your detector, otherwise you would have never been able to measure the momentum in the first place!
But can you prove that if we measure momentum half a year we will reach needed precision to say that uncertainty in position is 10 light years? Without this calculation your example lacks some creditability.
 
  • #9
Born2bwire said:
The precison of the experiment is inherently intertwined with the physics of the experiment. That is what you need to keep in mind and why I asked you to specify the source of uncertainty in your experiment.

Ok I agree that the precision is inherently intertwined with the physics of the experiment. But Heisenberg's formula is being assumed as an accurate model of reality. If it's an accurate model then it means that the two variables can be arbitrarily large or small, as long as the product of both is equal or greater than the expected constant. It follows then that an experiment can be set up in which one or the other variable is arbitrarily large.

Born2bwire said:
This is wonderfully demonstrated in Feynman's example. In Feynman's example, the instruments of measurement are infinitely precise, however, the interactions of the particle with the system introduces the uncertainty.

I'll have to look at Feynman's example. I haven't seen this yet so I can't comment. I'm guessing it will take some time to understand it so give me a few days.

Born2bwire said:
In the same way, you need to figure out how that is done in your thought experiment because while you may be able to measure at the detector to an infinite precision, the uncertainty will already be introduced by the interaction of the systems.

The maths is telling me something about reality, it's saying to me that I can have arbitrarily large values. I will focus my efforts on making a precise example, but keep in mind that my example will use the formula as a model of what's possible.

Born2bwire said:
This is also why it is a statistical property. You can measure to infinite precision with all your detectors, the uncertainty principle makes no comments on that. However, when you start to compare your results across an ensemble, then you will find the variance in your results because these variances arise independent of your detectors.

My scenario asks what happens when there is an upper bound on uncertainty of both variables. Regardless of doing 1000 experiments, the upper bound will still remain in all experiments. My premise that led to an upper bound on both variables is wrong or right. Doing 1000 experiments the same doesn't show that my premise is wrong.

Born2bwire said:
You are assuming that you can reduce the variance of one observable to an arbitrary precision. I am saying that such an assumption is incorrect, you should try to find the dependence of the variance in the observables of your specific system.

Where did the formula impose limits on the variables? I didn't make the assumption, Heisenberg did in his formula. I'm just using the formula to make a claim about reality which, if the formula is correct, should be fine. Your essentially telling me in this statement that the formula for uncertainty isn't an accurate model of reality.
 
  • #10
zonde said:
But you are using macroscopic coordinate system where we can establish more or less universal c.
But let's assume that in quantum world we don't have common coordinate system for very precise measurement of momentum and very precise measurement of position.

I think we need to be very careful here. We must make sure we don't use anything derived from the formula to justify the formula. For example we can't assume copenhagen type statements derived from H.U.P. are true, then use them to justify H.U.P. That would be circular.

Could the uncertainty principle have even been derived if c was not assumed to be a universal constant in the first place?

zonde said:
But can you prove that if we measure momentum half a year we will reach needed precision to say that uncertainty in position is 10 light years? Without this calculation your example lacks some creditability.

No, because those values were plucked out of nowhere to clarify what my problem was. That may have been an error on my part, but I think in the context it was clearly used just to explain my boggle. They were never intended as an exact example.

I will make a formal description of my problem with exact examples.
 
  • #11
texta said:
Could the uncertainty principle have even been derived if c was not assumed to be a universal constant in the first place?

The HUP was originally derived in the context of non-relativistic quantum mechanics. One way to do it is to use a result from Fourier analysis which applies to any kind of wave, as sketched qualitatively in this post.
 
  • #12
Again, you can have arbitrarily large variances, nothing is stopping you from formulating a measurement process so horrible as to satisfy this. But that does not mean you can make arbitrarily small variances. The variances are physically related to each other. You need to find out what this relationship is and then you can find what are the limits that you are able to impose. You have been saying that you can get some variance that is such and such size that makes the required variance in the other observable unphysical. But you have not verified that you could make the first observable's variance arbitrarily small. You need to go through the physics of the experiment and see how the variances can be introduced and how they are related. Heisenberg's uncertainty principle already did this for some black box system. You want to see how it is enforced in your actual example system then you will need to work out the physics of the variances.
 
  • #13
texta said:
I guess my question really asks: How can the other variable become arbitarily large when it is limited by C?

Neither position nor momentum is limited by C; both can assume arbitrarily large values. *Velocity* is limited by C, but *momentum* is not. (To show this, of course, you need to include the effects of relativity--but you need to do that anyway if you're considering measurements over time and distance scales of years and light-years.)
 
  • #14
texta said:
Could the uncertainty principle have even been derived if c was not assumed to be a universal constant in the first place?
Explanation in jtbell linked post looks very nice but I will try to add some of my thoughts.
When you derive HUP you base it on some relation between two observed quantities. Say you can express x using p or something like that. In relativistic case relation will not be linear because relativistic mass (or relativistic Doppler shift) will play a role.
So it seems to me that deviations in relativistic case will be asymptotically bound and that's all.

The most correct way to test this would be to think of some realistic experiment and see how the measurements are taken and interpreted (kind of empirical approach).
 
  • #15
texta said:
I will make a formal description of my problem with exact examples.

I assume you are familiar with entanglement and EPR. If you take 2 entangled particles which have the same momentum, same starting position, same polarization/spin, etc. then you can perform different tests on them. In this case, the results will still conform to the HUP. I mention this because it allows you to see that learning something about Alice still won't allow you to deduce more about Bob than the HUP allows. This example demonstrates clearly that the problem does not occur due to a disturbance caused by the measurement apparatus itself.

I think it is easier to see with spin measurements. Suppose you know Alice is polarized at 0 degrees from measuring her spin. If you measure Bob at 45 degrees, and then measure Bob at 0 degrees, the result will be completely uncertain. In other words, Bob at 0 degrees is no longer predicted by Alice at 0 degrees. And yet you knew - before measuring Bob at 45 degrees - exactly what Bob at 0 degrees would have been. And you knew both values (Bob at 0 degrees, Bob at 45 degrees) with unlimited precision and apparently violating the HUP as these are non-commuting observables.

So the point is that you are assuming that there exist simultaneous distinct values for a particle's non-commuting observables; and that assumption is not warranted. You can measure these values (as this example allows) and yet they are simply meaningless. Obviously, any attempt to "prove" Bob had such values simultaneously always fails - and you will find the same regardless of how you set up the experiment. You can measure non-commuting values all you want, but they still won't represent SIMULTANEOUS values (as subsequent repeated measurements will clearly show).
 
  • #16
Hi DrChinese,

I appreciate your feedback though I find your example makes the issue even more confusing for me.

I'm working on a clear thought experiment. I'll post it when I'm done. Then at least we'll have a basis of reference. It will include how the momentum and position are affected by the observation, and how the observations could be expected to be made.

DrChinese said:
So the point is that you are assuming that there exist simultaneous distinct values for a particle's non-commuting observables; and that assumption is not warranted.

Here is what I am going to do:
Set up a thought experiment which makes pre HUP assumptions (not post HUP).
Derive HUP from the thought experiment using similar logic to Heisenberg.
Then attempt to show a logical contradiction in HUP.

It seems that people keep trying to justify HUP in terms of modern QM. But most of modern QM is based on HUP. So that's not the right way to go about it.

I can see that a lot of experimental evidence agrees with HUP. But like all scientific theory's that just means "it's right so far". A logical contradiction in a theory means that something is wrong in the theory, no matter how well it agrees with experiments.

Since I perceive a logical contradiction in HUP how can I move on in my studies until it is resolved in my mind?
 
  • #17
texta said:
It seems that people keep trying to justify HUP in terms of modern QM. But most of modern QM is based on HUP. So that's not the right way to go about it.
What do you mean by "QM is based on HUP"? The HUP is derived from how states are described in QM; it is not assumed.
 
  • #18
Sounds a bit like the chicken and the egg story. If you accept that the HUP is nothing more than the statement that two operators do not commute, than you can ask the question: what is more "fundamental"? The existence of non-commuting operators, or the describtion of states as elements of a Hilbert space?

As far as I know, canonical quantization comes down to both the introduction of commutation relations on some pair of conjugate variables plus the statement that states are now elements of some Hilbert space.

(Please note that you are not required to choose position and momentum as the non-commuting conjugate pair. In second quantization, for example, this role is played by the field operators, or equivalently, the ladder operators. The momentum and position are then no longer operators, but nothing more than labels!)
 
Last edited:
  • #19
(From the historical viewpoint,)

Unless the quantum mechanics (based on the Schrodinger equation) is denied,
we must believe HUP. (Even if HUP is strange to us.)
Unfortunately, This is the fact.

I think that the turning point was when Hylleraas calculated the helium ground state energy by the variational methods based on the Schrodinger equation in 1928-1930.
(the error was only 0.01eV. The experimental value is -79.005eV)

On the other hand, Bohr model could not explain about the helium.
(But when we say this fact correctly, we must say that Bohr model could not explain about the helium without the computational method )

To get more information, search on Google by the words "bohr helium ground state energy".
 
Last edited:
  • #20
texta said:
I can see that a lot of experimental evidence agrees with HUP. But like all scientific theory's that just means "it's right so far". A logical contradiction in a theory means that something is wrong in the theory, no matter how well it agrees with experiments.

Since I perceive a logical contradiction in HUP how can I move on in my studies until it is resolved in my mind?

OK, so you know that experiments support the HUP. That should be a strong clue right there.

Next, you perceive a contradiction in the HUP relative to theory. So did EPR* in their famous 1935 paper. They showed clearly an contradiction in the theory. Or so they thought. Actually, they made an unwarranted assumption which led to the apparent contradiction. That is what I referred to in my earlier post. That assumption is realism, i.e. the simultaneous reality of non-commuting observables. Turns out that is not only unwarranted, it also essentially conflicts with the HUP. Realism is not a requirement of QM.

QM incorporates the HUP as being fundamental, so I fail to see how any reasonable example is going to conflict with QM theory, given experiments to date. You will need to come up with a prediction that is different than theory. They tried that in EPR, and experiments (such as Aspect) supported QM over EPR.

*Einstein was the E in EPR, I assume you know that. So keep in mind that you are in good company with your questions. I do believe you would benefit from a thorough understanding of EPR, as it is all about the HUP and logic. But you will also want to study how entanglement allows the HUP to be probed in ways that EPR could never imagine.
 
  • #21
texta said:
But the principle argues that these uncertainties are inherit in nature, not in the precision of the experiment. .

Actually there is the mathematics and there is the interpetation of the mathematics.

These are different.

The mathematics only requires that the form of the interaction what ever that may be, does not allow one to perform simultaneous measurements of commuting properties to an accuracy greater than that given by the Heisenburg relationship.

However, some people claim it is a Principle of Nature, i.e. Heisenberg Uncertainty Principle of Nature.

They believe mathematics is physics, i.e. if it can be done mathematically and gives the correct numerical result it must represent the real nature of Nature. So beware.
 
  • #22
Hi Textra,
I think as someone mentioned earlier in the thread, the uncertainty principle is usually used statistically on ensembles, however, I have seen some texts where it strongly implies it can be used for single particle measurements , the justification normally being that it can be used to explain the phenomena of the 2 slit experiment when only one particle is passing through , however I have also seen texts where this has been disputed.
Regardless I think it can be considered in the following way, firstly in the extreme case where either position or momentum can be considered to be measured to infinite accuracy , the inequality cannot be seen to apply, as deriving a value of the other variance would involve dividing by zero, which is not allowed mathematically.
Secondly we consider where one of the variables is not measured to be infinitely accurately, but vanishingly close, in this case , we would have to consider what we meant by that , as I would think that we cannot measure a position displacement more accurately than the plank scale, similarly for a momentum displacement , in order to measure it incredibly accurately we would need to have a device that could display an incredibly large amounts of digits (near to infinite as possible) after the decimal place, which we would then have to read, which would take a very long time if at all possible.
Finally even if we discount the previous two points, standard quantum mechanics is only really applicable for particles traveling at considerably less than c , as it is a non- relativistic approximation. In order to really examine the extreme cases you are considering you would need to use quantum field theory , as this is compatible with special relativity.
So, summing up, as the uncertainty relation can be derived from Schrodingers wave equation, I would regard it as a low energy, non - relativistic approximation , that nontheless can be used in the majority of applications but that does not necessarily apply exactly under the extreme conditions you are using as thought experiments. I hope this helps , please let me know if I can clarify further.
 
  • #23
Also , it is important to point out that what the uncertainty relations mean can be coloured by whichever ontological perspective you are viewing from, for example in some interpretations of standard quantum mechanics the wave equation represents what are actual "potential" measurement probabilities that we can expect to achieve depending on what experiment we carry out and it is meaningless to ascribe definite properties of non-commutating variables in between measurements, and in other interpretations , it just represent all the potential INFORMATION (or lack of) it is possible for us to know , and in other interpretations it represents the relational potential observables between quantum object and observer (measurement device). There are now many competing interpretations, suggest you read about :
Cramers transactional interpretation
Copenhagen Interpretations (there are a few)
Many Worlds
Consistent Histories
Relational Quantum Mechanics
Quantum Information
 

What is the Heisenberg Uncertainty Principle?

The Heisenberg Uncertainty Principle, also known as the Heisenberg's Uncertainty Principle or the Indeterminacy Principle, is a fundamental principle of quantum mechanics that states that it is impossible to simultaneously know the exact position and momentum of a particle.

Who discovered the Heisenberg Uncertainty Principle?

The Heisenberg Uncertainty Principle was first proposed by German physicist Werner Heisenberg in 1927. Heisenberg's groundbreaking work on quantum mechanics revolutionized the field and earned him a Nobel Prize in Physics in 1932.

What is the significance of the Heisenberg Uncertainty Principle?

The Heisenberg Uncertainty Principle has significant implications in the field of quantum mechanics. It fundamentally challenges the classical notion of determinism and shows that there will always be a level of uncertainty in measuring the properties of particles at the quantum level. This principle has also led to the development of other important theories in physics, such as the Copenhagen interpretation of quantum mechanics.

How does the Heisenberg Uncertainty Principle work?

The Heisenberg Uncertainty Principle is based on the concept of wave-particle duality, which states that particles can exhibit both wave-like and particle-like behavior. This principle states that the more accurately we measure the position of a particle, the less accurately we can measure its momentum, and vice versa. This is due to the fact that the act of measuring the position or momentum of a particle inevitably disturbs its state.

What are the practical applications of the Heisenberg Uncertainty Principle?

The Heisenberg Uncertainty Principle has practical applications in various fields, such as quantum computing, cryptography, and medical imaging. In quantum computing, the principle allows for the secure transfer of information, as any attempt to intercept or measure the information would cause a disturbance. In medical imaging, the principle is used to create detailed images of the body by measuring the position and momentum of particles in the body.

Similar threads

Replies
5
Views
1K
Replies
13
Views
1K
  • Quantum Physics
Replies
18
Views
2K
  • Quantum Physics
Replies
2
Views
848
  • Quantum Physics
Replies
15
Views
802
Replies
4
Views
997
  • Quantum Physics
2
Replies
36
Views
1K
  • Quantum Physics
Replies
21
Views
982
  • Quantum Physics
Replies
17
Views
1K
Replies
5
Views
700
Back
Top