Uncertainty Principle in Physics: Eng. Phys II

In summary: well...remain the same...but the uncertainty of where the particle is would be larger due to the uncertainty principle.
  • #1
woodysooner
174
0
In my eng. phys II class my prof. today was telling us about how in a bb experiment if you shoot it through a small slit (very small) and shoot let's say 20 bb's in a row and then look at where they embedd themselves past the slit if you have let's say piece of cardboard they bbs even if all shot exact same position would land not at the dead center but arranged chaotically in a gausian distribution lol no clue on that one but he said that was because of uncertanity principle but i ask him about the limits of uncertainty to my knowledge I thought it only held for small light wavelength sizes and below he said it applied to everything. Is it not just a statement that when you deal with ultra small particles you can never know both the vel and position and onces you know one it alters it?
 
Physics news on Phys.org
  • #2
The gaussian would be...ummm... VERY peaky. In fact I doubt even the best experimenter could detect it given other inevitable uncertainties (classical ones) in the experiment. Variation in size of BBs, finite properties of cardboard, stuff like that. But in theory he's right.
 
  • #3
Id say your prof is right and wrong. It does apply to everything, but when you apply it to things that are larger than particles, then the effect will be soo small, it will be non noticable.
 
  • #4
confused

so does the uncertainty principle hold for all objects no matter what size, but you can't detect it but for things smaller than the wavelenth of light. So he's right about it but he's wrong saying that all the bb's go everywhere because bb's are too large to detect uncertainty.?

Cheers Woody
 
  • #5
Yes. If the uncertainty in position is even like only 10% of the size of the BB (barely noticeable), then the uncertainty in momentum would be absolutely miniscule (like 10-alot). BB's tend to fly around with an uncertainty in momentum and position that are both extremely small (compared to it's classical properties), since the product is on the order of 10-34 J-s (and it's classical properties are on the order of 10-4 m and 10-4 kg).
 
  • #6
Turin I'm sorry for not fully understanding that, do you mean that as large as a bb is you would not notice any uncertainty effects. aka the bb's would all go as planned straight where I aim the bb gun to send them?

Cheers
Woody
 
  • #7
Turin is explaining that the inherent quantum uncertainty of a bb is miniscule--way below detection. But there are plenty of other sources of non-quantum "uncertainty" and "random" variation. These "classical" sources of "noise" can be reduced to the limit of our ability to build the "perfect" bb gun--but it will cost you. (Just compare a crappy bb gun to a sharpshooter's precision competition rifle.) Quantum uncertainty, however, cannot be reduced.
 
  • #8
so you are saying don't take a bad make and model with bad precision to be uncertainty. I like that my prof still says that the bb's go everywhere and that's because of uncertainity is he crazy or is he right somewhere.
 
  • #9
The uncertainty principle is the hard limit to predictability of quantities in any system. Classical sources of variance [bb size/mass, air currents, ejection force] would dominate in this example. The prof is technically correct, but, you would not be able to derive Planck's constant using this approach.
 
  • #10
so what can be used or said when he states that quantum uncertanties can be seen! on large objects.

I am not trying to be mean to my prof whatever i say will be humble but when he and i have these conversations i learn tons its the only way i can get him to talk about this stuff.

so anything for my arsenal will help.
 
  • #11
I believe the largest object ever seen that was in a pure quantum state was a supercooled droplet abour 2mm across. Scanning tunneling electron microscopes can see individual atoms, but their methods are too crude (it think) to detect uncertainty.

Uncertainty is built into all quantum theories. It is the cornerstone, as it were, of quantum electrodynamics, via the "equal time commutation relations'. And QED makes predictions that match experiment to six decimal places. So that's an indirect confirmation of the uncertainty principle.
 
  • #12
thanx self adjoint, you don't by chance have a link or anything to the 2mm supercooled stuff do you.
 
  • #13
I have another question about uncertainty principle:

suppose the spatial wavefunction of a particle extends outside the lab (there is a chance that the particle is found outside the lab).
Experiments are carried out in the lab, measuring the momentum of the particle.

Is the minimum experimental uncertainty larger than the quantum uncertainty of the momentum?
 
  • #14
I would imagine so. The primary source of uncertainty would probably be in the device that was used to cool the particle and make it's deBroglie wavelength extend outside the lab. Furthermore, once the particle extended beyond the cooling chamber, there would be entangling effects and decoherence.
 
  • #15
I think you misunderstood my question.

"Is the minimum experimental uncertainty larger than the quantum uncertainty of the momentum?"

I was trying to ask whether the dimension of the lab (or cooling chamber) in which measurement is performed, itself causes an uncertainty that cannot be minimized.

Assume the cooling device causes no uncertainty in the results, (either assume the particle is already cool, or the cooling device is Perfect). Let the environment of the lab (inside and outside) be vacuum so there's no entangling effect and decoherence.
 
  • #16
If I understand you correctly, then the uncertainties are the same; not just the same value, but the same thing. The fundamental quantum uncertainty is a feature of the experimental setup, and it cannot really be separated. The uncertainty due to the lab is the quantum uncertainty.
 
  • #17
Is Uncertainty Principle meaningless ?

In theory a single photon can measure either the position or the (angular) momentum of an electron in an orbit precisely, but not both, right?

If we send in two photons at the exact same time along the exact same vector and in phase, then they would both hit the electron at the same time, right? Can we then measure one photon to determine the position of the electron, and at the same time, measure the other photon to determine the (angular) momentum of that same electron?

So, the next question is: Are there any laser techniques or other techniques that can bunch two photons together?
 
  • #18
what_are_electrons said:
In theory a single photon can measure either the position or the (angular) momentum of an electron in an orbit precisely, ...
I'm not so sure about that. A single photon is either entirely absorbed by the electron or it passes the electron completely obliviously. The best you can intentionally do with a single photon is hope that the electron absorbs it, and subsequently emits a photon that you can detect. Then, when you detect the emitted photon, you can determine from which direction it came. Perhaps the distance between the emission and detector can be inferred from this process, but, off the top of my head, I don't see how. The procedure to determine a momentum state using a single photon is a bit more indirect. Basically, if the sent photon resonnates with the electron, then there is an energy difference. That together with the direction can then lead to a determination of some aspects of the momentum. But, again, off the top of my head, I don't see how the momentum could be completely specified.




what_are_electrons said:
If we send in two photons at the exact same time along the exact same vector and in phase, then they would both hit the electron at the same time, right?
The process of generating such a well defined photon-pair aside, you still have no control over whether either photon will interact with the electron.




what_are_electrons said:
Are there any laser techniques or other techniques that can bunch two photons together?
Hmm. I don't know of any.
 
  • #19
continue exploring the 2nd question:

Assume measurements (of multiple same 1-particle systems) are done inside the lab, the experimental position uncertainy you get will not be larger than the range of the lab. WHY? because the data you get is always in the range, so the standard deviation is always less than or equal to the range.

Implication is:
If the particle is close to a momentum eigenstate, the uncertainty principle is violated, since the position uncertainty never blows up.


?
 
  • #20
kakarukeys said:
continue exploring the 2nd question:

Assume measurements (of multiple same 1-particle systems) are done inside the lab, the experimental position uncertainy you get will not be larger than the range of the lab. WHY? because the data you get is always in the range, so the standard deviation is always less than or equal to the range.

Implication is:
If the particle is close to a momentum eigenstate, the uncertainty principle is violated, since the position uncertainty never blows up.


?

Eh? <notice the same number of question marks>

The "physical" range of your measurement has nothing to do with the uncertainty of the measurement.

Let's say the system has a plane-wave description such as exp(ikx), which is certainly a momentum eigenstate, as you require. Let's also say that the lab you are measuring this has a width of L, and your setup that emits this plane-wave particle (not an oxymoron) is at x=0. So the physical range of your detection of the position of this particle is from x=0 to x=L.

You start emitting the particle one at a time and try to measure it's position. If you do this enough number of times, the statistics of the position measurement will look like a flat line extending from 0 to L. However, is THIS the uncertainty of your measurement? It isn't! If you try to fit a gaussian or a lorentzian to this set of data points, it'll blow up, or at least give you some arbitrary answer depending on your fitting routine. The profile of the spread (a flat line or a square wave with abrupt boundary at 0 and L) will give you an uncertainy in position that is certainly extremely large even when you can only physically measure a finite range of position.

There are no violations of the uncertainty principle here...

Zz.
 
  • #21
Kakarukeys,
Ignorance of position does not necessarily constitute improbability of position.
 
  • #22
I don't get what you mean, perhaps it's my ignorance.

1stly, Why would you want to fit a Gaussian to a flat line?

2ndly,
In reality, we can't prepare an exact momentum eigenstate, anyway, it is good to discuss it.

the statistics of the position measurement will look like a flat line extending from 0 to L

The profile of the spread (a flat line or a square wave with abrupt boundary at 0 and L) will give you an uncertainy in position that is certainly extremely large even when you can only physically measure a finite range of position.

Let's see whether your statements are true?
let L = 1, so the range is [0, 1].
and so the probability distribution you get is P(x) = 1
Normalization: Integrate P(x) from 0 to 1 is 1

the mean is obviously 1/2
the mean of squares <x^2> = Integrate x^2 from 0 to 1 = 1/3

uncertainty = square root of (<x^2> - <x>^2)
= sqrt(1/12)

about a quarter of the range.

3rdly, even if you can show that the uncertainty is extremely large,
that does not save the uncertainty principle,
because your uncertainty of momentum is zero.

Ignorance of position does not necessarily constitute improbability of position.

perhaps we should suggest a way to work around with ignorance?
 
  • #23
kakarukeys said:
I don't get what you mean, perhaps it's my ignorance.

1stly, Why would you want to fit a Gaussian to a flat line?

Because when you want to know the distribution of your data around a mean or median, you want to know the STANDARD DEVIATION. Such things imply that there is an "expected" result, and that there is a spread around that result. A gaussian or a lorentzian are the two typical statistical distribution that fit such a description.

2ndly,
In reality, we can't prepare an exact momentum eigenstate, anyway, it is good to discuss it.

Plane wave states are as good as any. To a first approximation, the Drude model of conduction electrons in a metal are described as plane waves. So it isn't THAT difficult to accept.

Let's see whether your statements are true?
let L = 1, so the range is [0, 1].
and so the probability distribution you get is P(x) = 1
Normalization: Integrate P(x) from 0 to 1 is 1

the mean is obviously 1/2
the mean of squares <x^2> = Integrate x^2 from 0 to 1 = 1/3

uncertainty = square root of (<x^2> - <x>^2)
= sqrt(1/12)

about a quarter of the range.

3rdly, even if you can show that the uncertainty is extremely large,
that does not save the uncertainty principle,
because your uncertainty of momentum is zero.

Again, refer to the things I said at the beginning of this. I think you need to go over a little bit of basic statistics. Just think about it - if you have a flat distribution, what is the EXPECTED result if you were to make the NEXT measurement. It is not by any coincidence that <x> is called the EXPECTATION value of x. Just because you can calculate an "average" doesn't mean that it has any meaning. Look at your data and see what it means, rather than just plugging and chugging the mathematics.

Secondly, how do you know the probability is 1 within your expt. range? There's nothing that says that there are others you did not detect outside that range. That's why LOOKING at the distribution profile is crucial, and that is why we fit the data to something so that we can infer about the distribution of the data outside the range.

Thirdly, when the uncertain in position is "extremely large", it means that this is a very large number. It is CONSISTENT with the fact that the uncertainty in momentum is very small! You cannot measure a data distribution that is either zero, or infinity, because you run into the resolution of the measuring device and physical limitations. So if you try to measure the momentum of the emitted particles, you WILL see a small distribution no matter how accurately you measure it.

Zz.
 
Last edited:
  • #24
Hi every body ,
I am happy to have joined the forum .
I wonder if any of our friends has read D'espagnat's veiled reality . I think this can be of great help in this respect .I shall be glad to know your ideas and am ready to share mine with you.
 
  • #25
what_are_electrons said:
In theory a single photon can measure either the position or the (angular) momentum of an electron in an orbit precisely, but not both, right?

If we send in two photons at the exact same time along the exact same vector and in phase, then they would both hit the electron at the same time, right? Can we then measure one photon to determine the position of the electron, and at the same time, measure the other photon to determine the (angular) momentum of that same electron?

So, the next question is: Are there any laser techniques or other techniques that can bunch two photons together?

Firstly it's monemtum and positon that are conjugate variables, orthogonal compents of angular momentum are conjugate variables.

There's nothing in quantum mechanics that says we can't measure variables with arbitary accuracy, so let's say we perform the experiment with a photon once a deterimine it's postion and then perform it again using the exact same set-up, using another idensntical photon and determine it's momentum. We can then congratulate ourselves on defeating the uncertyainity principle and perhaps we might wonder if it's so easy to defeat why has it been the backbone of physics for so many years?

The answer is that we never defeated it, the uncertainity principle was still in operation in our experiments and if we'd of repeated them this would of bevome obvious, becausae as we perform the experiment repeatdly we notice that even though the set-up is exactly the same and we measure with arbitary accuracy, we do not obtain exactly the same results, infact when we examine the spred of our monetum and postion results we find that it obeys the HUP.
 
  • #26
Because when you want to know the distribution of your data around a mean or median, you want to know the STANDARD DEVIATION. Such things imply that there is an "expected" result, and that there is a spread around that result. A gaussian or a lorentzian are the two typical statistical distribution that fit such a description.

Nope, I want to know the Uncertainty. There are actually many definitions of the uncertainty. One of them is the reciprocal of the average of distribution function

1 / Integrate |wave function|^4 dx.

This definition does not suggest 'dispersions of values from the mean"

But here, since we are discussing the uncertainty principle, we have no choice, the definition of uncertainty here is the STANDARD DEVIATION.

The use of STANDARD DEVIATION does not mean the distributions of x and p must be single-peak distributions. The Uncertainty Principle is robust, whatever the distributions of x and p are, it should always hold.

Look at your data and see what it means
Yes, I look at the data of the flat distribution and I think that fitting a Gaussian or Lorentzian is inappropriate. It is best to leave it as it is.

Secondly, how do you know the probability is 1 within your expt. range? There's nothing that says that there are others you did not detect outside that range. That's why LOOKING at the distribution profile is crucial, and that is why we fit the data to something so that we can infer about the distribution of the data outside the range.

The P(x) = 1 is just an example of a flat distribution with uncertainty less than the range. There is no way to infer about the distribution of the data outside the range by looking at the data inside the range! There are countless possibilities. The values of a function outside an interval are independent of the values in the interval!

I know many statisticians do that, they define the standard deviation of a small sample collected from a huge distribution to be square root of

Sum (x - mean)^2 / n-1

instead of

Sum (x - mean)^2 / n

This makes a little sense if the distribution is continuous function. The value at a point is close to the value at another point in the neighbourhood of the point. And so by collecting values randomly around the whole real line to form the small sample, the values outside the small sample MIGHT BE DEPENDENT OF the values inside the small sample.

BUT here, your small sample comes from one local interval, not randomly selected small intervals across the real line.

You can't guess what is outside!
 
Last edited:
  • #27
kakarukeys said:
Nope, I want to know the Uncertainty. There are actually many definitions of the uncertainty. One of them is the reciprocal of the average of distribution function

1 / Integrate |wave function|^4 dx.

This definition does not suggest 'dispersions of values from the mean"

But here, since we are discussing the uncertainty principle, we have no choice, the definition of uncertainty here is the STANDARD DEVIATION.

The use of STANDARD DEVIATION does not mean the distributions of x and p must be single-peak distributions. The Uncertainty Principle is robust, whatever the distributions of x and p are, it should always hold.

The issues is how you would determine the uncertainty relationship in THIS MEASUREMENT! The only way you can determine this experimentally is by doing such a measurement and figuring out the distribution of your measurement around an "expected" value. If you look at your measurement, there is NO expected value! All values are equally probable! That's what a flat distribution curve is telling you! If you try to fit a gaussian or lorentzian to such a distribution, you end up with either a standard deviation or a full-width-at-half-maximum of an obscenely large number.

The flat distribution curve is telling you that all values of position is valid. It is also telling you that if you make the range of your measurement larger (greater than from 0 to L), then you will get the same type of distribution, unless you believe that it OK to have an abrupt ending somewhere regardless of the flat trend of the curve. If *I* were to have measured this set of data, and I see such a flat distribution, and I know that the data range is between 0 to L simply because that's the physical range that I can measure, I will be an an extremely awful, and short-sighted experimentalist to conclude that that's the accurate physical description of what Nature is trying to tell me. All I need to do is extend the range of my measurement, and I will see I keep getting the same distribution, with no "end" in sight, i.e. the data will not start sloping or decaying down indicating a trend that it might actually drop down to zero. It is the reason that we plot our data, to look at trends and patterns!

The P(x) = 1 is just an example of a flat distribution with uncertainty less than the range. There is no way to infer about the distribution of the data outside the range by looking at the data inside the range! There are countless possibilities. The values of a function outside an interval are independent of the values in the interval!

I know many statisticians do that, they define the standard deviation of a small sample collected from a huge distribution to be square root of

Sum (x - mean)^2 / n-1

instead of

Sum (x - mean)^2 / n

This makes a little sense if the distribution is continuous function. The value at a point is close to the value at another point in the neighbourhood of the point. And so by collecting values randomly around the whole real line to form the small sample, the values outside the small sample MIGHT BE DEPENDENT OF the values inside the small sample.

BUT here, your small sample comes from one local interval, not randomly selected small intervals across the real line.

You can't guess what is outside!

Yes you can. It is the reason why there are theoretical description of a phenomena, so that we can predict what will happen outside of the range that has been tested! When we sent things into space, we're "guessing" how our terrestrial laws will work "outside". If I see a data set that looks like a truncated hump, the BEST that I can do to analyze that is to see what the distribution looks like.

But what is worse tha not going "outside" is to be fooled into the prejudice that our physical limitations is equal to the Nature limitation. Your self-imposed, artificial boundary isn't the boundary of the phenomena that you are measuring. If you start expanding that boundary, and still get the same distribution, you immediately know that it is wrong to think your physical boundary is all there is to the phenomena. This is the worst mistake any experimentalist can do (and I should know since I am an experimentalist and I ALWAYS am vigilant on this).

At any rate, you also certainly cannot deduce from your limited range that the uncertainty in position is equivalent to the range of your measurement. If you don't think making extrapolation beyond the range of measurement is valid, then doing what you are doing is equally ridiculous, especially in light that you are claiming that the uncertainty principle CAN be violated.

Considering that this is my 3rd attempted at explaining to you why your conclusion is incorrect, and the fact that I seem to not have either explained it clearly, or not getting through you, I do not know of any other means of doing this. Maybe someone else can do better. If you believe that you have found a way to violate the uncertainty principle, I suggest you submit your finding to a peer-review journal to have it published, since this would be a fundamental, physics-shattering discovery.

Zz.
 
  • #28
After thinking about it on my own, I gradually understands why you want to
fit a Gaussian to a local flat distribution. that is because

you think that the local flat distribution could be the peak of a very very large Gaussian distribution. Since the range is so small, it looks flat.

That reduces to the same issue that whether one can predict the values outside when we know what is inside.

Yes you can. It is the reason why there are theoretical description of a phenomena, so that we can predict what will happen outside of the range that has been tested! When we sent things into space, we're "guessing" how our terrestrial laws will work "outside".

No you can't use the analogy. It says, knowing (completely) a system at a place, can we know how a same system performs at other place?

Our problem here is, knowing part of a system, can we know the other part of the system?

Rigourously speaking, you can't. The position eigenfunctions which are the dirac deltas are orthogonal, they and their amplitudes do not depend on each others.

If you are given a bunch of disconnected fragments of a curve, if you are told that the curve is a continuous function, then you might join the fragments together and conclude that to be close to the real curve.

If you are given only one fragment in a local interval, there is nothing you can do.

If there is no way to do experiment outside the range, accepting the value based on the limited data IS THE ONLY OPTION, ALTHOUGH IT LOOKS RIDICULOUS. This is the BEST VALUE YOU CAN FIND. The next time you do experiment, you will find better instruments, or prepare a minimized version of the state so that the whole state is inside the lab.

I don't believe this consitutes a violation of the uncertainty principle. That's why I put 3 question marks. There are two ways out of it that I can think of:

(1) when you measure the momentum in the restricted range, you will get extra uncertainty (because of the effect of the range), to counter balance the reduced uncertainty of the position.

(2) the Uncertainty Principle simply does not apply to experimental uncertainties, because all experimental devices have their limitations.
 
  • #29
kakarukeys said:
I don't believe this consitutes a violation of the uncertainty principle. That's why I put 3 question marks. There are two ways out of it that I can think of:

(1) when you measure the momentum in the restricted range, you will get extra uncertainty (because of the effect of the range), to counter balance the reduced uncertainty of the position.

(2) the Uncertainty Principle simply does not apply to experimental uncertainties, because all experimental devices have their limitations.

This will be my last response to this string, because obviously, it has been getting nowhere and my effort seems to be a waste of time.

A superconductor is a finite object with a "restricted range". Yet, the wavefunction of the superfluid is accurately described as a sum of coherent plane waves with no dispersion - and very definite momentum eigenstate. Because of this, the position of the charge carriers are undefined - they are all over the superconductors. Yet, this is true whether the superconductor is 1 mm, 1 cm, 1 m, or even 1 mile! The eigenstates do not change due to the size of the superconductor as long as it is large than the coherent length (below which you get no conventional superconductivity). Thus, the uncertainty in the momentum is independent of the physical size of the object, meaning it is independent of how large of a range you can measure the position/location of the charge carriers within the superconductor.

I have told you, way in the beginning, the plane wave eigenstate. How this actually changes due to the physical limitation of your experiment, you neglected to demonstrate. Remember that 0 and L are NOT boundary conditions. They are merely limitations to the range you can measure. So it doesn't affect the plane wave boundary condition to modify it into a non-plane wave form - which may no longer have clear momentum eigenfunction.

The End.

Zz.
 

Related to Uncertainty Principle in Physics: Eng. Phys II

What is the Uncertainty Principle in Physics?

The Uncertainty Principle, also known as the Heisenberg Uncertainty Principle, is a fundamental concept in quantum mechanics that states that it is impossible to know the exact position and momentum of a particle simultaneously. In other words, the more precisely we know the position of a particle, the less precisely we can know its momentum, and vice versa.

Who discovered the Uncertainty Principle?

The Uncertainty Principle was first proposed by German physicist Werner Heisenberg in 1927. Heisenberg's groundbreaking work in quantum mechanics led to the development of this fundamental principle, which has since been confirmed through numerous experiments.

What is the significance of the Uncertainty Principle in Physics?

The Uncertainty Principle is significant because it challenges our classical understanding of the physical world and demonstrates the limitations of our ability to measure and predict the behavior of particles on a microscopic level. It also plays a crucial role in the development of technologies such as electron microscopes and nuclear magnetic resonance imaging.

How does the Uncertainty Principle affect everyday life?

While the Uncertainty Principle may seem like an abstract concept, it actually has practical applications in everyday life. For example, it explains why we cannot accurately predict the exact path of a hurricane or the location of an electron in an atom. It also has implications for technologies such as GPS, which relies on precise measurements of both position and time.

Is the Uncertainty Principle a proven concept?

Yes, the Uncertainty Principle has been proven through numerous experiments and is considered a fundamental principle of quantum mechanics. It has been confirmed through observations of the behavior of particles at the atomic and subatomic level, and its predictions have been consistently supported by experimental evidence.

Similar threads

Replies
8
Views
2K
  • Quantum Physics
Replies
3
Views
349
  • Quantum Physics
Replies
8
Views
1K
  • Quantum Physics
Replies
12
Views
2K
  • Quantum Physics
Replies
6
Views
2K
  • Quantum Physics
Replies
2
Views
900
Replies
6
Views
2K
Replies
45
Views
8K
Replies
65
Views
15K
Back
Top