- #1

- 91

- 0

You are using an out of date browser. It may not display this or other websites correctly.

You should upgrade or use an alternative browser.

You should upgrade or use an alternative browser.

- Thread starter Godwin Kessy
- Start date

- #1

- 91

- 0

- #2

- 336

- 0

To plump it out -

We no longer think of an electron as either a 'wave' or a 'particle' in quantum physics.

(Just as we don't talk about a mule being

sense.

You might want to have a look at 'The principles of quantum mechanics' by P.A.M. Dirac.

- #3

Science Advisor

- 2,243

- 260

Electrons are particles. So are photons. The wavelike properties that these particles exhibit are a result of the particle's wavefunction -- the fact that these particles obey probabilistic mechanics manifests itself in the wave-like properties that we observe in macroscopic experiments. There's really nothing funny going on.

- #4

- 8

- 0

Electrons are particles. So are photons. The wavelike properties that these particles exhibit are a result of the particle's wavefunction -- the fact that these particles obey probabilistic mechanics manifests itself in the wave-like properties that we observe in macroscopic experiments. There's really nothing funny going on.

Things that behave as waves and particles depending on if they are being observed. Things that seem to exist in some ethereal manner whereby they pop in and out of our existence that we cannot identify both speed and position of . Transfer of information faster than light speed seemingly over any distances . Nothing funny going on has to be the greatest understatement of all time .

- #5

Science Advisor

Gold Member

- 1,780

- 23

Things that behave as waves and particles depending on if they are being observed. Things that seem to exist in some ethereal manner whereby they pop in and out of our existence that we cannot identify both speed and position of . Transfer of information faster than light speed seemingly over any distances . Nothing funny going on has to be the greatest understatement of all time .

There aren't any instances of faster than light transfer of information. And there is nothing that prevents us from measuring the speed and position of an object either. A lot of the confusion arises from attempts to describe quantum behavior in terms of classical phenomenon. The actual quantum theory treats the behavior of objects in a consistent manner.

- #6

- 8

- 0

- #7

- 91

- 0

- #8

- 36,018

- 4,723

You may want to start by reading the FAQ thread in the General Physics forum.

Zz.

- #9

- 79

- 0

imagine a electron is like a standing wave(a delocalization) in an orbital, it comes from uncertainty principle.

you can easily verify that by finding the solution of Schrodinger wave equations for an electron.

- #10

- 8

- 0

The uncertainty principle states that we cannot accurately measure certain pairs of physical properties of particles for example position and velocity or energy and time. The more accurately we measure one of the pair the less accurately we can measure the other.

These principles appear to me to be predicated on the rather ethereal life of particles which appear to reside in a universe of total uncertainty whereby one can only gauge the likelihood of their appearance in our physical universe. Furthermore they appear to be able to travel on all paths between points which given that they travel at C and therefore have infinite time available seems quite possible. It’s only when we look for them that this behavior is modified and they appear in our world.

- #11

Science Advisor

Gold Member

- 1,780

- 23

The uncertainty principle is primarily a relationship on the statistical results of a measurement. We are perfectly able to measure the position and momentum of a particle simultaneously. What the uncertainty principle states though is that if we were to take a large number of identical measurements, then the variance (or spread) of the position and momentum are related. If we are able to have a low variance in the position measurements, then this correlates to a limit on the variance in the momentum measurements. This does not mean to say that it is a completely causal phenomenon. For example, people often ascribe the electron cloud as being a consequence of uncertainty. However, that could not be any further from the truth. The electron cloud, for example, is precisely predicted by the wavefunction and assumes a deterministic position and energy of the electrons. The uncertainty priniple is only involved in terms of measurements.

In Bell's inequality, there is no means for us to set the entangled state to our liking. There is no means for us to transfer information. For this to occur, there would have to be a way for us to apriori set the measured state of our half of the entangled pair. In this manner, we would be able to force the other half of the entangled pair to have a desired state and thus we could transmit ones and zeros by simply manipulating the measurement of our entangled particles. However, this is not possible, when we measure our half of the entangled pair, there is no way for us to force a measured state.

The uncertainty principle states that we cannot accurately measure certain pairs of physical properties of particles for example position and velocity or energy and time. The more accurately we measure one of the pair the less accurately we can measure the other.

These principles appear to me to be predicated on the rather ethereal life of particles which appear to reside in a universe of total uncertainty whereby one can only gauge the likelihood of their appearance in our physical universe. Furthermore they appear to be able to travel on all paths between points which given that they travel at C and therefore have infinite time available seems quite possible. It’s only when we look for them that this behavior is modified and they appear in our world.

Particles do not follow all possible paths nor are they allowed to violate special relativity (any formulation that allows such violation is one that does not include special relativity, most of quantum mechanics uses non-relativistic theory). The path formulations are a mathematical tool and are not considered to have any true physical correlation. Particles can appear and disappear by virtue of special relativity. This does not have anything to do with quantum mechanics. If we have a system with a certain amount of energy, then that energy can convert into a particle by virtue of the Relativity's equivalence principle. If we couple special relativity with quantum mechanics, then we simply provide a mechanism for the creation and annihilation of particles via the equivalence principle. The only really quantum behavior here lies with virtual particles. Virtual particles are particles that are created from and annihilated into energy on very short time scales to the effect that they are not considered to be real particles. This occurs because observation of a system over a very small time interval requires a large variance in the observed energy states by virtue of the uncertainty principle. Since the energy can vary, then the system could momentarily have a large enough energy to create a particle, but since this energy spike is fleeting, so is the particle's lifetime. But at the same time, virtual particles are another mathematical tool. They are not considered to be truly physical and it is important to note that we are not saying that a bunch of energy is created from nothing when we consider these short-term time spans, but that variance in the observed energy is large.

Last edited:

- #12

- 171

- 0

The wave particle duality comes from the famous double slit experiment in which electron particles as well as photons were shot at a wall with two slits and a detector on the other side. When dealing with particles one would expect to see two lines of particles on the detector, however an interference pattern is present which is characteristic of waves. This happens even when there is only one particle traveling through the slits. If any of this is incorrect please set me straight. Hope that helps the OP.

Joe

- #13

Science Advisor

Gold Member

- 1,780

- 23

The wave particle duality comes from the famous double slit experiment in which electron particles as well as photons were shot at a wall with two slits and a detector on the other side. When dealing with particles one would expect to see two lines of particles on the detector, however an interference pattern is present which is characteristic of waves. This happens even when there is only one particle traveling through the slits. If any of this is incorrect please set me straight. Hope that helps the OP.

Joe

The actual mechanism of observation is not considered. However I have seen explanations that follow along such lines. For example, in Feynman's path integral text he explains the uncertainty in terms of purely mechanical methods. I can't remember it well enough now to say more on the matter but I will say that the uncertainty principle can be borne out completely by the mathematics of quantum mechanics.

When we talk about quantum mechanics in terms of states and such, what we are really talking about is the fact that there are many many solutions to the Schroedinger equation. In classical mechanics, these solutions would generally be a continuum but in quantum mechanics they are quantized into discrete states.

So for example, if I have a quantum oscillator, the analog of the classical mass on a spring, we find that unlike the classical oscillator, the quantum one has discrete energy levels. So mathematically we can describe the quantum oscillator as being a combination of the energy states. But there are other observables that we can ascribe to the oscillator. Position is one and so is momentum. We can also find that the system is quantized with respect to these observables as well. However, the big thing is that the state of where the energy is E_0 may not be described as a single momentum state p_0. Instead, state E_0 will be a combination of momentum states. In a way, we are saying that if the oscillator has an energy E_0, then there are a bunch of momentums that the oscillator can have, say p_0, p_1, p_5, ... Thus, if you measured a system that was precisely set in energy state E_0, there are a variety of momentum states that it can be in. This is the uncertainty. When we try to measure a system where the observables do not share the same eigenfunctions (that is states of the Schroedinger equation), then if we narrow down one observable to a definite eigenfunction, then the other observable will be described by a set of eigenfunctions (and thus we can measure multiple values of that observable). The Heisenberg Uncertainty Principle looks at the mathematics and gives us a limit on this variance.

Another way to think of it is this way. In signal processing, we can look at the signal in the time domain or frequency domain. If I have a wave packet, like a Gaussian pulse, in the time domain, then in the frequency domain we would observe a wideband of frequencies that make up this packet. The more narrow the packet is in time (the more localized), the wider the bandwidth that is needed to create the pulse. So we can think of the width of the pulse in the time domain to be like the variance of our measurement in time. The bandwidth in the frequency domain would be related to our variance in the measurement of frequency. If we took a single measurement of time and frequency of the signal, say by reading a single photon from the packet, we would have a very localized time measurement since the pulse is highly localized in time. However, the frequency would be all over the map since the signal is very wideband. But this behavior is only borne out when we look at a measurement of many many photons that make up the signal.

So going back to the ability of particles to pop out of the vacuum. If we treat the system over a short time frame, what happens is that we can have a large variety of possible energy states. The higher energy states are excited enough that the energy can be converted into a particle via the equivalence principle. So we are not saying that the energy of the vacuum is oscillating wildly, we are saying that we are observing the vacuum in such a manner that we can find a large range of perfectly valid energies. Thus, given the probability that we would observe energies of certain values, there is an associated probability that a particle can be created out of the energy. This only bears out by the fact that quantum mechanics is observed over a statistically meaningful set. That is, we observe the system a large number of times. Most of the time, we will see the low energies, but once in a while we will see the high energy.

Last edited:

- #14

Science Advisor

- 2,243

- 260

I gave you a serious answer.

- #15

Science Advisor

- 2,243

- 260

Most of what you just wrote down there doesn't make any sense. There's lots funny going on with your statements. My point, if you read my post ever so carefully, is that wave particle duality is not some spooky statement regarding our inability to assign an identity to subatomic particles.Things that behave as waves and particles depending on if they are being observed. Things that seem to exist in some ethereal manner whereby they pop in and out of our existence that we cannot identify both speed and position of . Transfer of information faster than light speed seemingly over any distances . Nothing funny going on has to be the greatest understatement of all time .

- #16

- 8

- 0

The uncertainty principle is primarily a relationship on the statistical results of a measurement. We are perfectly able to measure the position and momentum of a particle simultaneously.

In Bell's inequality, there is no means for us to set the entangled state to our liking. There is no means for us to transfer information.

.

Particles do not follow all possible paths nor are they allowed to violate special relativity (any formulation that allows such violation is one that does not include special relativity, most of quantum mechanics uses non-relativistic theory). The path formulations are a mathematical tool and are not considered to have any true physical correlation. Particles can appear and disappear by virtue of special relativity. This does not have anything to do with quantum mechanics. If we have a system with a certain amount of energy, then that energy can convert into a particle by virtue of the Relativity's equivalence principle. If we couple special relativity with quantum mechanics, then we simply provide a mechanism for the creation and annihilation of particles via the equivalence principle. The only really quantum behavior here lies with virtual particles. Virtual particles are particles that are created from and annihilated into energy on very short time scales to the effect that they are not considered to be real particles. This occurs because observation of a system over a very small time interval requires a large variance in the observed energy states by virtue of the uncertainty principle. Since the energy can vary, then the system could momentarily have a large enough energy to create a particle, but since this energy spike is fleeting, so is the particle's lifetime. But at the same time, virtual particles are another mathematical tool. They are not considered to be truly physical and it is important to note that we are not saying that a bunch of energy is created from nothing when we consider these short-term time spans, but that variance in the observed energy is large.

In quantum mechanics, the Heisenberg uncertainty principle states that certain pairs of physical properties, like position and momentum, cannot both be known to arbitrary precision. That is, the more precisely one property is known, the less precisely the other can be known. Could you expand on the manner in which you have managed to ovwercome this please.

I did not suggest that we could transmit information faster than light by means of entanglement merely that it has been demonstrated to have occurred in the aspect experiments and others that followed

in QED, light (or any other particle like an electron or a proton) passes over every possible path allowed by apertures or lenses. The observer (at a particular location) simply detects the mathematical result of all wave functions added up.

- #17

- 1,504

- 1

In quantum mechanics, the Heisenberg uncertainty principle states that certain pairs of physical properties, like position and momentum, cannot both be known to arbitrary precision. That is, the more precisely one property is known, the less precisely the other can be known. Could you expand on the manner in which you have managed to ovwercome this please.

I did not suggest that we could transmit information faster than light by means of entanglement merely that it has been demonstrated to have occurred in the aspect experiments and others that followed

in QED, light (or any other particle like an electron or a proton) passes over every possible path allowed by apertures or lenses. The observer (at a particular location) simply detects the mathematical result of all wave functions added up.

FTL transmittion of information has been DEMONSTRATED? please cite.

- #18

Science Advisor

Gold Member

- 1,780

- 23

In quantum mechanics, the Heisenberg uncertainty principle states that certain pairs of physical properties, like position and momentum, cannot both be known to arbitrary precision. That is, the more precisely one property is known, the less precisely the other can be known. Could you expand on the manner in which you have managed to ovwercome this please.

I did not suggest that we could transmit information faster than light by means of entanglement merely that it has been demonstrated to have occurred in the aspect experiments and others that followed

in QED, light (or any other particle like an electron or a proton) passes over every possible path allowed by apertures or lenses. The observer (at a particular location) simply detects the mathematical result of all wave functions added up.

For the last two. Information has not been transferred faster than light in any experiments. The reason, as I explained previously, is that we cannot predetermine what the measurement will be. Thus, there is no mechanism by which we can send a desired signal. It is essentially stating that Professor Busybee always wears one red and one green sock. If you know the left sock is red, then you automatically know that the other sock is green. But you cannot force the Professor to wear a red sock on his left foot so there is no information transferred here. If we could force the left sock to turn up red or green at our behest, then we could come up with a way to somehow ship a succession of the poor Professor's right foot to some recipient far away and send a coded message by manipulating the color of socks that he would see. Using Professor Busybee in this manner is incorrect (and inhumane, I'm sure he's rather attached to his appendages) because the Professor makes a conscious choice in his sock selection, he predetermined the colors in what can be described as a hidden variable. The Bell inequality is what challenges such hidden variables but I only use him as a visual aid.

QED's treatment of the path integral is not physical as I stated previously. It is a mathematical tool and is not meant as a physical mechanism of propagation. In addition, the QED path integral does not allow faster than light paths because it is a relativistic theory. Such paths in the integration obtain special properties that differentiate themselves from valid paths. QED treats light in the quantum field theory manner. The scalar and vector potentials are treated as fields. The excitation of these fields are the photons. Whenever a field interacts, it does through a point-like interaction of a quanta of energy/momentum, the photon. So photons are not treated as particles having trajectories, that is merely a mathematical tool for calculations. Instead, photons are created in the sense that we excite the fields. Photons are annihilated when the fields interact and give up a quanta of energy. In between these events of creation and annihilation we are not making any physical assumptions regarding photons. In effect, we do not really consider the particles to exist. What we regard as particles are the interactions of the fields with the measurement/observer. This interaction behaves like a particle.

The Heisenberg Uncertainty Principle makes NO ascertations about the precision of a measurement. It is a consequence that describes the relationship between the statistical measurement of what are called incompatible observables. It does not mean that the measurements are inaccurate or incorrect. It just means that between certain observables, we will not get the same measurements over and over again over a statistical set.

Think of a machine that measures some quantum state. The machine projects the measurement in the form of marbles. Each measurement makes a sack of marbles that varies in color, number, and size. We will consider color and size to be compatible (or commutable) observables. That is, every marble that is 0.5 in in diameter is always green and vice-versa. However, color (and by extension size) is incompatible (noncommutable) with number. That is, if we measure the state and have the machine make our sack of marbles, we might get 5 green marbles, or 3 green marbles, or 3 red marbles and so on.

So make 10,000 measurements and we get 10,000 sacks of marbles of varying colors and numbers. If we were to separate out the sacks by groups of numbers, we would find that for sacks of 5 marbles we have 10% red, 50% green, and 40% blue. For sacks of 6 marbles we have 20% red, 30% green, and 50% blue. And so on. Thus, we measure the number of marbles EXACTLY, but because a sack of five marbles can be red, green or blue then we get a spread of colors in our set of measurement. This spread will be described by the wavefunction in terms of color for the given eigenvalue of N marbles. This is the same as when we get an eigenfunction of position for a given eigenvalue of E energy. If the system has energy E_0, then the eigenfunction describes the positional distribution of the measurements of the system in this state (assuming time-independence). So if we measure the position of a particle in a system of state E_0, there are many many positions that it could be in and thus we get a statistical spread of position measurements.

Likewise, if we arrange the sacks by color we may find that green came in sacks of 5 10% of the time, 6 40% of the time and so on. This is another eigenfunction that gives the distribution of number for an eigenvalue of color. In this manner, we see that the eigenfunctions that describe the system in terms of color are different than the eigenfunctions that describe the system in terms of number.

Heisenberg's uncertainty principle then gives us the relationship between the variance of color and number in all our measurements. If we go back to our sacks and map out the number and color of the marbles in the sacks, we will find that we have a mean color and number (if we can allow for a mean color, we could allow the color to gradually transist over the visual spectrum as opposed to being three discrete colors). However, there will be a spread in the measurements in numbers and colors. The minimum spread is related by the uncertainty principle.

The problems of measurement and precision do not come into the argument yet, this is purely a consequence of the mathematics of quantum mechanics.

- #19

- 91

- 0

- #20

- 1,504

- 1

- #21

- 5

- 0

Do read the first chapter of Feynman's lectures volume 3

- #22

- 131

- 0

When we do experiments with single electrons or photons, they always are detected as particles. i.e. they are localized in space and time. For example, if the particles are detected on a fluorescent screen they appear as dots, one dot at a time for each particle. There is no evidence of any wave behavior in a single dot.

When we repeat the experiment many times, different particles hit the screen in different locations. It is the angular distribution of all the particles that looks like an interference pattern familiar to us from wave optics. But, it is important to remember that a particle is not a wave. Rather, it is the probability distribution of many particles that we identify as a “wave property” i.e. interference.

We have no idea what the electrons “are doing” in an atom. They do not have classical motion, and quantum mechanics does not describe their behavior. No one has ever seen atomic electrons oscillating up or down or with translational motion around the atom. Unfortunately, there is no answer to the question, “What is the motion of an atomic electron?”

Unfortunately, we cannot describe quantum objects in a classical way, as you would like.

- #23

- 66

- 0

What about experiments suggesting single particles can interfere with themselves and yield interference patterns?(we also have the experiments that say a particle, even a molecule, can be in two places at the same time, which would suggest how it could interfere with itself by passing through two slits at once.)When we do experiments with single electrons or photons, they always are detected as particles. i.e. they are localized in space and time. For example, if the particles are detected on a fluorescent screen they appear as dots, one dot at a time for each particle. There is no evidence of any wave behavior in a single dot.

As for the description of uncertainty, according to wiki

The amount of left-over uncertainty can never be reduced below the limit set by the uncertainty principle, no matter what the measurement process...

Today, logical positivism has become unfashionable in many cases, so the explanation of the uncertainty principle in terms of observer effect can be misleading. For one, this explanation makes it seem to the non positivist that the disturbances are not a property of the particle, but a property of the measurement process— the particle secretly does have a definite position and a definite momentum, but the experimental devices we have are not good enough to find out what these are. This interpretation is not compatible with standard quantum mechanics.In quantum mechanics, states which have both definite position and definite momentum at the same time just don't exist.

This was a surprising prediction of quantum mechanics, and not yet accepted. Many people would have considered it a flaw that there are no states of definite position and momentum. Heisenberg was trying to show this was not a bug, but a feature—a deep, surprising aspect of the universe.

If we take this to be right, even an individual particle cannot have both properties at the same time even if it is not measured, it exists without both these things being definitive. The wiki article has even more strange quantum behavior, such as that the failure to measure(e.g. it fails to hit the detector, if I'm not mistaken.), failure to interact with something, can also disturb the particle.

Since most hidden variable theories assume nonlocality and keep CFD, since one of the two has to go(according to the experimental data), as these [hidden variable theories] are usually dismissed, I assume what was tossed out of mainstream quantum physics was CFD and locality was kept in(If I recall correctly, I've heard locality was kept for compatibility with relativity.) . Such that all the unmeasured properties are believed to not be definitive for something that has not been measured(aka, if we exaggerate or use metaphor: "the moon is not there when you don't look."). It seems more sensible to throw out locality and keep CFD, and sort out any conflicts that may arise with regards to relativity.Counterfactual definiteness is a basic assumption, which, together with locality, leads to Bell inequalities. In their derivation it is explicitly assumed that every possible measurement, even if not performed, would have yielded a single definite result.Bell's Theorem actually proves that every quantum theory must violate either locality or CFD.

Last edited:

- #24

Science Advisor

- 1,402

- 3

What about experiments suggesting single particles can interfere with themselves and yield interference patterns?(we also have the experiments that say a particle, even a molecule, can be in two places at the same time, which would suggest how it could interfere with itself by passing through two slits at once.)

There is no contradiction ... the photon is always *detected* at a single point (i.e. pixel). You do not detect interference patterns for a single experiment with a single photon ... interference patterns are built up out of dots (i.e. single photon detection events), over

So yes, as long as the experiment does not detect which-path information, then each single photon travels along both paths, interferes with itself, and is detected at some point on the detector screen. The probability of it being detected at any given point is determined by the quantum interference.

As for the description of uncertainty, according to wiki

If we take this to be right, even an individual particle cannot have both properties at the same time even if it is not measured, it exists without both these things being definitive.

That is correct for undetected particles (according to standard QM), however it is *not* correct for particles that have interacted with a detector. When you measure a single particle, you measure it's properties with a precision that is determined *only* by the measurement precision, which can be taken to be infinitely good for the sake of this argument. There is *no* fundamental limit on the measurement precision for a single particle .. this is because the phenomenon of quantum decoherence that occurs upon detection causes the system (particle and detector) to be resolved into a single measurement state (i.e. eigenstate of the property being measured).

If the particle happened to exist in a single eigenstate of the measured property before measurement, then only that particular eigenstate will be measured, no matter how many times the experiment is repeated for identical conditions. However, if the particle existed in a superposition of eigenstates (as is true in the case we are considering for position and momentum measurements), then a series of measurements will observe a range of results, where the distribution is determined by the probability envelope of the superposition, which is in turn limited according to the HUP.

- #25

- 66

- 0

According to one of the last https://www.physicsforums.com/showthread.php?t=57528" the data suggests that each individual particle must be interfering with itself to give the observed results. It seems you agree, that it can interfere with itself, so it indirectly shows that even individual single photons and particles can showcase wavelike behavior.There is no contradiction ... the photon is always *detected* at a single point (i.e. pixel). You do not detect interference patterns for a single experiment with a single photon ... interference patterns are built up out of dots (i.e. single photon detection events), overa series of repeated experiments, each with a single photon.

So yes, as long as the experiment does not detect which-path information, then each single photon travels along both paths, interferes with itself, and is detected at some point on the detector screen. The probability of it being detected at any given point is determined by the quantum interference.

Ifhttp://en.wikipedia.org/wiki/Renninger_negative-result_experiment" [Broken] with a detector can disturb a particle and thus bring uncertainty, how can you be sure that your particular interaction with a particular detector is not affected by the lack of interactions with other parts disturbing the particle?That is correct for undetected particles (according to standard QM), however it is *not* correct for particles that have interacted with a detector. When you measure a single particle, you measure it's properties with a precision that is determined *only* by the measurement precision, which can be taken to be infinitely good for the sake of this argument. There is *no* fundamental limit on the measurement precision for a single particle .. this is because the phenomenon of quantum decoherence that occurs upon detection causes the system (particle and detector) to be resolved into a single measurement state (i.e. eigenstate of the property being measured).

If the particle happened to exist in a single eigenstate of the measured property before measurement, then only that particular eigenstate will be measured, no matter how many times the experiment is repeated for identical conditions. However, if the particle existed in a superposition of eigenstates (as is true in the case we are considering for position and momentum measurements), then a series of measurements will observe a range of results, where the distribution is determined by the probability envelope of the superposition, which is in turn limited according to the HUP.

Last edited by a moderator:

- #26

Science Advisor

- 1,402

- 3

According to one of the last https://www.physicsforums.com/showthread.php?t=57528" the data suggests that each individual particle must be interfering with itself to give the observed results. It seems you agree, that it can interfere with itself, so it indirectly shows that even individual single photons and particles can showcase wavelike behavior.

Yup.

Ifhttp://en.wikipedia.org/wiki/Renninger_negative-result_experiment" [Broken] with a detector can disturb a particle and thus bring uncertainty, how can you be sure that your particular interaction with a particular detector is not affected by the lack of interactions with other parts disturbing the particle?

To quote the chicken farmer from Napoleon Dynamite, "I don't understand a word you just said."

Seriously though ... I really don't understand how any of that is relevant to what we were discussing. Yes, there are many different perturbations that may end up changing the particular value observed for a given measurement. My point was that there is only ever *one* value that is measured (and with arbitrary precision).

Last edited by a moderator:

- #27

- 120

- 18

Obviously, this is somewhat strange. Personally I find that it is natural that matter has a wavelength (dependent on momentum) because E = mc^2, and also from Fourier series descriptions of particles. E = mc^2 says that all waves have an associated mass (since observable waves must have energy), and Fourier series say that the probability density of finding a particle as a function of position can be represented as a wave.

What schroedinger did was to simply say that there is a wave function associated with each particle. This came from two quantum principles:

(1) E = h[tex]\nu[/tex]

(2) P = h[tex]/[/tex][tex]\lambda[/tex]

Using (1) the time frequency of the wave can be found (i.e. the "speed" of propagation)

Using (2) the spatial frequency of the wave can be found.

By adding many waves together, it is possible to create many different shapes of probability distributions. The end result however is Heisenberg's uncertainty principle:

It turns out that the Schroedinger equation describes the probability density of finding a particle as a function of time and space. If you localize the wave to a single point, then you don't know what the wavelengths are. Identically, if you know the wavelength exactly, then the wave function is so spread out that it is impossible to tell where the particle is if you were to observe it. This is all of course according to a wave function mechanics.

- #28

- 1,504

- 1

@flashprogram: I made this same error of reasoning a few months ago (can't find the thread), and was roundly spanked. SpectraCat is giving you a very good answer. The wavefunction interferes with itself, but to detect the interference requires the "image" (classic pattern) be built dot by dot. If a single particle is interfering with itself, no test thus far can show that WITH a single particle. The wavefunction, yes, the particle, no.

EDIT: Damn, beaten to the punch! :rofl:

@Couchyam: Welcome, and very nice first post! You wouldn't believe how most people (myself included) make a poor first impression.

EDIT: Damn, beaten to the punch! :rofl:

@Couchyam: Welcome, and very nice first post! You wouldn't believe how most people (myself included) make a poor first impression.

Last edited:

- #29

- 66

- 0

To quote the chicken farmer from Napoleon Dynamite, "I don't understand a word you just said."

Seriously though ... I really don't understand how any of that is relevant to what we were discussing. Yes, there are many different perturbations that may end up changing the particular value observed for a given measurement. My point was that there is only ever *one* value that is measured (and with arbitrary precision).

Well the example there shows that lack of interaction with a particular hypothetical detector can disturb the particle, I assume that would introduce uncertainty and affect measurements. Given that particles are part of the universe, there are countless things that one could imagine could be 'considered' detectors and if that is so this should interfere with any measurement simply due to the lack of interaction with these other parts of the system.

That is unless there are stringent requirements, to what could be considered a valid detector, detectors that can disturb a particle even when there is no interaction. Or if said disturbances somehow don't affect your measurements.

@flashprogram: I made this same error of reasoning a few months ago (can't find the thread), and was roundly spanked. SpectraCat is giving you a very good answer. The wavefunction interferes with itself, but to detect the interference requires the "image" (classic pattern) be built dot by dot. If a single particle is interfering with itself, no test thus far can show that WITH a single particle. The wavefunction, yes, the particle, no.

True, but the point was that although you couldn't see it directly the data from current experiments seems to imply that it is indeed taking place even in single particles.

Right now I don't recall where, but I think I heard about an experiment with a stationary molecule that seemed to be in two places at once.

- #30

Science Advisor

Gold Member

- 7,884

- 1,720

Obviously, this is somewhat strange. Personally I find that it is natural that matter has a wavelength (dependent on momentum) because E = mc^2, and also from Fourier series descriptions of particles. E = mc^2 says that all waves have an associated mass (since observable waves must have energy), and Fourier series say that the probability density of finding a particle as a function of position can be represented as a wave.

What schroedinger did was to simply say that there is a wave function associated with each particle. This came from two quantum principles:

(1) E = h[tex]\nu[/tex]

(2) P = h[tex]/[/tex][tex]\lambda[/tex]

Using (1) the time frequency of the wave can be found (i.e. the "speed" of propagation)

Using (2) the spatial frequency of the wave can be found.

By adding many waves together, it is possible to create many different shapes of probability distributions. The end result however is Heisenberg's uncertainty principle:

It turns out that the Schroedinger equation describes the probability density of finding a particle as a function of time and space. If you localize the wave to a single point, then you don't know what the wavelengths are. Identically, if you know the wavelength exactly, then the wave function is so spread out that it is impossible to tell where the particle is if you were to observe it. This is all of course according to a wave function mechanics.

I agree with Frame Dragger: Great first post! Welcome to PhysicsForums!

- #31

- 131

- 0

According to one of the last https://www.physicsforums.com/showthread.php?t=57528" the data suggests that each individual particle must be interfering with itself to give the observed results. It seems you agree, that it can interfere with itself, so it indirectly shows that even individual single photons and particles can showcase wavelike behavior.

There is no wave-like behavior until many photons have been detected and we observe the probability distribution of those many photons. Quantum mechanics is about probabilities and it is the probability distribution that we identify as an interference pattern.

It might help to read A. Tonomura, J. Endo, T. Matsuda, T. Kawasaki, and H. Exawa,

“Demonstration of single electron build-up of an interference pattern”, Am. J. Phys. 57, 117-120 (1989).

This experiment is discussed at length in Mark P. Silverman, "More Than One Mystery" (Springer-Verlag, New York, 1995), pp. 1-8, and also in George Greenstein and Arthur G. Zajonc, "The Quantum Challenge" (Jones and Bartlett, Boston, 1997), pp. 1-4.

If we consider only what is observed in real experiments, then, in my opinion, much of the confusion goes away. It is when we try to explain what is happening, that things get "weird". The Tonomura paper cited above describes a real experiment. There are many others, since it is now commonplace to do experiments where there is only one photon in the apparatus at anyone time. None of the experiments show a single photon interfering with itself, passing through two slits at the same time, smearing itself all over a detection screen, or doing other strange things often attributed to it. Such comments are our attempts to explain "what the results really mean." If something cannot be verified experimentally, we should be very suspicious. And, unfortunately, the experimental results do not pertain to the photon before it is detected.

Last edited by a moderator:

- #32

- 66

- 0

There is no wave-like behavior until many photons have been detected and we observe the probability distribution of those many photons. Quantum mechanics is about probabilities and it is the probability distribution that we identify as an interference pattern.

It might help to read A. Tonomura, J. Endo, T. Matsuda, T. Kawasaki, and H. Exawa,

“Demonstration of single electron build-up of an interference pattern”, Am. J. Phys. 57, 117-120 (1989).

This experiment is discussed at length in Mark P. Silverman, "More Than One Mystery" (Springer-Verlag, New York, 1995), pp. 1-8, and also in George Greenstein and Arthur G. Zajonc, "The Quantum Challenge" (Jones and Bartlett, Boston, 1997), pp. 1-4.

If we consider only what is observed in real experiments, then, in my opinion, much of the confusion goes away. It is when we try to explain what is happening, that things get "weird". The Tonomura paper cited above describes a real experiment. There are many others, since it is now commonplace to do experiments where there is only one photon in the apparatus at anyone time. None of the experiments show a single photon interfering with itself, passing through two slits at the same time, smearing itself all over a detection screen, or doing other strange things often attributed to it. Such comments are our attempts to explain "what the results really mean." If something cannot be verified experimentally, we should be very suspicious. And, unfortunately, the experimental results do not pertain to the photon before it is detected.

I will have to read up on that, but the comments do suggest that the overall result must depend on individual photons having exhibited such behavior.

- #33

- 131

- 0

As for the description of uncertainty, according to wiki

If we take this to be right, even an individual particle cannot have both properties at the same time even if it is not measured, it exists without both these things being definitive. The wiki article has even more strange quantum behavior, such as that the failure to measure(e.g. it fails to hit the detector, if I'm not mistaken.), failure to interact with something, can also disturb the particle.

We need to discuss quantum uncertainty. Spectracat has it right! The accuracy of a measurement is determined by the skill of the experimenter and the limitations of his apparatus. We can imagine an ideal experiment that yields exact values for the observable being measured and not be in conflict with quantum theory.

Classically, if we repeat the experiment we always get the same experimental result; classical physics is deterministic and the result is certain.

Not so in a quantum experiment! When we repeat the quantum experiment, in general, we get a different result, even if everything is done perfectly. If we repeat the experiment many times then we get the entire eigenvalue spectrum of the observable being measured.

The probability distribution of those results can be used to calculate the uncertainty in those results. Quantum mechanics is indeterminate and there is an uncertainty in the measured observable.

Of course, there is the special case (where the particle is in an eigenstate) where the experiment is set up so we always get the same result, as in the classical case. In this special quantum case there is no uncertainty.

- #34

- 1,504

- 1

Trust me on this... save yourself the pain and just do the research into this. You'll be happier without the continued protestations, and frankly, you'll have a new appreciation for the current experimental limitations on exploring the quantum world.

- #35

- 66

- 0

We need to discuss quantum uncertainty. Spectracat has it right! The accuracy of a measurement is determined by the skill of the experimenter and the limitations of his apparatus. We can imagine an ideal experiment that yields exact values for the observable being measured and not be in conflict with quantum theory.

Classically, if we repeat the experiment we always get the same experimental result; classical physics is deterministic and the result is certain.

Not so in a quantum experiment! When we repeat the quantum experiment, in general, we get a different result, even if everything is done perfectly. If we repeat the experiment many times then we get the entire eigenvalue spectrum of the observable being measured.

The probability distribution of those results can be used to calculate the uncertainty in those results. Quantum mechanics is indeterminate and there is an uncertainty in the measured observable.

Of course, there is the special case (where the particle is in an eigenstate) where the experiment is set up so we always get the same result, as in the classical case. In this special quantum case there is no uncertainty.

I believe counterfactual definiteness(CFD) refers to the fact that the values exist without measuring that is, there is a definitive value for every aspect that if measured would have been obtained. The bell inequalities said that you had to discard either CFD or locality, hidden variable theories kept CFD and discarded locality, while mainstream quantum physics discards CFD and keeps locality to better agree with relativity.

I think that is one of the reasons why it is said that even failure to interact with some detectors, constitutes a measurement, and disturbs a particle. I mean if you gain information about the state of a particle through a series of lack of interactions, then you could probably obtain all the properties without disturbing the particle, but if even lack of interaction can disturb, be considered a measurement, than you cannot do this, that would at least protect the idea that there are no definite values for all the properties when not measured.

Last edited:

Share:

- Replies
- 2

- Views
- 663

- Replies
- 38

- Views
- 1K

- Replies
- 27

- Views
- 1K

- Replies
- 24

- Views
- 685

- Replies
- 17

- Views
- 1K

- Replies
- 9

- Views
- 892

- Replies
- 11

- Views
- 1K

- Replies
- 14

- Views
- 691

- Replies
- 8

- Views
- 495

- Replies
- 3

- Views
- 879