# Wave particle duality

SpectraCat
According to one of the last https://www.physicsforums.com/showthread.php?t=57528" the data suggests that each individual particle must be interfering with itself to give the observed results. It seems you agree, that it can interfere with itself, so it indirectly shows that even individual single photons and particles can showcase wavelike behavior.
Yup.

Ifhttp://en.wikipedia.org/wiki/Renninger_negative-result_experiment" [Broken] with a detector can disturb a particle and thus bring uncertainty, how can you be sure that your particular interaction with a particular detector is not affected by the lack of interactions with other parts disturbing the particle?
To quote the chicken farmer from Napoleon Dynamite, "I don't understand a word you just said."

Seriously though ... I really don't understand how any of that is relevant to what we were discussing. Yes, there are many different perturbations that may end up changing the particular value observed for a given measurement. My point was that there is only ever *one* value that is measured (and with arbitrary precision).

Last edited by a moderator:
Wave Particle Duality comes as a result of De Broglie's hypothesis that matter has an associated wavelength. It has been verified that matter does in fact have a wavelength associated with it, since interference effects from electron scattering has been observed.
Obviously, this is somewhat strange. Personally I find that it is natural that matter has a wavelength (dependent on momentum) because E = mc^2, and also from Fourier series descriptions of particles. E = mc^2 says that all waves have an associated mass (since observable waves must have energy), and Fourier series say that the probability density of finding a particle as a function of position can be represented as a wave.

What schroedinger did was to simply say that there is a wave function associated with each particle. This came from two quantum principles:

(1) E = h$$\nu$$
(2) P = h$$/$$$$\lambda$$

Using (1) the time frequency of the wave can be found (i.e. the "speed" of propagation)
Using (2) the spacial frequency of the wave can be found.

By adding many waves together, it is possible to create many different shapes of probability distributions. The end result however is Heisenberg's uncertainty principle:

It turns out that the Schroedinger equation describes the probability density of finding a particle as a function of time and space. If you localize the wave to a single point, then you don't know what the wavelengths are. Identically, if you know the wavelength exactly, then the wave function is so spread out that it is impossible to tell where the particle is if you were to observe it. This is all of course according to a wave function mechanics.

@flashprogram: I made this same error of reasoning a few months ago (can't find the thread), and was roundly spanked. SpectraCat is giving you a very good answer. The wavefunction interferes with itself, but to detect the interference requires the "image" (classic pattern) be built dot by dot. If a single particle is interfering with itself, no test thus far can show that WITH a single particle. The wavefunction, yes, the particle, no.

EDIT: Damn, beaten to the punch! :rofl:
@Couchyam: Welcome, and very nice first post! You wouldn't believe how most people (myself included) make a poor first impression.

Last edited:
To quote the chicken farmer from Napoleon Dynamite, "I don't understand a word you just said."

Seriously though ... I really don't understand how any of that is relevant to what we were discussing. Yes, there are many different perturbations that may end up changing the particular value observed for a given measurement. My point was that there is only ever *one* value that is measured (and with arbitrary precision).
Well the example there shows that lack of interaction with a particular hypothetical detector can disturb the particle, I assume that would introduce uncertainty and affect measurements. Given that particles are part of the universe, there are countless things that one could imagine could be 'considered' detectors and if that is so this should interfere with any measurement simply due to the lack of interaction with these other parts of the system.

That is unless there are stringent requirements, to what could be considered a valid detector, detectors that can disturb a particle even when there is no interaction. Or if said disturbances somehow don't affect your measurements.

@flashprogram: I made this same error of reasoning a few months ago (can't find the thread), and was roundly spanked. SpectraCat is giving you a very good answer. The wavefunction interferes with itself, but to detect the interference requires the "image" (classic pattern) be built dot by dot. If a single particle is interfering with itself, no test thus far can show that WITH a single particle. The wavefunction, yes, the particle, no.
True, but the point was that although you couldn't see it directly the data from current experiments seems to imply that it is indeed taking place even in single particles.

Right now I don't recall where, but I think I heard about an experiment with a stationary molecule that seemed to be in two places at once.

DrChinese
Gold Member
Wave Particle Duality comes as a result of De Broglie's hypothesis that matter has an associated wavelength. It has been verified that matter does in fact have a wavelength associated with it, since interference effects from electron scattering has been observed.
Obviously, this is somewhat strange. Personally I find that it is natural that matter has a wavelength (dependent on momentum) because E = mc^2, and also from Fourier series descriptions of particles. E = mc^2 says that all waves have an associated mass (since observable waves must have energy), and Fourier series say that the probability density of finding a particle as a function of position can be represented as a wave.

What schroedinger did was to simply say that there is a wave function associated with each particle. This came from two quantum principles:

(1) E = h$$\nu$$
(2) P = h$$/$$$$\lambda$$

Using (1) the time frequency of the wave can be found (i.e. the "speed" of propagation)
Using (2) the spacial frequency of the wave can be found.

By adding many waves together, it is possible to create many different shapes of probability distributions. The end result however is Heisenberg's uncertainty principle:

It turns out that the Schroedinger equation describes the probability density of finding a particle as a function of time and space. If you localize the wave to a single point, then you don't know what the wavelengths are. Identically, if you know the wavelength exactly, then the wave function is so spread out that it is impossible to tell where the particle is if you were to observe it. This is all of course according to a wave function mechanics.
I agree with Frame Dragger: Great first post! Welcome to PhysicsForums!!

According to one of the last https://www.physicsforums.com/showthread.php?t=57528" the data suggests that each individual particle must be interfering with itself to give the observed results. It seems you agree, that it can interfere with itself, so it indirectly shows that even individual single photons and particles can showcase wavelike behavior.
There is no wave-like behavior until many photons have been detected and we observe the probability distribution of those many photons. Quantum mechanics is about probabilities and it is the probability distribution that we identify as an interference pattern.

It might help to read A. Tonomura, J. Endo, T. Matsuda, T. Kawasaki, and H. Exawa,
“Demonstration of single electron build-up of an interference pattern”, Am. J. Phys. 57, 117-120 (1989).
This experiment is discussed at length in Mark P. Silverman, "More Than One Mystery" (Springer-Verlag, New York, 1995), pp. 1-8, and also in George Greenstein and Arthur G. Zajonc, "The Quantum Challenge" (Jones and Bartlett, Boston, 1997), pp. 1-4.

If we consider only what is observed in real experiments, then, in my opinion, much of the confusion goes away. It is when we try to explain what is happening, that things get "weird". The Tonomura paper cited above describes a real experiment. There are many others, since it is now commonplace to do experiments where there is only one photon in the apparatus at any one time. None of the experiments show a single photon interfering with itself, passing through two slits at the same time, smearing itself all over a detection screen, or doing other strange things often attributed to it. Such comments are our attempts to explain "what the results really mean." If something cannot be verified experimentally, we should be very suspicious. And, unfortunately, the experimental results do not pertain to the photon before it is detected.

Last edited by a moderator:
There is no wave-like behavior until many photons have been detected and we observe the probability distribution of those many photons. Quantum mechanics is about probabilities and it is the probability distribution that we identify as an interference pattern.

It might help to read A. Tonomura, J. Endo, T. Matsuda, T. Kawasaki, and H. Exawa,
“Demonstration of single electron build-up of an interference pattern”, Am. J. Phys. 57, 117-120 (1989).
This experiment is discussed at length in Mark P. Silverman, "More Than One Mystery" (Springer-Verlag, New York, 1995), pp. 1-8, and also in George Greenstein and Arthur G. Zajonc, "The Quantum Challenge" (Jones and Bartlett, Boston, 1997), pp. 1-4.

If we consider only what is observed in real experiments, then, in my opinion, much of the confusion goes away. It is when we try to explain what is happening, that things get "weird". The Tonomura paper cited above describes a real experiment. There are many others, since it is now commonplace to do experiments where there is only one photon in the apparatus at any one time. None of the experiments show a single photon interfering with itself, passing through two slits at the same time, smearing itself all over a detection screen, or doing other strange things often attributed to it. Such comments are our attempts to explain "what the results really mean." If something cannot be verified experimentally, we should be very suspicious. And, unfortunately, the experimental results do not pertain to the photon before it is detected.
I will have to read up on that, but the comments do suggest that the overall result must depend on individual photons having exhibited such behavior.

As for the description of uncertainty, according to wiki

If we take this to be right, even an individual particle cannot have both properties at the same time even if it is not measured, it exists without both these things being definitive. The wiki article has even more strange quantum behavior, such as that the failure to measure(e.g. it fails to hit the detector, if I'm not mistaken.), failure to interact with something, can also disturb the particle.
We need to discuss quantum uncertainty. Spectracat has it right! The accuracy of a measurement is determined by the skill of the experimenter and the limitations of his apparatus. We can imagine an ideal experiment that yields exact values for the observable being measured and not be in conflict with quantum theory.

Classically, if we repeat the experiment we always get the same experimental result; classical physics is deterministic and the result is certain.

Not so in a quantum experiment! When we repeat the quantum experiment, in general, we get a different result, even if everything is done perfectly. If we repeat the experiment many times then we get the entire eigenvalue spectrum of the observable being measured.
The probability distribution of those results can be used to calculate the uncertainty in those results. Quantum mechanics is indeterminate and there is an uncertainty in the measured observable.

Of course, there is the special case (where the particle is in an eigenstate) where the experiment is set up so we always get the same result, as in the classical case. In this special quantum case there is no uncertainty.

@Flashprogram: I told you: spanking... get ready for a quantized spanking! I ran into this same misconception and it was rapidly squeezed out of me. It will come down to: "Show me one instance where a single particle is detected in two places at once, they are always dots on a screen, which form a pattern based on statistics and a number of hits."

Trust me on this... save yourself the pain and just do the research into this. You'll be happier without the continued protestations, and frankly, you'll have a new appreciation for the current experimental limitations on exploring the quantum world.

We need to discuss quantum uncertainty. Spectracat has it right! The accuracy of a measurement is determined by the skill of the experimenter and the limitations of his apparatus. We can imagine an ideal experiment that yields exact values for the observable being measured and not be in conflict with quantum theory.

Classically, if we repeat the experiment we always get the same experimental result; classical physics is deterministic and the result is certain.

Not so in a quantum experiment! When we repeat the quantum experiment, in general, we get a different result, even if everything is done perfectly. If we repeat the experiment many times then we get the entire eigenvalue spectrum of the observable being measured.
The probability distribution of those results can be used to calculate the uncertainty in those results. Quantum mechanics is indeterminate and there is an uncertainty in the measured observable.

Of course, there is the special case (where the particle is in an eigenstate) where the experiment is set up so we always get the same result, as in the classical case. In this special quantum case there is no uncertainty.
I believe counterfactual definiteness(CFD) refers to the fact that the values exist without measuring that is, there is a definitive value for every aspect that if measured would have been obtained. The bell inequalities said that you had to discard either CFD or locality, hidden variable theories kept CFD and discarded locality, while mainstream quantum physics discards CFD and keeps locality to better agree with relativity.

I think that is one of the reasons why it is said that even failure to interact with some detectors, constitutes a measurement, and disturbs a particle. I mean if you gain information about the state of a particle through a series of lack of interactions, then you could probably obtain all the properties without disturbing the particle, but if even lack of interaction can disturb, be considered a measurement, than you cannot do this, that would at least protect the idea that there are no definite values for all the properties when not measured.

Last edited:
DrChinese
Gold Member
I believe counterfactual definiteness(CFD) refers to the fact that the values exist without measuring that is, there is a definitive value for every aspect that if measured would have been obtained. The bell inequalities said that you had to discard either CFD or locality, hidden variable theories kept CFD and discarded locality, while mainstream quantum physics discards CFD and keeps locality to better agree with relativity.

I think that is one of the reasons why it is said that even failure to interact with some detectors, constitutes a measurement, and disturbs a particle. I mean if you gain information about the state of a particle through a series of lack of interactions, then you could probably obtain all the properties without disturbing the particle, but if even lack of interaction can disturb, be considered a measurement, than you cannot do this, that would at least protect the idea that there are no definite values for all the properties when not measured.
Consider a photon emitted from a free excited atom in outer space. That photon may only be detected at a single spot (say on the Earth), yet its wave packet expands to much of the universe... long after it was detected. But because of CFD, you would be hard pressed to prove it. Yet with entangled particle pairs, you can do "tricks" that are not possible with single particles. And those tricks show that probability amplitudes (wave packets) are very "real".

My point being that you need to consider the entire context, as you say. That context can be very big. Or it can be small and controlled.

I am going to start another thread (since I don't want to meander off-topic in this one) in the next couple of days to discuss an interesting way to consider the "reality" of a probability amplitude. Assuming, of course, that anyone wonders about this...

Consider a photon emitted from a free excited atom in outer space. That photon may only be detected at a single spot (say on the Earth), yet its wave packet expands to much of the universe... long after it was detected. But because of CFD, you would be hard pressed to prove it. Yet with entangled particle pairs, you can do "tricks" that are not possible with single particles. And those tricks show that probability amplitudes (wave packets) are very "real".

My point being that you need to consider the entire context, as you say. That context can be very big. Or it can be small and controlled.

I am going to start another thread (since I don't want to meander off-topic in this one) in the next couple of days to discuss an interesting way to consider the "reality" of a probability amplitude. Assuming, of course, that anyone wonders about this...
Considering how often this very question seems to arise, I'd say starting a thread might even warrent a sticky if it goes well.

Things that behave as waves and particles depending on if they are being observed. Things that seem to exist in some ethereal manner whereby they pop in and out of our existance that we cannot identify both speed and position of . Transfer of information faster than light speed seemingly over any distances . Nothing funny going on has to be the greatest understatement of all time .
But true indeeed nevertheless.

My suggestion is to reintegrate finite difference ideas.

Hallow no one is giving me answers seriously, may anyone help me have a picture of the wave particle duality and probably giving in detail the uncertainity principle and the wave function as been stated in some replies
rued 7n , check into cellular automata, finite yet instantaneous speed across the whole system with possibility of simultaneous action across an infinityof squares.

I personally read a new kind of science by wolfram, and tolkiens books to fill in the details later.

For more information talk to the guy behind minecraft and ask him how you scale it up with infinite computational resources within finite time on aturing machine, i suggest using the cloud for computational resources at a planetary scale, with perfectly encrypted maximal compression transmission costs are negligible ,after all zeno proved it, and later newton and some other guy reproved it, motion appears impossible yet is possible.

wave-particle duality, possession by physical entities (such as light and electrons) of both wavelike and particle-like characteristics. On the basis of experimental evidence, the German physicist Albert Einstein first showed (1905) that light, which had been considered a form of electromagnetic waves, must also be thought of as particle-like, or localized in packets of discrete energy. The French physicist Louis de Broglie proposed (1924) that electrons and other discrete bits of matter, which until then had been conceived only as material particles, also have wave properties such as wavelength and frequency. Later (1927) the wave nature of electrons was experimentally established. An understanding of the complementary relation between the wave aspects and the particle aspects of the same phenomenon was announced in 1928 (see complementarity principle