What does the probabilistic interpretation of QM claim?

In summary, the conversation discusses the probability interpretation of quantum mechanics and how it relates to observable operators and measurements. The probability interpretation states that the probability of observing a particular eigenvalue is equal to the square of the state function for that eigenvalue. However, this interpretation does not determine which operators can be measured, as this is a matter of theoretical and experimental developments. The conversation also delves into the issue of measuring position and momentum, and how quantum uncertainty rules out continuity in position. The example of the double-slit experiment is used to demonstrate the wave-mechanics interpretation of position and momentum in quantum mechanics. The conversation ends with a mention of experiments done in strong magnetic fields and how they measure position and momentum in particle tracks.
  • #36
Physics Monkey said:
I'm sorry, but this statement is simply false. I just gave an example where particle position is relevant. One can mention quantum fields all one wants, but that doesn't change the fact that as a practical matter particle positions can be meaningful and useful approximations. Even in quantum field theory.

I have nothing against particle positions as meaningful and useful semiclassical _approximations_, as is appropriate for particles assumed to have collapsed already, and hence described by an effective particle picture along a track. This is a change of the representation, simplifying the picture and the analysis in cases where the physics allows this.

Nevertheless, even in a track, one only has a measurement of the projection of the position on the plane transversal to the momentum.

But before the detector is reached, there is just a radially expanding quantum field for each particle kind involved in the decay (before and after), and Mott's analysis applies. The secondary bubble traces start at random positions along the track, for the same reason that the primary trace start at a random position anywhere at the surface of the detector where the field density is large enough and continues inside the detector.
 
Physics news on Phys.org
  • #37
meopemuk said:
the most spectacular failure is related to electrons registered by a photographic plate. If you describe the incident electron by a plane wave or other continuous charge density field, you will have a hard time to explain how this distributed charge density condenses to a single location of one emulsion grain.

Mott even explains how a complete particle track appears in a bubble chamber - caused by a classical external electromagnetic field reaching the detector from a particular direction.
 
  • #38
meopemuk said:
I agree with you that parameter x in quantum field [tex] \psi(x,t) [/tex] has absolutely no relationship to physically measurable position.
We completely disagree. There is indeed no relation to an alleged particle position.

But the parameters x and t in a quantum field have the definite meaning of position and time - not of a particle, but of the point where the field strength is measured. The rate of response of the detector at position x at time t is for a photon proportional to the intensity <|E(x,t)|^2>, where E(x,t) is the complex analytic signal of the electric field operator, and for an electron proportional to the intensity <|Psi(x,t)|^2>, where Psi(x,t) is the Dirac field operator.
meopemuk said:
You were correct to point out that in the case of indistinguishable particles this does not allow to form a Hermitian "particle position" operator. But the above construction of n-particle localized states is sufficient to describe position measurements in the Fock space.
The radial wave produced by a double slit is not a localized state.
meopemuk said:
Another point is that refusing the measurability of positions you are are not saving yourself from the "weird" quantum collapse. You've mentioned elsewhere that the momentum-space wavefunction [tex] \psi(p) [/tex] does have a measurable probabilistic interpretation. So, it does require a collapse. This time in the momentum space.
There is only an apparent collapse due to changing the description before and after reaching the detector. Discontinuities caused by changes in the description level are ubiquitous in physics - whether classical or quantum.
meopemuk said:
Our difference is that I believe that the blackening of silver atoms or the formation of bubbles are direct local effects of incident particles. So, by measuring positions of exposed grains of photoemulsion or bubbles we measure (albeit indirectly) positions of particles, which created these effects.
I know. Whereas I interpret it in terms of quantum fields, which have a much more benign intuitive interpretation, and also apply to electromagnetic radiation, where your interpretation breaks down.
meopemuk said:
If I understand correctly, your position is that the blackened grain of photoemulsion or the formed bubble is not a proof that the particle really hit that spot.
Instead, it is proof that there is an incident quantum field.
meopemuk said:
creation of the local photographic image or a small bubble is "explained" by a sequence of non-trivial condensation events happening in the bulk of the detector. These events require migration of charge to macroscopic distances
Of charge density. But charge density migrates over macroscopic distances also during the flight from the source to the detector - there is nothing strange about it.
meopemuk said:
If I understand correctly, your motivation for applying these non-trivial models of particle detection is to avoid using the quantum-mechanical wave function collapse.
No. My motivation is to have a consistent intuitive view of quantum field theory, which since over half a century is regarded as the correct description of microscopic physics, with ''particles are just bundles of energy and momentum of the fields'' (Weinberg).

That one doesn't need the collapse is just a welcome byproduct of this view.
 
  • #39
A. Neumaier said:
I know. Whereas I interpret it in terms of quantum fields, which have a much more benign intuitive interpretation, and also apply to electromagnetic radiation, where your interpretation breaks down.

Is there a single example, where the corpuscular interpretation "breaks down", as you say?

A. Neumaier said:
Of charge density. But charge density migrates over macroscopic distances also during the flight from the source to the detector - there is nothing strange about it.

I can understand a charge wave that propagates and spreads out. However, I have a difficulty to imagine a wave that collapses to a point spontaneously. Which physical mechanism can be responsible for such a collapse?


A. Neumaier said:
That one doesn't need the collapse is just a welcome byproduct of this view.

I think that the discovery of the quantum nature of things, sometimes dubbed "collapse", was the single most important discovery in 20th century physics. I know that we disagree about that.

Eugene.
 
  • #40
meopemuk said:
Is there a single example, where the corpuscular interpretation "breaks down", as you say?
Photons have no position; they disappear upon the attempt to measure any of their properties. It is only continuous brain washing that calls such ghost-like objects particles.
meopemuk said:
I can understand a charge wave that propagates and spreads out. However, I have a difficulty to imagine a wave that collapses to a point spontaneously. Which physical mechanism can be responsible for such a collapse?
In my understanding there is no collapse and there need not be one. The collapse is an artifact of the point particle interpretation of quantum mechanics.
meopemuk said:
I think that the discovery of the quantum nature of things, sometimes dubbed "collapse", was the single most important discovery in 20th century physics. I know that we disagree about that.
Collapse is not the quantum nature of things, but the least understood aspects of quantum mechanics. QM is an extremely successful description of Nature no matter whether one believes in collapse.
 
Last edited:
  • #41
A. Neumaier said:
Photons have no position; they disappear upon the attempt to measure any of their properties. It is only continuous brain washing that calls such ghost-like objects particles.

There is nothing strange in the fact that photons can be created and absorbed easily. This is described naturally in QFT, which is a theory designed to work with systems, where the number of particles can change.

A. Neumaier said:
In my understanding there is no collapse and there need not be one. The collapse is an artifact of the point particle interpretation of quantum mechanics.

Collapse is not the quantum nature of things, but the least understood aspects of quantum mechanics. QM is an extremely successful description of Nature whether one believes in collapse.

I think we agree that in the double-slit setup the locations of marks on the photographic plate are random. I hope we also agree that the clicks produced by a Geiger counter attached to a piece of radioactive material occur at random times. At least, it is fair to say that nobody was able to predict locations of individual marks or timings of individual clicks.

In my understanding, quantum mechanics says that these kinds of events are not predictable as a matter of principle. Nature has an inherently random component, which cannot be explained. The best we can do is to calculate probabilities of these random events. That's what quantum mechanics is doing and it is doing it brilliantly. Once we agreed on the fundamental randomness of quantum events, there is no other way, but to accept the idea of collapse: The outcomes are not known to us before observations, they are described only as probability distributions. After the observation is made a single outcome emerges, so the probability distribution collapses.

There is nothing there to understand about the collapse. Things that are fundamentally random cannot be explained or understood any better than simply saying that they are random.

From my discussions with you I've understood that you have a different view on the origin of randomness. You basically believe that nature obeys deterministic field-like equations. The appearance of a mark on the photographic plate has a mechanistic explanation in which the impacting electron field interacts with the fields of atoms in the plate. This interaction leads to some physical migration of the field energy and charge density to one specific point, which appears to us as a blackened AgBr microcrystal. These migration processes involve huge number of atoms, so they are "stochastic" or "chaotic", and their outcomes cannot be predicted at our current level of knowledge. Nevertheless, you maintain that at the fundamental level there are knowable field equations as opposed to the pure chance.

These are two different philosophies, two different world views, which could be completely equivalent as far as specific experimental observations are concerned. In general, I find it not fruitful to argue about ones philosophy, religion or political preferences. These kinds of convictions cannot be changed by logical arguments. So, perhaps we should agree to disagree.

Eugene.
 
  • #42
meopemuk said:
There is nothing strange in the fact that photons can be created and absorbed easily. This is described naturally in QFT, which is a theory designed to work with systems, where the number of particles can change.
An entity about which we can say nothing at all during its flight from the source to the detector, which never has a position, produces a spot on a plate and at this moment disappears forever. This is a perfect description of a ghost, whereas calling it a particle is an unfortunate historical accident. Everyone beginning to study quantum mechanics finds this extremely strange and un-particle-like. Not to find that strange is the result of years of indoctrination by famous and less famous kindergarden storytellers. That the most famous of them had won a Nobel prize helped in making the brainwashing more efficient.
meopemuk said:
Nature has an inherently random component, which cannot be explained.
I explain is as microscopic chaos in the detector.
meopemuk said:
The best we can do is to calculate probabilities of these random events. That's what quantum mechanics is doing and it is doing it brilliantly. Once we agreed on the fundamental randomness of quantum events, there is no other way, but to accept the idea of collapse: The outcomes are not known to us before observations, they are described only as probability distributions. After the observation is made a single outcome emerges, so the probability distribution collapses.
Nobody but you calls the change of prior probabilities into posterior certainties a collapse.

Collapse _always_ refers to the collapse of the state - that after the measurement, the state of the measured system is in an eigenstate of the measured observable!
meopemuk said:
You basically believe that nature obeys deterministic field-like equations.
No. Nature obeys the rules of QFT, and all macroscopic information arrives in the form of expectation values of appropriate fields, as given by statistical thermodynamics, the quantum theory of macroscopic matter. This is enough to explain everything without assuming a collapse of the state. (What you call collapse, but what others label a change of probabilities into certainties is fully explained by the subjective inability to predict a chaotic system with zillions of degrees of freedom.)
meopemuk said:
Nevertheless, you maintain that at the fundamental level there are knowable field equations as opposed to the pure chance.
Field equations are operator equations. What is knowable are the field expectations at macroscopic resolutions. Engineers measure them routinely.
meopemuk said:
These are two different philosophies, two different world views, which could be completely equivalent as far as specific experimental observations are concerned. In general, I find it not fruitful to argue about ones philosophy, religion or political preferences.
I find it _very_ fruitful to argue about ones philosophy, religion or political preferences.
This is the only way to influence people's convictions.
meopemuk said:
These kinds of convictions cannot be changed by logical arguments. So, perhaps we should agree to disagree.
We always agreed that we disagree, from the start of this thread. But we draw different consequences from this fact.
 
  • #43
A. Neumaier said:
Nobody but you calls the change of prior probabilities into posterior certainties a collapse.

Collapse _always_ refers to the collapse of the state - that after the measurement, the state of the measured system is in an eigenstate of the measured observable!

I've forgotten to mention that I am not interested in the state of the quantum system after it has "interacted" with the measuring device and produced the measurement outcome. So, I am agnostic about the state after the measurement. Yes, I understand that there are situations when one can measure repeatedly different things on the same copy of the system. However, I would like to avoid discussions of such situations. So, I would prefer to think that after the measurement is done and its result is recorded, the system is discarded. Dealing only with such one-time measurements makes my life a bit easier.

So, I agree that collapse = "the change of prior probabilities into posterior certainties". However, I disagree that the collapse ever happens in classical physica, because in classical physics everything is determined and predictable. If somebody has encountered a "probability" in classical physics, that's only because this somebody was too lazy or ignorant to specify exactly all necessary initial conditions. Somebody's ignorance and laziness cannot be accounted for in a rigorous theory. "Zillions of degrees of freedom" is also not a good excuse to introduce probabilities, because we are talking about principles here, not about practical realizations.

Eugene.
 
Last edited:
  • #44
It should be pointed out that A Neumaier's suggestion that a deterministic chaotic dynamics may underly quantum randomness is not the standard view, and to even be consistent with modern experimental results requires some additional weird assumptions such as explicit non-locality (Bohm) or information loss behind event horizons ('t Hooft).

http://www.nature.com/news/2007/070416/full/news070416-9.html
 
  • #45
meopemuk said:
I've forgotten to mention that I am not interested in the state of the quantum system after it has "interacted" with the measuring device and produced the measurement outcome. So, I am agnostic about the state after the measurement.
But this means that you are agnostic about collapse, as the term is traditionally understood: ''In quantum mechanics, wave function collapse (also called collapse of the state vector or reduction of the wave packet) is the phenomenon in which a wave function—initially in a superposition of several different possible eigenstates—appears to reduce to a single one of those states after interaction with an observer.'' http://en.wikipedia.org/wiki/Wavefunction_collapse
meopemuk said:
So, I agree that collapse = "the change of prior probabilities into posterior certainties".
You only agree to your own nonstandard interpretation of the word ''collapse''. I don't agree at all with this usage.
meopemuk said:
"Zillions of degrees of freedom" is also not a good excuse to introduce probabilities, because we are talking about principles here, not about practical realizations.
''Unperformed experiments have no results'' (A. Peres, Amer. J. Phys. 46 (1978), 745).
This holds even more for unperformable measurements or preparations.
 
  • #46
unusualname said:
It should be pointed out that A Neumaier's suggestion that a deterministic chaotic dynamics may underly quantum randomness is not the standard view, and to even be consistent with modern experimental results requires some additional weird assumptions such as explicit non-locality (Bohm) or information loss behind event horizons ('t Hooft).
It is enough to assume information loss to the part of the universe not visible from our planetary system (where all our experiments are done). Radiation goes there all the time; so this assumption is satisfied.
unusualname said:
The paper says nothing about requiring weird assumptions. But the author states: ''But for objects governed by the laws of quantum mechanics, like photons and electrons, it may make no sense to think of them as having well defined characteristics. Instead, what we see may depend on how we look.''

depend on how wee look = depend on the measurement apparatus (here our eye).

Thus his statement confirms my hypothesis.
 
  • #47
A. Neumaier said:
It is enough to assume information loss to the part of the universe not visible from our planetary system (where all our experiments are done). Radiation goes there all the time; so this assumption is satisfied.

Radiation travels via a local mechanism, are you saying your deterministic model is local and real?

The paper says nothing about requiring weird assumptions. But the author states: ''But for objects governed by the laws of quantum mechanics, like photons and electrons, it may make no sense to think of them as having well defined characteristics. Instead, what we see may depend on how we look.''

depend on how wee look = depend on the measurement apparatus (here our eye).

Thus his statement confirms my hypothesis.

Shouldn't you say that statement doesn't contradict your model, rather than asserting it confirms it.

I added that link to a mainstream science article to point out the mainstream view on quantum interpretation, just in case people think your "science advisor" tag adds credibility to your nonstandard view.

But I'm not saying you're wrong, just that it's an an unusual model to be promoting.
 
  • #48
A. Neumaier said:
But this means that you are agnostic about collapse, as the term is traditionally understood: ''In quantum mechanics, wave function collapse (also called collapse of the state vector or reduction of the wave packet) is the phenomenon in which a wave function—initially in a superposition of several different possible eigenstates—appears to reduce to a single one of those states after interaction with an observer.'' http://en.wikipedia.org/wiki/Wavefunction_collapse

I've possibly created a confusion by using my own definition of collapse, which is different from the wikipedia's one. To clarify, I would like to mention that I am interested only in single measurements of observables. I am not interested in what is the state of the system after the measurement is completed. I am not sure if wave function is a good description for such states.

Eugene.
 
  • #49
meopemuk said:
I agree that some aspects of particle detection can be explained by Mandel & Wolf type arguments. However, there are situations, where these arguments fail completely. I think the most spectacular failure is related to electrons registered by a photographic plate. If you describe the incident electron by a plane wave or other continuous charge density field, you will have a hard time to explain how this distributed charge density condenses to a single location of one emulsion grain. I think it is well established that after "observation" the entire electron charge is located in the neighborhood of the blackened emulsion grain. Apparently, there should be a mechanism by which the distributed charge density condenses to a point and overcomes a strong Coulomb repulsion in the process. This doesn't look plausible even from the point of view of energy conservation.

In that case, what is wrong with Mott's or Schiff's analyses (which apply for incident
field carrying charge)? To me these seem adequate to account for the experimental
observations.
 
  • #50
unusualname said:
Radiation travels via a local mechanism, are you saying your deterministic model is local and real?
My interpretation is not deterministic, since it is based on standard QFT. But like the latter it is local.

By the way, there are no no-go theorems against deterministic field theories underlying quantum mechanics. Indeed, local field theories have no difficulties violating Bell-type inequalities. See http://arnold-neumaier.at/ms/lightslides.pdf , starting with slide 46.

unusualname said:
Shouldn't you say that statement doesn't contradict your model, rather than asserting it confirms it.
If a key statement that wasn't known to the proposer of some model doesn't contradict this model, it is usually considered as a confirmation of the model. In the present case, since you brought the paper as argument to caution readers against my views, and my main assumption was that the results of measurements depend on the detector, and the author of the paper made precisely this point (for the special detector called us - or our yes), it is a significant confirmation.
unusualname said:
I added that link to a mainstream science article to point out the mainstream view on quantum interpretation, just in case people think your "science advisor" tag adds credibility to your nonstandard view.
I am not reponsible for having this tag.
unusualname said:
But I'm not saying you're wrong, just that it's an an unusual model to be promoting.
I am only taking quantum field theory seriously. It is not that unusual: People working on dynamic reduction models have a very similar view:

G. Ghirardi,
Quantum dynamical reduction and reality:
Replacing probability densities with densities in real space,
Erkenntnis 45 (1996), 349-365.
http://www.jstor.org/stable/20012735

My only new point compared to them is that one doesn't need the dynamic reduction once one has the field density ontology.
 
Last edited by a moderator:
  • #51
@A. Neumaier, I don't understand you, make it simple for me, is deterministic chaotic dynamics the fundamental mathematical description of reality in your model?
 
  • #52
unusualname said:
is deterministic chaotic dynamics the fundamental mathematical description of reality in your model?
The fundamental mathematical description of reality is standard quantum field theory, _not_ deterministic chaos. The latter is an emergent feature.

In my thermal interpretation of quantum physics, the directly observable (and hence obviously ''real'') features of a macroscopic system are the expectation values of the most important fields Phi(x,t) at position x and time t, as they are described by statistical thermodynamics. If it were not so, thermodynamics would not provide the good macroscopic description it does.

However, the expectation values have only a limited accuracy; as discovered by Heisenberg, quantum mechanics predicts its own uncertainty. This means that <Phi(x)> is objectively real only to an accuracy of order 1/sqrt(V) where V is the volume occupied by the mesoscopic cell containing x, assumed to be homogeneous and in local equilibrium. This is the standard assumption for deriving from first principles hydrodynamical equations and the like. It means that the interpretation of a field gets more fuzzy as one decreases the size of the coarse graining - until at some point the local equilibrium hypothesis is no longer valid.

This defines the surface ontology of the thermal interpretation. There is also a deeper ontology concerning the reality of inferred entities - the thermal interpretation declares as real but not directly observable any expectation <A(x,t)> of operators with a space-time dependence that satisfy Poincare invariance and causal commutation relations.
These are distributions that produce exact numbers when integrated over sufficiently smooth localized test functions.

Approximating a multiparticle system in a semiclassical way (mean field theory or a little beyond) gives an approximate deterministic system governing the dynamics of these expectations. This system is highly chaotic at high resolution. This chaoticity seems enough to enforce the probabilistic nature of the measurement apparatus. Neither an underlying exact deterministic dynamics nor an explicit dynamical collapse needs to be postulated.
 
  • #53
Sorry, but chaotic dynamics is an exact mathematical model, that's the whole point of it, you can't say it's "emergent". Sensitive dependence at infinitesimally small changes in the the dynamical parameters is part of the definition of chaotic dynamics. If you have a stochastic dynamics then you have stochastic dynamics, if you have deterministic dynamics then you have deterministic dynamics, there's no inbetween "emergent" type system.
 
  • #54
unusualname said:
Sorry, but chaotic dynamics is an exact mathematical model, that's the whole point of it, you can't say it's "emergent". Sensitive dependence at infinitesimally small changes in the the dynamical parameters is part of the definition of chaotic dynamics. If you have a stochastic dynamics then you have stochastic dynamics, if you have deterministic dynamics then you have deterministic dynamics, there's no inbetween "emergent" type system.
The world is not as black and white as you paint it!

The same system can be studied at different levels of resolution. When we model a dynamical system classically at high enough resolution, it must be modeled stochastically since the quantum uncertainties must be taken into account. But at a lower resolution, one can often neglect the stochastic part and the system becomes deterministic. If it were not so, we could not use any deterministic model at all in physics but we often do, with excellent success.

This also holds when the resulting deterministic system is chaotic. Indeed, all deterministic chaotic systems studied in practice are approximate only, because of quantum mechanics. If it were not so, we could not use any chaotic model at all in physics but we often do, with excellent success.
 
  • #55
A. Neumaier said:
The world is not as black and white as you paint it!

The same system can be studied at different levels of resolution. When we model a dynamical system classically at high enough resolution, it must be modeled stochastically since the quantum uncertainties must be taken into account. But at a lower resolution, one can often neglect the stochastic part and the system becomes deterministic. If it were not so, we could not use any deterministic model at all in physics but we often do, with excellent success.

This also holds when the resulting deterministic system is chaotic. Indeed, all deterministic chaotic systems studied in practice are approximate only, because of quantum mechanics. If it were not so, we could not use any chaotic model at all in physics but we often do, with excellent success.

You either have deterministic laws at the fundamental level or you don't, why don't you just say you believe the universe is deterministic at the fundamental level, then I would understand you.
 
  • #56
unusualname said:
You either have deterministic laws at the fundamental level or you don't, why don't you just say you believe the universe is deterministic at the fundamental level, then I would understand you.
On the fundamental level, we have textbook quantum field theory. It doesn't matter for my interpretation whether or not there is an even deeper underlying deterministic level. So there is no need to commit myself.
 
  • #57
A. Neumaier said:
On the fundamental level, we have textbook quantum field theory. It doesn't matter for my interpretation whether or not there is an even deeper underlying deterministic level. So there is no need to commit myself.

Ok, then if you don't mind I'll answer the thread title, the probabilistic interpretation of QM claims nature is fundamentally probabilistic, and this claim has stood the test of time since the late 1920s, ok? :smile:
 
  • #58
strangerep said:
In that case, what is wrong with Mott's or Schiff's analyses (which apply for incident
field carrying charge)? To me these seem adequate to account for the experimental
observations.

I don't have access to Mott's and Schiff's writings. My only point was that it is unreasonable to represent 1 (one) electron by a continuous charge density wave. When we look at the electron experimentally, we often find it well-localized, i.e., within the space of one atom. And I find it rather difficult to imagine how a spread-out charge wave can condense to the atomic-size volume all by itself.

Eugene.
 
  • #59
unusualname said:
Ok, then if you don't mind I'll answer the thread title, the probabilistic interpretation of QM claims nature is fundamentally probabilistic, and this claim has stood the test of time since the late 1920s, ok? :smile:

If this were the only thing the probabilistic interpretation of QM claims, there were no point in doing QM, and there were no point for this thread.

By the way, the url in your profile is spelled incorrectly.
 
  • #60
meopemuk said:
I don't have access to Mott's and Schiff's writings.
http://books.google.com/books?hl=en...=vuOXpTH8m8gxB4s-WO8q8--oCSQ#v=onepage&q=Mott The wave mechanics tracks&f=false
meopemuk said:
My only point was that it is unreasonable to represent 1 (one) electron by a continuous charge density wave. When we look at the electron experimentally, we often find it well-localized, i.e., within the space of one atom. And I find it rather difficult to imagine how a spread-out charge wave can condense to the atomic-size volume all by itself.
You are confusing assumptions and knowledge.

We never ''look at an electron experimentally'' - we only infer its presence from a measured current or ionization track. Mott shows that this track is produced by a classical spherical wave impinging on the cloud chamber from a certain direction, which will determine the direction of the track produced at the atom that happens to fire. There is nothing counterintuitive about that. The uncertainty in the charge density inside the detector is much larger than the charge of one electron.

You _assume_ instead that this is caused by a single electron. And then you say that you find it because of the track. This is a simple instance of a self-fulfilling prophecy. http://en.wikipedia.org/wiki/Self-fulfilling_prophecy
 
  • #61
A. Neumaier said:
By the way, the url in your profile is spelled incorrectly.

thanks! probably just as well since I haven't constructed M yet (need linear groups and propagation of states throughout the universe ;) )
 
  • #62
A. Neumaier said:
You are confusing assumptions and knowledge.

We never ''look at an electron experimentally'' - we only infer its presence from a measured current or ionization track. Mott shows that this track is produced by a classical spherical wave impinging on the cloud chamber from a certain direction, which will determine the direction of the track produced at the atom that happens to fire. There is nothing counterintuitive about that. The uncertainty in the charge density inside the detector is much larger than the charge of one electron.

You _assume_ instead that this is caused by a single electron. And then you say that you find it because of the track. This is a simple instance of a self-fulfilling prophecy. http://en.wikipedia.org/wiki/Self-fulfilling_prophecy

Instead of a cloud chamber bombarded by a dense electron flow I would like to think about a cleaner setup in which definitely one and only one electron was emitted and then captured by an array of tiny detectors, such as CCD device. After the measurement we can be sure that only one detector in the array has "clicked" and the entire electron charge has been deposited inside this detector. If I apply your "charge density field" theory to this situation, I'll get a surprising conclusion that the extended charge distribution has collapsed to the volume of one micrometer-size detector all by itself and against the resistance of the Coulomb repulsion. I find this rather amazing.

Eugene.
 
  • #63
meopemuk said:
Instead of a cloud chamber bombarded by a dense electron flow I would like to think about a cleaner setup in which definitely one and only one electron was emitted and then captured by an array of tiny detectors, such as CCD device. [...]

You still haven't given a reference to an actual experimental setup that does this.
I.e., emits definitely one and only one electron, presumably with a momentum uncertainty
corresponding to a small solid angle that exactly encompasses the CCD device.
 
  • #64
A. Neumaier said:
[...] there are no no-go theorems against deterministic field theories
underlying quantum mechanics. Indeed, local field theories have no
difficulties violating Bell-type inequalities. See
http://arnold-neumaier.at/ms/lightslides.pdf ,
starting with slide 46.

On slide 49, you begin a hidden-variable analysis of a particular
experiment with the following assumptions:

(i) The source of beam 1 produces an ensemble of photons which is in
the classical (but submicroscopic) state [tex]\lambda[/tex] with
probability density [tex]p(\lambda)[/tex].

(ii) Whether a photon created at the source in state [tex]\lambda[/tex] reaches
the detector after passing the kth filter depends only on [tex]B_k[/tex] and
[tex]\lambda[/tex]. (This is reasonable since one can make a beam
completely dark, in which case it carries no photons.)

(iii) The conditional probability of detecting a photon which is in state [tex]\lambda[/tex]
and passes through filter k when [tex]B_k = B[/tex] and [tex]B_{3 − k} = 0[/tex] is given
by a functional expression [tex]p_k(B,\lambda)[/tex].

Later (after a QM analysis, etc), you say on slide 57:

The experiment can be explained by the classical Maxwell
equations, upon interpreting the photon number detection rate as
proportional to the beam intensity. This is a classical description,
not by classical particles (photons) but by classical waves.

I didn't see where you "explained the experiment by the classical
Maxwell equations" in these slides. (Or are you implicitly referring
to the arguments given in Mandel & Wolf?)

You go on to say:

Thus a classical wave model for quantum mechanics is not ruled out
by experiments demonstrating the violation of the traditional hidden
variable assumptions.

Therefore the traditional hidden variable assumption only amount
to a hidden point particle assumption.

And the experiments demonstrating [Bell inequality] violation
only disproves classical models with point particle structure.

It's not clear to me where, in the hidden variable assumptions you
listed, one has assumed point particle structure.
 
Last edited:
  • #65
strangerep said:
emits definitely one and only one electron,

I am not an experimentalist, so I can be mistaken. But I think that it should be possible to arrange emission of exactly one electron in a controlled fashion. For example, one can use a single radioactive nucleus, which experiences beta-decay.

You can say that the emitted electron flies in a random direction, so, most likely, it will not be found in our measuring device. But if we are persistent and prepare another radioactive nucleus, then another one... Eventually, we will be able to catch the electron and perfrom the experiment.


strangerep said:
presumably with a momentum uncertainty
corresponding to a small solid angle that exactly encompasses the CCD device.


If this electron passes through a crystal, then the effect of "electron diffraction" occurs, which is basically similar to the occurence of interference picture in the famous double slit
experiment. There are probability peaks in certain directions of electron propagation. The preferential angles depend on the (1) initial electron's momentum, (2) type of the crystal lattice, (3) orientation of the crystal. So, it should not be difficult to arrange all components in such a way that the diffraction (or interference) picture covers the entire surface of the CCD device.

Eugene.
 
  • #66
meopemuk said:
[...] I think that it should be possible to arrange emission of exactly one electron in a controlled fashion. For example, one can use a single radioactive nucleus, which experiences beta-decay.

You can say that the emitted electron flies in a random direction, so, most likely, it will not be found in our measuring device. But if we are persistent and prepare another radioactive nucleus, then another one... Eventually, we will be able to catch the electron and perform the experiment.

This is almost the "preparation" arrangement assumed in Mott's analysis (except
he uses alpha particles instead of electrons).

But you have not specified a way to observe the electron on its way from nucleus
to target (and I'm not sure what you meant by "catch the electron").

Instead, you're relying on random emission by nuclear decay. I don't see how this
is practical except by having a sample of the radioactive material with many nuclei,
and this leads to an ensemble of emitted electrons, with nonzero probability of
more than 1 electron in any given time interval.
 
  • #67
strangerep said:
But you have not specified a way to observe the electron on its way from nucleus
to target (and I'm not sure what you meant by "catch the electron").

There is no need to observe the electron on its way from nucleus to target. "Catching the electron" means registration of the hit by one detector in the CCD array.

strangerep said:
Instead, you're relying on random emission by nuclear decay. I don't see how this
is practical except by having a sample of the radioactive material with many nuclei,
and this leads to an ensemble of emitted electrons, with nonzero probability of
more than 1 electron in any given time interval.

This is not easy, but in principle possible. One can make sure that the radioactive sample contains one and only one nucleus of the desired unstable type. As an exotic possiblity I can suggest using C60 buckyballs. Currently there are techniques allowing to place one foreign (e.g., radioactive) atom inside the buckyball sphere. Then, I guess, it might be possible to deposit just one such "stuffed" buckyball on the surface of the sample. This would arrange a "single electron" emitter.

Eugene.


EDIT: I've googled for "single electron source" and found a number of interesting references. So, I guess that preparation of one-electron states is a solved technical problem. See, for example,


J.-Y. Chesnel, A. Hajaji, R. O. Barrachina, and F. Frémont, Young-Type Experiment Using a Single-Electron Source and an Independent Atomic-Size Two-Center Interferometer. Phys. Rev. Lett. 98, 100403 (2007). http://prl.aps.org/abstract/PRL/v98/i10/e100403 [/URL]
 
Last edited by a moderator:
  • #68
meopemuk said:
I've googled for "single electron source" and found a number of interesting
references. So, I guess that preparation of one-electron states is a solved
technical problem.

Although such papers are indeed interesting I came to the opposite
conclusion about it being a "solved technical problem" in the way you seem
to mean. I think such a description is an over-claim indeed.

The various setups I saw appear quite elaborate, specific to particular
applications that don't correspond easily to what you wanted (imho).

See, for example,

J.-Y. Chesnel, A. Hajaji, R. O. Barrachina, and F. Frémont,
"Young-Type Experiment Using a Single-Electron Source and an Independent
Atomic-Size Two-Center Interferometer."
Phys. Rev. Lett. 98, 100403 (2007).
http://prl.aps.org/abstract/PRL/v98/i10/e100403

Umm,... did you actually read this paper?
As usual, the devil is in the detail...

The experiment consists of an incident beam of alpha particles,
striking a gas of [tex]H_2[/tex] molecules. There's a particle reaction
chain in which the alpha particle becomes a doubly-excited helium atom
by capturing two electrons from a hydrogen molecule. The two resultant
protons move apart a little, and the doubly-excited helium atom decays,
re-emitting the electrons. Sometimes, one of the electrons is
re-emitted back towards the 2-proton target which acts like a 2-centre
scatterer. The resultant scattering pattern of such back-emitted
electrons is recorded. The "result" of the experiment is thus a
scattering cross section.

The experiment is called "single-electron" only because the probability
is extremely low that more than one electron is scattered by a given
2-proton scatterer. I.e., it's "single-electron" within the lifetime of
the 2-proton scatterer.

On the 2nd page, the authors clarify further that:

Chesnel et al said:
Since these individual scattering processes are repeated with
similar initial conditions many times, what is actually measured
here is the ensemble probability of the diffraction of just one single
electron by one single two-center scatterer.

The results seem adequately accounted for by statistical field-theoretic analysis.
 
  • #69
strangerep said:
The experiment is called "single-electron" only because the probability
is extremely low that more than one electron is scattered by a given
2-proton scatterer. I.e., it's "single-electron" within the lifetime of
the 2-proton scatterer.

I am not so sure why you insist that only 1 electron must be emitted in 100% of cases? What if there is some non-zero probability of 2 or 3 electrons being emitted? I see no problem with that from the point of view of the corpuscular interpretation. This is still something completely different from the continuous charge density field that you are arguing for.

Eugene.

EDIT: Well, if you don't like single electron sources, then we can return to the discussion of interference experiments with single atoms. I hope, you wouldn't deny that individual atoms can be produced one-by-one and that double-slit-type experiments are possible with them?
 
Last edited:
  • #70
It claims MWI is full of crap basically. :smile:

Ie that there is no real wave function, hidden variables that are local and that the evolution of the wave is therefore not deterministic.
 
<h2>1. What is the probabilistic interpretation of QM?</h2><p>The probabilistic interpretation of QM is a fundamental principle of quantum mechanics that states that the behavior of particles at the quantum level cannot be predicted with certainty, but only with a certain probability. This means that the outcome of any measurement or observation of a quantum system is not determined, but rather described in terms of probabilities.</p><h2>2. How does the probabilistic interpretation of QM differ from classical mechanics?</h2><p>Classical mechanics, which describes the behavior of macroscopic objects, is based on deterministic principles where the future state of a system can be predicted with certainty. In contrast, the probabilistic interpretation of QM introduces an element of randomness and uncertainty at the quantum level, which is not present in classical mechanics.</p><h2>3. What evidence supports the probabilistic interpretation of QM?</h2><p>There is a wealth of experimental evidence that supports the probabilistic interpretation of QM. For example, the famous double-slit experiment demonstrates the wave-like behavior of particles at the quantum level, which can only be described in terms of probabilities. Additionally, various other experiments have shown that particles can exist in multiple states simultaneously, further supporting the probabilistic nature of quantum mechanics.</p><h2>4. Does the probabilistic interpretation of QM apply to all physical systems?</h2><p>Yes, the probabilistic interpretation of QM applies to all physical systems, regardless of their size or complexity. However, the effects of quantum mechanics are usually only noticeable at the microscopic level, and classical mechanics is still a highly accurate and useful model for describing the behavior of macroscopic objects.</p><h2>5. How does the probabilistic interpretation of QM impact our understanding of reality?</h2><p>The probabilistic interpretation of QM challenges our traditional understanding of reality, as it suggests that the behavior of particles is inherently uncertain and unpredictable. It also raises philosophical questions about the nature of reality and our ability to truly understand and describe it. However, despite its counterintuitive nature, the probabilistic interpretation of QM has been extensively tested and has been shown to be a highly accurate and useful model for describing the behavior of particles at the quantum level.</p>

1. What is the probabilistic interpretation of QM?

The probabilistic interpretation of QM is a fundamental principle of quantum mechanics that states that the behavior of particles at the quantum level cannot be predicted with certainty, but only with a certain probability. This means that the outcome of any measurement or observation of a quantum system is not determined, but rather described in terms of probabilities.

2. How does the probabilistic interpretation of QM differ from classical mechanics?

Classical mechanics, which describes the behavior of macroscopic objects, is based on deterministic principles where the future state of a system can be predicted with certainty. In contrast, the probabilistic interpretation of QM introduces an element of randomness and uncertainty at the quantum level, which is not present in classical mechanics.

3. What evidence supports the probabilistic interpretation of QM?

There is a wealth of experimental evidence that supports the probabilistic interpretation of QM. For example, the famous double-slit experiment demonstrates the wave-like behavior of particles at the quantum level, which can only be described in terms of probabilities. Additionally, various other experiments have shown that particles can exist in multiple states simultaneously, further supporting the probabilistic nature of quantum mechanics.

4. Does the probabilistic interpretation of QM apply to all physical systems?

Yes, the probabilistic interpretation of QM applies to all physical systems, regardless of their size or complexity. However, the effects of quantum mechanics are usually only noticeable at the microscopic level, and classical mechanics is still a highly accurate and useful model for describing the behavior of macroscopic objects.

5. How does the probabilistic interpretation of QM impact our understanding of reality?

The probabilistic interpretation of QM challenges our traditional understanding of reality, as it suggests that the behavior of particles is inherently uncertain and unpredictable. It also raises philosophical questions about the nature of reality and our ability to truly understand and describe it. However, despite its counterintuitive nature, the probabilistic interpretation of QM has been extensively tested and has been shown to be a highly accurate and useful model for describing the behavior of particles at the quantum level.

Similar threads

  • Quantum Interpretations and Foundations
Replies
17
Views
1K
  • Quantum Interpretations and Foundations
9
Replies
309
Views
7K
  • Quantum Interpretations and Foundations
2
Replies
41
Views
3K
  • Quantum Interpretations and Foundations
4
Replies
109
Views
7K
  • Quantum Interpretations and Foundations
2
Replies
35
Views
4K
  • Quantum Interpretations and Foundations
3
Replies
76
Views
3K
  • Quantum Interpretations and Foundations
Replies
10
Views
1K
  • Quantum Interpretations and Foundations
2
Replies
47
Views
1K
  • Quantum Interpretations and Foundations
Replies
21
Views
2K
  • Quantum Interpretations and Foundations
3
Replies
84
Views
1K
Back
Top