A How do entanglement experiments benefit from QFT (over QM)?

Click For Summary
Entanglement experiments can benefit from Quantum Field Theory (QFT) due to its ability to incorporate relativistic effects, which are crucial when reference frames impact outcomes. While non-relativistic Quantum Mechanics (QM) suffices for many entanglement scenarios, QFT is necessary for processes involving particle creation and annihilation, particularly in high-energy contexts. Discussions highlight that QFT is often implicitly used in quantum optics, even if not explicitly referenced in entanglement experiments. The consensus is that while QFT provides a more comprehensive framework, the fundamental aspects of entanglement remain consistent across both QM and QFT. Understanding the interplay between relativity and quantum mechanics is essential for addressing questions about causality and information exchange in entangled systems.
  • #271
DarMM said:
It doesn't. In the probability theory for Bohmian Mechanics the total law holds.

How can that be if the predictions of QM and BM are the same? For example, if A and B are the outcomes of position and momentum measurements in QM, then the probabilities of the outcomes should be the same in QM and BM. What is different in the formula between QM and BM?
 
Physics news on Phys.org
  • #272
atyy said:
How can that be if the predictions of QM and BM are the same? For example, if A and B are the outcomes of position and momentum measurements in QM, then the probabilities of the outcomes should be the same in QM and BM. What is different in the formula between QM and BM?
Bohmian Mechanics in equilibrium is equivalent to QM because of a posited strict restriction on epistemic reasoning within its probability theory. In Bohmian Mechanics in general the total law holds.

When we demand equilibrium we impose a very specific restriction on access to/ability to reason about the hidden variables. Provided this epistemic block holds always the probability theory effectively reduces to that of QM. When blocked in this absolute way the effective Bayesian reasoning about observations is a probability like QM's. The resultant probability theory then does break Kolomogorv's axioms. Breaking the total law is not consistent with classical probability theory. Blocking statistical inference in a very specific way on a theory that normally has classical probability can cause it to not have classical probability.

Note that equilibrium exactly holding cannot be true but must be a thermalisation effect, so in essence if Bohmian Mechanics were true one should be able to see the total law restored.

There is a much broader point here that most of the interpretations of QM are not really interpretations but actually different theories. All hidden variable theories replicate QM under some kind of epistemic restriction that cannot hold in general and in some scenarios even with that restriction will have divergent predictions. Other views such as Many Worlds make conjectures about the formal structure of the theory that have yet to be verified.

The only actual interpretations proper are things like Quantum Bayesianism vs Copenhagen where really it's a purely philosophical thing, e.g. how do you view probabilities. You'll see a similar remark from Rudolf Peierls in "The Ghost in the Atom" from Cambridge University Press.

EDIT:
Note also that even in the underlying Bohmian theory preparations do not prepare ensembles of most quantities, we still have contextuality afterall. Only a position ensemble is prepared.
 
  • Like
Likes mattt and kith
  • #273
DarMM said:
Bohmian Mechanics in equilibrium is equivalent to QM because of a posited strict restriction on epistemic reasoning within its probability theory. In Bohmian Mechanics in general the total law holds.

When we demand equilibrium we impose a very specific restriction on access to/ability to reason about the hidden variables. Provided this epistemic block holds always the probability theory effectively reduces to that of QM. When blocked in this absolute way the effective Bayesian reasoning about observations is a probability like QM's. The resultant probability theory then does break Kolomogorv's axioms. Breaking the total law is not consistent with classical probability theory. Blocking statistical inference in a very specific way on a theory that normally has classical probability can cause it to not have classical probability.

Note that equilibrium exactly holding cannot be true but must be a thermalisation effect, so in essence if Bohmian Mechanics were true one should be able to see the total law restored.

There is a much broader point here that most of the interpretations of QM are not really interpretations but actually different theories. All hidden variable theories replicate QM under some kind of epistemic restriction that cannot hold in general and in some scenarios even with that restriction will have divergent predictions. Other views such as Many Worlds make conjectures about the formal structure of the theory that have yet to be verified.

The only actual interpretations proper are things like Quantum Bayesianism vs Copenhagen where really it's a purely philosophical thing, e.g. how do you view probabilities. You'll see a similar remark from Rudolf Peierls in "The Ghost in the Atom" from Cambridge University Press.

EDIT:
Note also that even in the underlying Bohmian theory preparations do not prepare ensembles of most quantities, we still have contextuality afterall. Only a position ensemble is prepared.

Yes, I agree. Maybe the only difference is that I would say that all the routes here are also open to the minimal interpretation, so we don't have to say that the minimal interpretation goes beyond classical probability, any more than BM does. We could also say the minimal interpretation is contained within classical probability.
 
Last edited:
  • #274
atyy said:
Yes, I agree. Maybe the only difference is that I would say that all the routes here are also open to the minimal interpretation, so we don't have to say that the minimal interpretation goes beyond classical probability, any more than BM does. We could also say the minimal interpretation is contained within classical probability.
QM mathematically violates classical probability, thus is not contained in it. It is contained in a classical theory with a certain kind of epistemic restriction such as Bohmian Mechanics at equilibrium. However note that in equilibrium we are violating Kolmogorov's axioms anyway due to how the epsitemic restriction functions. I'll say more about this in a while as it links into Spekkens model. Bohmian Mechanics and other hidden variable theories replicate much of QM by this restriction alone, you only need the nonlocality/retrocausality to violate CHSH or Bell inequalities.

So in no sense is QM contained in classical probability. Mathematically classical probability is a subset of quantum probability not the other way around. It's like saying curved spacetime is contained in flat spacetime because the latter might turn out to be the correct description of nature.

I think more so one should say that a truly minimal view is neutral to there being a deeper theory where classical probability theory holds. However it would have to acknowledge that as far as we can tell now and operationally in labs preparations do not constitute ensembles. Regarding your previous statement:
atyy said:
so I'm not sure it's quite correct to say that a breakdown of the total law of probability is not consistent with classical probability
Breaking the total law is not consistent with classical probability mathematically. It may be the case that there is a deeper theory which uses classical probability but that is a separate statement. Similarly a Newton-Cartan bundle is not consistent with a Lorentzian metric theory, but the deeper gravitational turned out to involve such.

Also note that from contextuality even still in such a deeper theory a preparation does not constitute an ensemble for most observables. Although classical probability is restored we cannot view our preparation as an ensemble for observables like angular momentum, but only the hidden ##\lambda##.
 
  • #275
A. Neumaier said:
Suppose a beam is split into a superposition of two beams. At positions where the two beams are very far apart, a beam dump collapse is obtained if one destroys one of the resulting beams (by position measurements there) and makes measurements on the other one. This is a bilocal activity created by coordinated local interactions at two far apart places.

Such activities, together with a comparison of the joint measurement statistics at a later time,
are at the heart of all nonlocality experiments. It is not ''spooky action at a distance'' but ''spooky passion at a distance''.
What do you mean by "superposition of two beams"? If you talk about superposition you have to tell the basis, according to which the state ket is a superposition.

The beam dump is due to a local interaction between the particles in the one partial beam dumped and the material you bump the particles in. This is with very good will something like a position measurement though nobody cares about the precise position where the beam is dumped ;-)). Then you do experiments with the other beam, which is also due to usual local interactions of the particles in this beam with the various elements of the experiment (in the SGE the magnet and the particle detector like in the original experiment the glas plates on which the silver atoms where catched and then developed to be measured under a microscope afterwards). Of course, in principle the beam dump and the experiment with the other beam can be as far apart as you wish. This has nothing to do with spooky actions at a distance.

One should clearly distinguish between nonlocal interactions, which according to standard relativistic QFT (aka the standard model) does not exist and correlations between far-distant parts of quantum systems described by entanglement. What you mean by "spooky passion at a distance" I can't say.
 
  • #276
vanhees71 said:
What do you mean by "superposition of two beams"? If you talk about superposition you have to tell the basis, according to which the state ket is a superposition.
I had done so in an earlier post to the same topic:
A. Neumaier said:
take the ensemble of prepared systems, each in the state given by a symmetric superposition of (spin-up, momentum-up) and (spin-down, momentum-down);
A. Neumaier said:
Such activities, together with a comparison of the joint measurement statistics at a later time, are at the heart of all nonlocality experiments. It is not ''spooky action at a distance'' but ''spooky passion at a distance''.
vanhees71 said:
What you mean by "spooky passion at a distance" I can't say.
It is a meaningful play with words. It means that something happens at a distance - namely that nature cooperates globally at long distance to ensure that the perfect nonclassical correlations predicted by quantum mechanics in certain experiments actually happen. But it cannot be controlled hence is a passive happening (a ''passion'') rather than an active one (an ''action''). In spite of (and consistent with) the locally induced interactions!
 
  • Like
Likes mattt
  • #277
vanhees71 said:
I see. The "potentiality interpretation" of the "wave function" (or more generally a quantum state) is due to Schrödinger.
It's important to note that this is interpretation neutral. Due to contextuality some of the quantities we measure have to arise during interaction with the measurement device and can only be taken as properties of the device-system pair. Thus the state preparation has not prepared an ensemble for these quantities.
 
  • #278
vanhees71 said:
Sure, and if you call a beam-dump a collapse, fine with me.
Situations like these are precisely what induced Heisenberg 1927 to talk about state reduction (aka reduction of the state vector, aka collapse). That you don"t like the commonly used words for it doesn't mean that you don't make use of the same concept.
 
  • #279
A. Neumaier said:
I had done so in an earlier post to the same topic:
It is a meaningful play with words. It means that something happens at a distance - namely that nature cooperates globally at long distance to ensure that the perfect nonclassical correlations predicted by quantum mechanics in certain experiments actually happen. But it cannot be controlled hence is a passive happening (a ''passion'') rather than an active one (an ''action''). In spite of (and consistent with) the locally induced interactions!
This is the gibberish, I fight against. What do you mean by "nature cooperates globally".

Here you have a very clear preparation procedure consisting of entirely local physics: A beam of silver atoms comes through a hole from an oven, which gives a beam of unpolarized particles. Then it runs through a magnetic field such that you get an entanglement between the measured spin component and position. The entanglement refers to one and the same particle, and thus it's a "local property" of the single particle. Here you thus don't even have the long-distant correlations via entanglement as in the Bell experiments with two photons!

Of course you cannot control which spin state a single particle in the beam takes, that's the irreducible randomness of QT, but it allows you to prepare states with a definite spin component in the measured direction by selection of the wanted partial beam, thanks to the spin-component-position entanglement.

It's of course right that everything is consistent with local interactions. Otherwise we'd have to find a new theory instead of the Standard Model, which is difficult, because the Standard Model works better than wanted by the majority of particle physicists who are dissatisfied with it for various reasons.
 
  • #280
DarMM said:
It's important to note that this is interpretation neutral. Due to contextuality some of the quantities we measure have to arise during interaction with the measurement device and can only be taken as properties of the device-system pair. Thus the state preparation has not prepared an ensemble for these quantities.
Are you saying the protons in the LHC do not have the very well determined momentum? This claim contradicts the very functioning of the entire device!
 
  • #281
A. Neumaier said:
Situations like these are precisely what induced Heisenberg 1927 to talk about state reduction (aka reduction of the state vector, aka collapse). That you don"t like the commonly used words for it doesn't mean that you don't make use of the same concept.
The important difference of my view to Heisenberg's is Heisenberg's claim that this is something outside of quantum theory. I have not seen a single convincing argument that the understanding of a "beam dump" needs any other laws than the usual quantum theoretical laws about the interaction of particles with other particles forming the beam-dumping material.
 
  • #282
DarMM said:
QM mathematically violates classical probability, thus is not contained in it. It is contained in a classical theory with a certain kind of epistemic restriction such as Bohmian Mechanics at equilibrium. However note that in equilibrium we are violating Kolmogorov's axioms anyway due to how the epsitemic restriction functions. I'll say more about this in a while as it links into Spekkens model. Bohmian Mechanics and other hidden variable theories replicate much of QM by this restriction alone, you only need the nonlocality/retrocausality to violate CHSH or Bell inequalities.

Hmmm, does BM at equilibrium really break the Kolmogorov axioms? Why can't we just put it down to contextuality, which if I understand correctly, just means that if you set up an experiment to measure position, then you cannot also measure momentum.

DarMM said:
So in no sense is QM contained in classical probability. Mathematically classical probability is a subset of quantum probability not the other way around. It's like saying curved spacetime is contained in flat spacetime because the latter might turn out to be the correct description of nature.

Hmmm, I do tend to think that curved spacetime is contained in flat spacetime.

DarMM said:
I think more so one should say that a truly minimal view is neutral to there being a deeper theory where classical probability theory holds.

Yes, that is what I mean, though I guess I'm not sure what the distinction is between that and saying that classical probability contains QM.

DarMM said:
However it would have to acknowledge that as far as we can tell now and operationally in labs preparations do not constitute ensembles. Regarding your previous statement:

Breaking the total law is not consistent with classical probability mathematically. It may be the case that there is a deeper theory which uses classical probability but that is a separate statement. Similarly a Newton-Cartan bundle is not consistent with a Lorentzian metric theory, but the deeper gravitational turned out to involve such.

Generally, my instinctive understanding of why QM does not prepare ensembles in the same sense as classical probability is that for classical probability, a mixed state is a unique combination of pure states, whereas in QM a mixed state is not a unique combination of pure states. Thus I would say that the ensemble in QM is underspecified in terms of what subensembles constitute it, not that the subensembles cannot exist until measurement.

DarMM said:
Also note that from contextuality even still in such a deeper theory a preparation does not constitute an ensemble for most observables. Although classical probability is restored we cannot view our preparation as an ensemble for observables like angular momentum, but only the hidden ##\lambda##.

Yes, I agree.
 
  • #283
vanhees71 said:
Are you saying the protons in the LHC do not have the very well determined momentum? This claim contradicts the very functioning of the entire device!
No I'm not saying that. I'm just saying basic aspects of contextuality and quantum probability.

LHC beams have very well determined momenta for momentum measurements as shown by the tightness of the resulting momentum distribution.

However you cannot consider the beams as being an ensemble of different momenta independent of momentum measurements purely from the preparation.
 
  • #284
atyy said:
Hmmm, I do tend to think that curved spacetime is contained in flat spacetime
Well then it's just a different use of the word "contain".

I would have considered curved spaces to be mathematically more general than flat spaces thus not contained in flat spacetime. You're using it to mean "May ultimately be a physical limiting case in some sense of...".

Mathematically the theory of curved spaces is not contained in the the theory of flat spaces, but physically the flat space theory could be correct. It's a separate notion.

atyy said:
Hmmm, does BM at equilibrium really break the Kolmogorov axioms?
Yes, there are restrictions in the lattice of events that you don't have in Kolmogorov's axioms.

atyy said:
Yes, that is what I mean, though I guess I'm not sure what the distinction is between that and saying that classical probability contains QM
It's as I said above.

Mathematically quantum probability is more general. However you are discussing physically how any given mathematical structure may only arise in a specific physical limit of another theory. Our two notions of "contain" were different.

In the case you're talking about we don't find out that classical probability theory contains quantum probability theory, that's impossible as the latter is more general. Rather we find that the correct hidden variable theory contained an epistemic special case isomorphic to a quantum probability theory.

My only problem is that under this definition in some sense any theory is contained by almost anything as it could be wrong and be a limit of something else entirely different.

My statement more considered QM as it is now where we seem to not have a common sample space for our observables from their operational statistics and thus we currently have no grounds to accept a preparation as constituting an ensemble.

atyy said:
Generally, my instinctive understanding of why QM does not prepare ensembles in the same sense as classical probability is that for classical probability, a mixed state is a unique combination of pure states, whereas in QM a mixed state is not a unique combination of pure states. Thus I would say that the ensemble in QM is underspecified in terms of what subensembles constitute it, not that the subensembles cannot exist until measurement.
I wouldn't say this as even for a pure states we lack a common sample space which prevents one thinking of preparations as ensembles.
 
Last edited:
  • Like
Likes mattt and dextercioby
  • #285
vanhees71 said:
The important difference of my view to Heisenberg's is Heisenberg's claim that this is something outside of quantum theory. I have not seen a single convincing argument that the understanding of a "beam dump" needs any other laws than the usual quantum theoretical laws about the interaction of particles with other particles forming the beam-dumping material.
Heisenberg didn't think of state reduction as being outside of quantum theory but (like most physicists since him) as being an aspect of it.
Werner Heisenberg (1927) said:
Jede Ortsbestimmung reduziert also das Wellenpaket wieder auf seine ursprüngliche Grösse
Paul Dirac (1930) said:
The state of the system after the observation must be an eigenstate of [the operator corresponding to the observable] ##\alpha##, since the result of a measurement of ##\alpha## for this state must be a certainty.
 
  • Like
Likes dextercioby and Lord Jestocost
  • #286
A. Neumaier said:
It means that something happens at a distance - namely that nature cooperates globally at long distance to ensure that the perfect nonclassical correlations predicted by quantum mechanics in certain experiments actually happen. But it cannot be controlled hence is a passive happening (a ''passion'') rather than an active one (an ''action''). In spite of (and consistent with) the locally induced interactions!
vanhees71 said:
What do you mean by "nature cooperates globally".
Nature ensures in perfect correlation experiments with entangled photon pairs (the ''certain experiments'') that whenever Alice measures $Ak$ to get ##a_k## then Bob measures $Bk$ and also gets ##a_k##, while for Bob it seems that his results are random. In spite of (and consistent with) the locally induced interactions!
 
  • #287
DarMM said:
No I'm not saying that. I'm just saying basic aspects of contextuality and quantum probability.

(a) LHC beams have very well determined momenta for momentum measurements as shown by the tightness of the resulting momentum distribution.

(b) However you cannot consider the beams as being an ensemble of different momenta independent of momentum measurements purely from the preparation.
For me (a) and (b) are contradicting each other since for me (a) is what I understand as an ensemble of protons with pretty well defined momenta. It's something I'd expect to be quite well described by a wave function sharply peaked in momentum space (or the appropriate formulation in QFT as the correspondingly smeared creation operator applied to the vacuum state).

How can the state concept of QT make physical sense, if (a) doesn't define an ensemble of protons with pretty well determined momentum?
 
  • #288
vanhees71 said:
For me [a] and are contradicting each other since for me [a] is what I understand as an ensemble of protons with pretty well defined momenta. It's something I'd expect to be quite well described by a wave function sharply peaked in momentum space (or the appropriate formulation in QFT as the correspondingly smeared creation operator applied to the vacuum state).
That's just Kochen-Specker contextuality and quantum probability though. It's not my personal view or interpretation.

How do you view the Kochen-Specker theorem then? That might be an easier way to pin point the misunderstanding.
 
  • #289
Morbert said:
Is the protean nature of ensembles in QM a weakness in the minimalist ensemble interpretation?

My understanding so far: The theory of a given system is the double ##(H,\rho)##, the dynamics and the preparation. I.e. All physical content is contained in these terms. The triple ##(H,\rho,\sigma)## describes an ensemble in terms of possible outcomes of a measurement (or possible outcomes of a sequence of measurements), where ##\sigma## is the set of possibilities. The triple ##(H,\rho,\sigma')## describes an ensemble in terms of a different, incompatible set of measurement possibilities ##\sigma'##.

Could we say the physical content of the triples ##(H,\rho,\sigma)## and ##(H,\rho,\sigma')## is the same, and the choice of one over the other is merely a choice of appropriate descriptive terms for a measurement context. I.e. A choice of measurement context does not change any physical content of the preparation. It merely constrains the physicist to use a description appropriate for that context.

[edit] - Added some clarification.
Forgot to respond to this. Yes indeed and it's essentially that constraint that prevents one viewing it as an ensemble. Only after such a "choice" does QM give a well defined statistical population.
 
Last edited:
  • Like
Likes Morbert
  • #290
vanhees71 said:
for me [a] is what I understand as an ensemble of protons with pretty well defined momenta.
But before you had said,
vanhees71 said:
In my example the LHC is a "preparation machine" for (unpolarized) proton beams with a quite well-defined momentum and energy. These beams are prepared such that they collide in specific points along the beam line. For me the preparation procedure delivers a well-defined ensemble of colliding proton beams.
An ensemble of proton beams (prepared moving blops consisting of many protons) is not an ensemble of protons. Protons are never seen in the LHC experiments, prepared are the blops
and measured are the traces of the collision products. In the spirit of the quote by Peres, this is what you have in the labs, not protons.
 
  • #291
I don't see, why my notion of ensembles as the interpretation of quantum states should violate the KS theorem at all. My point is that the state for itself defines an ensemble, since I'm still free to measure whatever I can measure (restricting myself to precise PV measurements, I can always measure any set of compatible observables I like, independent of the state preparation).

E.g., if I prepare a proton beam polarized in ##z## direction, I'm still free to measure any spin component of each proton in this beam I like. Accordingly, given this state I know for any spin component the probabilities for measuring one of the two values ##\pm \hbar##. So no matter, which (quantum theoretically sensible) measurement I perform on my "ensemble" I have well-defined probabilities. That's why I think it's too narrow to say the ensemble is not only given by the quantum state alone but only in context of the measurement to be performed on it. I think that's also demonstrated by the correct prediction for the (probabilistic) outcome of "delayed-choice measurements".
 
  • #292
A. Neumaier said:
But before you had said,

An ensemble of proton beams (prepared moving blops consisting of many protons) is not an ensemble of protons. Protons are never seen in the LHC experiments, prepared are the blops
and measured are the traces of the collision products. In the spirit of the quote by Peres, this is what you have in the labs, not protons.
This is semantics. As any object also a proton is defined by its properties. In fact in QT objects are much more definitely defined than in classical physics since any proton is completely indistinguishable from any other proton. A beam of protons consists of many protons forming an ensemble. You can of course argue whether a specific bunch in the LHC is an ensemble of independently prepared single protons.
 
  • #293
vanhees71 said:
for me [a] is what I understand as an ensemble of protons with pretty well defined momenta.
But before you had said that it is an ensemble of proton beams with pretty well defined momenta,
not an ensemble of protons. Protons are never seen in the experiment, prepared are the bunches.
vanhees71 said:
This is semantics.
Yes, and semantics (meaning) counts in arguments about the meaning of concepts.
vanhees71 said:
A beam of protons consists of many protons forming an ensemble.
A single beam does not, according to your definition:
vanhees71 said:
An ensemble is a collection of independent equally prepared systems.
The protons in a single beam are neither independent nor a collection. They are not even distinguishable, one can point to none of them, only to the time-dependent multiparticle state formed by the whole bunch.
 
  • #294
Well, then how do you explain that the LHC measures outcomes precisely in accordance with the standard model assuming two protons in the initial state. Of course the answer is simply that the bunches are still dilute enough that for FAPP you can assume that only a single pp collision occurs in each interaction of the bunches.
 
  • #295
vanhees71 said:
Well, then how do you explain that the LHC measures outcomes precisely in accordance with the standard model assuming two protons in the initial state. Of course the answer is simply that the bunches are still dilute enough that for FAPP you can assume that only a single pp collision occurs in each interaction of the bunches.
I explain it by ''shut up and calculate''. In that mode all inquiries about the precise meaning of the concepts involved is meaningless, as the meaning is left to the discretion of everyone. On this level it is alright to equate ensemble with preparation, as such ''equations'' are only proxies for intuitive reasoning.

But if one starts to inquire into the meaning of the concepts used (as in many of these foundational threads), one finds them problematic and often inconsistent in how they are used.
 
  • Like
Likes Auto-Didact
  • #296
Well, one can also overprobemetize the problems.
 
  • #297
vanhees71 said:
My point is that the state for itself defines an ensemble
But it simply doesn't as a mathematical fact. An ensemble is an approximate realisation of a sample space. In quantum probability the state alone does not give one a well defined sample space due to Kochen-Specker contextuality. That's all there is to it.

What would help me is understanding what you think the Kochen-Specker theorem implies. Then we could more easily discuss this since currently what you are saying seems in direct contradiction to it.
 
  • #298
DarMM said:
Well then it's just a different use of the word "contain".

I would have considered curved spaces to be mathematically more general than flat spaces thus not contained in flat spacetime. You're using it to mean "May ultimately be a physical limiting case in some sense of...".

Mathematically the theory of curved spaces is not contained in the the theory of flat spaces, but physically the flat space theory could be correct. It's a separate notion.Yes, there are restrictions in the lattice of events that you don't have in Kolmogorov's axioms.It's as I said above.

Mathematically quantum probability is more general. However you are discussing physically how any given mathematical structure may only arise in a specific physical limit of another theory. Our two notions of "contain" were different.

In the case you're talking about we don't find out that classical probability theory contains quantum probability theory, that's impossible as the latter is more general. Rather we find that the correct hidden variable theory contained an epistemic special case isomorphic to a quantum probability theory.

My only problem is that under this definition in some sense any theory is contained by almost anything as it could be wrong and be a limit of something else entirely different.

My statement more considered QM as it is now where we seem to not have a common sample space for our observables from their operational statistics and thus we currently have no grounds to accept a preparation as constituting an ensemble.

I guess it is the difference between saying that determinism is a special case of randomness (which is mathematically true, since one can use the delta measure), or saying that randomness is a special case of determinism (which is not mathematically true, but is physically true in the sense that we can consider statistical mechanics as arising from Newton's laws for many particles and ignorance of the exact state).

DarMM said:
I wouldn't say this as even for a pure states we lack a common sample space which prevents one thinking of preparations as ensembles.

I guess, reading your replies to @vanhees71, that this is due to the KS theorem. But I thought the point of the KS theorem is that QM is contextual? What has contextuality got to do with an inability to consider preparations as ensembles?
 
  • Like
Likes vanhees71
  • #299
DarMM said:
QM mathematically violates classical probability, thus is not contained in it. It is contained in a classical theory with a certain kind of epistemic restriction such as Bohmian Mechanics at equilibrium.
This makes no sense. BM is conceptually a completely classical, deterministic theory, and quantum equilibrium states are well-defined probabilistic states defined on the configuration space.
DarMM said:
However note that in equilibrium we are violating Kolmogorov's axioms anyway due to how the ep[is]temic restriction functions. I'll say more about this in a while as it links into Spekkens model. Bohmian Mechanics and other hidden variable theories replicate much of QM by this restriction alone, you only need the nonlocality/retrocausality to violate CHSH or Bell inequalities.
All you need is, indeed, a preferred frame. Everything else (logic, probability theory, local configurations) are completely classical. Epistemic restrictions (as far as that means that we are unable to prepare some states but are restricted to prepare only states in quantum equilibrium) do not lead to any violations of Kolmogorovian probability.

By the way, the quantum theory fits into Kolmogorovian probability in a quite trivial way, which is described in Kochen, S., Specker, E.P. (1967). The Problem of Hidden Variables in Quantum Mechanics, J. Math. Mech. 17(1), 59-87 on page 63. They have combined this with some bad words about it, to motivate some additional restrictions (non-contextuality) for "good" hidden variables, which they prove are incompatible with quantum theory.
 
  • #300
Elias1960 said:
This makes no sense. BM is conceptually a completely classical, deterministic theory, and quantum equilibrium states are well-defined probabilistic states defined on the configuration space.
All true. I don't know how it affects what I say though.

Quantum theory generalizes classical probability theory. That observation is decades old by now.

Elias1960 said:
By the way, the quantum theory fits into Kolmogorovian probability in a quite trivial way, which is described in Kochen, S., Specker, E.P. (1967). The Problem of Hidden Variables in Quantum Mechanics, J. Math. Mech. 17(1), 59-87 on page 63. They have combined this with some bad words about it, to motivate some additional restrictions (non-contextuality) for "good" hidden variables, which they prove are incompatible with quantum theory
Can you show me where in that paper they show that quantum theory "trivially" fits into Kolmogorovian probability in a way that isn't essentially the sense @atyy and I have mentioned.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
Replies
58
Views
4K
  • · Replies 1 ·
Replies
1
Views
466
Replies
1
Views
1K
  • · Replies 69 ·
3
Replies
69
Views
7K
Replies
4
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 115 ·
4
Replies
115
Views
9K