Von Neumann QM Rules Equivalent to Bohm?

  • #51
vanhees71 said:
You keep repeating this every time, but I've not seen a single example for such an experimental observation, which would imply that either Einstein causality or QT must be wrong. Before I believe either of this, I need a very convincing experimental evidence for a collapse!

The bell tests need collapse, when the Alice's and Bob's observations are timelike separated or they are calculated in a frame in which the their observations are not simultaneous.

vanhees71 said:
It causes trouble not for any whatever-logy but to the overwhelming evidence for the correctness of the relativistic space-time for all (at least all local) observations made so far. Either you believe in the existence of a collapse or Einstein causality and locality. The most successful model ever, the Standard Model of elementary particle physics, obeys both. one doesn't need a collapse to derive all observable predictions of it, and these predictions are validated by all observations made so far (to the dismay of the particle theorists, who'd like to find evidence for physics beyond the standard model in order to see how to overcome some of its difficulties, including the hierarchy problem and the description of dark matter to find a hint, where to look for direct evidence of what it is made of).

Collapse causes no trouble to relativistic spacetime, unless one believes in a special relativistic ontology. Einstein causality and locality are beliefs in ontology. Special relativity does not require Einstein causality and locality - that is one of great lessons of collapse.
 
  • Like
Likes TrickyDicky
Physics news on Phys.org
  • #52
rubi said:
The idea of the algebraic framework is to extract the relevant part of QM (observable facts) and get rid of the mathematical parts that have no relevance (like the choice of a Hilbert space). In QM, we are interested in the behaviour of certain sets of observables (position, momenum, ...) and these observables form an algebra (they can be multiplied for example). A state of a system tells us all physical information that can be extracted in principle (like expectation values, probabilities, ...). In QM, we usually have a Hilbert space with operators and a state is determined by a vector ##\Psi##. Expectation values are given by ##\left<A\right>=\left<\Psi,A\Psi\right>##. A state could also be given by a density matrix ##\rho## and the expectation values would be ##\left<A\right>=\mathrm{Tr}(\rho A)##. So the expectation value functional takes an observable and spits out a number (the expectation value). Now there is a mathematical theorem (GNS) that says then when we have a certain algebra (of observables) and know all the expectation values of these observables, then we can reconstruct a Hilbert space ##\mathcal H##, a representation ##\pi## of the algebra and a vector ##\Omega##, such that the expectation values are given by ##\left<A\right> = \left<\Omega,\pi(A)\Omega\right>##. (The expectation value functional is usually denoted by ##\omega(A)## rather than ##\left<A\right>##.) But that also means that even if we have an algebra of observables and a state given by a density matrix, we can construct a new Hilbert space such that the state that was formerly given by a density matrix now is a plain old vector state (##\Omega##): We just use our old algebra as the algebra and the "algebraic state" ##\omega(A)=\mathrm{Tr}(\rho A)## as the expectation value functional and apply the theorem. (It constructs the new Hilbert space and the new representation of the algebra explicitely.)

Now what does that look like concretely? Let's say we have an algebra of observables ##\mathfrak A## on a concrete Hilbert space ##\mathcal H## and a density matrix ##\rho## on ##\mathcal H##. The density matrix can always be written as ##\rho=\sum_n \rho_n b_n \left<b_n,\cdot\right>##, where ##(b_n)_n## is an ONB for ##\mathcal H##. We can now define a new Hilbert space ##\mathcal H' = \bigoplus_n\mathcal H##, a representation ##\pi(A) (\bigoplus_n v_n) = \bigoplus_n A v_n## and a vector ##\Omega_\rho = \bigoplus_n \sqrt{\rho_n} b_n##. We can verify that we get the same expectation value as before: ##\mathrm{Tr}(\rho A) = \left<\Omega_\rho,\pi(A)\Omega_\rho\right>##. Every density matrix on ##\mathcal H## can be represented this way by a normalized vector ##\Omega_\rho## in ##\mathcal H'## and since they are normalized, they are related by unitary transformations. So if one has two density matrices ##\rho(t_1)## and ##\rho(t_2)## in ##\mathcal H##, there is a unitary operator ##U(t_2,t_1)## in ##\mathcal H'## such that ##\Omega_{\rho(t_2)}=U(t_2,t_1)\Omega_{\rho(t_1)}##.

Edit: I should probably add what the inner product on ##\mathcal H'## is: ##\left<\bigoplus_n v_n, \bigoplus_n w_n\right>_{\mathcal H'} = \sum_n\left<v_n,w_n\right>_{\mathcal H}##

As far as I can tell, this corresponds to
(1) A proper mixture and an improper mixture (reduced density matrix) are indistinguishable if one only looks at local observables
(2) Every mixture can be interpreted as a reduced density matrix, and purified.

The physical picture behind this is decoherence. However, decoherence does not derive the collapse. No matter what mathematical tricks one plays, deterministic unitary evolution and the born rule are insufficient, because
(1) There are two types of time evolution: deterministic and probabilistic
(2) The Born rule does not give the joint probabilities for observations carried out at different times

In addition to standard physics texts, rigourous texts like Holevo deal extensively with collapse, and it is mentioned in the rigourous text by Dimock.
Holevo https://www.amazon.com/dp/3540420827/?tag=pfamazon01-20
Dimock https://www.amazon.com/dp/1107005094/?tag=pfamazon01-20
 
Last edited by a moderator:
  • #53
atyy said:
As far as I can tell, this corresponds to
(1) A proper mixture and an improper mixture (reduced density matrix) are indistinguishable if one only looks at local observables
(2) Every mixture can be interpreted as a reduced density matrix, and purified.
No, I'm not restricting the set of observables anywhere. Every observable ##A## on ##\mathcal H## corresponds to an observable ##\pi(A)## on ##\mathcal H'## and every pure and mixed state (proper or improper) on ##\mathcal H## corresponds to a vector state ##\Omega## in ##\mathcal H'##. (##\sum_n \rho_n b_n \left<b_n,\cdot\right>## is sent to ##\bigoplus_n \sqrt{\rho_n} b_n##.)

The physical picture behind this is decoherence. However, decoherence does not derive the collapse. No matter what mathematical tricks one plays, deterministic unitary evolution and the born rule are insufficient, because
(1) There are two types of time evolution: deterministic and probabilistic
It's independent of decoherence. Both types of time evolution can be described by a general Lindblad equation in ##\mathcal H## and this induces a family of time evolution operators ##U(t_2,t_1)## on ##\mathcal H'## as described above. In fact, one doesn't even need a Lindblad equation. It's enough to know the density matrices in ##\mathcal H## at all times to get the family.

(2) The Born rule does not give the joint probabilities for observations carried out at different times
I'm sure one can write down joint probabilities also in ##\mathcal H'##, since ##\mathcal H'## contains exactly the same information as the set of density matrices on ##\mathcal H##. It's only encoded differently (as an infinite direct sum instead of a matrix). It just needs a little bit of extra work to get the correct formulas. Very roughly speaking, I just put the (square roots) of the entries of a density matrix in a list, rather than in a matrix, so if one can use that information to calculate joint probabilities on ##\mathcal H##, one should also be able to do that on ##\mathcal H'##.

In addition to standard physics texts, rigourous texts like Holevo deal extensively with collapse, and it is mentioned in the rigourous text by Dimock.
Holevo https://www.amazon.com/dp/3540420827/?tag=pfamazon01-20
Dimock https://www.amazon.com/dp/1107005094/?tag=pfamazon01-20
Thanks, I will have a look at them.
 
Last edited by a moderator:
  • #54
rubi said:
No, I'm not restricting the set of observables anywhere. Every observable ##A## on ##\mathcal H## corresponds to an observable ##\pi(A)## on ##\mathcal H'## and every pure and mixed state (proper or improper) on ##\mathcal H## corresponds to a vector state ##\Omega## in ##\mathcal H'##. (##\sum_n \rho_n b_n \left<b_n,\cdot\right>## is sent to ##\bigoplus_n \sqrt{\rho_n} b_n##.)

Yes, but is it also the case that every observable in ##\mathcal H'## corresponds uniquely to an observable in ##\mathcal H##? If it doesn't, then that would correspond to what physicists call a local observable, since the space ##\mathcal H## is "smaller" or "local" compared to ##\mathcal H'## which is "larger".

Is the theorem you are thinking about what is called the GNS construction on this page about the church of the larger Hilbert space: http://www.quantiki.org/wiki/The_Church_of_the_larger_Hilbert_space? If it is, then I do think it is equivalent to purifications, and the two "churches" of quantum theory. The other denomination is of course the church of the smaller Hilbert space: http://mattleifer.info/wordpress/wp-content/uploads/2008/11/commandments.pdf.

rubi said:
It's independent of decoherence. Both types of time evolution can be described by a general Lindblad equation in ##\mathcal H## and this induces a family of time evolution operators ##U(t_2,t_1)## on ##\mathcal H'## as described above. In fact, one doesn't even need a Lindblad equation. It's enough to know the density matrices in ##\mathcal H## at all times to get the family.

rubi said:
I'm sure one can write down joint probabilities also in ##\mathcal H'##, since ##\mathcal H'## contains exactly the same information as the set of density matrices on ##\mathcal H##. It's only encoded differently (as an infinite direct sum instead of a matrix). It just needs a little bit of extra work to get the correct formulas. Very roughly speaking, I just put the (square roots) of the entries of a density matrix in a list, rather than in a matrix, so if one can use that information to calculate joint probabilities on ##\mathcal H##, one should also be able to do that on ##\mathcal H'##.

What I mean is that if I know the density matrices at all time and the Born rule, there is still experimental data that exists that I cannot predict without collapse or some other postulate. For example, if I know the state is f(x) at t1 and g(x) at t2, I can calculate the probability of some being at x=y at t1, and the probability of being at x=z at t2. However, I cannot calculate the probability of being at z at t2 given that I was at y at t1, ie. I cannot calculate p(y,z) or p(z|y).

rubi said:
Thanks, I will have a look at them.

In Holevo's book, it is in the chapter on repeated and continuous measurements, and the concept that is called "collapse" is dealt with by the concept of an instrument. In Dimock's book it is just mentioned, and there is not extensive discussion about it.
 
  • #55
atyy said:
The bell tests need collapse, when the Alice's and Bob's observations are timelike separated or they are calculated in a frame in which the their observations are not simultaneous.
This I don't understand. In Bell tests you start with an entangled pair of photons (biphotons), created by some local process, e.g., the parametric down conversion in a crystal. They are mostly emitted back to back and you simply have to wait long enough to be able to detect the photons at large distances (making sure that nothing disturbs them to prevent the decoherence the state). The single-photon polarizations are maximally random (unpolarized photons) but the 100% correlation between polarizations measured in the same direction are inherent in this state. So it's a property of the biphotons, and the correlations are thus "caused" by their production in this state and not due to the measurement of one of the single-photon polarization states. It doesn't matter, whether the registration events by A and B are time-like or space-like separated. You'll always measure the correlation due to the entanglement provided there was no disturbance in between to destroy the entanglement by interactions with this disturbances. This shows that there's no collapse necessary in the analysis of these experiments (the same holds of course when you use not aligned setups of the two polarizers at A's and B's places, as necessary to demonstrate the violation of Bell's inequality or variations of it).

Collapse causes no trouble to relativistic spacetime, unless one believes in a special relativistic ontology. Einstein causality and locality are beliefs in ontology. Special relativity does not require Einstein causality and locality - that is one of great lessons of collapse.
I don't know, what you precisely mean by "special relativistic ontology". The space-time structure described by the Minkowski space-time is only consistent with the principle of causality, if there cannot be causal influences over space-like distances, and the collapse assumption introduces precisely such a thing, because it states that the bi-photon state instantaneously collapses to a two-photon state, where by letting one of the entangled photons go through a polarization filter at A's causes B's photon to have the corresponding complementary polarization state. This happens instantaneously in the usual collapse assumptions, and clearly violates Einstein causality and thus directly contradicts the very foundations of QED, which is a very well tested theory. So it's much simpler and more natural not to make this unnecessary assumption but take for granted what Born's Rule tells you about the correlations of the entangled biphoton state as detailed above. As I said, I don't see, where you need a collapse to describe a (theoretically) pretty simple experiment via quantum theory.
 
  • #56
atyy said:
Yes, but is it also the case that every observable in ##\mathcal H'## corresponds uniquely to an observable in ##\mathcal H##? If it doesn't, then that would correspond to what physicists call a local observable, since the space ##\mathcal H## is "smaller" or "local" compared to ##\mathcal H'## which is "larger".
The space ##\mathcal H'## certainly contains more states than ##\mathcal H##, but is that a bad thing? (Even ##\mathcal H## usually contains pathological states that are never physically realized. Think of ##\sum_n \chi_{[n,n+2^{-n}]}\in L^2(\mathbb R)##, which is normalized and doesn't vanish at ##\infty## or some fancy nowhere continuous function or whatever crazy things mathematicians can come up with.) Every physical situation that can be described in ##\mathcal H## can also be described in ##\mathcal H'## and the results are equivalent. Thus it doesn't matter whether I choose to describe my physics in ##\mathcal H## or ##\mathcal H'##. Of course, one would usually choose ##\mathcal H##, since the description is easier there, but this choice is not physically relevant. The point of the example was not to promote the use of ##\mathcal H'##, but rather to explain that whether time evolution is unitary or not is only a matter of how one organizes the available information and not a physical principle. Physics only cares about probability conservation. Whether that happens unitarily or not depends on the way we choose to encode our states. In other words: We cannot "detect" the Hilbert space. It's analogous to the situation in general relativity, where the choice of coordinate system is not relevant either.

Is the theorem you are thinking about what is called the GNS construction on this page about the church of the larger Hilbert space: http://www.quantiki.org/wiki/The_Church_of_the_larger_Hilbert_space? If it is, then I do think it is equivalent to purifications, and the two "churches" of quantum theory. The other denomination is of course the church of the smaller Hilbert space: http://mattleifer.info/wordpress/wp-content/uploads/2008/11/commandments.pdf.
Yes, I was talking about the GNS construction, but my example was not strictly a GNS construction, but equivalent to a GNS construction in many situations. The theorem on that page seems to use tensor products instead of direct sums though, so I don't think it's the same thing and it's not clear whether that also works in infinitely many dimensions. Contrary to what is said on the page, the GNS construction is a much more general result and it only yields Hilbert spaces that are equivalent to tensor products in special situations. Stinesprings theorem seems to be something completely different in general. In the ##\mathcal H'## I gave, one cannot reconstruct ##\rho## by taking a partial trace for example.

What I mean is that if I know the density matrices at all time and the Born rule, there is still experimental data that exists that I cannot predict without collapse or some other postulate. For example, if I know the state is f(x) at t1 and g(x) at t2, I can calculate the probability of some being at x=y at t1, and the probability of being at x=z at t2. However, I cannot calculate the probability of being at z at t2 given that I was at y at t1, ie. I cannot calculate p(y,z) or p(z|y).
Well, if you can do it on ##\mathcal H##, then you can also do it on ##\mathcal H'##. No information gets lost in that transition.

In Holevo's book, it is in the chapter on repeated and continuous measurements, and the concept that is called "collapse" is dealt with by the concept of an instrument. In Dimock's book it is just mentioned, and there is not extensive discussion about it.
Thanks. I can't get hold of that book before monday, though. :H
 
  • #57
atyy said:
Yes, but is it also the case that every observable in ##\mathcal H'## corresponds uniquely to an observable in ##\mathcal H##? If it doesn't, then that would correspond to what physicists call a local observable, since the space ##\mathcal H## is "smaller" or "local" compared to ##\mathcal H'## which is "larger".
I also want to add the following to my previous post: For every state ##\Psi## in ##\mathcal H'## and every ##\epsilon>0##, there is a density matrix ##\rho## in ##\mathcal H## such that for all observables ##A##, the following holds: ##|\mathrm{Tr}_{\mathcal H}(\rho A) - \left<\Psi,A\Psi\right>_{\mathcal H'}|<\epsilon##. That means one can find a density matrix in ##\mathcal H## that has expectation values that are arbitrarily close to the expectation values in ##\mathcal H'##, so ##\mathcal H## and ##\mathcal H'## can't be distinguished physically, even if one uses a state in ##\mathcal H'## that doesn't directly come from a state in ##\mathcal H##. This is the content of Fell's theorem.

Edit: Oops. I just realized that you were asking about observables, not states. Well, observables are not something that one gets out of the mathematics, but rather something one has to put in. Just because I use a different Hilbert space, it doesn't mean that I can magically build more measurement devices than I could before. So the right order is to specify what I can possibly measure (for example position, momentum, variances, correlations, ...) and then build a theory that describes these measurements. I can do that in both ##\mathcal H## and ##\mathcal H'##, but both Hilbert spaces also contain a huge amount of excess operators that don't correspond to physically realizable measurement apparata. And both spaces contain equally many of them (as in cardinality of the set of such operators).

P.S. I'm now reading the paper you quoted.
 
Last edited:
  • #58
rubi said:
Thanks. I can't get hold of that book before monday, though. :H

I'll read your other comments too, will reply later. But before Monday, one can also try

http://arxiv.org/abs/0706.3526 (the "collapse" is defined by postulating an instrument as in Eq 3)
http://arxiv.org/abs/0810.3536 (the "collapse" is again defined by postulating an instrument in section 6.2, Eq 6.7 to Eq 6.12)

What is interesting about the presentation by Heinosaari and Ziman is that the argument leading up to Eq 6.7 almost seems to derive from the Schroedinger equation and the Born rule. However, I think it requires the assumption that if B is measured a little later than A, the same result is obtained as if B and A are measured at the same time. This seems to be some sort of continuity argument, which is how the projection postulate was argued for by Dirac.
 
Last edited:
  • #59
vanhees71 said:
Where do you need a collapse here? I just measure, e.g., a spin component (to have the simple case of a discrete observable) and take notice of the result. If you have a filter measurement (the usually discussed Stern-Geralch apparati are such), I filter out all partial beams I don't want and am left with a polarized beam with in the spin state I want. That's all. I don't need a collapse. The absorption of the unwanted partial beams are due to local interactions of the particles with the absorber. There's no collapse!

Well, in an EPR setup, with experimenters Alice and Bob, Alice performs a measurement on one particle. She gets a result. You can explain that in terms of filters. But, immediately after the measurement, she can compute the probabilities for Bob's result. Now, that can't be due to filtering. She isn't filtering Bob's particles.
 
  • #60
No it's due to the fact that A knows that the photons are entangled and thus what B must measure at his photon. Again: The result for Bob's photon is not caused by Alice's measurement. The preparation was before when the biphoton was created by some process (parametric down conversion).
 
  • #61
vanhees71 said:
This I don't understand. In Bell tests you start with an entangled pair of photons (biphotons), created by some local process, e.g., the parametric down conversion in a crystal. They are mostly emitted back to back and you simply have to wait long enough to be able to detect the photons at large distances (making sure that nothing disturbs them to prevent the decoherence the state). The single-photon polarizations are maximally random (unpolarized photons) but the 100% correlation between polarizations measured in the same direction are inherent in this state. So it's a property of the biphotons, and the correlations are thus "caused" by their production in this state and not due to the measurement of one of the single-photon polarization states. It doesn't matter, whether the registration events by A and B are time-like or space-like separated. You'll always measure the correlation due to the entanglement provided there was no disturbance in between to destroy the entanglement by interactions with this disturbances. This shows that there's no collapse necessary in the analysis of these experiments (the same holds of course when you use not aligned setups of the two polarizers at A's and B's places, as necessary to demonstrate the violation of Bell's inequality or variations of it).

Well, immediately before the first measurement, the most complete description of the state of the photons possible is:

Description 1:
  • Statement 1.A: The probability of the first photon passing a filter at angle A is 50%
  • Statement 1.B: The probability of the second photon passing a filter at angle B is 50%
  • Statement 1.C: The probability of both events happening is 0.5 cos^2(A-B)
Now you perform the first measurement, and the result is that the first photon does pass the filter oriented at angle A. Then immediately after the first measurement, but before the second measurement, the most complete description of the photons possible is:

Description 2:
  1. Statement 2.A: The first photon is polarized at angle A
  2. Statement 2.B: The probability of the second photon passing a filter at angle B is cos^2(A-B)
The "collapse" is simply a name for the transition from Description 1 to Description 2. So it's definitely there. The only issue is, what is the nature of this transition? Is it simply a change of knowledge in the mind of the experimenters? Or, is there some objective facts about the world that change?

Before either measurement is made, is Description 1 an objective fact about the world, or is it simply a statement about our knowledge? Same question for Description 2.

If you say that Description 1 and Description 2 are objective facts about the world, then it seems to me that collapse is a physical process.
 
  • #62
Descriptions 1 and 2 are different experiments! I relabel thus "Description" with "Experiment". No wonder that you get different results. Of course, the state of a system depends on the preparation procedure, which is (in my understanding of quantum theory) a tautology because the state is defined as an equivalence class of preparation procedures (see also this nice book by Strocchi, where this is formalized using the ##C^*## approach to observables).

Experiment 1 considers all biphotons and Experiment 2 filters out those biphotons, where A finds it to be polarized at angle A. So the probabilities (relative frequencies!) refer to different ensembles.

Note that Experiment 2 can be achieved even post factum, i.e., if you make precise enough timing at both A and B you have the information about which B photon belongs to which A photon, i.e., you know which photons belonged to the same biphoton. Then you can make the selection necessary to do Experiment 2 after everything has long happened to the photons. This is this famous post-selection thing, which of course also becomes somewhat "spooky" when (in my opinion falsely) interpret the measurement at A as the cause for the outcome at B and not as (in my opinion correctly) interpret the preparation in an entangled state as the cause for the correlations described by it.
 
  • #63
vanhees71 said:
No it's due to the fact that A knows that the photons are entangled and thus what B must measure at his photon.

Saying that "the photons are entangled" is just a description of the initial state of the two photons (my Description 1) above. "Collapse" is about the transition from Description 1 to Description 2.
 
  • #64
atyy said:
What is the status of domain wall fermions and the standard model? Can a lattice standard model be constructed with domain wall fermions, at least in principle, even if it is too inefficient to simulate? Or is the answer still unknown?
I don't think there is a necessity for chiral fermions on the lattice. In the standard model, fermions come in pairs of Dirac particles. These can be put on the lattice.

The point is, of course, that one wants some gauge groups acting only on chiral components. And this seem impossible to do exactly on a simple lattice model. So strange things like domain wall fermions are invented. In fact, there is no need for them - use an approximate gauge symmetry on the lattice - anyway, these gauge fields are massive. Not renormalizable? Who cares, the long distance limit rules out the non-renormalizable elements anyway. (Of course, one has to start with the old Dirac-Fermi approach to gauge field quantization, because the Gupta-Bleuler approach depends on exact gauge symmetry to get rid of negative probabilities.)
 
  • Like
Likes atyy
  • #65
stevendaryl said:
Saying that "the photons are entangled" is just a description of the initial state of the two photons (my Description 1) above. "Collapse" is about the transition from Description 1 to Description 2.

I think vanhees71 considers that one can add Description 2 to Description 1. To do so, one would modify Description 2 to be conditional: "If Alice measures the photon to be polarized at angle A ...

All that is fine. But what he doesn't realize is that whatever one does Einstein causality is gone, and Einstein causality is either not meaningful (if the variables of QM are not real) or violated by QED (if the variables of QM are real).
 
  • #66
Again: The result for Bob's photon is not caused by Alice's measurement
Sure, but only physical collapse contradicts this, non-physical collapse follows from Bell's inequalities violations and respects forbidden ftl causation.
vanhees71 said:
No it's due to the fact that A knows that the photons are entangled and thus what B must measure at his photon.

It seems to me entanglement correlations are empirical embodiments of non-physical collapse
 
  • #67
What is a "non-physical collapse"? Either there is something collapsing in the real world when a measurement is made or not! In my opinion there's not the slightest evidence for anything collapsing when we observe something.
 
  • #68
stevendaryl said:
Saying that "the photons are entangled" is just a description of the initial state of the two photons (my Description 1) above. "Collapse" is about the transition from Description 1 to Description 2.
But where is anything "collapsing" here. A measures her photon's polarization. If she find it to be polarized in A-direction, she considers what's the probability that B finds his photon to be polarized in B-direction; if her photon is absorbed, she doesn't consider anything further, i.e., she uses a sub-ensemble of the complete ensemble originally prepared, which is another preparation procedure, i.e., a different measurement than done in Experiment 1, where all photon pairs are considered. Nothing has collapsed, it's just the choice of the ensemble based on Alice's (local!) measurement of her photon's polarization. Everything is just calculated by the usual rules of probability theory from the given initial biphoton state. There's no need to assume an instantaneous collapse of B's photon's state by A's measurement on her photon necessary to explain all the probabilistic properties of the two experiments under consideration.
 
  • #69
vanhees71 said:
You keep repeating this every time, but I've not seen a single example for such an experimental observation, which would imply that either Einstein causality or QT must be wrong. Before I believe either of this, I need a very convincing experimental evidence for a collapse!

How do you obtain a wave function as the initial state? You make a measurement, it has a value, that means, you have obtained a state with the corresponding eigenstate as the wave function. Without collapse there would be no method of state preparation in quantum theory.

vanhees71 said:
Either you believe in the existence of a collapse or Einstein causality and locality.
Anyway, to believe in Einstein causality is nonsensical (causality with Reichenbach's principle of common cause is not causality, one could name correlaity or so. But if you accept common cause, then you need FTL causal influences to explain the violations of Bell's inequalities. So, the collapse is not important for this at all, the violation of Bell's inequality is the point, and this point is quite close to loophole-free experimental validation.

vanhees71 said:
The most successful model ever, the Standard Model of elementary particle physics, obeys both.
No, it obeys only what I have named "correlaity" - instead of claims about causality it contains only claims about correlations.
 
  • #70
vanhees71 said:
But where is anything "collapsing" here.

Well, it has to do with whether you think that my Description 1 and Description 2 are objective facts about the world, or whether they are just states of somebody's knowledge. If they are objective facts about the world, then there is a physical transition from the situation described by Description 1 to the situation described by Description 2.

On the other hand, if Description 1 and Description 2 are not objective facts about the world, then that raises other thorny questions. What IS an objective fact about the world? If you say it's only the results of measurements, then that's a little weird, because a measurement is just a macroscopic interaction. Why should facts about macroscopic states be objective if facts about microscopic states are not?

A measures her photon's polarization. If she find it to be polarized in A-direction, she considers what's the probability that B finds his photon to be polarized in B-direction; if her photon is absorbed, she doesn't consider anything further, i.e., she uses a sub-ensemble of the complete ensemble originally prepared, which is another preparation procedure

It seems to me that this business of "preparing a sub-ensemble" is equivalent to invoking collapse.
 
  • #71
vanhees71 said:
What is a "non-physical collapse"? Either there is something collapsing in the real world when a measurement is made or not!

I'm not sure who you are responding to, but in classical probability, there is a situation analogous to quantum entanglement, and something analogous to collapse, but it's clearly NOT physical. I have a pair of shoes, and randomly select one to put in a box and send to Alice, and another one to put in a box to send to Bob. Before Alice opens her box, she would describe the situation as "There is a 50/50 chance of my getting a left shoe or a right shoe. There is also a 50/50 chance of Bob getting either shoe." After opening the box and finding a left shoe, she would describe the situation as "I definitely have the left shoe, and Bob definitely has the right shoe". So, the probability distribution "collapses" when she opens the box.

But that's clearly not physical. The box contained a left shoe before she opened it, she just didn't know it. So the probabilities reflect her knowledge, not the state of the world.

In an EPR-type experiment, the analogous explanation would be that the photon was polarized at angle A before Alice detected it, she just didn't know it. But that interpretation of what's going on is contradicted by Bell's theorem. To me, talking about "ensembles" and "filtering a sub-ensemble" is another way of talking about hidden variables, so it seems equally inconsistent with Bell's theorem.
 
  • #72
vanhees71 said:
What is a "non-physical collapse"? Either there is something collapsing in the real world when a measurement is made or not! In my opinion there's not the slightest evidence for anything collapsing when we observe something.
The wavefunction is what is collapsing in the usual account, but unless you follow an interpretation with collapse that considers wavefunctions as physical entities, you probably see wavefunctions just as mathematical tools, mathematical tools don't "collapse" in any real world sense. So you are left with the concept of non-physical collpse, just non-unitary evolution(you talked about it in #29, remember?) that is fully compatible with microcausality((anti)commutation of spacelike fields).
 
  • #73
stevendaryl said:
Saying that "the photons are entangled" is just a description of the initial state of the two photons (my Description 1) above. "Collapse" is about the transition from Description 1 to Description 2.
Yes, and thus it's not a physical process, named collapse, but the mere adaption of the state by A due to information gained by the outcome of her measurement on her photon. It's epistemic not ontological to put it in this philosophical language (which I personally don't like very much, because it's not very sharply defined).
 
  • #74
Ilja said:
How do you obtain a wave function as the initial state? You make a measurement, it has a value, that means, you have obtained a state with the corresponding eigenstate as the wave function. Without collapse there would be no method of state preparation in quantum theory.
I associate the initial state (not wave function, because there's no sensible description of photons as wave functions) to the system under consideration due to the preparation procedure. I don't need a collapse but a laser and an appropriate birefringent crystal for parametric down conversion. Of course, there's a filtering involved to filter out the entangled photon pairs.

I don't know of any paper deriving this photon-pair production process from first principles. It's of course experimental evidence ensuring that you prepare these states. For the effective theory describing it see the classical paper

Hong, C. K., Mandel, L.: Theory of parametric frequency down conversion of light, Phys. Rev. A 31, 2409, 1985
http://dx.doi.org/10.1103/PhysRevA.31.2409
 
  • #75
vanhees71 said:
Yes, and thus it's not a physical process, named collapse, but the mere adaption of the state by A due to information gained by the outcome of her measurement on her photon. It's epistemic not ontological to put it in this philosophical language (which I personally don't like very much, because it's not very sharply defined).

I would say that it's definitely NOT that. I suppose there are different interpretations possible, but the way I read Bell's theorem is that the purely epistemic interpretation of the wave function is not viable.

Once again, I want to point out what are the implications of the claim the updating is purely epistemic. Again, we assume that both Alice and Bob have their filters oriented at the same angle, A. We ask what Alice knows about the state of Bob's photon. Immediately before measuring her photon's polarization, the most that Alice knows is: "There is a 50/50 chance that Bob's photon has polarization A". Immediately afterward, she knows "There is a100% chance that Bob's photon has polarization A".

It seems to me that if you want to say that the change is purely epistemic, then that means that the state of Bob's photon wasn't changed by Alice's measurement, only Alice's information about it changed. Okay, that's fine. But let's go through the reasoning here:
  1. After Alice's measurement, Bob's photon has definite polarization state A.
  2. Alice's measurement did not change the state of Bob's photon.
  3. Therefore, Bob's photon had definite polarization state A BEFORE Alice's measurement.
So it seems to me that assuming that measurements are purely epistemic implies that photons have definite (but unknown) polarizations even before they are measured. But that's a "hidden variables" theory of the type ruled out by Bell's theorem.
 
Last edited:
  • #76
stevendaryl said:
So it seems to me that assuming that measurements are purely epistemic implies that photons have definite (but unknown) polarizations even before they are measured. But that's a "hidden variables" theory of the type ruled out by Bell's theorem.
I disagree. What about hidden nonlocal influences?

The measurement of Alice make a local random choice of the direction, this choice is somehow transferred to Bob's particle which changes its hidden internal state correspondingly. This would be a non-local interaction in reality, of course, but not excluded by Bell's theorem. And the wave function could be, nonetheless, purely epistemic.
 
  • #77
Ilja said:
I disagree. What about hidden nonlocal influences?

Yes, you're right. I meant making the auxiliary assumption of locality.

The measurement of Alice make a local random choice of the direction, this choice is somehow transferred to Bob's particle which changes its hidden internal state correspondingly. This would be a non-local interaction in reality, of course, but not excluded by Bell's theorem. And the wave function could be, nonetheless, purely epistemic.
 
  • #78
Non-physical collapse the way I see it is equivalent to a version of decoherence that contrary to the usual account cannot be made reversible even in principle i.e. no possibility of combining system plus environment in any meaningful way, this is what an intrinsic cut in QM is, whether the cut is referring to system/apparatus, system/environment, microscopic/ macroscopic degrees of freedom in coarse-graining, probabilistic/deterministic evolution. This should be common to any interpretation that takes seriosly single measurements.
 
  • #79
TrickyDicky said:
Non-physical collapse the way I see it is equivalent to a version of decoherence that contrary to the usual account cannot be made reversible even in principle i.e. no possibility of combining system plus environment in any meaningful way, this is what an intrinsic cut in QM is, whether the cut is referring to system/apparatus, system/environment, microscopic/ macroscopic degrees of freedom in coarse-graining, probabilistic/deterministic evolution. This should be common to any interpretation that takes seriosly single measurements.

I disagree. The most detailed consideration of the measurement process which is known is that of de Broglie-Bohm theory. So, to say that it does not take measurements seriously would be unjust. But it does not have a cut.

It has an effective collapse, by putting in the trajectory of the measurement device into the wave function of device and system, which defines the effective wave function of the system. You can do this at every moment, before, after and during the measurement, and obtain a nice picture of a non-Schroedinger evolution for the collapsing effective wave function. But where to make the cut between device and system remains your free choice.
 
  • #80
stevendaryl said:
I would say that it's definitely NOT that. I suppose there are different interpretations possible, but the way I read Bell's theorem is that the purely epistemic interpretation of the wave function is not viable.

Once again, I want to point out what are the implications of the claim the updating is purely epistemic. Again, we assume that both Alice and Bob have their filters oriented at the same angle, A. We ask what Alice knows about the state of Bob's photon. Immediately before measuring her photon's polarization, the most that Alice knows is: "There is a 50/50 chance that Bob's photon has polarization A". Immediately afterward, she knows "There is a100% chance that Bob's photon has polarization A".

It seems to me that if you want to say that the change is purely epistemic, then that means that the state of Bob's photon wasn't changed by Alice's measurement, only Alice's information about it changed. Okay, that's fine. But let's go through the reasoning here:
  1. After Alice's measurement, Bob's photon has definite polarization state A.
  2. Alice's measurement did not change the state of Bob's photon.
  3. Therefore, Bob's photon had definite polarization state A BEFORE Alice's measurement.
So it seems to me that assuming that measurements are purely epistemic implies that photons have definite (but unknown) polarizations even before they are measured. But that's a "hidden variables" theory of the type ruled out by Bell's theorem.
No, that's not what's implied although the "change of state" due to A's measurement is in my opinion indeed purely epistemic. Before any measurement, both A and B have simply unpolarized photons, which however are known to be entangled due to the preparation procedure in an entangled biphoton. Let's write down the math, because that helps here. I simplify it (somewhat too much) by just noting the polarization states of the photons and let's simplify it to the case that both measure the polarization in the same direction.

So initially we have the two-photon polarization state
$$|\Psi_0 \rangle=\frac{1}{\sqrt{2}} (|H V \rangle-|VH \rangle).$$
The single photon states are given by tracing out the other photon respectively, and both Alice and Bob describe it by
$$\hat{\rho}_{\text{Alice}}=\frac{1}{2} \mathbb{1}, \quad \hat{\rho}_{\text{Bob}}=\frac{1}{2} \mathbb{1}.$$
Now we assume that Alice measures the polarization of her photon and find's that it is horizontally polarized and nothing happens with Bob's photon which may be detected very far away from Alice at a space-like distance so that, if you assume that QED microcausality holds (which I think is a very weak assumption given the great success of QED). Then the state after this measurement is described (for Alice!) by the pure two-photon polarization state (which I leave unrenormalized to store the probability for this to happen conveniently in the new representing state ket; the state itself is of course the ray):
$$|\Psi_1 \rangle = |H \rangle \langle H| \otimes \mathbb{1}|\Psi_0 \rangle=\frac{1}{\sqrt{2}} |H V \rangle.$$
This, of course, happens with the probability
$$\|\Psi_1 \|^2=1/2,$$
which was already clear from the reduced state for A's single photon derived above.

Now, what Bob finds is with probability 1/2 H and with probability 1/2 V, because he cannot know (faster than allowed by the speed of light via communicating with Alice) what Alice has found. So Bob will still describe his single-photon's state as ##\hat{\rho}_{\text{Bob}}=1/2 \mathbb{1}##. Nothing has changed for Bob, and according to the usual understanding of relativistic causality he cannot know before measring his photon's polarization more about it if he doesn't exchange information about Alice's result, and this he can do (again using the standard interpretation of relativistic causality) only by exchanging some signal with Alice which he can get only with the speed of light and not quicker.

Alice knows after her measurement that Bob must find a vertically polarized photon, i.e., the conditional propability given Alice's result gives for 100% V polarization for Bob's photon. That this is true can be verified after exchanging the measurement protocols between Alice and Bob, given the fact that via precise timing it is possible to know which photons A and B measure belong to one biphoton. That's why Bob can "post-select" his photons by only considering the about 50% of photons where Alice found an H-polarized photon, and then finds 100% V-polarized ones.

This clearly shows that the notion of state is an epistemic one in this interpretation, because A and B describe the same situation with different states, depending on her knowledge. Note that there can never be contradictions between these two descriptions, because given that A doesn't know that B's photon is entangled in the way described by ##|\Psi_0 \rangle## A wouldn't ever be able to say that B would find V-polarization with 100% probability, if she has found H-polarization. This is what's stated in the linked-cluster theorem, and this of course holds for any local microcausal relativistic QFT. If on the other hand a theory obeying the linked-cluster theorem (which is the minimum assumption you must make to stay in accordance with usual relativistic causality) must necessarily be such a local microcausal relativistic QFT is not clear to me, and I've not seen any attempts to prove this (see Weinberg QT of Fields, vol. 1).

As a "minimal interpreter" I stay silent about the question, whether or not there is an influence of Alice's measurement on Bob's photon or not, as already mentioned by Ilja above. I only say this is the case in standard QED, which is by construction a local microcausal relativistic QFT. If there is an extension to QT where you can describe non-local interactions in the sense of a non-local deterministic hidden-variable theory that is consistent with the relativistic space-time structure, I don't know, at least I've not seen any convincing yet in the published literature. But what's for sure a "naive" instantaneous collapse assumption is for sure at odds with the relativistic space-time description and it is, as the above argument (hopefully convincingly) shows, not necessary to understand the probabilistic outcomes according to QT.
 
  • #81
vanhees71 said:
As a "minimal interpreter" I stay silent about the question, whether or not there is an influence of Alice's measurement on Bob's photon or not, as already mentioned by Ilja above. I only say this is the case in standard QED, which is by construction a local microcausal relativistic QFT. If there is an extension to QT where you can describe non-local interactions in the sense of a non-local deterministic hidden-variable theory that is consistent with the relativistic space-time structure, I don't know, at least I've not seen any convincing yet in the published literature. But what's for sure a "naive" instantaneous collapse assumption is for sure at odds with the relativistic space-time description and it is, as the above argument (hopefully convincingly) shows, not necessary to understand the probabilistic outcomes according to QT.

It has nothing to do with a minimal interpretation. Any relativistic QFT is not consistent with the classical meaning of "relativistic space-time structure" or "Einstein causality". Relativistic QFT does not allow faster than light signalling of classical information, and that is the meaning of "local microcausal relativistic QFT".
 
Last edited:
  • Like
Likes TrickyDicky and Ilja
  • #82
Microcausality means that local observables commute at space-like distances, and this implies that there is no action at a distance on the quantum level. In our example, the local interaction of A's photon with her polarizer doesn't affect instantaneously Bob's photon.
 
  • #83
vanhees71 said:
Microcausality means that local observables commute at space-like distances, and this implies that there is no action at a distance on the quantum level. In our example, the local interaction of A's photon with her polarizer doesn't affect instantaneously Bob's photon.
No, QFT simply does not tell us anything about this question.

QFT in the minimal interpretation is not a realistic theory, thus, does not make claims that the polarizer affects instantaneously Bob's photon, but also no claims that it doesn't.
 
  • #84
According to standard QFT the Hamilton density commutes with any other local observable at any spacelike separated argument. Thus a local interaction doesn't affect any observable instantaneously (or at speeds faster than light).

Whether or not there are non-local deterministic theories (which I think is what's meant by "realistic theories" by philosophers) which are as successful as standard QFT, I don't know.
 
  • Like
Likes TrickyDicky
  • #85
vanhees71 said:
Now, what Bob finds is with probability 1/2 H and with probability 1/2 V, because he cannot know (faster than allowed by the speed of light via communicating with Alice) what Alice has found. So Bob will still describe his single-photon's state as ##\hat{\rho}_{\text{Bob}}=1/2 \mathbb{1}##. Nothing has changed for Bob,

Okay, but Alice knows what result will get, with 100% certainty, before Bob makes the measurement. So, from her point of view, Bob's information is incomplete. The more complete story is that he will definitely get the same polarization as Alice (assuming their filters are aligned).

So if there is such a thing as "the objective state of Bob's photon", then that state is NOT 50/50 chance of passing Bob's filter.

You could deny that there is such a thing as the state of Bob's photon. But that's pretty weird, too. Alice can certainly reason as if Bob's photon is in a definite state of polarization, and that reasoning gives correct results.

I don't see how it makes sense to say that Alice's updating is purely epistemic.
 
  • #86
But that's very common in probability theory. It's just Bayes formula for conditional probability. This is not more mysterious in QT than in any "classical" probabilistic description.

This example for me makes it very clear that it's purely epistemic, because Alice's measurement updates her information and thus she changes her description of the state of the system. Bob doesn't have this information and thus stays with the description he assigns to the situation due to his knowledge. Physically nothing has changed for his photon by Alice's measurement. So the association of the state is determined by the preparation procedure and can vary for Alice and Bob due to the different information available to them about this system. This for me clearly shows the epistemic character of probabilistic descriptions (not restricted to QT; the difference between QT and classical probabilistic models are the strong correlations described by entangled states, which are stronger than ever possible in classical deterministic local theories, as shown by Bell).
 
  • #87
stevendaryl said:
Okay, but Alice knows what result will get, with 100% certainty, before Bob makes the measurement. So, from her point of view, Bob's information is incomplete. The more complete story is that he will definitely get the same polarization as Alice (assuming their filters are aligned).

So if there is such a thing as "the objective state of Bob's photon", then that state is NOT 50/50 chance of passing Bob's filter.

You could deny that there is such a thing as the state of Bob's photon. But that's pretty weird, too. Alice can certainly reason as if Bob's photon is in a definite state of polarization, and that reasoning gives correct results.

I don't see how it makes sense to say that Alice's updating is purely epistemic.

Just a little expansion on this:

After Alice measures her photon to be polarized horizontally, she would describe Bob's photon as being in the PURE state |H\rangle. As you say, Bob would describe his own photon as being in the mixed state \rho = \frac{1}{2}(|H\rangle\langle H| + |V\rangle\langle V|). But this disagreement is completely explained by saying that Bob's photon is REALLY in state |H\rangle\langle H|, he just doesn't know it. Density matrices reflect both quantum superpositions and classical uncertainty (due to lack of information).

You (vanhees71) say that Bob's photon is still in the state \rho = \frac{1}{2}(|H\rangle\langle H| + |V\rangle\langle V|), even after Alice finds her photon to be horizontally-polarized. That doesn't make sense to me. There is zero probability of Bob detecting polarization V, while his density matrix would say it's 1/2. His density matrix is wrong (or is less informative than the one Alice is using for Bob's photon).
 
Last edited:
  • #88
Ilja said:
I disagree. The most detailed consideration of the measurement process which is known is that of de Broglie-Bohm theory. So, to say that it does not take measurements seriously would be unjust. But it does not have a cut.

It has an effective collapse, by putting in the trajectory of the measurement device into the wave function of device and system, which defines the effective wave function of the system. You can do this at every moment, before, after and during the measurement, and obtain a nice picture of a non-Schroedinger evolution for the collapsing effective wave function. But where to make the cut between device and system remains your free choice.
It does have a cut in the sense I commented above, pilot wave/particle trajectories.
 
  • #89
vanhees71 said:
But that's very common in probability theory. It's just Bayes formula for conditional probability. This is not more mysterious in QT than in any "classical" probabilistic description.

No, it's not the same. In classical probability, there is a distinction between what is true and what my knowledge of the truth is. Someone randomly puts a left shoe into one box and a right shoe into the other box. One box is sent to Alice, and the other box is sent to Bob. When Alice opens her box, she finds a left shoe. She updates her epistemic probabilities for Bob's box to be 100% chance of a right shoe. There's clearly no nonlocal influence going on. HOWEVER, Alice knows that Bob actually had a right shoe BEFORE she opened the box. She just didn't know it until she opened her box.

In the EPR experiment, Alice finds out that Bob's photon has polarization H. If it's purely epistemic updating of Alice's information, that means that Bob's photon had polarization H BEFORE she measured her photon.

You can't have it both ways. If it's purely epistemic, then the objective state of the photon cannot be changed by Alice's updating. If the objective state after updating is H, then it must have been H beforehand. I don't see how it could be otherwise.

I guess you could say that the state H that Alice deduces for Bob's photon isn't objective, it's subjective, for Alice only. But that's hard to maintain. Would you then say that the polarization state of a photon is NEVER objective?

As Einstein, Rosen and Podolsky said, if you can predict the result of a future measurement with 100% accuracy, it sure seems like it's something objective.
 
  • #90
vanhees71 said:
Microcausality means that local observables commute at space-like distances, and this implies that there is no action at a distance on the quantum level. In our example, the local interaction of A's photon with her polarizer doesn't affect instantaneously Bob's photon.

No, that is wrong (well, this particular quote is ambiguous, but I'm taking it in the context of your earlier remarks on EPR). The commutation of spacelike-separated observables says that there is no faster than light transfer of classical information. That is a different issue from Einstein causality, which means that the nonlocal correlations are entirely explained by each event having a cause in its past light cone. Relativistic quantum field theory means that Einstein causality is either empty or false.
 
  • #91
I think vanhees' point is that the statistical information about the 100% correlation is already contained in the quantum state ##\left|\Psi\right>=\left|HV\right>-\left|HV\right>## and one doesn't need to collapse it to extract that information. ##\left<HH|\Psi\right>=\left<VV|\Psi\right>=0## (and so on). We just prepare the state ##\left|\Psi\right>## and repeat the experiment a thousand times and the statistics will agree with the QM predictions.

Also, I think we should be careful with the words correlation and causation. QM predicts non-local correlation. That's different from non-local causation. Correlation doesn't imply causation, even in the case of 100% correlation. This is just a logical leap that cannot be made. It is also true that "the sun will rise tomorrow" will be 100% correlated with "humans have two legs" for example, but that doesn't mean that one causes the other or that there is a common cause.
 
  • #92
rubi said:
I think vanhees' point is that the statistical information about the 100% correlation is already contained in the quantum state ##\left|\Psi\right>=\left|HV\right>-\left|HV\right>## and one doesn't need to collapse it to extract that information. ##\left<HH|\Psi\right>=\left<VV|\Psi\right>=0## (and so on). We just prepare the state ##\left|\Psi\right>## and repeat the experiment a thousand times and the statistics will agree with the QM predictions.

Also, I think we should be careful with the words correlation and causation. QM predicts non-local correlation. That's different from non-local causation. Correlation doesn't imply causation, even in the case of 100% correlation. This is just a logical leap that cannot be made. It is also true that "the sun will rise tomorrow" will be 100% correlated with "humans have two legs" for example, but that doesn't mean that one causes the other or that there is a common cause.

The part about vanhees71's point that is wrong is that he is using the EPR objection to collapse. EPR's version of causality is not consistent with quantum field theory.

Furthermore, as long as we use the Schroedinger picture, the collapse is how we extract the information that is contained in the wave function in order to predict the nonlocal correlations.

Of course, one doesn't have to accept the wave function or the collapse as real, so one may say that Einstein causality is empty in quantum field theory. If one accepts the wave function and collapse as real, then Einstein causality is violated. There is no choice of saying that quantum field theory fulfills Einstein causality.
 
Last edited:
  • #93
vanhees71 said:
Whether or not there are non-local deterministic theories (which I think is what's meant by "realistic theories" by philosophers) which are as successful as standard QFT, I don't know.
First, no, classical stochastic theories are also realistic, and Nelsonian stochastics is an example. Then, the first example of a local deterministic theory for the EM field was given already in Bohm's original paper. And, given that such theories are, as interpretations of QT in the particular domain, equivalent to QT in this domain, there is no difference in success betwenn a QFT and a QFT in a realistic interpretation.
 
  • #94
atyy said:
[1)]The part about vanhees71's point that is wrong is that he is using the EPR objection to collapse. EPR's version of causality is not consistent with quantum field theory.

[2)] Furthermore, as long as we use the Schroedinger picture, the collapse is how we extract the information that is contained in the wave function in order to predict the nonlocal correlations.

[3)] Of course, one doesn't have to accept the wave function or the collapse as real, so one may say that Einstein causality is empty in quantum field theory. If one accepts the wave function and collapse as real, then Einstein causality is violated. There is no choice of saying that quantum field theory fulfills Einstein causality.

Ad 1) What's wrong?

Ad 2) You cannot argue with a specific picture of time evolution, because all are equivalent (modulo mathematical quibbles a la Haag's theorem ;-)).

Ad 3) This I don't understand. The usual local microcausal QFTs are precisely constructed such that they fulfill Einstein causality (among other things it makes the S-matrix with its time-ordered products of field operators manifestly covariant wrt. special orthochronous Poincare transformations). Last but not least, if the collapse isn't considered real, it's just a sloppy abbreviation for what the minimal interpretation states more carefully, and there's nothing to argue about it anymore. Then all our debates are pretty empty ;-).
 
  • #95
Ilja said:
First, no, classical stochastic theories are also realistic, and Nelsonian stochastics is an example. Then, the first example of a local deterministic theory for the EM field was given already in Bohm's original paper. And, given that such theories are, as interpretations of QT in the particular domain, equivalent to QT in this domain, there is no difference in success betwenn a QFT and a QFT in a realistic interpretation.
Ok, as you well know, I don't understand what philosophers mean by "realistic", particularly as it seems as if there are as many notions of this word as there are philosophers. Then, if everything is "solved" with Bohm's original paper, why is then always stated, also by followers of the Bohmian interpretation, that there are problems with Bohm and relativistic QFT?
 
  • #96
stevendaryl said:
No, it's not the same. In classical probability, there is a distinction between what is true and what my knowledge of the truth is. Someone randomly puts a left shoe into one box and a right shoe into the other box. One box is sent to Alice, and the other box is sent to Bob. When Alice opens her box, she finds a left shoe. She updates her epistemic probabilities for Bob's box to be 100% chance of a right shoe. There's clearly no nonlocal influence going on. HOWEVER, Alice knows that Bob actually had a right shoe BEFORE she opened the box. She just didn't know it until she opened her box.
But in the quantum case A also knew beforehand that the two photons in the entangled state are in this entangled state. That is as good as in the classical example. The only difference is that in classical physics you can't have such correlations. It's clear that the single-photon polarizations are completely indetermined before A's measurement according to standard QT, while the single-shoe states in the classical example are always definite, but there's no difference concerning a collapse between the two ensembles. In both cases the probabilities describe the knowledge of the observers about the sytem, and that's adapted after new information is gained.
 
  • #97
atyy said:
The part about vanhees71's point that is wrong is that he is using the EPR objection to collapse. EPR's version of causality is not consistent with quantum field theory.

Furthermore, as long as we use the Schroedinger picture, the collapse is how we extract the information that is contained in the wave function in order to predict the nonlocal correlations.

Of course, one doesn't have to accept the wave function or the collapse as real, so one may say that Einstein causality is empty in quantum field theory. If one accepts the wave function and collapse as real, then Einstein causality is violated. There is no choice of saying that quantum field theory fulfills Einstein causality.
As I see it, the problem is the following: We have a state ##\left|\Psi\right> = \left|HV\right>-\left|VH\right>##. This state contains all information that is obtained in an EPR experiment, so a collapse is not necessary. The collapse is not needed to explain the results of an EPR experiment. However, we also know that if we measure any of the same photons again, we will not get the same correlations again. Therefore, after the measurement, the state cannot be ##\left|\Psi\right>## anymore, but needs to be something different. This is the real reason for why we usually assume that the system has collapsed into ##\left|HV\right>## or ##\left|VH\right>## and this would indeed be a non-local interaction. However, it doesn't need to be so. There is another option that is only available if we are willing to include the measurement devices into the description: The local interaction with the measurement device could have made the correlations spill over into some atoms of the measurement device, so the correlations are still there, but not easily accessible. One only needs local interactions for this to happen. I'm convinced that if we could ever control all the degrees of freedom of the measurement apparata, we could recover the information about correlations. It's basically analogous to the quantum eraser.
 
  • #98
vanhees71 said:
The usual local microcausal QFTs are precisely constructed such that they fulfill Einstein causality (among other things it makes the S-matrix with its time-ordered products of field operators manifestly covariant wrt. special orthochronous Poincare transformations).
No, it does not care at all about Einstein causality - this would require to care about the EPR argument - it cares only about correlations.

vanhees71 said:
Last but not least, if the collapse isn't considered real, it's just a sloppy abbreviation for what the minimal interpretation states more carefully, and there's nothing to argue about it anymore. Then all our debates are pretty empty ;-).
"The collapse" is, of course, not the point, the point which proves nonlocality is the violation of Bell's inequality. And this violation exists in QFT too, and that means, QFT is not compatible with the EPR criterion of reality, thus, with Einstein's understanding of causality.
 
  • #99
vanhees71 said:
Ok, as you well know, I don't understand what philosophers mean by "realistic", particularly as it seems as if there are as many notions of this word as there are philosophers. Then, if everything is "solved" with Bohm's original paper, why is then always stated, also by followers of the Bohmian interpretation, that there are problems with Bohm and relativistic QFT?
I said EM theory is presented in Bohm's original paper, not that everything is solved in that paper.

Then, the problem is, of course, that dBB theory requires a preferred frame. The interpretation of relativity we are forbidden to talk about does not have a problem with this, but to talk about it is forbidden not only here, so some people indeed think this is a problem.

The first proposal for fermion fields I know about is from Bell, in my paper http://arxiv.org/abs/0908.0591 I obtain equations for a pair of Dirac fermions from those of a scalar field with broken symmetry, which reduces Bohmian versions of fermions (as long as they appear in pairs) to the unproblematic case of scalar fields, which can use the same scheme used by Bohm for the EM field. For gauge fields, one should not use the Gupta-Bleuer approach with an indefinite Hilbert space, but the older Fermi-Dirac one, what remains is unproblematic too. And, even if some part of it would be problematic, there is a less beautiful but possible variant where only for a part of the degrees of freedom one has a dBB trajectory.
 
  • #100
vanhees71 said:
Ad 1) What's wrong?

Ad 2) You cannot argue with a specific picture of time evolution, because all are equivalent (modulo mathematical quibbles a la Haag's theorem ;-)).

Ad 3) This I don't understand. The usual local microcausal QFTs are precisely constructed such that they fulfill Einstein causality (among other things it makes the S-matrix with its time-ordered products of field operators manifestly covariant wrt. special orthochronous Poincare transformations). Last but not least, if the collapse isn't considered real, it's just a sloppy abbreviation for what the minimal interpretation states more carefully, and there's nothing to argue about it anymore. Then all our debates are pretty empty ;-).

Yes, of course one does not privilege a particular picture of time evolution. However, one also cannot disallow it. So if one allows the Schroedinger picture, there is collapse.

The place I think you are wrong is that Einstein causality is not the causality that is fulfilled by explicit construction in quantum field theory. Here by Einstein causality, I mean the causality in EPR and in classical special relativity, in which the cause of an event is in its past light cone - I am using this definition of Einstein causality because I think this is what you are using by bringing up EPR. The "causality" in quantum field theory is a different thing from Einstein causality - it forbids faster-than-light transfer of classical information. So your mistake is that you are confusing two types of causality - signal causality (which is present in relativistic QFT) and Einstein causality (like EPR, which is not present in relativistic QFT).
 
Back
Top