Undergrad Murray Gell-Mann on Entanglement

  • Thread starter Thread starter Thecla
  • Start date Start date
  • Tags Tags
    Entanglement
Click For Summary
Murray Gell-Mann discusses quantum entanglement, emphasizing that measuring one photon does not affect the other, a statement that aligns with many physicists' views but remains interpretation-dependent. The conversation highlights the complexity of defining "doing something" in the context of entanglement and measurement. While some argue that measurement collapses the wave function of both photons, others assert that this does not imply a causal effect between them. The discussion also touches on the implications of non-locality and hidden variables, with differing opinions on whether Gell-Mann's interpretation adequately addresses the nuances of quantum mechanics. Overall, the debate reflects ongoing complexities in understanding quantum entanglement and measurement.
  • #271
atyy said:
So if the moon has a trajectory, there are hidden variables.
Again, it's not necessary for the moon to have a trajectory only for "being there". Strictly speaking the moon has no trajectory since nothing has an exact trajectory, because this contradicts the position-momentum uncertainty relation. As a macroscopic object in the sense of the classical approximation, of course, its center of mass has a trajectory (not easy to calculate, as already made Kepler crazy ;-)).

I don't believe in hidden variables, but of course, I cannot disprove their existence. Maybe after all nature is deterministic with non-local interactions, but we are not clever enough (yet?) to find an adequate theory of such a possibility and also no experiment to observe the hidden variables (yet?).
 
  • Like
Likes Demystifier
Physics news on Phys.org
  • #272
vanhees71 said:
Again, it's not necessary for the moon to have a trajectory only for "being there". Strictly speaking the moon has no trajectory since nothing has an exact trajectory, because this contradicts the position-momentum uncertainty relation. As a macroscopic object in the sense of the classical approximation, of course, its center of mass has a trajectory (not easy to calculate, as already made Kepler crazy ;-)).

I don't believe in hidden variables, but of course, I cannot disprove their existence. Maybe after all nature is deterministic with non-local interactions, but we are not clever enough (yet?) to find an adequate theory of such a possibility and also no experiment to observe the hidden variables (yet?).

The moon having a trajectory does not contradict the position-momentum uncertainty. It just means the x(t) exists, where x is the position of the particle.

If the moon being "there" does not mean it has a position, then what do you mean by "there"?
 
  • #273
But ##x(t)## doesn't exist in the classical sense, because it's uncertain anyway. Given you start with a pretty well located particle. Then it has a pretty unsharp momentum, and thus with time, also the position uncertainty grows. So there are only trajectories in a coarse-grained sense, i.e., not in the sense of accurate values ##x(t)## as in classical physics!
 
  • #274
vanhees71 said:
I don't believe in hidden variables, but of course, I cannot disprove their existence.
A hypothetic question: If you lived in time when Boltzmann had a theory that thermodynamics is a consequence of motion of atoms, while Mach argued against existence of atoms because there was no any direct experimental evidence for atoms, on whose side would you be at that time (given the evidence at that time)?
 
  • #275
vanhees71 said:
But ##x(t)## doesn't exist in the classical sense, because it's uncertain anyway. Given you start with a pretty well located particle. Then it has a pretty unsharp momentum, and thus with time, also the position uncertainty grows. So there are only trajectories in a coarse-grained sense, i.e., not in the sense of accurate values ##x(t)## as in classical physics!

But what are you coarse graining?
 
  • #276
atyy said:
If one believes the moon is there when one is not looking, then one should believe the quantum description is incomplete
This is a statement without any logical support.

It is enough to believe that the mass density (an expectation value computable in principle from the density operator of the solar system) is positive in the volume occupied by the moon. Since this a macroscopic observation, one can work to a very good approximation in the limit where Planck's constant is set to zero and the classical description is therefore valid. No discrepancy at all with quantum physics!

It only conflicts with the ridiculous view that the highly idealized axioms introduced in an introductory quantum mechanics course or textbook define quantum mechanics.
 
  • Like
Likes vanhees71
  • #277
Demystifier said:
A hypothetic question: If you lived in time when Boltzmann had a theory that thermodynamics is a consequence of motion of atoms, while Mach argued against existence of atoms because there was no any direct experimental evidence for atoms, on whose side would you be at that time (given the evidence at that time)?
That's difficult to say. As a physicist probably I'd have taken the side of Mach, although if I'd have been be a bit more open minded and taking the vast evidence of "atomism" from chemistry at the time, maybe I'd have taken Boltzmann's side since his model at least didn't contradict any evidence (and it was appealing mathematically).
 
  • Like
Likes Demystifier
  • #278
atyy said:
If the moon is always there, then the moon has a trajectory, so particles have trajectories.
The moon need only to have a mean trajectory, given by the expectation of the center of mass of the position operators of its atoms. Its standard deviation is far below the radius of the moon and hence negligible.
 
  • Like
Likes vanhees71
  • #279
vanhees71 said:
Given the state ##\hat{\rho}_{\sigma_x=1/2}=|\sigma_x=1/2 \rangle\langle \sigma_x=1/2 |## with ##|\sigma_x=1/2 \rangle=\frac{1}{\sqrt{2}} (|\sigma_z=1/2 \rangle+|\sigma_z=-1/2 \rangle)## means, according to minimally interpreted QT, with regard to a measurment of ##\sigma_z## not more and not less that you'll find with 50% probality up and with 50% probability down when measuring ##\sigma_z##. That's it.

Under what circumstances does an electron measure its own spin? Never, right? So it doesn't make any sense at all to say that an isolated electron has a 50% probability of being spin-up in the z-direction. What about a pair of electrons? When does one electron measure the spin of another electron? Never, right? So for a pair of electrons, probability doesn't make any sense.

Probability only makes sense for an interaction in which one of the subsystems is a macroscopic measuring device.
 
  • #280
vanhees71 said:
If I have prepared a photon with some (pretty well defined) momentum, then it's there due to this preparation procedure and it has a (pretty well defined) momentum, no matter whether I detect it or not. Maybe I'm again to naive to understand (and I've never understood this argument), why this is a problem at all.

To me, if you and your equipment are all described by the same physics as electrons and photons, etc., then to say that "I prepared things in such-and-such a way" means "Me and my equipment were put into such and such a macroscopic state". So there is a notion of "state" for macroscopic objects that does not depend on yet another system to prepare them in that state. They can put themselves into a particular state. But you're saying that for an electron, or a photon, or any microscopic system, the only notion of state is a preparation procedure by a macroscopic system. That seems incoherent to me. At best, it's a heuristic, but it can't possibly be an accurate description of what's going on. If macroscopic systems have properties without being observed, then why can't microscopic systems?
 
  • Like
Likes ddd123
  • #281
A. Neumaier said:
The moon need only to have a mean trajectory, given by the expectation of the center of mass of the position operators of its atoms. Its standard deviation is far below the radius of the moon and hence negligible.

Yes. If there were actually a proof that the laws of quantum mechanics implies that macroscopic objects have negligible standard deviation in their position, then there wouldn't be a measurement problem. But it doesn't seem to me that there could be such a proof. Imagine an isolated system consisting of an experimenter, a Stern-Gerlach device, and a source of electrons. The experimenter puts an electron into a state of spin-up in the x-direction, then later measures the spin in the z-direction. If it's spin-up, he goes to Rome, and if it's spin-down, he goes to Paris. It seems to me that the quantum mechanical evolution of the entire system would result in a 50% probability of the experimenter going to Rome, and a 50% probability of the experimenter going to Paris. The standard deviation of his position would be huge.
 
  • #282
stevendaryl said:
Yes. If there were actually a proof that the laws of quantum mechanics implies that macroscopic objects have negligible standard deviation in their position, then there wouldn't be a measurement problem.
For properly normalized extensive macroscopic properties (and this includes the center of mass operator), there is such a proof in many treatises of statistical mechanics. It is the quantum analogue of the system size expansion for classical stochastic processes. For example, see Theorem 9.3.3 and the subsequent discussion in my online book. But you can find similar statements in all books on stochastic physics where correlations are discussed in a thermodynamic context if you care to look, though usually for different, thermodynamically relevant variables.

This property (essentially a version of the law of large numbers) is indispensable for the thermodynamic limit that justifies thermodynamics microscopically, since in this limit all uncertainties disappear and classical thermodynamics and hydromechanics appear as effective theories.

The measurement problem appears only because people mistake the highly idealized von Neumann measurement (treated in introductory texts) - which applies only to very specific collapse-like measurements such as that of electron spin - for the general notion of a measurement, and therefore are lead to interpreting the reading from a macroscopic instrument in these terms, inventing for it a collapse that has no scientific basis.

And unfortunately, physics education is today so fragmentized and research so specialized that people working on resolving issues in the quantum foundations typically never had an in-depth education in statistical mechanics. As a consequence they believe that the textbook foundations are the real ones...

As for your thought experiment, the experimenter cannot travel if the system you describe is truly isolated. But once it is not isolated, your argument breaks down.
 
  • #283
stevendaryl said:
Imagine an isolated system consisting of an experimenter, a Stern-Gerlach device, and a source of electrons. The experimenter puts an electron into a state of spin-up in the x-direction, then later measures the spin in the z-direction. If it's spin-up, he goes to Rome, and if it's spin-down, he goes to Paris. It seems to me that the quantum mechanical evolution of the entire system would result in a 50% probability of the experimenter going to Rome, and a 50% probability of the experimenter going to Paris. The standard deviation of his position would be huge.
A. Neumaier said:
As for your thought experiment, the experimenter cannot travel if the system you describe is truly isolated. But once it is not isolated, your argument breaks down.
Yes, I think Arnold has a point here. The closest we come to an isolated system in this case is the Earth itself, and the experimenter going to Rome or Paris would not influence the Earth's center of gravity trajectory, nor its standard deviation.
 
  • #284
Demystifier said:
But one of the reasons it [QM] hasn't failed so far is because it remained agnostic on many interesting questions.

A. Neumaier said:
on many interesting questions that can be checked experimentally? What would be an example?

Demystifier said:
What orientation of the Stern-Gerlach apparatus will the experimentalist freely choose in the next experimental run. :biggrin:

That's not a good example! :biggrin:

The canonical example of an interesting question, which can be checked experimentally, which QM is agnostic on, is simply: what value will this measurement give? For instance consider a particle with definite z spin. When measured in x direction, it will be spin up or down, 50/50. QM is agnostic regarding which of these will happen. Indeed standard QM says it's impossible to predict; but that's an unprovable over-statement. You may think this is trivial, but it's not. It's the key difference between QM and classical.

Note that according to QM we could predict the result perfectly IF we had access to info outside the particle's past light cone. In particular, if we could access the next second of its future light cone. The typical Bell-type experimental situation is similar. If Bob had access to Alice's measurement, outside his past light cone, he could predict his own measurement (perfectly, if at the same angle). From this point of view we can say that the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.

Anyway Demystifier's statement is justified. A traditional classical physicist - such as Einstein - considers it "cheating" for QM to simply refuse to predict (one single) experimental result. If we ever come up with a new, deeper, theory that can do that, Demystifier's (and Einstein's) point would become obvious and accepted by all. Until then, it remains rather subtle and requires some cogitation to appreciate.
 
  • Like
Likes Demystifier
  • #285
secur said:
the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.
This is a misunderstanding. In classical relativistic physics, in order to completely predict results, info beyond the past light cone of the here-and-now is also required!
 
  • #286
A. Neumaier said:
As for your thought experiment, the experimenter cannot travel if the system you describe is truly isolated. But once it is not isolated, your argument breaks down.

We've been through this before, and it still doesn't make any sense to me. There is nothing in quantum mechanics that bounds the standard deviation of a variable such as position. A single electron can be in a superposition of being here, and being 1000 miles away. A single atom can be in such a superposition. A single molecule can be in such a superposition. There is nothing in quantum mechanics that says that a macroscopic object can't be in such a superposition.

Some people say that decoherence prevents such superpositions, but the way I understand decoherence, what it really does is to rapidly cause the superposition to spread, to eventually "infect" the entire causally connected universe.
 
  • #287
Heinera said:
Yes, I think Arnold has a point here. The closest we come to an isolated system in this case is the Earth itself, and the experimenter going to Rome or Paris would not influence the Earth's center of gravity trajectory, nor its standard deviation.

The only significance of being "isolated" is that isolation is needed to be able to talk about the state of a subsystem. Because of decoherence, if you tried to place a macroscopic object into a macroscopic superposition, the superposition would rapidly spread to the entire universe. So we can't actually analyze macroscopic superpositions unless (a la many-worlds) we are willing to consider the wave function of the entire universe.

But conceptually, we can imagine a composite system consisting of an electron plus a macroscopic measuring device. If the electron being spin-up results in the measuring device going into macroscopic state U, and the electron being spin-down results in the measuring device going into macroscopic state D, then the electron being in a superposition of spin-up and spin-down would result in the measuring device going into a superposition of those two states. That's a consequence of linearity.
 
  • Like
Likes eloheim
  • #288
Stevendaryl, to my understanding, decoherence is just the result of the reversibility of a system becoming extremely unlikely through chain interactions. The farther part the states you refer to are, the more difficult it is to maintain said reversibility.
 
  • Like
Likes eloheim
  • #289
Jilang said:
Stevendaryl, to my understanding, decoherence is just the result of the reversibility of a system becoming extremely unlikely through chain interactions. The farther part the states you refer to are, the more difficult it is to maintain said reversibility.

I agree with that. But decoherence figures into discussions about macroscopic simulations in the following way: Once decoherence happens, it becomes mathematically intractable to describe the quantum state as a superposition, so it is instead described as a mixed state. But my claim is that there is nothing in quantum mechanics that would then select a single alternative out of the set of possibilities described by that mixed state.
 
  • Like
Likes eloheim, secur and ddd123
  • #290
stevendaryl said:
Some people say that decoherence prevents such superpositions, but the way I understand decoherence, what it really does is to rapidly cause the superposition to spread, to eventually "infect" the entire causally connected universe.
No it can't. The more things that become 'infected' the higher the probabiity that one of them will interact and bring it all to an end.
The natural spread extent is very small in space and time.
 
  • #291
Isn't that where the Born rule comes into play? Doesn't it just select the appropriate one for the detector?
 
  • #292
Mentz114 said:
No it can't. The more things that become 'infected' the higher the probabiity that one of them will interact and bring it all to an end.
The natural spread extent is very small in space and time.

This sounds like GRW.

But I agree with stevendaryl on everything here, it's unclear where the pure superposition is supposed to end.
 
  • Like
Likes eloheim
  • #293
ddd123 said:
This sounds like GRW.
I'll look up GRW.
I was extending the viral anology. It probably won't work unless there are fewer interactions that multiply than those that fix.

..., it's unclear where the pure superposition is supposed to end.
I wish I knew. Is the 'end' even defined ?
 
  • #294
stevendaryl said:
But my claim is that there is nothing in quantum mechanics that would then select a single alternative out of the set of possibilities described by that mixed state.
If you allow dissipative sub-systems in QT then it is the initial conditions that decide the outcome.
 
  • #295
Mentz114 said:
I wish I knew. Is the 'end' even defined ?

In usual quantum theory, when you look at a measurement instrument's pointer it's pretty defined at that point :D but you have Avogadro's number like orders of magnitudes in between to narrow it down further.
 
  • #296
ddd123 said:
This sounds like GRW.
I looked up the Ghirardi-Rimini-Weber theory (GRW) and it is sort of similar to what I posted. Thanks for telling me about it.
 
  • #297
Mentz114 said:
If you allow dissipative sub-systems in QT then it is the initial conditions that decide the outcome.

Do they? That would seem to mean that if you are trying to measure the spin of an electron, then initial conditions in the measuring device determine the final measurement result. That's a kind of hidden-variable theory, except that the variable is not in the thing being measured, but in the thing doing the measurement.

I would think that that would cause problems for EPR. There, you produce a pair of correlated spin-1/2 particles. I don't see how initial conditions in the two distant measuring devices could conspire to always produce anti-correlated results.
 
  • #298
Mentz114 said:
No it can't. The more things that become 'infected' the higher the probabiity that one of them will interact and bring it all to an end.
The natural spread extent is very small in space and time.

An interaction doesn't reduce a superposition to a single value; it instead causes one subsystem that is in a superposition to cause a second subsystem to also be in a superposition. That's what I mean by the superposition spreading to infect the rest of the universe.
 
  • #299
Mentz114 said:
I looked up the Ghirardi-Rimini-Weber theory (GRW) and it is sort of similar to what I posted. Thanks for telling me about it.

But that theory isn't standard QM, it's a proposed alternative theory.
 
  • #300
secur said:
... the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.

A. Neumaier said:
This is a misunderstanding. In classical relativistic physics, in order to completely predict results, info beyond the past light cone of the here-and-now is also required!

I certainly thought that in classical relativistic physics the past light cone(s) of the objects in question (including the space, of course, with its curvature; and the stress-energy tensor) contain all info that could possibly affect the physics. And, theoretically perfect prediction is possible. (In fact given that the theory is local all you really need is "here-and-now" information - anything in contact - but that's not relevant at the moment). Can you please explain further?

[EDIT] assume there's only one inertial frame used for both observations and predictions ... I can't think of any other loopholes I might be missing
 
Last edited:

Similar threads

Replies
5
Views
1K
  • · Replies 58 ·
2
Replies
58
Views
5K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 40 ·
2
Replies
40
Views
5K
  • · Replies 85 ·
3
Replies
85
Views
5K
  • · Replies 33 ·
2
Replies
33
Views
3K