Undergrad Murray Gell-Mann on Entanglement

  • Thread starter Thread starter Thecla
  • Start date Start date
  • Tags Tags
    Entanglement
Click For Summary
Murray Gell-Mann discusses quantum entanglement, emphasizing that measuring one photon does not affect the other, a statement that aligns with many physicists' views but remains interpretation-dependent. The conversation highlights the complexity of defining "doing something" in the context of entanglement and measurement. While some argue that measurement collapses the wave function of both photons, others assert that this does not imply a causal effect between them. The discussion also touches on the implications of non-locality and hidden variables, with differing opinions on whether Gell-Mann's interpretation adequately addresses the nuances of quantum mechanics. Overall, the debate reflects ongoing complexities in understanding quantum entanglement and measurement.
  • #331
Demystifier said:
But why then Ballentine made a wrong prediction about the quantum Zeno effect? Is it just a little mistake that can happen to everyone? Or is it a deep disagreement with the others?
Yes, I think that's simply a mistake. Why from the minimal interpretation one should deny the quantum Zeno effect is not clear to me.
 
  • Like
Likes Demystifier
Physics news on Phys.org
  • #332
Did Ballentine actually change his mind though? From the last time I read wiki, it seemed he was still arguing against the Zeno effect, so there's an impression something is at stake.
 
  • #333
stevendaryl said:
I don't understand how your remarks address what I said. For a microscopic system, the "state" is meaningful in two ways: (1) the preparation procedure needed to put the system in that state, and (2) the probabilities that state gives for future measurements. But for a macroscopic system, there is a notion of state that doesn't have either of those features. A macroscopic system simply is in some definite state or another.
The macroscopic observables, which are an average over a vast amount of microscopic states observables ##^*##, appear classical. Of course, on the microscopic level a macroscopic system is described by QT. There's no contradiction between these too levels of description.

##^*## corrected due to the hint in #340
 
Last edited:
  • #334
ddd123 said:
Did Ballentine actually change his mind though? From the last time I read wiki, it seemed he was still arguing against the Zeno effect, so there's an impression something is at stake.
Can you share the link to that wiki?
 
  • #336
vanhees71 said:
The macroscopic observables, which are an average over a vast amount of microscopic states, appear classical. Of course, on the microscopic level a macroscopic system is described by QT. There's no contradiction between these too levels of description.

Claiming it doesn't make it so. If each of the microscopic states is only meaningful in that it makes predictions for future measurements, then how does a macroscopic state have meaning that doesn't involve future measurements?
 
  • #337
stevendaryl said:
Claiming it doesn't make it so. If each of the microscopic states is only meaningful in that it makes predictions for future measurements, then how does a macroscopic state have meaning that doesn't involve future measurements?

More specifically: Why does an average over a vast number of microscopic states, each of which only meaning in terms of future measurements, produce a macroscopic value that has meaning independent of measurements? That seems like an outlandishly improbable claim. That doesn't make it false, but it shouldn't be a default assumption without further argument supporting it.
 
  • #338
ddd123 said:
Ok, I cannot see a problem with [46]. Indeed there's no "collapse", but just the interaction between the atom (simplified to a three-level toy model) and the RF field that causes the "quantum Zeno effect". So, of course, Ballentine is not denying the measured facts.
 
  • #339
stevendaryl said:
the state of a system is only meaningful in predicting probabilities for what a measuring device would measure is not true for macroscopic systems.
The (always mixed) state of a measurement device indeed fully determines the measurement reading to a very high accuracy, not only probabilistically. That's how measurement devices are made.
 
  • #340
vanhees71 said:
The macroscopic observables, which are an average over a vast amount of microscopic states
over a vast amount of microscopic observables, not states!
 
  • Like
Likes vanhees71
  • #341
A. Neumaier said:
The (always mixed) state of a measurement device indeed fully determines the measurement reading to a very high accuracy, not only probabilistically. That's how measurement devices are made.

I think you misunderstood what I said. There are two states involved here: the state of the measuring device, and the state of the system being measured. The first has a meaning that does not depend on measurements by yet other measuring devices.
 
  • #342
Macroscopic state variables such as the position of the center of mass of a macroscopic object have two features that are different from microscopic state variables: (1) There are no observed interference effects between different states, and (2) they have a small standard deviation (relative to the appropriate scale for the variable; for example, the standard deviation for the position of a brick is typically small compared to the size of the brick). Decoherence explains the first effect, but not the second. Pure quantum mechanics in the minimal interpretation cannot explain why macroscopic state variables have definite (up to a small standard deviation) values.

Bohmian mechanics halfway explains it. According to that interpretation, all objects have definite positions at all times. However, in Bohmian mechanics, the state, or wave function, evolves smoothly at all times, so in those cases where quantum mechanics would predict a large standard deviation, Bohmian gives (or seems to--maybe I'm misunderstanding something) schizophrenic results: The macroscopic object such as a brick is well-localized, since each of its constituent particles is well-localized. On the other hand, the standard deviation, as computed using the wave function, may still be quite large.

Many-worlds attempts (and I'm not sure how successful it is) to say that even though a macroscopic object can have a large standard deviations for its position, that is unobservable. Rather than "seeing" a brick with a large standard deviation, the state of the world splits into different branches, each of which sees the brick as localized.
 
  • #343
stevendaryl said:
I think you misunderstood what I said. There are two states involved here: the state of the measuring device, and the state of the system being measured. The first has a meaning that does not depend on measurements by yet other measuring devices.
Well, your formulation invited the misunderstanding. Anyway, whether the state of a single electron has a meaning at all is one of the controversial points in the foundations. Generally agreed is only that an ensemble of many equally prepared electrons has a state. And this automatically leads to a probabilistic framework.
 
  • Like
Likes vanhees71
  • #345
A. Neumaier said:
over a vast amount of microscopic observables, not states!
true! I've corrected it.
 
  • #346
  • #347
stevendaryl said:
That's just incorrect. The law of large numbers is not sufficient to explain this effect. You are mistaken.

I think that this might be an insurmountable obstacle to reaching a conclusion, because to me, your [A. Neumaier's] efforts to prove that macroscopic objects have definite positions (give or take a small standard deviation) is assuming your conclusion. It's circular reasoning. You want to launch into the use of density matrices of a particular form that only make sense under the assumption that you're trying to prove.

On the other side, I think I could demonstrate definitely that you are wrong by considering the pure state of an isolated system that includes macroscopic objects. You would refuse to even look at such an argument, because you insist that macroscopic can't have pure states.

So that's an impasse. You reject out of hand the reasoning that would prove you wrong, and I find your reasoning to be circular.
 
  • #348
But could we not consider a variant of the cat paradox where a brick sits on a trap door, and falls its full height if a nucleus decays? Then decoherence would make it such that we never observe the brick in a superposition, but the two possibilities do still occur in experiments, so we do get a large standard deviation in the brick's location.
 
  • #349
stevendaryl said:
You want to launch into the use of density matrices of a particular form that only make sense under the assumption that you're trying to prove.
It is legitimate to start with different basic assumptions on which to erect the edifice of quantum mechanics. The only condition is that the basic assumptions are consistent with experiment. Everything else is a matter of choice, and the quality of the choice is measured by the conclusions one can draw from it and how well they fit the real world.

You start with the traditional textbook assumptions and get into all the trouble with meaningless superpositions of macroscopic objects, for which nobody has been able to give a meaning in reality. Note that the superposition principle is already known to be inconsistent with physics as it leads to immediate contradiction with rotations when you superimpose a spin 0 and a spin 1/2 state. (Try to rotate by ##2\pi## and observe what happens to inner products of two arbitrary such superpositions.)

stevendaryl said:
I find your reasoning to be circular.

I start with the algebraic approach to quantum mechanics where quantities are functions of elements of a C^* algebra (e.g. the algebra of linear operators on a Schwartz space, which encodes Dirac's bra-ket setting) and states are positive linear operators - the natural analogue of what one has in classical stochastic physics. This is a far better starting point than the unrealistic textbook axioms used in introductory textbooks. Nothing is circular in this approach.

In the algebraic approach there is no superposition principle, and it naturally accounts for superselection sectors such as that for integral/half-integral spin. Moreover, it gives a far simpler approach to statistical mechanics compared to the standard approach. Finally, and most importantly, it leads to exactly the same predictions as the shut-up-and-calculate part of quantum mechanics and hence is a fully trustworthy foundation.

So my approach cannot be proved wrong, while the superposition principle is proved wrong by the existence of spin 1/2.
 
Last edited:
  • Like
Likes vanhees71
  • #350
stevendaryl said:
The law of large numbers is not sufficient to explain this effect.
If ##A_1,\ldots,A_N## are uncorrelated operators with the same standard deviation ##\sigma## then ##X:=N^{-1}(A_1+\ldots A_N)## has standard deviation ##N^{-1/2}\sigma##, as a simple calculation reveals. The arguments in statistical mechanics are similar, except that they account (in many important instances) for the typical correlations between the ##A_k##.

Please point out where the argument is faulty. If successful, all books on statistical mechanics must be rewritten to account for your revolutionary insight.
 
Last edited:
  • Like
Likes vanhees71
  • #351
If I may, I think stevendaryl's point is not that it is surprising that a million measurements of a brick's location will average to a number very likely to fall well within the brick, but rather, that the standard deviation of the individual measurements will themselves yield a distribution that is highly peaked thusly. My example was to show that this is not due to the way we measure the location of bricks, but rather, to the way we cull those measurements by correlating them against other information that we generally have access to macroscopically-- information we do not have access to and cannot cull by microscopically.
 
  • #352
To continue that point, what it means is that in any situation where measurements on bricks do give a wide standard deviation, we can always attribute that to a lack of complete information about the brick-- we can always imagine having "the lights on" in such a way that we can cull that broad distribution into subsets with much smaller standard deviations. That's just what we cannot do with electrons. So is it that bricks behave differently than electrons, or is the different behaviour our own-- we analyze the situation differently because we have access to richer information for the bricks, and we use that richer information to correlate the measurements and look at the standard deviations within those correlated subsets. When we have more information, we act differently, and there's the "cut" right there. This doesn't explain why the cut is there, why we get richer information about bricks than electrons, but it does show where the cut comes from-- it comes from how we think, how we process information, and what we mean by "everything it is possible to know about a system."
 
  • Like
Likes ddd123
  • #353
A. Neumaier said:
The second is explained by the law of large numbers and the standard procedures in statistical mechanics.
I don't get where you get large numbers. Do you take all the particles that make up the brick as an ensemble?
 
  • #354
zonde said:
I don't get where you get large numbers. Do you take all the particles that make up the brick as an ensemble?
Not as an ensemble - the ensemble is just a buzzword for the density operator, visualized with a popular - but in the macroscopic case, where statistical mechanics predicts properties of single systems such as a particular brick, misleading - picture of many repetitions.

The many particles appear instead in the sums that define the various macroscopic observables!
 
  • #355
If the many particles appear in those sums, then they are certainly not uncorrelated A operators. The brick is a solid object, those measurements have correlations (and consider the significance of that in my cat analog).
 
  • #356
Ken G said:
If the many particles appear in those sums, then they are certainly not uncorrelated A operators. The brick is a solid object, those measurements have correlations (and consider the significance of that in my cat analog).
That's why I included in my statement the phrase ''except that they account (in many important instances) for the typical correlations''. You need to do the real calculations (for the canonical ensemble, say) that accompany the derivation of the thermodynamic limit in every textbook to see that the correlations contribute to a lower order than the sum, so that the simple argument I gave for the uncorrelated case still remains qualitatively valid.

Without the law of large numbers there would be no thermodynamic limit, and thermodynamics would not be valid.
 
  • #357
So what I'm saying is, the "ensemble" concept is also applicable to macro systems, like a bunch of decks of cards that have all been shuffled. The only difference is, we have access to lots of other ways to get information about those various decks of cards, such that we can regard the situation as more than a density matrix if we do access that other information. Ironically, a card player does not have access to that information unless they cheat, so they do in fact treat a single deck exactly as though it were represented by a diagonal density matrix. We only encounter problems when we ask "but what is the deck really doing", or some such thing, but those questions are of no value to the card player-- they are really just errors in failing to track the differences in having information, versus not having information. We should stop thinking that we are talking about the systems, and simply recognize that we are always talking about our information about the system. After all, that is all the scientist ever uses. When you do that, ensembles and density matrices are exactly the same in quantum and classical theory, the latter are just decohered versions. So that's the first type that stevendaryl was talking about-- the second type is just an artifact of the different quality of the information we have access to classically. When we don't have access to that information to cull our results by, then we only get the decoherence type-- and we do get the large standard deviations.
 
  • #358
A. Neumaier said:
That's why I included in my statement the phrase ''except that they account (in many important instances) for the typical correlations''. You need to do the real calculations (for the canonical ensemble, say) that accompany the derivation of the thermodynamic limit in every textbook to see that the correlations contribute to a lower order than the sum, so that the simple argument I gave for the uncorrelated case still remains valid.
That cannot be true because it doesn't work for the cat analog-- that standard deviation is not small at all.
Without the law of large numbers there would be no thermodynamic limit, and thermodynamics would not be valid.
I certainly agree with that: thermodynamics needs the law of large numbers. But it needs much more: it needs the way we cull by the information we have.
 
  • #359
Ken G said:
we are always talking about our information about the system.
You may be always talking about your information about the system. But physics models the behavior of systems independent of anyone's information. The nuclear processes inside the sun happen in the way modeled by physics even though nobody ever looked into this inside.

From measurements, one can get information about the outside only. But the model is about the inside, and predicts both the inside and what it radiates to the outside.
 
  • #360
Ken G said:
That cannot be true because it doesn't work for the cat analog-- that standard deviation is not small at all.
The single cat is not an ensemble - it remains all the time macroscopic and its measurable aspects therefore have a small standard deviation.
 

Similar threads

Replies
5
Views
1K
  • · Replies 58 ·
2
Replies
58
Views
5K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 24 ·
Replies
24
Views
3K
  • · Replies 40 ·
2
Replies
40
Views
5K
  • · Replies 85 ·
3
Replies
85
Views
5K
  • · Replies 33 ·
2
Replies
33
Views
3K