I Murray Gell-Mann on Entanglement

  • I
  • Thread starter Thread starter Thecla
  • Start date Start date
  • Tags Tags
    Entanglement
  • #351
If I may, I think stevendaryl's point is not that it is surprising that a million measurements of a brick's location will average to a number very likely to fall well within the brick, but rather, that the standard deviation of the individual measurements will themselves yield a distribution that is highly peaked thusly. My example was to show that this is not due to the way we measure the location of bricks, but rather, to the way we cull those measurements by correlating them against other information that we generally have access to macroscopically-- information we do not have access to and cannot cull by microscopically.
 
Physics news on Phys.org
  • #352
To continue that point, what it means is that in any situation where measurements on bricks do give a wide standard deviation, we can always attribute that to a lack of complete information about the brick-- we can always imagine having "the lights on" in such a way that we can cull that broad distribution into subsets with much smaller standard deviations. That's just what we cannot do with electrons. So is it that bricks behave differently than electrons, or is the different behaviour our own-- we analyze the situation differently because we have access to richer information for the bricks, and we use that richer information to correlate the measurements and look at the standard deviations within those correlated subsets. When we have more information, we act differently, and there's the "cut" right there. This doesn't explain why the cut is there, why we get richer information about bricks than electrons, but it does show where the cut comes from-- it comes from how we think, how we process information, and what we mean by "everything it is possible to know about a system."
 
  • Like
Likes ddd123
  • #353
A. Neumaier said:
The second is explained by the law of large numbers and the standard procedures in statistical mechanics.
I don't get where you get large numbers. Do you take all the particles that make up the brick as an ensemble?
 
  • #354
zonde said:
I don't get where you get large numbers. Do you take all the particles that make up the brick as an ensemble?
Not as an ensemble - the ensemble is just a buzzword for the density operator, visualized with a popular - but in the macroscopic case, where statistical mechanics predicts properties of single systems such as a particular brick, misleading - picture of many repetitions.

The many particles appear instead in the sums that define the various macroscopic observables!
 
  • #355
If the many particles appear in those sums, then they are certainly not uncorrelated A operators. The brick is a solid object, those measurements have correlations (and consider the significance of that in my cat analog).
 
  • #356
Ken G said:
If the many particles appear in those sums, then they are certainly not uncorrelated A operators. The brick is a solid object, those measurements have correlations (and consider the significance of that in my cat analog).
That's why I included in my statement the phrase ''except that they account (in many important instances) for the typical correlations''. You need to do the real calculations (for the canonical ensemble, say) that accompany the derivation of the thermodynamic limit in every textbook to see that the correlations contribute to a lower order than the sum, so that the simple argument I gave for the uncorrelated case still remains qualitatively valid.

Without the law of large numbers there would be no thermodynamic limit, and thermodynamics would not be valid.
 
  • #357
So what I'm saying is, the "ensemble" concept is also applicable to macro systems, like a bunch of decks of cards that have all been shuffled. The only difference is, we have access to lots of other ways to get information about those various decks of cards, such that we can regard the situation as more than a density matrix if we do access that other information. Ironically, a card player does not have access to that information unless they cheat, so they do in fact treat a single deck exactly as though it were represented by a diagonal density matrix. We only encounter problems when we ask "but what is the deck really doing", or some such thing, but those questions are of no value to the card player-- they are really just errors in failing to track the differences in having information, versus not having information. We should stop thinking that we are talking about the systems, and simply recognize that we are always talking about our information about the system. After all, that is all the scientist ever uses. When you do that, ensembles and density matrices are exactly the same in quantum and classical theory, the latter are just decohered versions. So that's the first type that stevendaryl was talking about-- the second type is just an artifact of the different quality of the information we have access to classically. When we don't have access to that information to cull our results by, then we only get the decoherence type-- and we do get the large standard deviations.
 
  • #358
A. Neumaier said:
That's why I included in my statement the phrase ''except that they account (in many important instances) for the typical correlations''. You need to do the real calculations (for the canonical ensemble, say) that accompany the derivation of the thermodynamic limit in every textbook to see that the correlations contribute to a lower order than the sum, so that the simple argument I gave for the uncorrelated case still remains valid.
That cannot be true because it doesn't work for the cat analog-- that standard deviation is not small at all.
Without the law of large numbers there would be no thermodynamic limit, and thermodynamics would not be valid.
I certainly agree with that: thermodynamics needs the law of large numbers. But it needs much more: it needs the way we cull by the information we have.
 
  • #359
Ken G said:
we are always talking about our information about the system.
You may be always talking about your information about the system. But physics models the behavior of systems independent of anyone's information. The nuclear processes inside the sun happen in the way modeled by physics even though nobody ever looked into this inside.

From measurements, one can get information about the outside only. But the model is about the inside, and predicts both the inside and what it radiates to the outside.
 
  • #360
Ken G said:
That cannot be true because it doesn't work for the cat analog-- that standard deviation is not small at all.
The single cat is not an ensemble - it remains all the time macroscopic and its measurable aspects therefore have a small standard deviation.
 
  • #361
A. Neumaier said:
You may be always talking about your information about the system. But physics models the behavior of systems independent of anyone's information. The nuclear processes inside the sun happen in the way modeled by physics even though nobody ever looked into this inside.
Ah, but look more carefully at what you are saying here. Does what you mean by the nuclear processes in the Sun include which nucleons have fused and which ones haven't, or do you just mean what you care about the Sun, the total amount of fusion energy that has been released? You have to know what information you care about before you can assert what you mean by the fusion processes, and this contradicts your claim that physics models the behavior independently of our information. Look at an actual model of the core of the Sun, and what you will instantly see is that nowhere in that model does it include which nucleons have fused and which ones haven't, so it's still just a density matrix in that model! We model what we care about, is that not always so? That's why it is always about the information we are choosing to track.
 
  • #362
A. Neumaier said:
The single cat is not an ensemble - it remains all the time macroscopic and its measurable aspects therefore have a small standard deviation.
I'm not talking about our language about the brick, I'm talking about setting up an experiment, and looking at the standard deviation in the outcome. It's purely observational, it's not some picture we are invoking. The experiment is a thousand bricks on trap doors attached to unstable nuclei with a half-life of an hour, followed by measurements of how far the bricks have displaced after 1 hour. That will produce a distribution (in this case bimodal) with a large standard deviation, even though all the systems are prepared identically. The only way to reduce that standard deviation is to cull it based on other information, like turning the lights on and watching which trap doors trigger, and correlating via that new information. Of course that's just what we do, but the problem is-- we forget that that's what we do! We lose track of our own role, the role of how we are culling and correlating the data based on other information we are in some sense taking for granted. But a complete description should never take anything for granted, every step, every correlation being used, must be tabulated and explicitly included. Then there's no difference between classical and quantum systems any more, except the decoherence and the richness of the additional information, both of which are perfectly natural ramifications of macro systems.
 
Last edited:
  • #363
Ken G said:
which nucleons have fused and which ones haven't
This is a meaningless statement since nucleons are indistinguishable.
 
  • #364
Ken G said:
The experiment is a thousand bricks
That makes an ensemble of bricks. I was talking about a single brick. Statistical mechanics makes assertions about every single brick.
 
  • #365
secur said:
... the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.

A. Neumaier said:
One needs the information on a Cauchy surface, not on the past light cone, to make predictions. More precisely, to predict classically what happens at a point in the future of a given observer, the latter's present defines (at least in sufficiently nice spacetimes) a Cauchy surface where all information must be available to infer the desired information.

First let's get Closed Timelike Curves out of the way. It occurs to me that their presence might vitiate my statement, depending how you look at it. They're not supported by experiment and thoroughly irrelevant to this discussion. So let's ignore such pathological spacetimes.

Then we can, as you say, define a Cauchy Surface for any observer, for instance Alice or Bob in typical Bell experiments. But this contributes nothing to the discussion.

Cauchy Surface is used to formulate an "Initial" Value Problem in GR or SR, to determine a complete solution (both past and future) for an entire space. The spacetime point or event where/when Alice makes her observation is one point on a Cauchy Surface which constitutes her "instant" (loosely speaking) throughout space. The info at that specific point comes only from her past light cone (assuming "forward" time). To predict her result, all the rest of the Cauchy Surface is irrelevant (classically) since by definition it can't causally affect her. If we're interested in solving Einstein Field Equation for the entire block universe we'd need it - but we're not.

So (ignoring Closed Timelike Curves) you're simply wrong. One does NOT need the entire Cauchy Surface (which includes detailed info on the Bullet Cluster, for instance, which won't affect her for 3.7 billion years) to predict what Alice's SG says about her particle today in a lab here on Earth!

A. Neumaier said:
it is no different in quantum mechanics when one makes (probabilistic) predictions.

Sorry, that's irrelevant, since my statement is about exact (implied by the word "completely"), not probabilistic, predictions.

A. Neumaier said:
The apex of the light cone is the point in space-time at which all information needed to do the statistics is available.

Correct. That apex is precisely Alice's measurement's spacetime event. All the info there is determined by her past light cone, and nothing else - classically.

To summarize - arguably, with GR, you can produce a contradiction to my statement. In a pathological spacetime one could argue that classical predictions at a point require info outside the past light cone - maybe. If so, I'll concede the point. The fact one must go to such extremes demonstrates the basic validity of my statement.

Cauchy Surfaces have nothing to do with the discussion of my statement (which, I claim, is quite illuminating), or of Bell-type experiments (excepting GR-related Orch-OR and Joy Christian :-), or of Gell-Mann's video. Let's not muddy the waters with irrelevancies, it's muddy enough already.
 
Last edited:
  • #366
The bottom line of what I'm saying is, the collapse of the wavefunction (the second type stevendaryl was talking about, not decoherence which is mundane) occurs when we lose track of how our minds are processing information, how we are correlating and culling by a lot of information we take for granted as obvious to us. Each interpretation sees what is happening there differently. Many worlds refuses to imagine that our minds are playing a role and making choices about what to track and what to ignore, so then the mind is trapped in a coherent subspace of a much larger but mutually incoherent reality that has no consequences for that subspace. Copenhagen sees the unknowns in what the mind is doing as reality itself, so the collapse is real because the mind just works that way. Bohm sees hidden information that we have no access to that determines all these things. But the scientist only cares about the information processing, so it always comes back to what their mind is doing with the information they have.
 
  • #367
A. Neumaier said:
That makes an ensemble of bricks. I was talking about a single brick. Statistical mechanics makes assertions about every single brick.
Statistical mechanics makes assertions about the location of a brick? How does that work? I could see a claim that it makes assertions about the center of mass of a gas of free particles, but that's the kind of uncorrelated system you were talking about above-- that's not at all a brick. You still need to explain what these A operators are, and how they are uncorrelated in a brick.
A. Neumaier said:
This is a meaningless statement since nucleons are indistinguishable.
But notice that the way we model the core of the Sun does not care if the nucleons are distinguishable or not, which is exactly my point about the information that we choose to track. I wager that nothing you just said about fusion in the Sun would be different if the nuclei were distinguishable, since you never in any way invoked indistinguishability. If that is correct, it follows immediately that your objection is not relevant, it obfuscates the key issue here.
 
Last edited:
  • Like
Likes zonde
  • #368
A. Neumaier said:
If ##A_1,\ldots,A_N## are uncorrelated operators with the same standard deviation ##\sigma## then ##X:=N^{-1}(A_1+\ldots A_N)## has standard deviation ##N^{-1/2}\sigma##, as a simple calculation reveals. The arguments in statistical mechanics are similar, except that they account (in many important instances) for the typical correlations between the ##A_k##.

Please point out where the argument is faulty. If successful, all books on statistical mechanics must be rewritten to account for your revolutionary insight.

The law of large numbers doesn't not imply anything about the possibility or impossibility of superpositions of macroscopically distinguishable states. It doesn't imply anything about whether unitary evolution can result in such a superposition. It doesn't say anything of relevance to this discussion.
 
  • Like
Likes zonde
  • #369
stevendaryl said:
The law of large numbers doesn't not imply anything about the possibility or impossibility of superpositions of macroscopically distinguishable states. It doesn't imply anything about whether unitary evolution can result in such a superposition. It doesn't say anything of relevance to this discussion.

The issue is whether there is any reason to believe that the standard deviation of a macroscopic variable such as position must remain small. That seems like a bizarre claim. It's certainly not true classically.

Take the dynamics of some sufficiently complex classical system. In phase space, pick out a small neighborhood. All the relevant physical variables such as position will then have a small standard deviation, if the initial neighborhood is small enough. Now, let the system evolve with time. Typically, for complex systems, the evolution will result in the neighborhood being stretched out and distorted. If the system is ergodic, then that initially compact neighborhood will spread out until it is dense in the subspace of the phase space consisting of all points with the same values for conserved quantities such as energy and angular momentum, etc. There is absolutely no guarantee that the standard deviation will remain small.

I don't know why you [A. Neumaier] think that it will remain small in the case of quantum dynamics.
 
  • #370
Ken G said:
Statistical mechanics makes assertions about the location of a brick?
about the macroscopic properties of the brick in its rest frame; the motion of the rest frame itself is already given by classical mechanics, which applies to macroscopic objects with good accuracy. If you take a photodetector (with a pointer) in place of a brick it makes predictions about the pointer location in the detector's rest frame given the incident current after it was magnified enough. If you register a microscopic phenomenon, only the little subsystem that magnifies the microscopic event to a macroscopic one needs a more detailed stochastic treatment via decoherence, where the microscopic event is modeled by a true ensemble.
 
  • #371
stevendaryl said:
The law of large numbers doesn't not imply anything about the possibility or impossibility of superpositions of macroscopically distinguishable states. It doesn't imply anything about whether unitary evolution can result in such a superposition. It doesn't say anything of relevance to this discussion.
In post #342 you said that ''(2) they have a small standard deviation'' was not explained by decoherence, and I argued that it was explained by statistical mechanics. Why did you bring it up if it wasn't relevant to this discussion?
stevendaryl said:
the standard deviation of a macroscopic variable such as position must remain small. That seems like a bizarre claim. It's certainly not true classically.
Classical statistical mechanics predicts this, too, for the center of mass of a single macroscopic solid body. The trajectory of the center of mass may be a complicated curve, but everyday experience already shows that the uncertainty in predicting the path is tiny, unless one plays billiard or so where the motion is ergodic. Even then it holds for short times, almost up to the order of the inverse first Lyapunov exponent. But ergodic motion is not the typical case; if it were, Galilei would never have found the dynamical laws based on which Newton formulated his mechanics. And life would probably be impossible.
 
  • #372
A. Neumaier said:
about the macroscopic properties of the brick in its rest frame; the motion of the rest frame itself is already given by classical mechanics, which applies to macroscopic objects with good accuracy.

The issue is this: Imagine a situation in which the eventual location of the brick is extremely sensitive to initial conditions. You can make up your own example, but maybe the brick is balanced on the end of a pole, and that pole is balanced on its end. The slightest push in any direction will result in the pole falling in that direction. If the setup is sensitive enough, then a random quantum event, such as the decay of a radioactive atom, could be used to influence the final location of the brick.

In such a circumstance, if you try to compute the probability distribution of the final location of the brick, then it will have a sizable standard deviation. So there is no definitely no mechanism that confines probability distributions for macroscopic objects to give small standard deviations to position.

Now, maybe you want to say that the large standard deviation is due to our ignorance of the details of the initial conditions. That doesn't make any sense to me. What determines the final position is (by hypothesis) whether the atom decays or not. Of course, this is so far talking about a semi-classical notion of "standard deviation", where you treat the brick and pole classically. But I can't see how treating the brick quantum mechanically would make much difference. There is no guarantee that the standard deviation for position of a brick will remain small. It almost certainly will not, in cases where microscopic quantum events are amplified to have macroscopic consequences.
 
  • #373
A. Neumaier said:
In post #342 you said that ''(2) they have a small standard deviation'' was not explained by decoherence, and I argued that it was explained by statistical mechanics. Why did you bring it up if it wasn't relevant to this discussion?

Because what you were saying was false, and I was pointing out that it was false.
 
  • #374
A. Neumaier said:
Classical statistical mechanics predicts this, too, for the center of mass of a single macroscopic solid body.

No, it does not. If you have a chaotic system, and you have an ensemble of systems that initially are confined to a small region of phase space, then as time progresses, that region will spread out. The standard deviation for variables such as position will not remain small.
 
  • #375
A. Neumaier said:
about the macroscopic properties of the brick in its rest frame; the motion of the rest frame itself is already given by classical mechanics, which applies to macroscopic objects with good accuracy.
Exactly, and the way classical mechanics gives that accuracy is by bringing in and correlating with all kinds of extra information from the environment. It won't work at all for the bricks on trap doors unless you correlate the outcomes with that extra information! It's always information processing, even classically.
 
  • #376
stevendaryl said:
No, it does not. If you have a chaotic system, and you have an ensemble of systems that initially are confined to a small region of phase space, then as time progresses, that region will spread out. The standard deviation for variables such as position will not remain small.
Yes, chaos is an excellent alternative to quantum coupling to see this effect. In both cases, we always drive down the uncertainty by correlating with additional outside information, just as with the chaos in shuffling a deck of cards. It's just that we so automatically, without even thinking about it or tracking it formally, say that "I don't know what the cards are, but I could if I just gained access to information I am not privy to but which has been determined", that we don't even realize you can always go from a diagonal density matrix to a definite outcome by correlating with additional information, you simply cull the outcomes into bins and poof, the diagonal density matrix is a bunch of definite outcomes. We don't even seem to realize it is we who have accomplished that "collapse" via information correlation, but we can tell that is true by simply not doing the information correlation, and immediately we are right back to the diagonal density matrix-- exactly like actual card players do.
 
  • #377
stevendaryl said:
In such a circumstance, if you try to compute the probability distribution of the final location of the brick, then it will have a sizable standard deviation. So there is no definitely no mechanism that confines probability distributions for macroscopic objects to give small standard deviations to position.
See my answer here.
stevendaryl said:
If you have a chaotic system, and you have an ensemble of systems
Note that I was talking about a single solid body. Statistical mechanics of macroscopic bodies does not make any prediction for probabilities for what happens to a collection of single solid bodies.
 
Last edited:
  • #378
If the single body is a kite in the wind, then classical mechanics does not tell you where the kite will be a minute after its string breaks-- except to within a broad distribution that will have a large standard deviation. If we do a measurement of the kite's location a minute later, the uncertainty we face in predicting that outcome is no more avoidable than the uncertainty in an electron's location in an atom. So it's not about the theories we use, it is about the information we are plugging in as we go along. Decoherence removes the quantum coherences the electron would show, but that's not collapse-- the collapse still happens when we correlate with other information, in either case. Collapse is culling.
 
  • #379
Thecla said:
In this video ... Murray Gell-Mann discuses Quantum Mechanics and at 11:42 he discuses entanglement. At 14:45 he makes the following statement:

"People say loosely, crudely, wrongly that when you measure one of the photons it does something to the other one. It doesn't."

Do most physicists working in this field agree with the above statement ?

It's interesting to note that OP's question is already more-or-less answered by Gell-Mann's quote. Who are these "people" who "loosely, crudely, wrongly" disagree with him? They are, in fact, "physicists working in this field"! As the video mentions, they include other Nobel Prize winners. The truth is, most physicists don't buy his "Consistent Histories", which provides the justification for his stance.

What he should say is something like "Many other highly respected physicists believe this statement, but I don't. Currently there's no experimental evidence one way or the other; it's just a matter of opinion. Time will tell who's right."
 
  • #380
I think the essential disagreement between A. Neumaier vs. stevendaryl and Ken G can be described as follows. I'll use the language of "superposition" and "collapse" - because, when you get right down to it, it's the only interpretation I understand. A. Neumaier doesn't like that language, of course, but I hope he'll agree with the essence of my explanation.

Suppose a random event (radium decay) sends a brick to two very different locations: an unstable pole falls in a random direction, or a trap-door opens / doesn't. Before "collapse" occurs, we compute the average location of the brick (its center-of-mass) over the superposed states represented by ( |radium decays> + |radium doesn't decay> ). Now, stevendaryl figures the standard deviation of this average can become large. A. Neumaier says no, it remains very small by macro-world standards.

The essential difference is that A. Neumaier figures the collapse (a.k.a. "collapse", with scare quotes) will happen as soon as the superposed states decohere sufficiently, which will be very quick. Again, to be clear, he doesn't use that language, but - looking at it the Copenhagen way - I think that's a correct statement of his position. Therefore we'll never be averaging over two brick-positions that are macroscopically far apart.

stevendaryl, OTOH, figures the collapse won't happen automatically or spontaneously, just because of decoherence. Some sort of measurement event is required. Therefore we will be averaging over superposed brick positions separated by arbitrarily large macroscopic distances.

In A. Neumaier's approach we get sensible macro-world answers for brick locations and the standard deviation thereof, but not in the other approach.

Both positions are reasonable, on their own terms, and neither violates current experimental data. To end the fruitless dispute we must simply agree to wait for future experiments to decide. Unfortunately I have no idea how to design such experiments.
 
  • #381
secur said:
What he should say is something like "Many other highly respected physicists believe this statement, but I don't. Currently there's no experimental evidence one way or the other; it's just a matter of opinion. Time will tell who's right."

Yes, it's pretty mysterious how interpretation is the main source of dogmatism among physicists, when it is also the topic with the least chance of being subjected to experimental verification. Not even string theorists are this self-assured.
 
  • #382
secur said:
Both positions are reasonable, on their own terms, and neither violates current experimental data. To end the fruitless dispute we must simply agree to wait for future experiments to decide. Unfortunately I have no idea how to design such experiments.

Stevendaryl's position is more modest though, because he simply says something isn't proven - he doesn't exclude the other possibility.
 
  • #383
secur said:
The essential difference is that A. Neumaier figures the collapse (a.k.a. "collapse", with scare quotes) will happen as soon as the superposed states decohere sufficiently, which will be very quick. Again, to be clear, he doesn't use that language, but - looking at it the Copenhagen way - I think that's a correct statement of his position. Therefore we'll never be averaging over two brick-positions that are macroscopically far apart.
As I understand A.Neumaier argues about some averaging over all particles of macroscopic system ... but I don't get it how it is relevant. And it does not seem that anybody else get's it.
 
  • #384
secur said:
What he should say is something like "Many other highly respected physicists believe this statement, but I don't. Currently there's no experimental evidence one way or the other; it's just a matter of opinion. Time will tell who's right."
You are of course right, though Gell-Mann is not known for diplomacy or humility! Still, I think one can go farther and still be correct-- one can add "in my opinion, the key lesson that quantum entanglement, a theory that of course will eventually be modified or replaced (not because it got the wrong answer for Bell experiments, which it didn't, but because that's what happens in physics) is trying to help us see, and thereby help motivate whatever will replace it, is not that the parts of the system influence other parts in nonlocal ways, but rather that the behavior of the full system is not well characterized in the first place by the concept of influences between its parts."

I don't know if Gell-Mann would agree to this, but it seems to me that the reason entanglement is not well characterized by influences between parts (or at least, it gets awkward in that area) is because the concept of influences between parts is itself a behavior that appears only due to the breakdown of entanglements. So our mission is not to understand how the parts influence each other when entangled, but rather to understand why we get away with imagining that parts influence each other when entanglement is absent. It's like with decoherence, our goal is not to figure out how coherences support superpositions, but rather how interactions diagonalize the density matrix. Only the Bohmians start with the definite outcomes and try to figure out how ignoring the pilot wave produces the illusion of populating coherences across a density matrix-- the rest of us take those off-diagonal coherences for granted and try to figure out how they went away in a measurement!
 
Last edited:
  • #385
Ken G said:
You are of course right, though Gell-Mann is not known for diplomacy or humility!

For me, that isn't really a problem in itself. I haven't read Gell-Mann's book which he refers to, so I'll have to only consider the posted video and this is surely a limitation. In the video, Gell-Mann tries to characterize the objection that one can choose, say, the polarizer angle as something "to confuse us". But that is the crux of the whole matter! That's not a confusing objection, it's the problem that an interpretation must answer! It's fine if he has a really strong argument against some position, but first he must acknowledge the position. In fact, he says right away that the explanation is that the choices of the different angles belong to different histories - he jumps straight to his own interpretation. I think what we're doing here is a little different, because to argue that there's no "influence between parts" doesn't require any marriage with a specific interpretation but can be held on a more general basis instead, using fewer very reasonable assumptions.
 
  • #386
secur said:
Both positions are reasonable, on their own terms, and neither violates current experimental data. To end the fruitless dispute we must simply agree to wait for future experiments to decide. Unfortunately I have no idea how to design such experiments.
What I'm saying is that we already can do experiments to see what is happening here, but the experiments are on the scientists! All we have to do is watch how the scientist is using information, and you can observe exactly the place where collapse occurs-- it occurs when the elements in the density matrix (which is quickly diagonalized by decoherence) are correlated against other information, such as experimental outcomes. The correlations cull the data into subsets, which are regarded as independent and show small standard deviations within those subsets, but of course it was our choice to look only at the subsets in the first place. We can see exactly when the choice to do that culling occurred, it occurred when we culled the data into bins we call "what happened this time or that time." Quantum theory was never built to be culled that way, that's why it looks like collapse.
 
  • #387
ddd123 said:
In fact, he says right away that the explanation is that the choices of the different angles belong to different histories - he jumps straight to his own interpretation. I think what we're doing here is a little different, because to argue that there's no "influence between parts" doesn't require any marriage with a specific interpretation but can be held on a more general basis instead, using fewer very reasonable assumptions.
I agree, I think we can do better than just buy off on one interpretation and discard the rest. We can watch the process of ourselves doing experiments and correlating data, and see what parts of that quantum mechanics is designed to treat, and what parts are coming from us in a more "manual" kind of way. It is the parts we take for granted that create our confusion, so different interpretations get confused at different places because they take different things for granted. Just as you noticed where Gell-Mann plugged in an interpretation, we need to see all or our modes of experimentation and analysis as examples of interpretation choices.
 
  • #388
Ken G said:
The correlations cull the data into subsets, which are regarded as independent and show small standard deviations within those subsets, but of course it was our choice to look only at the subsets in the first place. We can see exactly when the choice to do that culling occurred, it occurred when we culled the data into bins we call "what happened this time or that time." Quantum theory was never built to be culled that way, that's why it looks like collapse.

What do you think about this other paper that's been posted earlier: https://arxiv.org/abs/1412.6987

Specifically the argument at chapter 9.1, do you think it is more or less what you're saying now? Just curious, because I was unable to form an opinion on this paper.
 
  • Like
Likes Ken G
  • #389
I do see some parallels. I believe the author is making the case that Kolmogorov's approach to probability was just one type of analysis, like choosing Euclid's approach for processing geometric information. But neither can be said to be "absolute" structures that are axiomatic to reality, instead we use them when they work and discard them when they don't. We can also gain an understanding of the requirements needed for them to be useful, but other types of probability analyses may be needed to account for things like irrationality in players of a game-- you could have weird correlations that show up that would not appear in a formal analysis in which all players were rational. The parallel I see is that he seems to be saying that there are not "absolute probabilities", rather probabilities are what you make of them based on your assumptions and constraints, and more bizarre probability structures may work better in some contexts that you cannot always tell in advance without very carefully tracking what assumptions are valid. That seems to gibe with the perspective of Scott Aaronson that bhobba often cites-- that you can understand quantum theory by using a probability structure that allows probabilities to be negative at various places in the calculation, but which never end up negative when you combine them into a final result. That would be anti-axiomatic for a Kolmogorov probability structure much like a triangle with three right angles would be anti-axiomatic for Euclid.

So I agree that many of the paradoxes we get in QT are when we try to plug square pegs into round holes, as with a particular version of probability theory, but I suspect the problem traces more specifically to us taking for granted certain steps in the data analysis, steps that we did not think needed to be included in the formalism because they were just so obviously the way we think about things. Perhaps these steps in the scientific method are as obvious to us as Euclidean geometry, so shouldn't need to be included in the axioms of the formal system-- making the formal system incomplete and vulnerable to paradoxes like collapses and nonlocal influences.
 
  • #390
secur said:
The essential difference is that A. Neumaier figures the collapse (a.k.a. "collapse", with scare quotes) will happen as soon as the superposed states decohere sufficiently, which will be very quick. Again, to be clear, he doesn't use that language, but - looking at it the Copenhagen way - I think that's a correct statement of his position. Therefore we'll never be averaging over two brick-positions that are macroscopically far apart.

stevendaryl, OTOH, figures the collapse won't happen automatically or spontaneously, just because of decoherence. Some sort of measurement event is required. Therefore we will be averaging over superposed brick positions separated by arbitrarily large macroscopic distances.

I'm not actually making any wild claim about what happens. I believe that whenever you look at a brick, it will be in a more or less definite location (up to within some tiny standard deviation). So I believe the same thing as A. Neumaier about what actually happens. The disagreement is over whether what actually happens is (easily) explained by quantum mechanics without collapse.

To me, there are several alternative explanations for why a brick is in a sort-of definite position at all time, and they are all sort-of plausible to me:
  1. The Bohmian explanation: all particles have definite positions at all times, and so of course a brick does, as well.
  2. The Many-Worlds explanation: the brick doesn't actually have a definite position, but within a single "branch" of the universal wave function, it does have a definite position.
  3. The collapse explanation: as soon as you measure a brick's location (or look at it), the brick's wave function collapses into a state of definite position.
A. Neumaier seems to be denying all three possible explanations, and claiming that ordinary quantum mechanics, without collapse, predicts that the brick is in a (more or less) definite location at all times. That to me is completely implausible, and in my opinion, probably provably wrong. (Not provable by me, but maybe by someone smarter than me.)
 
  • #391
vanhees71 said:
It's definitely not true that I think that all the "founding fathers" of QT are wrong or haven't understood their own theory. Ballentine, in my opinion, also follows just standard QT. He's even emphasizing the bare physics content of it, and there's no contradiction of "minimal interpretation" to the Copenhagen flavor without collapse. As I said, I never understood Bohr completely, but as far as I can see he had a pretty similar view, taking the quantum states as epistemic.

If you believed the quantum state were epistemic, you would not object to collapse, and you would not object to collapse conceived as nonlocal.
 
  • #392
It requires no "belief" to regard the state as epistemic, that can be observed simply by watching a physicist apply the concept. So it only requires belief that the demonstrably epistemic use of the state concept corresponds to, or represents, something ontic. One chooses to either believe that if it works epistemically, there must be an ontic reason for that, or else one looks at things like Newton's force of gravity, which might not seem so ontic after all, and just says "oh yeah, it's a concept."
 
  • #393
ddd123 said:
Yes, it's pretty mysterious how interpretation is the main source of dogmatism among physicists, when it is also the topic with the least chance of being subjected to experimental verification. Not even string theorists are this self-assured.

Actually I'd say string theorists are that self-assured! Anyway it's not that mysterious. It's precisely when your case is weak that you can't give an inch. Lawyers, politicians, rhetoricians and debaters know this well. If you can't blind them with brilliance, obfuscate. And always remember: ad hominem is your friend.

Ken G said:
You are of course right, though Gell-Mann is not known for diplomacy or humility! Still, I think one can go farther and still be correct-- one can add "in my opinion, the key lesson that quantum entanglement, a theory that of course will eventually be modified or replaced ... the behavior of the full system is not well characterized in the first place by the concept of influences between its parts."

That would be fine also. Just acknowledge that your position is not proven, with a phrase like "in my opinion"; after that you can be as assertive as you like. But when you insist that anyone who disagrees is "loose, crude and wrong" civilized discussion becomes impossible. Think that, but don't say it.

Brief personal aside: both my parents were diplomats, so it's in my DNA :-)

Your "holistic" idea is attractive - something like that must be right. It's more-or-less compatible with any interpretation, if you look at it the right way (although you may not agree). But I don't accept that the state is entirely epistemic. Can't formulate a clear objection yet, though.

stevendaryl said:
I'm not actually making any wild claim about what happens.

Didn't mean to imply you made a wild claim.

stevendaryl said:
The disagreement is over whether what actually happens is (easily) explained by quantum mechanics without collapse.

I figure A. Neumaier must be postulating some process that corresponds to collapse. Something like GRW, maybe. I wish he'd say "IMO there is no collapse, but there is a process which you mistake for collapse. It happens as follows: (*** insert explanation here ***)".

One of these days I'll study his paper on the subject. I glanced at it and no question, it contains a lot of good stuff. If he'd provide some explanatory comments using the standard language it would be easier to absorb. Use terms like what-we-mistakenly-call-collapse (wwmc-collapse) if you like. Compare and contrast to MWI, GRW or whatever, as applicable.

Your objection - assuming I understand it - is correct. Decoherence can (arguably) explain why off-diagonal terms get close to zero, eliminating interference. So far, only unitary evolution is required. But it doesn't address, at all, why we wind up seeing one particular outcome and not others. That's the vital issue.
 
  • #394
Ken G said:
... Gell-Mann is not known for diplomacy or humility!
Well no, I would say he's not... lol
Murray Gell-Mann said:
If I have seen further than others, it is because I am surrounded by dwarfs.
Some more Murray Gell-Mann clasic quotes...
secur said:
Actually I'd say string theorists are that self-assured!
Some certainly are ... or were... ?

Continue! - excellent thread! ... it was not my intention to butt in... :blushing:
 
Last edited:
  • #395
OCR said:
Continue! - excellent thread!

Yes - I'd second that - and also add my thanks to the many contributors. Wonderful stuff :smile:
 
  • #396
...looks like anyone makes mistakes
 
  • #397
  • #398
OCR said:
Those quotes give us a clear look at how Gell-Mann thinks about entanglement:
"If on one branch of history, the plane polarization of one photon is measured and thereby specified with certainty, then on the same branch of history the circular polarization of the other photon is also specified with certainty. On a different branch of history the circular polarization of one of the photons may be measured, in which case the circular polarization of both photons is specified with certainty. On each branch, the situation is like that of Bertlmann's socks"

So he sees classical consistent histories weaved together into a whole that exhibits bizarre correlations, the entanglement is not between the contributing parts of the system, but rather between the contributing histories of the whole system. It's an interesting take on "holism"-- the "whole thing" is this entangled history. I'm not sure that's any less bizarre than entangling the parts of the system, but either way, the main lesson of entanglement is that the whole is not a simple amalgamation of parts, and the amalgamation is not well characterized by a simple sum with "influences between parts" enforcing the emergent properties. Instead, an "influence" is merely a decohered version of those more general entanglements of histories. Of course, if one rejects the idea that "histories" can be different things that come together to support a classical concept of a single decohered history (a "collapse" of histories, if you like), then to those people, Gell-Mann's view is just as objectionable as Copenhagen's view of collapse. I'll bet Gell-Mann must have the same problem with the question "but how does the history we perceive get culled from all the others" that Bohr had with "but how does the outcome we perceive get culled from all those that could have happened." Either way, I return to my earlier conclusion that collapse is culling.
 
  • #399
Thecla said:
"People say loosely ,crudely,wrongly that when you measure one of the photons it does something to the other one. It doesn't."
I am wondering if the interpretation I present below is of any use to anyone other than myself.

A measurement with respect to one particle does not have any affect on any property of the other particle. However, it does affect the probability distribution of the expected value of a future measurement with respect to the other particle.

Consider the following experimental environment. A large collection of urns each has N balls, some white and some black, put into each of them. For each urn, the number of balls of each color put into it, W and B, W+B=N, is randomly selected from a given probability distribution. A measurement involves randomly choosing an urn and randomly drawing K balls without replacement. A second future measurement involves randomly drawing K more balls from the same urn. Looking at the color of the balls from the first drawing has no affect on the color of any of the remaining balls in the urn. However, it does effect the probability distribution of the expected number of white (or black) balls for the second measurement.

Regards,
Buzz
 
  • #400
Buzz Bloom said:
A second future measurement involves randomly drawing K more balls from the same urn. Looking at the color of the balls from the first drawing has no affect on the color of any of the remaining balls in the urn. However, it does effect the probability distribution of the expected number of white (or black) balls for the second measurement.
It doesn't sound like the ontology you are picturing to demonstrate that interpretation would exhibit correlations that would violate the Bell inequality. So that's really the rub here-- it's not that we can't picture an ontology that would allow outcomes of one measurement to alter our expectations for another, it's that we can't picture an ontology that does it in a way that can violate the Bell inequality without there being any influences between the parts of the system, or something else strange going on (like the whole system being more than the simple sum of its parts).
 
Back
Top