Murray Gell-Mann on Entanglement

  • Context: Undergrad 
  • Thread starter Thread starter Thecla
  • Start date Start date
  • Tags Tags
    Entanglement
Click For Summary
SUMMARY

Murray Gell-Mann asserts that measuring one photon does not affect the other in an entangled pair, a statement that has sparked debate among physicists. While many agree with Gell-Mann's interpretation, the discussion highlights the complexities of quantum mechanics, particularly regarding non-locality and the collapse of the wave function. The conversation emphasizes that interpretations of entanglement are varied and often depend on individual perspectives on quantum theory, particularly in relation to relativistic quantum field theory (QFT) and hidden variables.

PREREQUISITES
  • Understanding of Quantum Mechanics principles
  • Familiarity with Quantum Entanglement concepts
  • Knowledge of Relativistic Quantum Field Theory (QFT)
  • Awareness of Bell's Theorem and its implications
NEXT STEPS
  • Explore the implications of Bell's Theorem in quantum mechanics
  • Study the concept of wave function collapse in quantum systems
  • Investigate different interpretations of quantum mechanics, including hidden variable theories
  • Learn about decoherent histories and their role in quantum theory
USEFUL FOR

Physicists, quantum mechanics students, and researchers interested in the foundations of quantum theory and the philosophical implications of entanglement and measurement.

  • #361
A. Neumaier said:
You may be always talking about your information about the system. But physics models the behavior of systems independent of anyone's information. The nuclear processes inside the sun happen in the way modeled by physics even though nobody ever looked into this inside.
Ah, but look more carefully at what you are saying here. Does what you mean by the nuclear processes in the Sun include which nucleons have fused and which ones haven't, or do you just mean what you care about the Sun, the total amount of fusion energy that has been released? You have to know what information you care about before you can assert what you mean by the fusion processes, and this contradicts your claim that physics models the behavior independently of our information. Look at an actual model of the core of the Sun, and what you will instantly see is that nowhere in that model does it include which nucleons have fused and which ones haven't, so it's still just a density matrix in that model! We model what we care about, is that not always so? That's why it is always about the information we are choosing to track.
 
Physics news on Phys.org
  • #362
A. Neumaier said:
The single cat is not an ensemble - it remains all the time macroscopic and its measurable aspects therefore have a small standard deviation.
I'm not talking about our language about the brick, I'm talking about setting up an experiment, and looking at the standard deviation in the outcome. It's purely observational, it's not some picture we are invoking. The experiment is a thousand bricks on trap doors attached to unstable nuclei with a half-life of an hour, followed by measurements of how far the bricks have displaced after 1 hour. That will produce a distribution (in this case bimodal) with a large standard deviation, even though all the systems are prepared identically. The only way to reduce that standard deviation is to cull it based on other information, like turning the lights on and watching which trap doors trigger, and correlating via that new information. Of course that's just what we do, but the problem is-- we forget that that's what we do! We lose track of our own role, the role of how we are culling and correlating the data based on other information we are in some sense taking for granted. But a complete description should never take anything for granted, every step, every correlation being used, must be tabulated and explicitly included. Then there's no difference between classical and quantum systems any more, except the decoherence and the richness of the additional information, both of which are perfectly natural ramifications of macro systems.
 
Last edited:
  • #363
Ken G said:
which nucleons have fused and which ones haven't
This is a meaningless statement since nucleons are indistinguishable.
 
  • #364
Ken G said:
The experiment is a thousand bricks
That makes an ensemble of bricks. I was talking about a single brick. Statistical mechanics makes assertions about every single brick.
 
  • #365
secur said:
... the essential peculiarity of QM, compared to classical, is that in order to completely predict results, info beyond the past light cone is required.

A. Neumaier said:
One needs the information on a Cauchy surface, not on the past light cone, to make predictions. More precisely, to predict classically what happens at a point in the future of a given observer, the latter's present defines (at least in sufficiently nice spacetimes) a Cauchy surface where all information must be available to infer the desired information.

First let's get Closed Timelike Curves out of the way. It occurs to me that their presence might vitiate my statement, depending how you look at it. They're not supported by experiment and thoroughly irrelevant to this discussion. So let's ignore such pathological spacetimes.

Then we can, as you say, define a Cauchy Surface for any observer, for instance Alice or Bob in typical Bell experiments. But this contributes nothing to the discussion.

Cauchy Surface is used to formulate an "Initial" Value Problem in GR or SR, to determine a complete solution (both past and future) for an entire space. The spacetime point or event where/when Alice makes her observation is one point on a Cauchy Surface which constitutes her "instant" (loosely speaking) throughout space. The info at that specific point comes only from her past light cone (assuming "forward" time). To predict her result, all the rest of the Cauchy Surface is irrelevant (classically) since by definition it can't causally affect her. If we're interested in solving Einstein Field Equation for the entire block universe we'd need it - but we're not.

So (ignoring Closed Timelike Curves) you're simply wrong. One does NOT need the entire Cauchy Surface (which includes detailed info on the Bullet Cluster, for instance, which won't affect her for 3.7 billion years) to predict what Alice's SG says about her particle today in a lab here on Earth!

A. Neumaier said:
it is no different in quantum mechanics when one makes (probabilistic) predictions.

Sorry, that's irrelevant, since my statement is about exact (implied by the word "completely"), not probabilistic, predictions.

A. Neumaier said:
The apex of the light cone is the point in space-time at which all information needed to do the statistics is available.

Correct. That apex is precisely Alice's measurement's spacetime event. All the info there is determined by her past light cone, and nothing else - classically.

To summarize - arguably, with GR, you can produce a contradiction to my statement. In a pathological spacetime one could argue that classical predictions at a point require info outside the past light cone - maybe. If so, I'll concede the point. The fact one must go to such extremes demonstrates the basic validity of my statement.

Cauchy Surfaces have nothing to do with the discussion of my statement (which, I claim, is quite illuminating), or of Bell-type experiments (excepting GR-related Orch-OR and Joy Christian :-), or of Gell-Mann's video. Let's not muddy the waters with irrelevancies, it's muddy enough already.
 
Last edited:
  • #366
The bottom line of what I'm saying is, the collapse of the wavefunction (the second type stevendaryl was talking about, not decoherence which is mundane) occurs when we lose track of how our minds are processing information, how we are correlating and culling by a lot of information we take for granted as obvious to us. Each interpretation sees what is happening there differently. Many worlds refuses to imagine that our minds are playing a role and making choices about what to track and what to ignore, so then the mind is trapped in a coherent subspace of a much larger but mutually incoherent reality that has no consequences for that subspace. Copenhagen sees the unknowns in what the mind is doing as reality itself, so the collapse is real because the mind just works that way. Bohm sees hidden information that we have no access to that determines all these things. But the scientist only cares about the information processing, so it always comes back to what their mind is doing with the information they have.
 
  • #367
A. Neumaier said:
That makes an ensemble of bricks. I was talking about a single brick. Statistical mechanics makes assertions about every single brick.
Statistical mechanics makes assertions about the location of a brick? How does that work? I could see a claim that it makes assertions about the center of mass of a gas of free particles, but that's the kind of uncorrelated system you were talking about above-- that's not at all a brick. You still need to explain what these A operators are, and how they are uncorrelated in a brick.
A. Neumaier said:
This is a meaningless statement since nucleons are indistinguishable.
But notice that the way we model the core of the Sun does not care if the nucleons are distinguishable or not, which is exactly my point about the information that we choose to track. I wager that nothing you just said about fusion in the Sun would be different if the nuclei were distinguishable, since you never in any way invoked indistinguishability. If that is correct, it follows immediately that your objection is not relevant, it obfuscates the key issue here.
 
Last edited:
  • Like
Likes   Reactions: zonde
  • #368
A. Neumaier said:
If ##A_1,\ldots,A_N## are uncorrelated operators with the same standard deviation ##\sigma## then ##X:=N^{-1}(A_1+\ldots A_N)## has standard deviation ##N^{-1/2}\sigma##, as a simple calculation reveals. The arguments in statistical mechanics are similar, except that they account (in many important instances) for the typical correlations between the ##A_k##.

Please point out where the argument is faulty. If successful, all books on statistical mechanics must be rewritten to account for your revolutionary insight.

The law of large numbers doesn't not imply anything about the possibility or impossibility of superpositions of macroscopically distinguishable states. It doesn't imply anything about whether unitary evolution can result in such a superposition. It doesn't say anything of relevance to this discussion.
 
  • Like
Likes   Reactions: zonde
  • #369
stevendaryl said:
The law of large numbers doesn't not imply anything about the possibility or impossibility of superpositions of macroscopically distinguishable states. It doesn't imply anything about whether unitary evolution can result in such a superposition. It doesn't say anything of relevance to this discussion.

The issue is whether there is any reason to believe that the standard deviation of a macroscopic variable such as position must remain small. That seems like a bizarre claim. It's certainly not true classically.

Take the dynamics of some sufficiently complex classical system. In phase space, pick out a small neighborhood. All the relevant physical variables such as position will then have a small standard deviation, if the initial neighborhood is small enough. Now, let the system evolve with time. Typically, for complex systems, the evolution will result in the neighborhood being stretched out and distorted. If the system is ergodic, then that initially compact neighborhood will spread out until it is dense in the subspace of the phase space consisting of all points with the same values for conserved quantities such as energy and angular momentum, etc. There is absolutely no guarantee that the standard deviation will remain small.

I don't know why you [A. Neumaier] think that it will remain small in the case of quantum dynamics.
 
  • #370
Ken G said:
Statistical mechanics makes assertions about the location of a brick?
about the macroscopic properties of the brick in its rest frame; the motion of the rest frame itself is already given by classical mechanics, which applies to macroscopic objects with good accuracy. If you take a photodetector (with a pointer) in place of a brick it makes predictions about the pointer location in the detector's rest frame given the incident current after it was magnified enough. If you register a microscopic phenomenon, only the little subsystem that magnifies the microscopic event to a macroscopic one needs a more detailed stochastic treatment via decoherence, where the microscopic event is modeled by a true ensemble.
 
  • #371
stevendaryl said:
The law of large numbers doesn't not imply anything about the possibility or impossibility of superpositions of macroscopically distinguishable states. It doesn't imply anything about whether unitary evolution can result in such a superposition. It doesn't say anything of relevance to this discussion.
In post #342 you said that ''(2) they have a small standard deviation'' was not explained by decoherence, and I argued that it was explained by statistical mechanics. Why did you bring it up if it wasn't relevant to this discussion?
stevendaryl said:
the standard deviation of a macroscopic variable such as position must remain small. That seems like a bizarre claim. It's certainly not true classically.
Classical statistical mechanics predicts this, too, for the center of mass of a single macroscopic solid body. The trajectory of the center of mass may be a complicated curve, but everyday experience already shows that the uncertainty in predicting the path is tiny, unless one plays billiard or so where the motion is ergodic. Even then it holds for short times, almost up to the order of the inverse first Lyapunov exponent. But ergodic motion is not the typical case; if it were, Galilei would never have found the dynamical laws based on which Newton formulated his mechanics. And life would probably be impossible.
 
  • #372
A. Neumaier said:
about the macroscopic properties of the brick in its rest frame; the motion of the rest frame itself is already given by classical mechanics, which applies to macroscopic objects with good accuracy.

The issue is this: Imagine a situation in which the eventual location of the brick is extremely sensitive to initial conditions. You can make up your own example, but maybe the brick is balanced on the end of a pole, and that pole is balanced on its end. The slightest push in any direction will result in the pole falling in that direction. If the setup is sensitive enough, then a random quantum event, such as the decay of a radioactive atom, could be used to influence the final location of the brick.

In such a circumstance, if you try to compute the probability distribution of the final location of the brick, then it will have a sizable standard deviation. So there is no definitely no mechanism that confines probability distributions for macroscopic objects to give small standard deviations to position.

Now, maybe you want to say that the large standard deviation is due to our ignorance of the details of the initial conditions. That doesn't make any sense to me. What determines the final position is (by hypothesis) whether the atom decays or not. Of course, this is so far talking about a semi-classical notion of "standard deviation", where you treat the brick and pole classically. But I can't see how treating the brick quantum mechanically would make much difference. There is no guarantee that the standard deviation for position of a brick will remain small. It almost certainly will not, in cases where microscopic quantum events are amplified to have macroscopic consequences.
 
  • #373
A. Neumaier said:
In post #342 you said that ''(2) they have a small standard deviation'' was not explained by decoherence, and I argued that it was explained by statistical mechanics. Why did you bring it up if it wasn't relevant to this discussion?

Because what you were saying was false, and I was pointing out that it was false.
 
  • #374
A. Neumaier said:
Classical statistical mechanics predicts this, too, for the center of mass of a single macroscopic solid body.

No, it does not. If you have a chaotic system, and you have an ensemble of systems that initially are confined to a small region of phase space, then as time progresses, that region will spread out. The standard deviation for variables such as position will not remain small.
 
  • #375
A. Neumaier said:
about the macroscopic properties of the brick in its rest frame; the motion of the rest frame itself is already given by classical mechanics, which applies to macroscopic objects with good accuracy.
Exactly, and the way classical mechanics gives that accuracy is by bringing in and correlating with all kinds of extra information from the environment. It won't work at all for the bricks on trap doors unless you correlate the outcomes with that extra information! It's always information processing, even classically.
 
  • #376
stevendaryl said:
No, it does not. If you have a chaotic system, and you have an ensemble of systems that initially are confined to a small region of phase space, then as time progresses, that region will spread out. The standard deviation for variables such as position will not remain small.
Yes, chaos is an excellent alternative to quantum coupling to see this effect. In both cases, we always drive down the uncertainty by correlating with additional outside information, just as with the chaos in shuffling a deck of cards. It's just that we so automatically, without even thinking about it or tracking it formally, say that "I don't know what the cards are, but I could if I just gained access to information I am not privy to but which has been determined", that we don't even realize you can always go from a diagonal density matrix to a definite outcome by correlating with additional information, you simply cull the outcomes into bins and poof, the diagonal density matrix is a bunch of definite outcomes. We don't even seem to realize it is we who have accomplished that "collapse" via information correlation, but we can tell that is true by simply not doing the information correlation, and immediately we are right back to the diagonal density matrix-- exactly like actual card players do.
 
  • #377
stevendaryl said:
In such a circumstance, if you try to compute the probability distribution of the final location of the brick, then it will have a sizable standard deviation. So there is no definitely no mechanism that confines probability distributions for macroscopic objects to give small standard deviations to position.
See my answer here.
stevendaryl said:
If you have a chaotic system, and you have an ensemble of systems
Note that I was talking about a single solid body. Statistical mechanics of macroscopic bodies does not make any prediction for probabilities for what happens to a collection of single solid bodies.
 
Last edited:
  • #378
If the single body is a kite in the wind, then classical mechanics does not tell you where the kite will be a minute after its string breaks-- except to within a broad distribution that will have a large standard deviation. If we do a measurement of the kite's location a minute later, the uncertainty we face in predicting that outcome is no more avoidable than the uncertainty in an electron's location in an atom. So it's not about the theories we use, it is about the information we are plugging in as we go along. Decoherence removes the quantum coherences the electron would show, but that's not collapse-- the collapse still happens when we correlate with other information, in either case. Collapse is culling.
 
  • #379
Thecla said:
In this video ... Murray Gell-Mann discuses Quantum Mechanics and at 11:42 he discuses entanglement. At 14:45 he makes the following statement:

"People say loosely, crudely, wrongly that when you measure one of the photons it does something to the other one. It doesn't."

Do most physicists working in this field agree with the above statement ?

It's interesting to note that OP's question is already more-or-less answered by Gell-Mann's quote. Who are these "people" who "loosely, crudely, wrongly" disagree with him? They are, in fact, "physicists working in this field"! As the video mentions, they include other Nobel Prize winners. The truth is, most physicists don't buy his "Consistent Histories", which provides the justification for his stance.

What he should say is something like "Many other highly respected physicists believe this statement, but I don't. Currently there's no experimental evidence one way or the other; it's just a matter of opinion. Time will tell who's right."
 
  • #380
I think the essential disagreement between A. Neumaier vs. stevendaryl and Ken G can be described as follows. I'll use the language of "superposition" and "collapse" - because, when you get right down to it, it's the only interpretation I understand. A. Neumaier doesn't like that language, of course, but I hope he'll agree with the essence of my explanation.

Suppose a random event (radium decay) sends a brick to two very different locations: an unstable pole falls in a random direction, or a trap-door opens / doesn't. Before "collapse" occurs, we compute the average location of the brick (its center-of-mass) over the superposed states represented by ( |radium decays> + |radium doesn't decay> ). Now, stevendaryl figures the standard deviation of this average can become large. A. Neumaier says no, it remains very small by macro-world standards.

The essential difference is that A. Neumaier figures the collapse (a.k.a. "collapse", with scare quotes) will happen as soon as the superposed states decohere sufficiently, which will be very quick. Again, to be clear, he doesn't use that language, but - looking at it the Copenhagen way - I think that's a correct statement of his position. Therefore we'll never be averaging over two brick-positions that are macroscopically far apart.

stevendaryl, OTOH, figures the collapse won't happen automatically or spontaneously, just because of decoherence. Some sort of measurement event is required. Therefore we will be averaging over superposed brick positions separated by arbitrarily large macroscopic distances.

In A. Neumaier's approach we get sensible macro-world answers for brick locations and the standard deviation thereof, but not in the other approach.

Both positions are reasonable, on their own terms, and neither violates current experimental data. To end the fruitless dispute we must simply agree to wait for future experiments to decide. Unfortunately I have no idea how to design such experiments.
 
  • #381
secur said:
What he should say is something like "Many other highly respected physicists believe this statement, but I don't. Currently there's no experimental evidence one way or the other; it's just a matter of opinion. Time will tell who's right."

Yes, it's pretty mysterious how interpretation is the main source of dogmatism among physicists, when it is also the topic with the least chance of being subjected to experimental verification. Not even string theorists are this self-assured.
 
  • #382
secur said:
Both positions are reasonable, on their own terms, and neither violates current experimental data. To end the fruitless dispute we must simply agree to wait for future experiments to decide. Unfortunately I have no idea how to design such experiments.

Stevendaryl's position is more modest though, because he simply says something isn't proven - he doesn't exclude the other possibility.
 
  • #383
secur said:
The essential difference is that A. Neumaier figures the collapse (a.k.a. "collapse", with scare quotes) will happen as soon as the superposed states decohere sufficiently, which will be very quick. Again, to be clear, he doesn't use that language, but - looking at it the Copenhagen way - I think that's a correct statement of his position. Therefore we'll never be averaging over two brick-positions that are macroscopically far apart.
As I understand A.Neumaier argues about some averaging over all particles of macroscopic system ... but I don't get it how it is relevant. And it does not seem that anybody else get's it.
 
  • #384
secur said:
What he should say is something like "Many other highly respected physicists believe this statement, but I don't. Currently there's no experimental evidence one way or the other; it's just a matter of opinion. Time will tell who's right."
You are of course right, though Gell-Mann is not known for diplomacy or humility! Still, I think one can go farther and still be correct-- one can add "in my opinion, the key lesson that quantum entanglement, a theory that of course will eventually be modified or replaced (not because it got the wrong answer for Bell experiments, which it didn't, but because that's what happens in physics) is trying to help us see, and thereby help motivate whatever will replace it, is not that the parts of the system influence other parts in nonlocal ways, but rather that the behavior of the full system is not well characterized in the first place by the concept of influences between its parts."

I don't know if Gell-Mann would agree to this, but it seems to me that the reason entanglement is not well characterized by influences between parts (or at least, it gets awkward in that area) is because the concept of influences between parts is itself a behavior that appears only due to the breakdown of entanglements. So our mission is not to understand how the parts influence each other when entangled, but rather to understand why we get away with imagining that parts influence each other when entanglement is absent. It's like with decoherence, our goal is not to figure out how coherences support superpositions, but rather how interactions diagonalize the density matrix. Only the Bohmians start with the definite outcomes and try to figure out how ignoring the pilot wave produces the illusion of populating coherences across a density matrix-- the rest of us take those off-diagonal coherences for granted and try to figure out how they went away in a measurement!
 
Last edited:
  • #385
Ken G said:
You are of course right, though Gell-Mann is not known for diplomacy or humility!

For me, that isn't really a problem in itself. I haven't read Gell-Mann's book which he refers to, so I'll have to only consider the posted video and this is surely a limitation. In the video, Gell-Mann tries to characterize the objection that one can choose, say, the polarizer angle as something "to confuse us". But that is the crux of the whole matter! That's not a confusing objection, it's the problem that an interpretation must answer! It's fine if he has a really strong argument against some position, but first he must acknowledge the position. In fact, he says right away that the explanation is that the choices of the different angles belong to different histories - he jumps straight to his own interpretation. I think what we're doing here is a little different, because to argue that there's no "influence between parts" doesn't require any marriage with a specific interpretation but can be held on a more general basis instead, using fewer very reasonable assumptions.
 
  • #386
secur said:
Both positions are reasonable, on their own terms, and neither violates current experimental data. To end the fruitless dispute we must simply agree to wait for future experiments to decide. Unfortunately I have no idea how to design such experiments.
What I'm saying is that we already can do experiments to see what is happening here, but the experiments are on the scientists! All we have to do is watch how the scientist is using information, and you can observe exactly the place where collapse occurs-- it occurs when the elements in the density matrix (which is quickly diagonalized by decoherence) are correlated against other information, such as experimental outcomes. The correlations cull the data into subsets, which are regarded as independent and show small standard deviations within those subsets, but of course it was our choice to look only at the subsets in the first place. We can see exactly when the choice to do that culling occurred, it occurred when we culled the data into bins we call "what happened this time or that time." Quantum theory was never built to be culled that way, that's why it looks like collapse.
 
  • #387
ddd123 said:
In fact, he says right away that the explanation is that the choices of the different angles belong to different histories - he jumps straight to his own interpretation. I think what we're doing here is a little different, because to argue that there's no "influence between parts" doesn't require any marriage with a specific interpretation but can be held on a more general basis instead, using fewer very reasonable assumptions.
I agree, I think we can do better than just buy off on one interpretation and discard the rest. We can watch the process of ourselves doing experiments and correlating data, and see what parts of that quantum mechanics is designed to treat, and what parts are coming from us in a more "manual" kind of way. It is the parts we take for granted that create our confusion, so different interpretations get confused at different places because they take different things for granted. Just as you noticed where Gell-Mann plugged in an interpretation, we need to see all or our modes of experimentation and analysis as examples of interpretation choices.
 
  • #388
Ken G said:
The correlations cull the data into subsets, which are regarded as independent and show small standard deviations within those subsets, but of course it was our choice to look only at the subsets in the first place. We can see exactly when the choice to do that culling occurred, it occurred when we culled the data into bins we call "what happened this time or that time." Quantum theory was never built to be culled that way, that's why it looks like collapse.

What do you think about this other paper that's been posted earlier: https://arxiv.org/abs/1412.6987

Specifically the argument at chapter 9.1, do you think it is more or less what you're saying now? Just curious, because I was unable to form an opinion on this paper.
 
  • Like
Likes   Reactions: Ken G
  • #389
I do see some parallels. I believe the author is making the case that Kolmogorov's approach to probability was just one type of analysis, like choosing Euclid's approach for processing geometric information. But neither can be said to be "absolute" structures that are axiomatic to reality, instead we use them when they work and discard them when they don't. We can also gain an understanding of the requirements needed for them to be useful, but other types of probability analyses may be needed to account for things like irrationality in players of a game-- you could have weird correlations that show up that would not appear in a formal analysis in which all players were rational. The parallel I see is that he seems to be saying that there are not "absolute probabilities", rather probabilities are what you make of them based on your assumptions and constraints, and more bizarre probability structures may work better in some contexts that you cannot always tell in advance without very carefully tracking what assumptions are valid. That seems to gibe with the perspective of Scott Aaronson that bhobba often cites-- that you can understand quantum theory by using a probability structure that allows probabilities to be negative at various places in the calculation, but which never end up negative when you combine them into a final result. That would be anti-axiomatic for a Kolmogorov probability structure much like a triangle with three right angles would be anti-axiomatic for Euclid.

So I agree that many of the paradoxes we get in QT are when we try to plug square pegs into round holes, as with a particular version of probability theory, but I suspect the problem traces more specifically to us taking for granted certain steps in the data analysis, steps that we did not think needed to be included in the formalism because they were just so obviously the way we think about things. Perhaps these steps in the scientific method are as obvious to us as Euclidean geometry, so shouldn't need to be included in the axioms of the formal system-- making the formal system incomplete and vulnerable to paradoxes like collapses and nonlocal influences.
 
  • #390
secur said:
The essential difference is that A. Neumaier figures the collapse (a.k.a. "collapse", with scare quotes) will happen as soon as the superposed states decohere sufficiently, which will be very quick. Again, to be clear, he doesn't use that language, but - looking at it the Copenhagen way - I think that's a correct statement of his position. Therefore we'll never be averaging over two brick-positions that are macroscopically far apart.

stevendaryl, OTOH, figures the collapse won't happen automatically or spontaneously, just because of decoherence. Some sort of measurement event is required. Therefore we will be averaging over superposed brick positions separated by arbitrarily large macroscopic distances.

I'm not actually making any wild claim about what happens. I believe that whenever you look at a brick, it will be in a more or less definite location (up to within some tiny standard deviation). So I believe the same thing as A. Neumaier about what actually happens. The disagreement is over whether what actually happens is (easily) explained by quantum mechanics without collapse.

To me, there are several alternative explanations for why a brick is in a sort-of definite position at all time, and they are all sort-of plausible to me:
  1. The Bohmian explanation: all particles have definite positions at all times, and so of course a brick does, as well.
  2. The Many-Worlds explanation: the brick doesn't actually have a definite position, but within a single "branch" of the universal wave function, it does have a definite position.
  3. The collapse explanation: as soon as you measure a brick's location (or look at it), the brick's wave function collapses into a state of definite position.
A. Neumaier seems to be denying all three possible explanations, and claiming that ordinary quantum mechanics, without collapse, predicts that the brick is in a (more or less) definite location at all times. That to me is completely implausible, and in my opinion, probably provably wrong. (Not provable by me, but maybe by someone smarter than me.)
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 58 ·
2
Replies
58
Views
5K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 61 ·
3
Replies
61
Views
4K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 22 ·
Replies
22
Views
3K
  • · Replies 19 ·
Replies
19
Views
3K