A Assumptions of the Bell theorem

  • #351
No, I'm saying that you can't solve the "measurement problem" by considering only closed quantum systems. This is the one thing where Bohr was right: Measuring a quantum system means to use a macroscopic apparatus to gain information about this quantum system, and a measurement implies an irreversible process letting me read off a pointer at that instrument. This you cannot describe by a closed system (neither in classical mechanics or field theory nor in quantum (field) theory).

In classical as well as quantum physics you derive the behavior of macroscopic systems by a plethora of methods leading to an effective description leading to irreversibility, dissipation and particularly, in the quantum case, decoherence.

What is empty philosophy is to claim you can describe macroscopic systems in all microscopic detail as closed quantum systems. It's also empty philosophy to claim that there's a measurement problem only because of this impossibility. If you wish it's the same empty philosophy as to claim that there is a problem, because we are able to describe nature with mathematical models at all. It's just an observed fact that we can to an amazing extent, as it is an observed fact that we are able for the last 400+x years to invent better and better instruments to measure and thus quantify with higher and higher accuracy all kinds of phenomena, and this is enabled by both these experimental and engineering achievements and its interplay with theory, which enables us to think and talk about phenomena that exceed or direct abilities by several orders of magnitude in scales (from the microscopic subatomic/subnuclear dimensions below 1 fm up to very large astronomical if not even cosmological scales). It's empty philosophy (though pretty entertaining sometimes) to ask, why this quantitative description of our universe is possible at all.
 
  • Like
Likes Kolmo
Physics news on Phys.org
  • #352
Kolmo said:
As far as I can see from studying things like the Allahverdyan paper and other more detailed treatments, the only real issues such as the absence of conditionalization are covered by more detailed models.
I don’t think so. You have deterministic evolution of microsystems, but when you include macroscopic systems (measurement devices), it becomes nondeterministic? How is that possible if the macro system is governed by the same laws as the micro system?
 
  • #353
vanhees71 said:
No, I'm saying that you can't solve the "measurement problem" by considering only closed quantum systems.
Does that mean that you can’t have measurements in a closed system?
 
  • Like
Likes Demystifier
  • #354
stevendaryl said:
I don’t think so. You have deterministic evolution of microsystems, but when you include macroscopic systems (measurement devices), it becomes nondeterministic? How is that possible if the macro system is governed by the same laws as the micro system?
I don't really understand this comment. This is how I see it.

Well say we treat the relevant degrees of freedom of the device as ##D_i## following your notation. They form some commutative algebra ##\mathcal{M}##. The quantum system then has some non-commutative algebra ##\mathcal{A}## so that the total system has the algebra ##\mathcal{M}\lor\mathcal{A}## or if you are in non-relativistic QM ##\mathcal{M}\otimes\mathcal{A}##.

Then if we have some state ##\omega## on ##\mathcal{M}\lor\mathcal{A}## that is essentially a product state and some time evolution ##\alpha_{t}## that couples them we end up with a mixed state after measurement of the form:
##\omega = \sum_{i} p_{i}\otimes\rho_{i}##
with ##p_{i}## a state on the device's algebra and ##\rho_{i}## a state of the quantum system, potentially even a mixed one. Since ##p_{i}## is just a state on a commutative operator algebra then (by Gelfand's representation theorem if one wants to fully prove it) it's just some classical probability distribution and thus just ignorance of the current device reading. Upon seeing the device reading you condition on the outcome ##i## via Bayes's rule which has an effect on the total state such that the state for the quantum system is now ##\rho_i##.

The only issue remaining would be can we in fact treat macroscopic degrees of freedom as commutative in this sense. To me this was answered in sufficient detail in the affirmative in older works by Ludwig, Gottfried's textbook or the long papers of Loinger et al from the 60s about ergodicity in quantum theory. There were other works by Peres, Omnes, Lockhart and others in the 80s that come to the same conclusion by either properties of the observable algebra in QFT, decoherence studies or properties of coarse grained observables. Even more recently you have the enormous synthasizing and summarizing papers of Allahverdyan et al.
 
  • #355
Kolmo said:
Well say we treat the relevant degrees of freedom of the device as ##D_i## following your notation. They form some commutative algebra ##\mathcal{M}##. The quantum system then has some non-commutative algebra ##\mathcal{A}## so that the total system has the algebra ##\mathcal{M}\lor\mathcal{A}## or if you are in non-relativistic QM ##\mathcal{M}\otimes\mathcal{A}##.

Then if we have some state ##\omega## on ##\mathcal{M}\lor\mathcal{A}## that is essentially a product state and some time evolution ##\alpha_{t}## that couples them we end up with a mixed state after measurement of the form:
But why should a product state evolve into a mixed state?
 
  • #356
stevendaryl said:
But why should a product state evolve into a mixed state?
I don't get this. It just does given the evolution (super)operator, I'm not sure what "why" I could provide.
 
  • #357
stevendaryl said:
But why should a product state evolve into a mixed state?

Decoherence does not describe the evolution of a pure state into a mixed state. It becomes a mixed state when we trace over the environmental degrees if freedom. That’s not something that happens, that’s a mathematical choice that the analyst uses. To me, something can’t go from (1) a superposition of a state in which A is true and a state where B is true to (2) either A or B is true, we just don’t know which until we gather more information, just because we have performed a mathematical operation on a state.
 
  • #358
stevendaryl said:
Decoherence does not describe the evolution of a pure state into a mixed state
I didn't discuss decoherence, except tangentially at the end. The process described above doesn't involve decoherence.
 
  • #359
It’s my opinion that many people believe just contradictory things about quantum mechanics. When it comes to something simple like an electron, there is a clear difference between being in a superposition of states A and B and being in a mixed state with a probability of being in state A and a probability of being in state B. But when it comes to macroscopic objects, people ignore the distinction.
 
  • #360
Kolmo said:
I don't get this. It just does given the evolution (super)operator

I don’t think it does.
 
  • #361
stevendaryl said:
I don’t think it does.
Okay, I don't get why it wouldn't since it would be a generic effect of coupling between an Abelian and non-Abelian algebra. I don't see what prevents the evolution from evolving into non-product states.

stevendaryl said:
It’s my opinion that many people believe just contradictory things about quantum mechanics. When it comes to something simple like an electron, there is a clear difference between being in a superposition of states A and B and being in a mixed state with a probability of being in state A and a probability of being in state B. But when it comes to macroscopic objects, people ignore the distinction.
In the model I had above the device has an Abelian observable algebra so it doesn't display superposition, the pointer states live in different superselection sectors.

It's like how the state ##\frac{1}{\sqrt{2}}\left(\ket{a} + \ket{b}\right)## when ##\ket{a}, \ket{b}## have different electric charges is not a superposition, since charge is superselected and has an Abelian algebra.
 
  • #362
Kolmo said:
I didn't discuss decoherence, except tangentially at the end. The process described above doesn't involve decoherence.

What process?

Do you agree that in the evolution of a single particle, with no interactions, a pure state never evolves into a mixed state?

What about the evolution of two particles that only interact with each other?

What about the interaction of 10 particles? Or 1000?

If the system is simple enough to actually analyze, pure states always evolve into pure states. So what reason is there for pure states to evolve into mixed states, once they get to be too complex?
 
  • #363
Kolmo said:
It's like how the state ##\frac{1}{\sqrt{2}}\left(\ket{a} + \ket{b}\right)## when ##\ket{a}, \ket{b}## have different electric charges is not a superposition, since charge is superselected and has an Abelian algebra.

No, it’s not like that. There is no super selection rule having to do with complexity, so that states involving 1000 or fewer particles can be in superpositions but states with many more particles can’t be.
 
  • #364
stevendaryl said:
No, it’s not like that. There is no super selection rule having to do with complexity, so that states involving 1000 or fewer particles can be in superpositions but states with many more particles can’t be.
It's not directly to do with complexity, it depends on the analysis and the observable as to what the effect is. It might be due to certain ergodic conditions being obeyed, being driven into the double centralizer of the algebra by infrared effects such as in QED, conditions of locality removing non-commuting operators as in Allahverdyan or Gottfried's old analysis and so on, but in the literature it is worked out that collective coarse grained coordinates are superselected in quite a few papers.
 
  • #365
Kolmo said:
It's not directly to do with complexity, it depends on the analysis and the observable as to what the effect is.

I don’t think that really makes any sense. There might be reasons that for certain systems, we just can’t get a pure state description, and so we are forced to use mixed states.

But there is a sleight of hand going on here. It’s true that a mixed state can be used to represent “ignorance” where either this is true, or that is true, but we don’t know which. But if the mixed state arises from ordinary quantum dynamics, then we know that that interpretation is WRONG.
 
Last edited:
  • #366
Demystifier said:
The point is, if you don't take into account the effect of measurement (intermediate collapse), then the correlator you compute does not correspond to the measured correlation. See e.g. https://arxiv.org/abs/1610.03161
Oh, now I see your point. If ##A(t_i)## and ##A(t_j)## don't commute, then ##\left<A(t_j)A(t_i)\right>## is not the correct expression for the correlator. I agree with that. In principle, one could construct an example as in the Bell-CHSH setting, where the set ##\{A_1, A_2, A_3, A_4\}## (where ##A_i = A(t_i)##) could be decomposed into two sets ##\{A_1,A_3\}## and ##\{A_2,A_4\}## of non-commuting operators that commute with each other. In such a situation the correlators would be the measured ones and my previous CHSH argument would go through, which is already enough to demonstrate the inviability of Kolmogorov probability. However, this is of course a very special situation and in general, one can't expect such a decomposition into commuting sets. Let's discuss the general case:

In the general case, one needs the joint probability distribution ##P_{ij}(u,v)## on ##\mathrm{spec}(A_j)\times\mathrm{spec}(A_i)## in order to compute the correlator ##C_{ij} = \sum_{u,v\in\mathrm{spec}(A)} u v P_{ij}(u,v)##. This probability distribution is given by ##P_{ij}(u,v) = \left<\pi_j(u)\pi_i(v)\Psi,\pi_j(u)\pi_i(v)\Psi\right>##, where the ##\pi_i(u)## are the corresponding projectors, but it is only a legitimate probability distribution that obeys Kolmogorov's axioms if the histories ##\pi_j(u)\odot \pi_i(v)## are consistent, e.g. if some decoherence has happened. In that case, we have ##C_{ij} = \sum_{u,v\in\mathrm{spec}(A)} u v\left<\pi_j(u)\pi_i(v)\Psi,\pi_j(u)\pi_i(v)\Psi\right>##, but since different sets of consistent histories are still incompatible in general, these correlators will still violate the CHSH inequality in general.

If ##A_i## and ##A_j## commute, this correlator reduces to the usual expression:
$$C_{ij} = \sum_{u,v\in\mathrm{spec}(A)} u v\left<\pi_j(u)\pi_i(v)\Psi,\pi_j(u)\pi_i(v)\Psi\right>$$
$$= \sum_{u,v\in\mathrm{spec}(A)} u v\left<\Psi,\pi_i(u)\pi_j(v)\pi_j(u)\pi_i(v)\Psi\right> = \sum_{u,v\in\mathrm{spec}(A)} u v\left<\Psi,\pi_j(u)^2\pi_i(v)^2\Psi\right>$$
$$= \sum_{u,v\in\mathrm{spec}(A)} u v\left<\Psi,\pi_j(u)\pi_i(v)\Psi\right>= \left<\Psi,\left(\sum_u u\pi_j(u)\right)\left(\sum_v v\pi_i(v)\right)\Psi\right>$$
$$= \left<\Psi,A_jA_i\Psi\right> = \left<A_j A_i\right>$$
 
Last edited:
  • Like
Likes Demystifier
  • #367
stevendaryl said:
I don’t think that really makes any sense. There might be reasons that for certain systems, we just can’t get a pure state description, and so we are forced to use mixed states

But there is a sleight of hand going on here. It’s true that a mixed state can be used to represent “ignorance” where either this is true, or that is true, but we don’t know which. But if the mixed state arises from ordinary quantum dynamics, then we know that that interpretation is WRONG
First of all technically the states are non-factor states not mixed states. "Mixed" strictly speaking refers to a non-pure state within a sector, not a mixture of states across a superselection sectors.

Secondly I don't see how it is wrong. When you do the details the macroscopic collective coarse grained coordinates are superselected, thus states like:
##\frac{1}{\sqrt{2}}\left(\ket{a} + \ket{b}\right)##
where ##\ket{a}## and ##\ket{b}## have different values for the collective coordinates are actually mixed states because these observables are superselected. Same with a sum of kets of different electric charge. Whether a sum of kets is a superposition or a mixture depends on the features of the algebra.

No "sleight of hand", it's just how states relate to their operator algebra.

stevendaryl said:
But if the mixed state arises from ordinary quantum dynamics, then we know that that interpretation is WRONG
In QFT states are always mixed due to the Type III nature of the observable algebra. I don't really see where this "the state must be pure" thing is coming from. It's not a necessary feature of quantum theory. All one needs for quantum theory is that the state is some element of the dual of the operator algebra and that the dynamics are some completely positive trace preserving map.
 
  • #368
Kolmo said:
In QFT states are always mixed due to the Type III nature of the observable algebra. I don't really see where this "the state must be pure" thing is coming from.
It's probably just a clash of terminology. You are talking about algebraic states that are mixed if they can be written as a convex combination and pure otherwise. Stevendaryl just means vector states when he talks about pure states. Of course every algebraic state (pure or mixed) can be written as a vector state due to GNS, but it is still mixed if viewed as an algebraic state.
 
  • Like
Likes Kolmo
  • #369
Nullstein said:
It's probably just a clash of terminology. You are talking about algebraic states that are mixed if they can be written as a convex combination and pure otherwise. Stevendaryl just means vector states when he talks about pure states. Of course every algebraic state (pure or mixed) can be written as a vector state due to GNS, but it is still mixed if viewed as an algebraic state.
Yeah, my point would be then that vector states which are sums of different values of coarse-grained macroscopic collective coordinates are actually mixtures of each individual term as algebraic states. To me this solves the only real issue with measurement processes: how one can condition on their results, i.e. it's a consistency check of the formalism.

I'm not saying the outcome of measurement can be determined in advance or anything.
 
  • #370
vanhees71 said:
What is empty philosophy is to claim you can describe macroscopic systems in all microscopic detail as closed quantum systems.
The question is, is this a matter of principle or of practicality? Before QM, most people thought it was a matter of practicality. This view isn't tenable in QM anymore (unless one uses certain interpretations which equip QM with a non-probabilistic ontology).

Your posts on this always leave me with the impression that you consider this to be unremarkable. Sure, one can be happy with an instrumentalist point of view and say that the classical physicists were misguided with their hope of finding out how Nature works instead of having only modeling tools. But that one has to take such an instrumental point of view if one doesn't consider the strangeness of dBB, MWI, etc. to be viable seems really remarkable to me.
 
  • #371
Kolmo said:
First of all technically the states are non-factor states not mixed states. "Mixed" strictly speaking refers to a non-pure state within a sector, not a mixture of states across a superselection sectors.

Secondly I don't see how it is wrong. When you do the details the macroscopic collective coarse grained coordinates are superselected, thus states like:
##\frac{1}{\sqrt{2}}\left(\ket{a} + \ket{b}\right)##
Why do you say they are superselected? That doesn't seem at all right to me.
 
  • #372
Kolmo said:
No "sleight of hand", it's just how states relate to their operator algebra.

I very much consider it sleight of hand.

There are two moves involved in the conclusion that the measurement problem is solved:
  1. Showing how mixed states come about.
  2. The interpretation of mixed states as "Either this is true, or that is true, but we don't know which"

You can't use "operator algebra" to derive the second.
 
  • #373
stevendaryl said:
Why do you say they are superselected? That doesn't seem at all right to me.
Because there are various papers and some books working this out in detail. There are a few different avenues to discussing it. Despite it being a usual talking point online, decoherence is often a subdominant effect in classicality (see the Allahverdyan et al paper mentioned above for quantitative estimates on this)

stevendaryl said:
You can't use "operator algebra" to derive the second.
I already covered this. The operator algebra approach generates non-factor states. Mixed states are not ignorance readable due to the decomposition ambiguity and several other reasons. Non-factor states however can be read as ignorance since they have a unique decomposition and are formally equivalent (via Gelfand's theorem) to probability distributions over the superselected quantities.
 
  • #374
Also, if you're going to use some feature of QFT to solve the measurement problem, does that amount to saying that there is no solution as long as one sticks to nonrelativistic quantum mechanics?
 
  • #375
stevendaryl said:
Also, if you're going to use some feature of QFT to solve the measurement problem, does that amount to saying that there is no solution as long as one sticks to nonrelativistic quantum mechanics?
Operator algebras occur in NRQM as well and the relevant features hold there as well. The argument is more involved in the NRQM case as the characterisation of superselected quantities is cleaner in QFT as it can be tied to fundamental infrared effects, not involving long arguments about the non-physicality of operators not commuting with the macroscopic collective coordinates.
 
  • #376
Kolmo said:
Because there are various papers and some books working this out in detail. There are a few different avenues to discussing it. Despite it being a usual talking point online, decoherence is often a subdominant effect in classicality (see the Allahverdyan et al paper mentioned above for quantitative estimates on this)

Well, what I have heard along those lines (and decoherence fits in with this) is that anything like "superselection" is a "For all practical purposes" effect. There is no real superselection it's just that the interference between terms is greatly suppressed (going to zero in the macroscopic limit). If you are talking about an effect in which there is a real superselection going on, along the lines of suppressing states with different total charges, well, that's interesting. My feeling is that it can't possibly be true.

I already covered this.

The operator algebra approach generates non-factor states. Mixed states are not ignorance readable due to the decomposition ambiguity and several other reasons. Non-factor states however can be read as ignorance since they have a unique decomposition and a formally equivalent (via Gelfands theorem) to probability distributions over the superselected quantities.

Being "formally equivalent" doesn't cut it. That's the sleight of hand that I'm talking about.
 
  • Like
Likes vanhees71 and romsofia
  • #377
Kolmo said:
Operator algebras occur in NRQM as well and the relevant features hold there as well. The argument is more involved in the NRQM case as the characterisation of superselected quantities is cleaner in QFT as it can be tied to fundamental infrared effects, not involving long arguments about the non-physicality of operators not commuting with the macroscopic collective coordinates.

Okay, then leave QFT out of it. I don't believe what you're claiming about NRQM. Are you saying that NRQM forbids the existence of superpositions of states that are macroscopically distinguishable?
 
  • #378
stevendaryl said:
Being "formally equivalent" doesn't cut it. That's the sleight of hand that I'm talking about.
Before I continue, could you explain this. If the state decomposes as a classical probability distribution over the values of the superselected quantities how is it a "sleight of hand" to read it as a classical probability distribution? I can't make any sense of this.
Why isn't it being a classical probability distribution over the macroscopic quantities enough?
 
  • #379
stevendaryl said:
Okay, then leave QFT out of it. I don't believe what you're claiming about NRQM. Are you saying that NRQM forbids the existence of superpositions of states that are macroscopically distinguishable?
Certainly not forbids them. You have SQUIDs and similar systems.
 
  • #380
Kolmo said:
Certainly not forbids them. You have SQUIDs and similar systems.

I don't know if that counts. But anyway, let's take a typical experimental result: You pass an electron through a Stern-Gerlach device, and the electron either goes left, and makes a spot on the left side of a photographic plate, or goes right, and makes a spot on the right side. Are you saying that there can't be a superposition of those two possibilities? There is a rigorous superselection rule preventing it?

I certainly believe that you can't in practice observe interference effects between the two possibilities.
 
  • #381
You'll have to answer my question in #378 first, as otherwise I wouldn't be sure where this is headed.
 
  • #382
Kolmo said:
Before I continue, could you explain this. If the state decomposes as a classical probability distribution over the values of the superselected quantities how is it a "sleight of hand" to read it as a classical probability distribution?

Well, let me try to flesh out the issue here.

Let's just look at a particular experiment. You have an electron which you somehow put into a superposition of being at point ##A##, where Alice awaits with her particle detector, and point ##B##, thousands of miles away, where Bob has his particle detector. Before the electron is detected, would you way that it is either at Bob's detector, or at Alice's detector, and we just don't know which? I would say no, according to orthodox quantum theory: the belief that a particle has a definite location at all times, but we just don't know what that location is, is a hidden-variable theory. It's one that's endorsed by the Bohm interpretation, but at the cost of FTL interactions. So if you reject such a hidden-variable theory, then the answer is no: The electron does not have a definite position until it is detected.

Now, let the electron interact with the detectors. Presumably, we have some superselection rule that forbids the world from entering a superposition in which Alice detects the electron and one in which Bob detects the electron. That means that the universe makes a nondeterministic choice, to either go with "The electron is detected by Alice" or "The electron is detected by Bob". (In a Many-Worlds type interpretation, you get both possibilities).

Presumably, early on, it's still a superposition, with both possibilities present, and later, there is only one possibility. So is there a moment where one possibility becomes actual?
 
  • #383
stevendaryl said:
Presumably, early on, it's still a superposition, with both possibilities present, and later, there is only one possibility. So is there a moment where one possibility becomes actual?
If I understand your terminology, once they make contact with a superselected quantity, i.e. once each term of the initially superposed state leads to different values of some superselected quantity.
 
  • #384
Kolmo said:
If I understand your terminology, once they make contact with a superselected quantity, i.e. once each term of the initially superposed state leads to different values of some superselected quantity.

Okay, so this seems like a more sophisticated way to say "measurement collapses the wave function". There were two problems with the old-fashioned statement. First, it seemed nonlocal. Second, it seemed that it relied on a fuzzy notion of when a "measurement" has been done.

So you're saying that these superselection rules (which I'm definitely not convinced about, but for the sake of argument) address the second one. The measurement is done when the system "makes contact with a superselected quantity". I don't have any idea what that means, but let it go...

The nonlocality is another issue. If something happening at Alice's detector (the electron making contact with a superselected quantity) makes it impossible for Bob to detect an electron, when it was possible the moment before, that seems to be a nonlocal effect.
 
  • #385
I looked up environmentally induced superselection, and definitely in the past when people talked about superselection, they meant the rapid decaying of the off-diagonal elements of the density matrix due to decoherence. That isn't truly superselection. It's an "for all practical purposes" superselection.

The specific paper by Allahverdyan that you mention that talks about another type of superselection doesn't seem to be available for free download. Or is it?
 
  • #386
stevendaryl said:
I looked up environmentally induced superselection, and definitely in the past when people talked about superselection, they meant the rapid decaying of the off-diagonal elements of the density matrix due to decoherence. That isn't truly superselection. It's an "for all practical purposes" superselection.

The specific paper by Allahverdyan that you mention that talks about another type of superselection doesn't seem to be available for free download. Or is it?

There is a paper that discusses the application of superselection rules to resolve the measurement problem:
http://jamesowenweatherall.com/SCPPRG/EarmanJohn2008Man_SuperselectionforPhilosophers.pdf
(Chapter 11)
 
  • #387
vanhees71 said:
Quantum jumps and/or collapse are just FAPP descriptions for pretty fast transition processes due to the interaction of the investigated system with the environment/measurement device leading to decoherence and irreversible defined measurement results.
This is interpretation dependent; in some interpretations collapse is not just a FAPP description.
 
  • #388
vanhees71 said:
No, I'm saying that you can't solve the "measurement problem" by considering only closed quantum systems. This is the one thing where Bohr was right: Measuring a quantum system means to use a macroscopic apparatus to gain information about this quantum system, and a measurement implies an irreversible process letting me read off a pointer at that instrument. This you cannot describe by a closed system (neither in classical mechanics or field theory nor in quantum (field) theory).

In classical as well as quantum physics you derive the behavior of macroscopic systems by a plethora of methods leading to an effective description leading to irreversibility, dissipation and particularly, in the quantum case, decoherence.

What is empty philosophy is to claim you can describe macroscopic systems in all microscopic detail as closed quantum systems. It's also empty philosophy to claim that there's a measurement problem only because of this impossibility.
For me, it's impossible to discuss quantum foundations with you because you use double standards. You use one set of reasoning standards in the argument above, but a totally different set of reasoning standards when you claim that there is no collapse of the wave function. With the standards of reasoning as above one could just as well claim that collapse happens in open systems, that we cannot understand it in detail because the measuring apparatus has too many degrees of freedom, and that there are plethora of methods leading to an effective description leading to collapse. But when it comes to collapse, you just shift to another, more fundamental, way of thinking which easily dismisses the argument for collapse above. And yet, when one wants to talk about the measurement problem with you, you again retreat to the effective non-fundamental mode of thinking.

You are like a laywer who argues that those under age 18 are not guilty for their actions because their actions are determined by the fundamental laws of physics, while those who are older than 18 are guilty because the guilt is an emergent phenomenon and we cannot understand all the details of physical determinations of their actions.
 
  • Like
Likes PeterDonis
  • #389
stevendaryl said:
The nonlocality is another issue. If something happening at Alice's detector (the electron making contact with a superselected quantity) makes it impossible for Bob to detect an electron, when it was possible the moment before, that seems to be a nonlocal effect
This is a paper that goes through a model where infrared effects of light drive quantities into the centre of the centralizer of the algebra making the state over them be simply ignorance:
https://arxiv.org/abs/2101.01044
See Section 6. It's probably the most explicit model that's still reasonably short. Allahverdyan's paper is over 100 pages long. "ETH approach" just means using the infrared properties of QED to derive removal of interference.

As for the nonlocality I don't see it. Following the evolution of the state it would predict either Alice's detector clicked or Bob's detector clicked, i.e. the possible histories are click here or click there. Just because the possible events are far apart doesn't to me indicate nonlocality.

stevendaryl said:
I looked up environmentally induced superselection, and definitely in the past when people talked about superselection, they meant the rapid decaying of the off-diagonal elements of the density matrix due to decoherence. That isn't truly superselection. It's an "for all practical purposes" superselection
Decoherence is a subdominant effect in classicality, but even in this case if one wanted to measure the off-diagonal terms say for a macroscopic body of ##10^{27}## particles for which there is decoherence of macroscopic quantities into the internal and external environment with the latter say involving scattered light, what quantity would you measure to detect interference? That is give me a measurable quantity that does not commute with macroscopic collective coordinates. Detailed calculations usually show such a quantity cannot be given an operational meaning.
 
  • #390
stevendaryl said:
Does that mean that you can’t have measurements in a closed system?
No, because you have to irreversibly store the measurement result in order to read it off. That argument is already due to Bohr. Unfortunately it came with all this confusing philosophy characteristic for him and, even worse, Heisenberg.
 
  • Like
Likes Kolmo
  • #391
stevendaryl said:
I don't know if that counts. But anyway, let's take a typical experimental result: You pass an electron through a Stern-Gerlach device, and the electron either goes left, and makes a spot on the left side of a photographic plate, or goes right, and makes a spot on the right side. Are you saying that there can't be a superposition of those two possibilities? There is a rigorous superselection rule preventing it?

I certainly believe that you can't in practice observe interference effects between the two possibilities.
Of course there can be superpositions. E.g., if you measure the spin component in another direction. What the SGE realizes is (almost perfect) entanglement between spin and momentum (or position) in the sense you described. This is well understood with (unitary!) time evolution for the electron in the Stern-Gerlach magnet.

This has nothing to do with a superselection rule. Superselection rules result from some symmetry principle. E.g., the impossibility within non-relativistic QT of superpositions of states with different mass (due to the fact that mass in non-relativistic QT is a central charge of the Galilei group's Lie algebra) or the superselection rule forbidding superpositions of states with half-integer and inter spin etc.
 
  • #392
vanhees71 said:
No, because you have to irreversibly store the measurement result in order to read it off.
So there are no measurements in the Universe, because Universe is a closed system by definition. :oldlaugh:
 
  • #393
Demystifier said:
For me, it's impossible to discuss quantum foundations with you because you use double standards. You use one set of reasoning standards in the argument above, but a totally different set of reasoning standards when you claim that there is no collapse of the wave function. With the standards of reasoning as above one could just as well claim that collapse happens in open systems, that we cannot understand it in detail because the measuring apparatus has too many degrees of freedom, and that there are plethora of methods leading to an effective description leading to collapse. But when it comes to collapse, you just shift to another, more fundamental, way of thinking which easily dismisses the argument for collapse above. And yet, when one wants to talk about the measurement problem with you, you again retreat to the effective non-fundamental mode of thinking.

You are like a laywer who argues that those under age 18 are not guilty for their actions because their actions are determined by the fundamental laws of physics, while those who are older than 18 are guilty because the guilt is an emergent phenomenon and we cannot understand all the details of physical determinations of their actions.
There is no collapse and I think my argument is consistent. It is just a wrong attitude to say that the QT of macroscopic open quantum systems by statistical means is less fundamental than the treatment of closed systems. To the contrary, it's very fundamental to understand the "classicality" of the behavior of macroscopic systems, and thus also measurement devices, as an emergent phenomenon. It's not enough to know the Standard Model of elementary particles or some future "better theory beyond the Standard Model". You also have to understand the phenomena of all kinds of "condensed matter" (from the QGP at the high-energy end to the matter surrounding us at the low-energy end).

It's obvious that you cannot even write down the state of a macroscopic system consisting of ##\sim 10^{24}## "molecules/atoms/particles" in all detail, let alone solve the full unitary time evolution of its dynamics as a closed system. This is not even possible in classical physics.
 
  • #394
vanhees71 said:
Of course there can be superpositions
He's talking about superpositions of the macroscopic collective coordinates of the device, i.e. a superposition of the location of marks on the photographic plates in a Stern-Gerlach device, not whether one can later go on to see superposition of the spin of the particle.

vanhees71 said:
This has nothing to do with a superselection rule. Superselection rules result from some symmetry principle.
The absence of interference for the macroscopic collective coordinates is equivalent to and often called a superselection principle. Only some superselected quantities result from symmetry principles, i.e. the typical easiest ones to derive like mass in NRQM. Dynamical ones are usually much harder and require solving detailed models.
 
  • #395
I prefer the expression of "environment induced selection (eins).

It's also true that there is no limit in the size of a system to show "quantum behavior" like interference/superposition and even entanglement. It's just a matter of being able to isolate the system enough from "the environment" to prevent decoherence, and this is a very challenging task for macroscopic systems.

I think Feynman would have been very pleased about the newest example using with drums (at a size in the micrometer region):

https://www.nature.com/articles/d41586-021-01223-4
 
  • Like
Likes Kolmo
  • #396
vanhees71 said:
I prefer the expression of "environment induced selection (eins).
That's a fine term. Working in this area we often wouldn't use it as not all of these effects are actually driven by the environment or decoherence. If you read Allahverdyan et al's long paper they discuss how decoherence is actually a subdominant source of classicality in the Curie-Weiss model of measurement. For that reason I prefer to use a more generic term.
 
  • Like
Likes vanhees71
  • #397
vanhees71 said:
it's very fundamental ... as an emergent phenomenon.
By definition, emergent means not fundamental.
 
  • #398
Emergent means to explain a phenomenon from an underlying fundamental theory using some appropriate approximation to find the adequate description in terms of an "effective theory".
 
  • #399
vanhees71 said:
Emergent means to explain a phenomenon from an underlying fundamental theory.
Is the Born rule fundamental or emergent?
 
  • #400
The Born rule is fundamental. Ho often do you want to hear this answer?
 
Back
Top