A Assumptions of the Bell theorem

  • #481
Ok, then you think quantum theory is incomplete and you need some extension of it to solve the measurement problem. That's a legitimate opinion but there's no empirical evidence whatsoever hinting at the necessity of such an extension, let alone a clue in which direction one should seek for it.
 
Physics news on Phys.org
  • #482
vanhees71 said:
There's no way to understand phenomena like friction and dissipation, approach of thermodynamic equilibrium, the "arrow time" and all such phenomena by considering only "closed systems".
I disagree.

vanhees71 said:
As Anderson famously put it: "more is different".
He didn't say "coarse grained is different".
 
  • #483
Well, if condensed matter physicists wouldn't "coarse grain", they couldn't understand anything of their subject ;-).
 
  • #484
vanhees71 said:
Ok, then you think quantum theory is incomplete and you need some extension of it to solve the measurement problem. That's a legitimate opinion but there's no empirical evidence whatsoever hinting at the necessity of such an extension, let alone a clue in which direction one should seek for it.
I agree that there is no empirical evidence, but I think there is a logical evidence. It's the logic that if something cannot be explained by considering the full closed system (on which we agree), then it also cannot be explained by considering its open subsystem.
 
  • #485
vanhees71 said:
Well, if condensed matter physicists wouldn't "coarse grain", they couldn't understand anything of their subject ;-).
Something they would. For instance, energy conservation in a closed system is a general theorem which is valid for any number of atoms and does not depend on coarse graining. Likewise, the principle that closed system QM cannot produce definite outcomes is another such general theorem.
 
  • #486
Of course, if you draw wrong conclusions, you are in danger to fight against windmills!
 
  • #487
Demystifier said:
Something they would. For instance, energy conservation in a closed system is a general theorem which is valid for any number of atoms and does not depend on coarse graining. Likewise, the principle that closed system QM cannot produce definite outcomes is another such general theorem.
Yes but measurements by necessity involve open systems of a certain form so that you get an irreversible process resulting in a measurement result/event that quantum theory assigns a probability to.
 
  • Like
Likes vanhees71 and dextercioby
  • #488
Kolmo said:
Yes but measurements by necessity involve open systems
Not necessarily. If you include the measuring apparatus as a part of the closed system, then measurement can be described by the closed system.
 
  • Like
Likes physika
  • #489
Demystifier said:
Not necessarily. If you include the measuring apparatus as a part of the closed system, then measurement can be described by the closed system.
Then when you look at the macroscopic coarse grained collective coordinates within that closed system you see they don't show interference terms and that the evolution is irreversible and thus you have a measurement result.

*Not due to some "breakdown" of quantum theory though, before anybody starts suggesting that.
 
  • Like
Likes vanhees71
  • #490
Kolmo said:
Then when you look at the macroscopic coarse grained collective coordinates within that closed system you see they don't show interference terms and that the evolution is irreversible and thus you have a measurement result.
Coarse graining indeed explains how the interference terms apparently disappear and how the evolution becomes apparently irreversible. However, and this is the crucial point, it does not explain how the measurement results appear. See e.g. my http://thphys.irb.hr/wiki/main/images/5/50/QFound3.pdf pages 21-22.
 
  • #491
Demystifier said:
Coarse graining indeed explains how the interference terms disappear and how the evolution becomes irreversible. However, and this is the crucial point, it does not explain how the measurement results appear.
I wasn't really discussing decoherence. Coarse-grained macroscopic DOFs also lack interference due to effects which dominate decoherence.

However regardless, I would say that of course quantum theory doesn't say which specific result arises because it is a probabilistic theory. This isn't really an inconsistency though.
 
  • Like
Likes vanhees71
  • #492
Kolmo said:
I wasn't really discussing decoherence. Coarse-grained macroscopic DOFs also lack interference due to effects which dominate decoherence.
In this case I have no idea how is that supposed to explain measurement results, i.e. definite measurement outcomes.

Kolmo said:
However regardless, I would say that of course quantum theory doesn't say which specific result arises because it is a probabilistic theory. This isn't really an inconsistency though.
I don't have a problem with that. There are, for instance, stochastic versions of "Bohmian" interpretation, which are fundamentally probabilistic too, and I'm fine with it. The question is not which specific result arises. The question is why only one specific result arises. If you don't adopt some specific interpretation (Bohm, many worlds, fundamental collapse, ...) then the answer is not clear. If you just postulate it, without an attempt to explain it from something deeper, then there are problems which I can only discuss if you specify some details of this postulate.
 
  • Like
Likes physika
  • #493
Demystifier said:
In this case I have no idea how is that supposed to explain measurement results, i.e. definite measurement outcomes.
It's just that there are effects other than decoherence that suppress interference. Decoherence isn't even the one that was first discovered.

Demystifier said:
The question is not which specific result arises. The question is why only one specific result arises
I find this question hard to understand. If you end up with a classical probability distribution over the outcomes, then you know an outcome occurred but not which one. QM will only tell you each outcome's probability.
 
  • Like
Likes vanhees71
  • #494
Demystifier said:
The question is not which specific result arises. The question is why only one specific result arises.
What is an example where more than one result arises?
 
  • Like
Likes vanhees71
  • #495
Kolmo said:
I find this question hard to understand. If you end up with a classical probability distribution over the outcomes, then you know an outcome occurred but not which one.
It's a bit subtle to explain where is the problem. To get this classical probability, you first need to do the coarse graining. And to do the coarse graining, you first need to decide which degrees of freedom are irrelevant. The problem is that this decision is arbitrary, subjective and athropomorphic. For instance, who or what makes such a decision in the absence of conscious beings? How the nature itself, which by definition is the whole nature (not its open part), can do the coarse graining? If two agents define coarse graining differently, does it imply that for one agent the outcome occurs and for the other the same outcome does not occur? This leads to various Wigner-friend type of paradoxes.
 
  • #496
martinbn said:
What is an example where more than one result arises?
Example is the many world interpretation. (But that's probably not what you meant by "example".)
 
  • #497
Demystifier said:
It's a bit subtle to explain where is the problem. To get this classical probability, you first need to do the coarse graining. And to do the coarse graining, you first need to decide which degrees of freedom are irrelevant. The problem is that this decision is arbitrary, subjective and athropomorphic. For instance, who or what makes such a decision in the absence of conscious beings? How the nature itself, which by definition is the whole nature (not its open part), can do the coarse graining? If two agents define coarse graining differently, does it imply that for one agent the outcome occurs and for the other the same outcome does not occur? This leads to various Wigner-friend type of paradoxes.
The decision which are the relevant macroscopic observables is not arbitrary but simply specified by the system under consideration, and it's of course subject to experimental test. If you choose the wrong observables in your theoretical description, you'll not be successful describing the phenomenon observed.

Of course it can also be that you have different resolution. Compare the CMBR measurement by the COBE satellite with that by WMAP and PLANCK, each with higher resolution ("less coarse graining") than the previous one. The resolution, "coarseness of the coarse graining", is not arbitrary or subjective but given by the macroscopic system (in this case a measurement device).
 
  • Like
Likes Kolmo
  • #498
Demystifier said:
To get this classical probability, you first need to do the coarse graining
If two agents define coarse graining differently, does it imply that for one agent the outcome occurs and for the other the same outcome does not occur?
There's not really any ambiguity here. Macroscopic observables are represented by well defined sums of individual atomic observables (in general sums of products) resulting in an aggregate operator. In measurement models it's a lot of work but you can show that the DOFs associated to such observables don't have interference and irreversibly store a result. Older papers show that experiments to demonstrate interference for such functional sum observables necessarily "melt" the device back into an atomic soup.

The functional sum of atomic observables giving the macroscopic pointer DOF is well-defined given the particular measurement set up. It's not some random free for all with every theoretician permitted to do whatever they want.
 
  • Like
Likes vanhees71
  • #499
vanhees71 said:
The resolution, "coarseness of the coarse graining", is not arbitrary or subjective but given by the macroscopic system (in this case a measurement device).
Let us try to understand it in an example. In a system of ##10^{23}## atoms, which of them are "small open system" and which are "macroscopic environment"?
 
  • #500
Kolmo said:
There's not really any ambiguity here. Macroscopic observables are represented by well defined sums of individual atomic observables (in general sums of products) resulting in an aggregate operator.
There is an ambiguity. Show me a concrete example of such a "well defined" sum and I will explain why is it ambiguous.
 
  • #501
Demystifier said:
Let is try to understand it in an example. In a system of ##10^{23}## atoms, which of them are "small open system" and which are "macroscopic environment"?
I think you have the picture wrong.

Say we have a ##n##-site operator ##a(q_{1}, \ldots , q_{n})## which is some operator over the Hilbert space ##\mathcal{H}_{1...n}## of these ##n##-particles. We can then form macroscopic operators via sums like:
$$A = \frac{1}{C}\sum_{f} a(f)$$
Where ##f## is one of the collections of ##n## particles defining ##a## and we perform the sum over all such partitions of ##n##-particles. Certain checks with relativity, material physics, etc can in addition show that a perfectly fine grained measurement of ##A## is not possible, i.e. not physically possible to distinguish all of its eigenvalues, so for a realistic model you replace ##A## with ##\bar{A}##.

One can then show that ##\bar{A}## doesn't display interference terms since it commutes with all other such macro-observables ##\bar{B}## and microscopic observables. A very early simple proof is given in the first edition of Gottfried's text.

These macro-observables are then your pointer variables. Each one is given by a particular well-defined sum. Seems clear to me.

It's also just one particular method for showing this.
 
  • Like
Likes vanhees71
  • #502
Kolmo said:
These macro-observables are then your pointer variables. Each one is given by a particular well-defined sum. Seems clear to me.
With the theory you presented, can you explain why typical macro pointers don't distinguish a cat in the state ##|dead\rangle+|alive\rangle## from the cat in the state ##|dead\rangle-|alive\rangle##?
 
  • #503
Demystifier said:
With the theory you presented, can you explain why typical macro pointers don't distinguish a cat in the state ##|dead\rangle+|alive\rangle## from the cat in the state ##|dead\rangle-|alive\rangle##?
Yes, although it's of course just one of a few methods. Essentially there's no physical observable which fails to commute with aggregate observables like "alive" and "dead", taking them here as shorthand for more well-defined macro-quantities.

A coupling which would attempt to measure such an observable would "melt" the cat into soup via the couplings that enact it. So leaving the cat as a macroscopic body it has pointer variables.
 
  • Like
Likes vanhees71
  • #504
Kolmo said:
Essentially there's no physical observable which fails to commute with aggregate observables like "alive" and "dead",
Why? How do you know that there is no such observable?

Kolmo said:
A coupling which would attempt to measure such an observable would "melt" the cat into soup via the couplings that enact it. So leaving the cat as a macroscopic body it has pointer variables.
I don't understand the second sentence. A macroscopic body melt into a soup is still a macroscopic body. A body does not need to be solid, a liquid is a body too. I know you were using a metaphor, but I don't understand the metaphor.
 
  • #505
Demystifier said:
Because the open system is a subsystem of the full closed system. Hence the properties of the open system can be derived from the properties of the closed system, and not the other way around.

Demystifier said:
Great, we finally agree that a closed system cannot resolve the measurement problem. What we disagree is that you think that an open system (which, in my understanding, is a subsystem of the full closed system) can resolve it.

Demystifier said:
I agree that there is no empirical evidence, but I think there is a logical evidence. It's the logic that if something cannot be explained by considering the full closed system (on which we agree), then it also cannot be explained by considering its open subsystem.
But you do know why a (open) subsystem provides additional structure not present in the full closed system alone. Even that additional structure is still not enough to resolve the measurement problem. But that additional structure is implicitly present in many arguments, so highlighting the importance of open systems for the measurement problem seems reasonable. (And it makes sense to me, because Heisenberg and other founders also stressed its importance.)
 
  • Like
Likes physika and Demystifier
  • #506
Demystifier said:
Why? How do you know that there is no such observable?
It can be proven, but it's a long argument. There are shorter arguments if one uses the abstract C*-formalism, but they sacrifice ease for brevity.

Demystifier said:
I don't understand the second sentence. A macroscopic body melt into a soup is still a macroscopic body. I know you were using a metaphor, but I don't understand the metaphor.
If you try to measure observables that don't commute with ##\bar{A}## then you necessarily reduce the device to a disperse plasma of individual atoms and subatomic particles which is not normally referred to as a macroscopic body or a device.

Any observable ##Q## obeying ##[Q,\bar{A}] \neq 0## is not compatible with the device remaining as a solid stable composite body.

Now there are old arguments from Ludwig in:
G. Ludwig: “Die Grundlagen der Quantenmechanik”, Springer, Berlin 19541
that such ##Q## in most cases probably can't be performed at all as the coupling Hamiltonians needed to enact them aren't physical at all. Gottfried says similar in his book in Sections 18-20 of his old text.
There are similar parallel arguments if one follows other approaches to measurement such as decoherence or the more abstract treatments with C*-algebras.

Either way it's clear that a measurement gives rise to irreversible storage of an outcome with the pointer variables being well-defined expressions and no real breakdown of quantum theory. The only novelty from the classical case is the fundamental probabalism of the outcomes and that we cannot ascribe a well-defined value to those quantities left unmeasured (which is the non-Kolmogorov nature of QM mentioned earlier).

1I learned to read German just to read this so I have fond memories of it!
 
  • Like
Likes vanhees71 and dextercioby
  • #507
Demystifier said:
Of course, I meant closed system including the measuring apparatus and the environment.
Which means that, while in principle yes, all the information is in that system, in practice most of that information is inaccessible to us. We certainly can't do quantum state tomography "from the outside" on a whole ensemble of identically prepared system + measuring apparatus + environment in order to find out exactly which pure state is being prepared.

Also, considering this closed system doesn't solve the measurement problem either, since this closed system should just undergo unitary evolution all the time, since it is not interacting with anything, and therefore we end up at something like the MWI. I see that this is more or less where you ended up in your exchange with @vanhees71.
 
  • Like
Likes vanhees71
  • #508
Trying to think of a way to phrase this, but if that unitary evolution, given the observable algebra of the device, results in a classical probability distribution over the observed outcomes isn't that all you need. Each term can just be read off as a probabilistic weighting of given values of the pointer variable(s). It's not deterministic, but I don't see the issue.
 
  • Like
Likes vanhees71
  • #509
Kolmo said:
Any observable ##Q## obeying ##[Q,\bar{A}] \neq 0## is not compatible with the device remaining as a solid stable composite body.
Without going into detailed mathematical proofs which is not my main point, would you agree that the reason for this, is that your axiomatic framework considers PERFECTLY optimal inferences only?

Ie. let's say the information theoretic summary is that, "We show that QM is the optimal inference theory of a "classical agent" (given certain conditions). Then the above conclusion does not allow that agents are allowed to have some variation AROUND the optimal value.

This seems unnatural to me, and too strong assumption from a perspective where you consider agents to evolve, and require the existence of variation, and it you relax this a bit, then the observables are not impossible, just corresponding to "unstable agents", or equivalently impying that if such a crazy agent would be backed up, it would act desctructively towards it's own environment.

Does this informal summary make sense to you, or would you disagree to it?

I certainly haven't read that book but "probably can't be performed at all as the coupling Hamiltonians needed to enact them aren't physical at all" sounds like meaning just the above?

/Fredrik
 
  • #510
Fra said:
Without going into detailed mathematical proofs which is not my main point, would you agree that the reason for this, is that your axiomatic framework considers PERFECTLY optimal inferences only?
No. Models of measurement equally model POVMs, Weak Measurements and so on. The Curie-Weiss model of measurement which is the default "very detailed" measurement model naturally produces POVMs. So it's not restricted to optimal measurements.
 

Similar threads

  • · Replies 333 ·
12
Replies
333
Views
18K
  • · Replies 292 ·
10
Replies
292
Views
10K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 37 ·
2
Replies
37
Views
6K
  • · Replies 6 ·
Replies
6
Views
2K
Replies
119
Views
3K
  • · Replies 226 ·
8
Replies
226
Views
23K
Replies
44
Views
5K
  • · Replies 40 ·
2
Replies
40
Views
2K
  • · Replies 228 ·
8
Replies
228
Views
15K