A Copenhagen: Restriction on knowledge or restriction on ontology?

  • #91
DarMM said:
That's just a mathematical fact of QM. There's no injective Gelfand representation for the algebra of observables

My point is that the word "observable" is misapplied. Microscopic observables are not observables. The fact that the operators ##x## and ##\frac{\partial}{\partial x}## don't have simultaneous eigenstates is not a fact about quantum mechanics, it's a fact about functions which was true before quantum mechanics was ever invented.
 
Physics news on Phys.org
  • #92
stevendaryl said:
Microscopic observables are not observables.
This rejects all traditional interpretations and brings you close to the thermal interpretation, where operators are q-observables only, and observable are inaccurate values for ##\langle A\rangle##.
 
  • Like
Likes dextercioby
  • #93
stevendaryl said:
My point is that the word "observable" is misapplied. Microscopic observables are not observables. The fact that the operators ##x## and ##\frac{\partial}{\partial x}## don't have simultaneous eigenstates is not a fact about quantum mechanics, it's a fact about functions which was true before quantum mechanics was ever invented.

Actually, ##x## doesn't have any eigenstates at all, but you know what I mean.
 
  • #94
stevendaryl said:
My point is that the word "observable" is misapplied. Microscopic observables are not observables. The fact that the operators ##x## and ##\frac{\partial}{\partial x}## don't have simultaneous eigenstates is not a fact about quantum mechanics, it's a fact about functions which was true before quantum mechanics was ever invented.
Well certainly the mathematical properties of those operators existed prior to QM, but I'm not sure that means it's not of any physical content with regard to QM. I mean it was a fact that there were non-trivial Adjoint bundles prior to Yang-Mills theories, but that doesn't mean these topological sectors are of no physical import in Yang-Mills theories. Otherwise we'd be close to saying the mathematics of a theory has no physical content.

Or like saying the Riemann tensor would have a decomposition into Weyl and Ricci curvature prior to GR. Sure, but it still means something physically via GR where such tensors have a physical meaning.

It directly tells you about a there being no state with those two observables having sharp values beyond a certain limit.
 
Last edited:
  • Like
Likes dextercioby
  • #95
A. Neumaier said:
The point is that after the whole statistics is collected, one knows the empirical distribution of all measurement results exactly; thus this defines a probability measure. In the limit of an infinite number of measurements you get a limiting measure for all measured variables. This measure, and only this, is sufficient to determine the outcomes - no matter how many multiple sample spaces are used in the microscopic description
Over what space though? Over the space of all trials, i.e. ##\omega_i## with ##i## a trial index?

I think I get what you mean, but regardless there won't be a common ##S_z## and ##S_x## sample space.
 
  • #96
Do either of you know of a reference for this macroscopic configuration probability measure idea?
 
  • #97
DarMM said:
Over what space though? Over the space of all trials, i.e. ##\omega_i## with ##i## a trial index?
Yes.
DarMM said:
I think I get what you mean, but regardless there won't be a common ##S_z## and ##S_x## sample space.
But a common ##S_z'## and ##S_x'## sample space, where ##S_z'## and ##S_x'## refer to the macroscopic pointer variables actually read when taking a measurement. @stevendaryl's claim is that only these need to be explained in terms of an interpretation of QM since these are being read, whereas the connection between these and the system measured is (in principle) a matter of theory.
 
  • #98
DarMM said:
Do either of you know of a reference for this macroscopic configuration probability measure idea?
It seems to be @stevendaryl's original idea, which he tried to communicate here on PF.
 
  • Like
Likes DarMM
  • #99
A. Neumaier said:
It seems to be @stevendaryl's original idea, which he tried to communicate here on PF.
Thank you. I'll need to think about it a bit as I'm not so sure it is correct. Despite ##S_z'## and ##S_x'## being macroscopic quantities I'm not so sure they really do have a common sample space in a meaningful way that contradicts or renders irrelevant the observation that they don't due to contextuality. Especially in light of some discussions by Jeffrey Bub who considered exactly this in his papers on the interpretation of QM.

@stevendaryl has had a lot of original ideas on this thread:
  1. The foundational irrelevancy of Contextuality
  2. The physical irrelevancy of the operators representing observables not commuting
  3. The irrelevancy of different sample spaces due to the quantities from each being amplified up to the macroscopic realm
Each of these ideas alone runs counter to most thinking in quantum foundations, quantum probability and quantum information. So I'll have to stop there to absorb and respond to them due to their novelty.
 
  • #100
DarMM said:
Thank you. I'll need to think about it a bit as I'm not so sure it is correct. Despite ##S_z'## and ##S_x'## being macroscopic quantities I'm not so sure they really do have a common sample space in a meaningful way that contradicts or renders irrelevant the observation that they don't due to contextuality. Especially in light of some discussions by Jeffrey Bub who considered exactly this in his papers on the interpretation of QM.

@stevendaryl has had a lot of original ideas on this thread:
  1. The foundational irrelevancy of Contextuality
  2. The physical irrelevancy of the operators representing observables not commuting
  3. The irrelevancy of different sample spaces due to the quantities from each being amplified up to the macroscopic realm
Each of these ideas alone runs counter to most thinking in quantum foundations, quantum probability and quantum information. So I'll have to stop there to absorb and respond to them due to their novelty.
I believe that 1. and 2. are a consequence of 3., and my previous remarks apply to 3.
 
  • #101
A. Neumaier said:
I believe that 1. and 2. are a consequence of 3., and my previous remarks apply to 3.
Yes that is correct, they're all ultimately the same thing in a sense.
 
  • #102
stevendaryl said:
There is no need for multiple probability spaces; the only probability space you need is the probability space for those macroscopic configurations.
I've tried to work this out, but I'm not seeing it. Can you give an example with some mathematical details, even sketched not necessarily in full detail.

Like what exactly are the macroscopic degrees of freedom, what sample space do they use and how do you still end up with CHSH violations despite the single sample space?
 
  • #103
DarMM said:
I still think this is too strong. You are using the lack of a solution given to the measurement problem to dismiss any insight from the different probabilistic structure of the theory. The fact of different sample spaces has many implications in Quantum Information, it's not just nothing or a red herring because it doesn't provide a solution to the measurement problem.

stevendaryl said:
Well, it seems to me that the various ways of saying that quantum mechanics is local is just a matter of shunting the issues that are of interest elsewhere---onto the measurement problem, or the single outcome problem.

Is this related?
Eric G. Cavalcanti
https://arxiv.org/abs/1602.07404
 
  • Like
Likes DarMM
  • #104
DarMM said:
I've tried to work this out, but I'm not seeing it. Can you give an example with some mathematical details, even sketched not necessarily in full detail.

Like what exactly are the macroscopic degrees of freedom, what sample space do they use and how do you still end up with CHSH violations despite the single sample space?
Have you seen Eberhard's proof of Bell inequalities? It might be the thing you are asking for. It is contained in this paper: https://journals.aps.org/pra/abstract/10.1103/PhysRevA.47.R747. This paper is behind paywall but I have posted the part of Bell inequality proof here: https://www.physicsforums.com/threa...y-on-probability-concept.944672/#post-5977632
 
  • #105
vanhees71 said:
What's local are the interactions
Interactions between what? Between field operators, obviously. But according to the statistical ensemble interpretation, the field operator is a tool to analyze the ensembles of systems, not the individual systems. Hence the local interactions are interactions between the ensembles of systems, not between the individual systems.

So what then are the interactions between the individual systems? The statistical ensemble interpretation of QFT does not tell. But the Bell theorem tells us that, if interactions between individual systems exist at all, then those interactions are nonlocal.
 
  • #106
This is also just personal opinion. My personal opinion is that the interactions described by QFT are local for each individual system. For the same reasons you give for your opinion that cannot be ruled within QFT.

The Bell theorem is about a fictitious deterministic local theory, fullfilling the validity of Bell's inequalities, which is ruled out by tremendous significance and rather verifies the predictions by (local!) QFT. It's also about the probabilistic statements of both the ficticious deterministic local theory and QFT. To Bell's dismay QFT is valid, while any deterministic local theory is ruled out. That's the great achievement by Bell: His work has brought philosophical gibberish a la EPR to a scientific statement that can be tested by experiments, and QFT delivers the correct description but not any deterministic local theory. Whether or not there is a deterministic non-local theory that describes this well-established facts, I cannot say, because there seems to be no such thing yet. It's understandable, because it's very hard to conceive a non-local description that is consistent with the causality structure of special (let alone general) relativity.

Note that above, I mean local/non-local in the sense of interactions!
 
  • #107
  • #108
DarMM said:
I've tried to work this out, but I'm not seeing it. Can you give an example with some mathematical details, even sketched not necessarily in full detail.

Okay, here's a sketch of the idea, which isn't all that profound. The minimal interpretation of quantum mechanics basically says that if you set up a system so that it is described by the state ##|\psi\rangle## and you measure some observable ##A## of the system, then you will get an eigenvalue ##a## of the corresponding operator with a probability given by: ##P_a = \langle \psi|\Pi_{A,a}|\psi\rangle##, where ##\Pi_{A,a}## is the projection operator, which has eigenvalue 1 on any state in which ##A## has definite value ##a##, and has eigenvalue 0 on any state in which ##A## has a definite value other than ##a##.

Now, the above prescription has the phrase "you measure some observable ##A##". What does that mean? Well, to measure an observable of a system means to set up an interaction between that system and a measuring device so that distinct values of that observable lead to macroscopically different states of the measuring device. In other words, for every possible value ##a## of the microscopic observable, there is a macroscopic configuration ##C_a## of the measuring device, so that the measuring device reliably ends up in configuration ##C_a## whenever the system of interest has value ##a##. For now, a macroscopic configuration is basically whatever description of a macroscopic system that one could obtain by inspection: This red light is on. There is a black dot here rather than there. This display shows such and such value. The Geiger counter is clicking.

Now, although it's an enormously difficult to completely nail down the details, I think that most physicists are fairly confident that measuring devices themselves are described by the same quantum mechanics as the systems being studied. In principle, even if intractable in practice, one could do a full-fledged quantum mechanical analysis of the system + measuring device + relevant environment, and one would find something like this:

##|\psi_a\rangle \otimes |start\rangle \Rightarrow |\psi_a\rangle \otimes |C_a\rangle##

where ##|\psi_a\rangle## is the state of the system of interest when it is an eigenstate of ##A## with eigenvalue ##a##, and ##|start\rangle## is the initial state of the measuring device + environment, and ##|C_a\rangle## is the state of the measuring device plus environment after the measurement takes place.

I say "something like this" rather than exactly this because the reality is much more complicated. There is no single state of measuring device + environment, there are enormously many microscopically distinguishable states corresponding to any macroscopic description, and a measurement process is an irreversible change, which is hard to describe using quantum mechanics. But all difficulties aside, I think most people are confident that there is nothing going on in a measurement process that isn't in principle describable by quantum mechanics.

So to the extent that my sketch can be accepted as approximately correct, with a large grain of salt, we can ask what happens if you use the same measuring setup to measure the microscopic system when it is not in an eigenstate of ##A##. Well, since the evolution equations of quantum mechanics are linear, it follows that a superposition of initial states would lead to a superposition of final states:

##\sum_a \alpha_a |\psi_a\rangle \otimes |start\rangle ##
##\Rightarrow \sum_a \alpha_a |\psi_a\rangle \otimes |C_a\rangle##

Then the Born rule saying that there is probability ##|\alpha_a|^2## that the microscopic system will be measured to have eigenvalue ##a## is essentially the same as saying that the measuring device will later be found to be in the configuration ##C_a## with probability ##|\alpha_a|^2##. So the Born rule for the microscopic system presumably follows from the Born rule for the measuring device plus the definition of what it means to measure something.So my claim is that the empirical content of the minimal interpretation is equivalent (in principle) to the following recipe:
  1. Describe the whole universe (or the part that's relevant) as a quantum system.
  2. Let that system evolve according to the usual unitary rule.
  3. Then decompose the final state into a superposition of macroscopically distinguishable states, each of which has definite values for all macroscopic properties.
  4. Assume that the macroscopic system will be found in exactly one of those states, with a probability given by the square of the corresponding amplitude.

The Born rule for measurements would (I claim) follow from the above recipe, together with the definition of what it means to measure a microscopic quantity.

Mathematically, I think the above recipe could be formulated in terms of projection operators. Presumably any fact about the world that could be verified by observation such as "there is a black spot on the left photographic plate" corresponds to a claim of the form that some coarse-grained observable has a value in some range. Such claims can be formulated in terms of projection operators. So for every such macroscopic statement ##c##, there is presumably a corresponding projection operator ##\Pi_c##. If the observables are taken to be low-enough precision and coarse-grained enough, then all the corresponding projection operators are approximately commuting. Which means that we talk in terms of macroscopic configurations, which are just maximal collections of compatible macroscopic claims. So we can in principle come up with projection operators ##\Pi_j## on the state of the universe such that for ##j \neq k##, the projection operators ##\Pi_j## and ##\Pi_k## correspond to macroscopically distinguishable states of the universe.

Then given an initial state of the universe ##|\psi_0\rangle##, the probability that the universe will be in macroscopic configuration ##j## at a later time ##t## would be given by:

##P_j(t) = \langle \psi_0 | e^{+iHt} \Pi_j e^{-iHt} \rangle \equiv \langle \psi_0 |\Pi_j(t)| \psi_0 \rangle##

where ##\Pi_j(t)## is the time-dependent Heisenberg operator corresponding to ##\Pi_j##:

##\Pi_j(t) = e^{+iHt}\Pi_j e^{-iHt}##

If we could actually calculate ##P_j(t)##, that would give the entire empirical content of quantum mechanics.
 
  • Like
Likes DarMM
  • #109
vanhees71 said:
This is also just personal opinion. My personal opinion is that the interactions described by QFT are local for each individual system. For the same reasons you give for your opinion that cannot be ruled within QFT.

The empirical content of QFT is, like the empirical content of nonrelativistic quantum mechanics, composed of two distinct pieces: (1) You calculate amplitudes for processes using the Schrodinger equation (non-relativistically), or using the S-matrix (relativistically). (2) You square the amplitudes to get probabilities for measurement results. The issue for locality in quantum mechanics is not, and has never been, about (1). It's about (2).
 
  • #110
@stevendaryl I'll just need to think for a bit, but how does this differ from consistent histories?
 
  • #111
vanhees71 said:
This is also just personal opinion. My personal opinion is that the interactions described by QFT are local for each individual system. For the same reasons you give for your opinion that cannot be ruled within QFT.
My "personal" opinion is supported by many published papers and books, including the book by Balentine. What published work can be used to support your opinion? Namely the opinion that, in the statistical ensemble interpretation, the interactions are local not only on the ensemble level, but also at the individual one.
 
  • #112
DarMM said:
@stevendaryl I'll just need to think for a bit, but how does this differ from consistent histories?

I think it's basically the same thing. But I don't see it as a different interpretation of quantum mechanics. It's basically the same as the minimal interpretation, but restricted to macroscopic observables. The nice thing about macroscopic observables is that they don't need a second system measuring them, so instead of saying "the observable will be measured to be an eigenvalue with such-and-such probability", you can just say "the observable will have this value with such-and-such probability". Measurement becomes irrelevant. At the cost of explicitly treating macroscopic variables as more special than microscopic variables.

Consistent histories doesn't explicitly make the microscopic/macroscopic distinction. It says you can take any collection of mutually commuting observables and compute the probabilities associated with their history. But I find that a little unsatisfying. What determines which collection is used? Does consistent histories imply a doubly-multiple Many Worlds, where not only are there different possible histories corresponding to different values for a fixed set of observables, but also different possible histories corresponding to different choices of the commuting observables?
 
  • #113
Okay that makes sense. Consistent histories still has multiple sample spaces though in the sense of history sets which cannot be combined or reasoned about together. Chapter 25 of Griffiths book has some good examples.
 
  • #114
DarMM said:
Okay that makes sense. Consistent histories still has multiple sample spaces though in the sense of history sets which cannot be combined or reasoned about together. Chapter 25 of Griffiths book has some good examples.
But why is this relevant? Only one history can be realized. The others don't matter.

If the history includes all repetitions of experiments then the probabilistic aspects must be deducible from the single realized history, just as we learn about empirical probabilities from looking at the history availsble to us.

In classical mechanics, only selective histories are permitted, which makes classical mechanics highly predictive and lack paradox. On the other hand, in quantum mechanics, all histories are permitted, which makes quantum mechanics strictly speaking totally nonpredictive unless you add an external selection criterion for which histories to permit.
 
Last edited:
  • Like
Likes dextercioby
  • #115
A. Neumaier said:
But why is this relevant? Only one history can be realized.
Of course, but it doesn't remove that mathematical structure from the theory, the counterfactual indefiniteness. Even Consistent History authors have it as a major feature of the theory.

The point is that even if only one history occurs it cannot be considered as a history where for spin measurements the whole vector ##(S_x, S_y, S_z)## occur, it will only contain a ##S_z## event say. Which is what would make QM different from a theory where the world was truly random but driven by a classical stochastic process.
 
  • #116
DarMM said:
Of course, but it doesn't remove that mathematical structure from the theory, the counterfactual indefiniteness. Even Consistent History authors have it as a major feature of the theory.

The point is that even if only one history occurs it cannot be considered as a history where for spin measurements the whole vector ##(S_x, S_y, S_z)## occur, it will only contain a ##S_z## event say. Which is what would make QM different from a theory where the world was truly random but driven by a classical stochastic process.

It's a stochastic theory for a commuting subset of observables. As I sketched, if you have an enumeration of all possible macroscopic configurations, then quantum mechanics gives you a probability ##P_j(t)## of the universe being in state ##j## at time ##t##. The macroscopic configuration doesn't say anything about microscopic observables such as the components of spin of individual electrons (except to the extent that those can be inferred from macrosopic information).
 
  • #117
stevendaryl said:
It's a stochastic theory for a commuting subset of observables. As I sketched, if you have an enumeration of all possible macroscopic configurations, then quantum mechanics gives you a probability ##P_j(t)## of the universe being in state ##j## at time ##t##. The macroscopic configuration doesn't say anything about microscopic observables such as the components of spin of individual electrons (except to the extent that those can be inferred from macrosopic information).
Yes, but you have multiple stochastic theories for each set of commuting macroobservables, unlike Classical Probability theory. You don't have a single space for all the macroscopic configurations, you can only form sample spaces for mutually commuting ones.

Would it be right to say that what you are getting at is that only ##S_z## is amplified up to the macroscopic level and thus only it constitutes a macroscopic outcome, i.e. if you take QM as a theory of stochastic macroscopic observables the lack of a common sample space arises purely from the fact that only ##S_z## "rises up".
 
  • #118
DarMM said:
Yes, but you have multiple stochastic theories for each set of commuting macroobservables, unlike Classical Probability theory.

But macroscopic variables all commute (at least approximately). You can't know the position and momentum of an electron at the same time, but you know the approximate position and approximate momentum of a baseball at the same time.

Would it be right to say that what you are getting at is that only ##S_z## is amplified up to the macroscopic level and thus only it constitutes a macroscopic outcome, i.e. if you take QM as a theory of stochastic macroscopic observables the lack of a common sample space arises purely from the fact that only ##S_z## "rises up".

If ##S_z## is measured, then its value becomes part of the macroscopic configuration. If it isn't measured, then it isn't a part of the macroscopic configuration. Microscopic variables are involved in computing the macroscopic probabilities, ##P_j(t)##, but they aren't assumed to have values (or associated probability distributions). Probability distributions only apply to values of macroscopic observables, and they are all approximately commuting.
 
  • #119
stevendaryl said:
But macroscopic variables all commute (at least approximately).
Yes, but this is essentially answered by the next part of your post. The macroscopic observables in a history where ##S_z## becomes part of the macroscopic configuration don't commute with the macroscopic observables where ##S_x##. This of course is really just a statement that there is no such thing as a history where ##(S_x,S_y,S_z)## becomes part of the macroscopic observables.

However this doesn't really invalidate what you are saying. Of which I will say more in a second, I just want to check that you agree with the paragraph above.
 
  • #120
DarMM said:
Yes, but this is essentially answered by the next part of your post. The macroscopic observables in a history where ##S_z## becomes part of the macroscopic configuration don't commute with the macroscopic observables where ##S_x##. This of course is really just a statement that there is no such thing as a history where ##(S_x,S_y,S_z)## becomes part of the macroscopic observables.

That's true.

There is no joint probability distribution describing ##S_x## and ##S_y##. However, if you lift it to the macroscopic level, there can be a joint probability distribution describing the two situations: "I measured ##S_x## and found it to be ##+\frac{1}{2}##" and "I measured ##S_y## and found it to be ## +\frac{1}{2}##". Those two situations are exclusive: if one is true, the other is false. This is in contrast to the microscopic situation, where it is not possible to give simultaneous truth values to ##S_x = + \frac{1}{2}## and ##S_y = + \frac{1}{2}##.
 
  • Like
Likes zonde

Similar threads

  • · Replies 292 ·
10
Replies
292
Views
10K
  • · Replies 4 ·
Replies
4
Views
3K
Replies
35
Views
737
  • · Replies 40 ·
2
Replies
40
Views
2K
  • · Replies 226 ·
8
Replies
226
Views
23K
  • · Replies 376 ·
13
Replies
376
Views
21K
  • · Replies 37 ·
2
Replies
37
Views
6K
  • · Replies 76 ·
3
Replies
76
Views
6K
  • · Replies 37 ·
2
Replies
37
Views
3K
Replies
133
Views
9K