A An argument against Bohmian mechanics?

  • #301
stevendaryl said:
The rule that a measurement results in an eigenvalue with probabilities given by the square of the amplitude IS an extra rule. It only applies to measurement interactions, and not to other types of interactions.
Yes, it's part of the standard rules of QT in the minimal interpretation, but there's nothing special in interactions between the system and the measurement apparatus. It's the same rules since a measurement apparatus consists of the same microsopic building blocks as any other object.

So that's an example of a rule that applies to measurements and not to other interactions. If it applied to other types of interactions, then you wouldn't have to use the phrase "when measured".
I still don't understand, how one can come to such a conclusion. It's just a tautology: If I want to know a (more or less precise) value of an observable, I have to measure it. No matter whether I'm thinking in terms of classical or quantum theory.
 
Physics news on Phys.org
  • #302
stevendaryl said:
There are no probabilities in QM without a choice of a basis. The microscopic evolution doesn't select a basis.
In the standard formalism there's a large freedom in choosing the time evolution picture. Thus states, represented by the statistical operator ##\hat{\rho}(t)## and eigenvectors ##|a(t) \rangle## of a complete set of compatible observables represented by a corresponding set of self-adjoint operators ##\hat{A}(t)##, for themselves have no physical meaning, i.e., are not referring to directly observable/measurable phenomena. That's the case only for the corresponding probability (distributions), $$P(t,a)=\langle a(t)|\hat{\rho}(t)|a(t) \rangle.$$
Thus there are probabilities in standard QT from the very beginning. It's part of the postulates and the only relation between the mathematical formalism to observable/measurable phenomena in nature.
 
  • #303
vanhees71 said:
Without Born's rule, I've no clue what quantum theory is about. Then it's a funny mathematical game to play without any contact to measurements and observations in the real world.
There was a time when the universe was playing that "funny mathematical game"! I'm sure even now there are places in the universe where that game is still being played.
 
  • #305
vanhees71 said:
?
I suppose that's for me!
I meant there was a time where no people were around to observe anything and yet the universe was behaving quantum mechanically. And even in the present, there are parts of the universe where we can't observe and still quantum mechanics applies to them. Unless you're willing to assume quantum mechanics applies only when people are around to observe the result!
 
  • #306
PeterDonis said:
It depends on how you define these terms. Or you could just realize that ordinary language is not well suited to this kind of discussion, and specify theories in terms of their actual math and their actual predictions, which is how we distinguish them in practice. It's easy to test whether a theory's predictions satisfy the Bell inequalities or not.
I did define the terms local and realist in previous posts and both definitions have math content(the math content of being realist is summarized in the Bell inequalities) and the nonlocal/local is defined in terms of physical possibility of FTL signals or not.
Whether you want to call a theory that violates them "nonlocal" or "non-realist" is a matter of words, not physics.
Well, it seems for some the distinction is about physics. Although in fact it seems more about philosophy the way it is presented here: http://fetzer-franklin-fund.org/media/emqm15-reconcile-non-local-reality-local-non-reality-3/
 
  • #307
vanhees71 said:
Yes, it's part of the standard rules of QT in the minimal interpretation, but there's nothing special in interactions between the system and the measurement apparatus.

Then why does the rule single out measurements?

I still don't understand, how one can come to such a conclusion. It's just a tautology: If I want to know a (more or less precise) value of an observable, I have to measure it. No matter whether I'm thinking in terms of classical or quantum theory.

There is no special interaction associated with measurements in classical theory. Look, try to formulate the Born rule without mentioning "measurement" or a macroscopic/microscopic distinction. It is not possible. In contrast, the rules for classical mechanics can be formulated without mentioning measurement. That doesn't mean that there are no measurements in classical mechanics, but that measurements are a derived concept, not a primitive concept.

We could try to make measurements a derived concept in QM, as well.

What is a measurement? Well, a first cut at this is that a measurement is an interaction between one system (the system to be measured) and a second system (the measuring device) so that the quantity to be measured causes a macroscopic change in the measuring device. Roughly speaking, we have:
  • A system being measured that has some operator O with eigenvalues o_i.
  • A measuring device, which has macroscopic states "S_{ready}" (for the state in which it has not yet measured anything) and "S_i" (for the state of having measured value o_i; each macroscopic state will correspond to a huge number of microscopic states.)
  • The device is metastable when in the "ready" state, meaning that a small perturbation away from the ready state will lead to an irreversible, entropy-increasing transition to one of the states S_i.
  • The interaction between system and measuring device is such that if the system is in a state having eigenvalue o_i, then the device is overwhelmingly more likely to end up in the state S_i than any other macroscopic state S_j. (There might be other macroscopic states representing a failed or ambiguous measurement, but I'll ignore those for simplicity.)
Note: There is a bit of circularity here, in that to really make sense of the Born rule, I have to define what a measuring device is, and to define a measuring device, I have to refer to probability (which transitions are much more likely than which others), which in quantum mechanics has to involve the Born rule. What we can do, though, is treat the fact that the device makes a transition to state S_i when it interacts with a system in a pure state with eigenvalue o_i as initially being a matter of empirical observation, or it can be derivable from classical or semiclassical physics.

Then the Born rule implies that if the system to be measured is initially in a superposition of states of the form: \sum_i c_i |\psi_i\rangle where |\psi_i\rangle is an eigenstate of O with eigenvalue o_i, then the measuring device will, upon interacting with the system, make a transition to one of the macroscopic states S_i with a probability given by |c_i|^2.

Now, at this point, I think we can see where the talk about "measurement" in quantum mechanics is something of a red herring. Presumably, the fact that a pure eigenstate |\psi_i\rangle of the system triggers the measuring device to go into macroscopic state S_i is in principle derivable from applying Schrodinger's equation to the composite system, if we know the interaction Hamiltonian. But because of the linearity of quantum evolution, one would expect that if the initial state of the system were a superposition \sum_i c_i |\psi_i\rangle, then the final state of the measuring device would be a superposition of different macroscopic states, as well. (Okay, decoherence will actually prevent the device from being in a superposition, but the combination system + device + environment would be in a superposition.) So the Born rule for measurements can be re-expressed in what I think is an equivalent form that doesn't involve measurements at all:

If a composite system is in a state of the form |\Psi\rangle = \sum_i c_i |\Psi_i\rangle, where for i \neq j, |\Psi_i\rangle and |\Psi_j\rangle represent macroscopically distinguishable states, then that means that the system is in exactly one of the states |\Psi_i\rangle, with probability |c_i|^2

Note the difference with the usual statement of the Born rule: I'm not saying that the system will be measured to have some eigenvalue \lambda_i with probability |c_i|^2, because that would lead to an infinite regress. You would need a microscopic system, a measuring device to measure the state of the microscopic system, a second measuring device to measure the state of the first measuring device, etc. The infinite regress must stop at some point where something simply HAS some value, not "is measured to have some value".

But with this reformulation of Born's rule, which I'm pretty sure is equivalent to the original, you can see that the macroscopic/microscopic distinction has to be there. You can't apply the rule without the restriction that i \neq j implies |\Psi_i\rangle is macroscopically distinguishable from |\Psi_j\rangle. To see this, take the simple case of a single spin-1/2 particle, where we only consider spin degrees of freedom. A general state can be written as |\psi\rangle = \alpha |u\rangle + \beta |d\rangle. Does that mean that the particle is "really" in the state |u\rangle or state |d\rangle, we just don't know which? No, it doesn't mean that. The state |\psi\rangle is a different state from either |u\rangle or |d\rangle, and it has observably different behavior.

If you eliminate "measurement" as a primitive, then you can't apply the Born rule without making a macroscopic/microscopic distinction.
 
Last edited:
  • Like
Likes MrRobotoToo, zonde, ShayanJ and 1 other person
  • #308
ShayanJ said:
I suppose that's for me!
I meant there was a time where no people were around to observe anything and yet the universe was behaving quantum mechanically. And even in the present, there are parts of the universe where we can't observe and still quantum mechanics applies to them. Unless you're willing to assume quantum mechanics applies only when people are around to observe the result!

Yeah, to me, the minimal interpretation is schizophrenic (whoops! I guess that's no longer a socially or medically acceptable term for split personality) On the one hand, it denies that there is anything special about measurement, and on the other hand, it seems to declare that the whole theory is meaningless without measurements.
 
  • Like
Likes RockyMarciano
  • #309
stevendaryl said:
Yeah, to me, the minimal interpretation is schizophrenic (whoops! I guess that's no longer a socially or medically acceptable term for split personality) On the one hand, it denies that there is anything special about measurement, and on the other hand, it seems to declare that the whole theory is meaningless without measurements.
To be fair one must admit that the origin of this situation lies in the mathematical formulation, the Born postulate gives non-classical probabilities as the only way to get predictions of measurements while the rest of the postulates based on a Hilbert space are classical. The only way to avoid direct schizophrenic contradiction is by adding an ad hoc macroclassical/microquantum cut to the theory that establishes a dependency on the classical theory and of course an unsolvable enigma about the difference between measurement interactions and the rest of interactions.

But in the minimal interpretation(and others) this cut is understood as a continuous smooth approximation from the micro to the macro situations so that there isn't really cut between measurements and the rest of interactions in principle but only the practical difficulties of dealing with the microquantum sizes with our big measurement apparatuses, and they are completely quantum too. This is actually hinging on the correspondence principle of the classical limit of QM(defined as: A more general theory can be formulated in a logically complete manner independently of the less general theory which forms a limiting case of it), based on certain requisites like the Ehrenfest equations among others. The problems is that these requisites depend on the truth of the classical postulates, so the principle is not met, and if those postulates really contradict the Born postulate they are of no use to base the correspondence limit and its smooth connection from micro to macro. One is then obligated to acknowledge the introduction of the cut in the theory in order to avoid inconsistence even if one doesn't think it exists in nature. In the words of Landau: "It is impossible in principle to formulate the basic concepts of QM without using classical mechanics".

Now try to explain this to an experimentalist minded person. As long as the Born rule postulate works they couldn't care less whether it is in contradiction or not with the rest of the postulates, to them it is none of their business.
 
  • #310
stevendaryl said:
Then why does the rule single out measurements?
There is no special interaction associated with measurements in classical theory. Look, try to formulate the Born rule without mentioning "measurement" or a macroscopic/microscopic distinction. It is not possible. In contrast, the rules for classical mechanics can be formulated without mentioning measurement. That doesn't mean that there are no measurements in classical mechanics, but that measurements are a derived concept, not a primitive concept.

We could try to make measurements a derived concept in QM, as well.

What is a measurement? Well, a first cut at this is that a measurement is an interaction between one system (the system to be measured) and a second system (the measuring device) so that the quantity to be measured causes a macroscopic change in the measuring device. Roughly speaking, we have:
  • A system being measured that has some operator O with eigenvalues o_i.
  • A measuring device, which has macroscopic states "S_{ready}" (for the state in which it has not yet measured anything) and "S_i" (for the state of having measured value o_i; each macroscopic state will correspond to a huge number of microscopic states.)
  • The device is metastable when in the "ready" state, meaning that a small perturbation away from the ready state will lead to an irreversible, entropy-increasing transition to one of the states S_i.
  • The interaction between system and measuring device is such that if the system is in a state having eigenvalue o_i, then the device is overwhelmingly more likely to end up in the state S_i than any other macroscopic state S_j. (There might be other macroscopic states representing a failed or ambiguous measurement, but I'll ignore those for simplicity.)
Note: There is a bit of circularity here, in that to really make sense of the Born rule, I have to define what a measuring device is, and to define a measuring device, I have to refer to probability (which transitions are much more likely than which others), which in quantum mechanics has to involve the Born rule. What we can do, though, is treat the fact that the device makes a transition to state S_i when it interacts with a system in a pure state with eigenvalue o_i as initially being a matter of empirical observation, or it can be derivable from classical or semiclassical physics.

Then the Born rule implies that if the system to be measured is initially in a superposition of states of the form: \sum_i c_i |\psi_i\rangle where |\psi_i\rangle is an eigenstate of O with eigenvalue o_i, then the measuring device will, upon interacting with the system, make a transition to one of the macroscopic states S_i with a probability given by |c_i|^2.

Now, at this point, I think we can see where the talk about "measurement" in quantum mechanics is something of a red herring. Presumably, the fact that a pure eigenstate |\psi_i\rangle of the system triggers the measuring device to go into macroscopic state S_i is in principle derivable from applying Schrodinger's equation to the composite system, if we know the interaction Hamiltonian. But because of the linearity of quantum evolution, one would expect that if the initial state of the system were a superposition \sum_i c_i |\psi_i\rangle, then the final state of the measuring device would be a superposition of different macroscopic states, as well. (Okay, decoherence will actually prevent the device from being in a superposition, but the combination system + device + environment would be in a superposition.) So the Born rule for measurements can be re-expressed in what I think is an equivalent form that doesn't involve measurements at all:

If a composite system is in a state of the form |\Psi\rangle = \sum_i c_i |\Psi_i\rangle, where for i \neq j, |\Psi_i\rangle and |\Psi_j\rangle represent macroscopically distinguishable states, then that means that the system is in exactly one of the states |\Psi_i\rangle, with probability |c_i|^2

Note the difference with the usual statement of the Born rule: I'm not saying that the system will be measured to have some eigenvalue \lambda_i with probability |c_i|^2, because that would lead to an infinite regress. You would need a microscopic system, a measuring device to measure the state of the microscopic system, a second measuring device to measure the state of the first measuring device, etc. The infinite regress must stop at some point where something simply HAS some value, not "is measured to have some value".

But with this reformulation of Born's rule, which I'm pretty sure is equivalent to the original, you can see that the macroscopic/microscopic distinction has to be there. You can't apply the rule without the restriction that i \neq j implies |\Psi_i\rangle is macroscopically distinguishable from |\Psi_j\rangle. To see this, take the simple case of a single spin-1/2 particle, where we only consider spin degrees of freedom. A general state can be written as |\psi\rangle = \alpha |u\rangle + \beta |d\rangle. Does that mean that the particle is "really" in the state |u\rangle or state |d\rangle, we just don't know which? No, it doesn't mean that. The state |\psi\rangle is a different state from either |u\rangle or |d\rangle, and it has observably different behavior.

If you eliminate "measurement" as a primitive, then you can't apply the Born rule without making a macroscopic/microscopic distinction.
This is again an example of philosophical misunderstandings. The Born rule doesn't single out ineractions between a measurement device and the system with regard to any other interaction. The same fundamental interactions of the Standard Model are at work always. Of course quantum theory as well as classical theory is about what we are able to observe and measure. So are the probabilities described by quantum theory probabilities for the outcome of measurements of a given observable on a system whose state is given by previous observations or a preparation procedure.

What you describe further is the collapse hypothesis, which I think is only a very special case which almost always doesn't apply and if so it's a measurement device carefully constructed to enable a (good approximation) of a von Neumann filter measurement. I thus don't say, that the system undergoes a transition to a the state ##|\psi_i \rangle \langle \psi_i|## with probability ##|c_i|^2## but simply that I measure ##o_i## with this probability. It may well be that the system is destroyed by the measurement process (e.g., a photon is absorbed when registered by a photodetector or em. calorimeter).

That the measurement device must have the classical properties you describe is also pretty clear since we have to "amplify" the microscopic properties to be able to measure them, but I don't think that there is a distinction between classical and quantum laws on a fundamental level. The classical behavior is derivable from quantum theory by appropriate averaging procedures in the usual sense of quantum statistics. A "macro state" is thus describable as the average over a large number of "micro states". You mention entropy production yourself, but that indeed makes it necessary to neglect information, i.e., to coarse grain to the relevant macroscopic observables.
 
  • #311
atyy said:
Anyway, the basic idea is that unless there is fine tuning, it is unlikely the universe was created in equilibrium.
I think that this idea missis the idea of statistical equilibrium. The system in statistical equilibrium tends to stay close to it, because the majority of all possible states are close to the equilibrium. The statistical equilibrium is nothing but the state of largest entropy. Therefore one does not need fine tuning to have an equilibrium. Just the opposite, one needs fine tuning to be in a state far from equilibrium.
 
  • Like
Likes vanhees71
  • #312
rubi said:
BM is possibly one of the least rational explanations that people have come up with in the history of science.
So what's the most rational interpretation of QM in your opinion? Consistent histories? With non-classical logic (see Griffiths)? Changing the rules of logic is the least rational thing to do for my taste.
 
  • Like
Likes vanhees71
  • #313
Demystifier said:
I think that this idea missis the idea of statistical equilibrium. The system in statistical equilibrium tends to stay close to it, because the majority of all possible states are close to the equilibrium. The statistical equilibrium is nothing but the state of largest entropy. Therefore one does not need fine tuning to have an equilibrium. Just the opposite, one needs fine tuning to be in a state far from equilibrium.

But this assumes a discrete state space. If the state space is not discrete, then there is no unique notion of majority.

Also, it makes no sense to use "majority" as an argument. It is dynamics that is fundamental, not statistical mechanics.
 
  • #314
atyy said:
But this assumes a discrete state space. If the state space is not discrete, then there is no unique notion of majority.
Sure, you have to fix some measure. But in many cases there is a natural choice of measure. For instance, in classical statistical physics for one particle in 3 dimensions the natural measure is ##d^3x d^3p##, which is related to the fact that the phase volume is conserved owing to the Liouville theorem. A similar measure exists for Bohmian mechanics.
 
  • #315
Demystifier said:
Sure, you have to fix some measure. But in many cases there is a natural choice of measure. For instance, in classical statistical physics for one particle in 3 dimensions the natural measure is ##d^3x d^3p##, which is related to the fact that the phase volume is conserved owing to the Liouville theorem. A similar measure exists for Bohmian mechanics.

But the world is manifestly not in thermodynamic equilibrium.
 
  • #316
atyy said:
But the world is manifestly not in thermodynamic equilibrium.
And that's one of the greatest mysteries in statistical physics. A natural state is thermodynamic equilibrium, and nobody knows why exactly we are not in equilibrium.
 
  • #317
Demystifier said:
And that's one of the greatest mysteries in statistical physics. A natural state is thermodynamic equilibrium, and nobody knows why exactly we are not in equilibrium.

Well, at least you are consistent. I've always thought the mystery was why silly things like the canonical ensemble actually work :)
 
  • #318
vanhees71 said:
What you describe further is the collapse hypothesis,

Tell me how my story changes if you don't assume collapse. I think that without collapse, then measurements become only subjective, which is basically the Many Worlds idea.

I would distinguish three aspects of the collapse hypothesis:
  1. After measurement, there is no longer observable interference between alternatives.
  2. Measurement reveals a fact about the world (that a certain observable has a certain value).
  3. After measurement, the composite system (system measured plus measuring device plus environment) is in a state consistent with the value measured, and so from then on, evolves from that state, not the original superposition.
Number 1 is not a necessary assumption, because it is presumably derivable from ordinary quantum mechanics, if you take into account decoherence.

So in denying collapse, are you denying 2? Measurements don't actually reveal anything about the world? I can't believe that. How could a result count as a measurement if it doesn't tell us anything about the world? But if it does tell us something about the world, what is telling us about the world?

As for #3, my discussion didn't mention anything about evolution after the measurement, so that's not relevant.

This is again an example of philosophical misunderstandings.

On the contrary, I think it shows that the minimal interpretation is incoherent as it stands. What am I misunderstanding?
 
Last edited:
  • Like
Likes zonde
  • #319
vanhees71 said:
This is again an example of philosophical misunderstandings. The Born rule doesn't single out ineractions between a measurement device and the system with regard to any other interaction. The same fundamental interactions of the Standard Model are at work always. Of course quantum theory as well as classical theory is about what we are able to observe and measure.
Classical theory is about what is really there. With measurements we check that the theory faithfully describes what is there. So measurements are special in classical theory i.e. measurements can falsify the theory.
In QT the Born rule eliminates solipsism from the theory and puts it on the grounds where it can be checked against reality. If you discard Born rule you can't perform scientific test of QT.
And this is how I understand this phrase from stevendaryl's post:
stevendaryl said:
The infinite regress must stop at some point where something simply HAS some value, not "is measured to have some value".
 
  • #320
vanhees71 said:
You mention entropy production yourself, but that indeed makes it necessary to neglect information, i.e., to coarse grain to the relevant macroscopic observables.

Coarse graining does not explain how measurement results in only one outcome from a superposition of several possible outcomes. That's a misunderstanding on your part.
 
  • #321
vanhees71 said:
What you describe further is the collapse hypothesis, which I think is only a very special case which almost always doesn't apply and if so it's a measurement device carefully constructed to enable a (good approximation) of a von Neumann filter measurement. I thus don't say, that the system undergoes a transition to a the state ##|\psi_i \rangle \langle \psi_i|## with probability ##|c_i|^2## but simply that I measure ##o_i## with this probability. It may well be that the system is destroyed by the measurement process (e.g., a photon is absorbed when registered by a photodetector or em. calorimeter).

I went through that already. Yes, that's what you say, and I claim that it is nonsense. That's the infinite regress that I'm talking about. For the microscopic system, it's not that it is ACTUALLY in state with eigenvalue o_i, it is only that the measuring device MEASURES it to have eigenvalue o_i. But the meaning of "the measuring device measures o_i" is that makes a transition to some corresponding macroscopic state S_i. Now, you want to say that the device isn't ACTUALLY in state S_i, it's just that's I OBSERVE it to be in that state. But since I'm a quantum mechanical system, as well, presumably, I'm not ACTUALLY in the state of "observing the device to be in state S_i", it's that a third person observes me to be observing the device to be in state S_i. Etc.

If measurements are ordinary interactions, then saying that "X observes Y" is a statement about the state of X. X is in the particular state of observing measurement outcome Y. So there is no need to bring in measurements again--it's simply a fact about the system X.
 
  • #322
Demystifier said:
So what's the most rational interpretation of QM in your opinion? Consistent histories? With non-classical logic (see Griffiths)? Changing the rules of logic is the least rational thing to do for my taste.
Consistent histories doesn't change the rules of logic. It specifies rules for how to obtain a single framework. All interpretations need to do this or how do you calculate the probability for ##S_x=1\wedge S_y=-1## in BM? Quantum theory is contextual and no interpretation can deny this fact without changing the predictions of the theory. CH is not really an interpretation of QT. It rather accomplishes the necessary job of telling us, how to obtain probabilities that add up to 1. The interpretational part of CH is to view time evolution as a stochastic process. What you call "changing the rules of logic" is common to all interpretations of QT, even BM. It's just that the CH people first understood, how to deal with it mathematically.

I don't know what's the most rational explanation, but if an interpretation requires a conspiracy of cosmic extent, then practically everything is more rational. The analogy between BM and epicycles is really quite obvious.

stevendaryl said:
Coarse graining does not explain how measurement results in only one outcome from a superposition of several possible outcomes. That's a misunderstanding on your part.
I don't understand why you think that a stochastic theory needs to explain how one outcome is selected. If I have a theory about a classical coin tossing experiment, which specifies the probabilities ##p_{h/t}=\frac{1}{2}##, then "nature is genuinely random and will randomly select one of the possibilities" is a perfectly fine explanation. If that's true, then no further explanation can be possible.
 
  • #324
Demystifier said:
Are you saying that Griffiths
https://arxiv.org/abs/1110.0974
is wrong?
No. The deviation from classical reasoning for quantum phenomena is dictated by experiments. CH explains, how to obtain single frameworks, within which one can resort to classical reasoning and use classical probability theory. No interpretation gets around this. "##S_x=+1\wedge S_y=-1##" is a completely valid conjunction of propositions according to classical logic, so if you claim that it is a valid statement in BM, you should be able to tell me what probability BM assigns to it.
 
  • #325
rubi said:
The analogy between BM and epicycles is really quite obvious.

It's a poor analogy. Epicycles have been shown to be right. BM remains conjectural.
 
  • Like
Likes rubi
  • #326
atyy said:
It's a poor analogy. Epicycles have been shown to be right. BM remains conjectural.
Well, there are two possibilities:
1. BM makes the same predictions as QM, which is what the Bohmians usually claim. In that case, the analogy is spot on.
2. BM makes different predictions than QM. In that case it either contradicts experiments or the different predictions concern only situations that have not been experimentally tested yet. Then, most physicist would expect the QM predictions to be right and the BM predictions to be wrong. If the QM predictions turned out to be wrong, people would be more likely to just adjust the QM model (e.g. modify the Hamiltonian) than to adopt BM (e.g. see neutrino oscillation).
 
  • #327
rubi said:
No. The deviation from classical reasoning for quantum phenomena is dictated by experiments. CH explains, how to obtain single frameworks, within which one can resort to classical reasoning and use classical probability theory. No interpretation gets around this. "##S_x=+1\wedge S_y=-1##" is a completely valid conjunction of propositions according to classical logic, so if you claim that it is a valid statement in BM, you should be able to tell me what probability BM assigns to it.
"##S_x=+1\wedge S_y=-1## (at the same time)" is an invalid statement in BM. It is also an invalid statement in the standard Copenhagen interpretation. And more importantly, there is no experiment which gives ##S_x=+1\wedge S_y=-1## (at the same time). So how can this be dictated by experiments?
 
  • #328
rubi said:
I don't understand why you think that a stochastic theory needs to explain how one outcome is selected. If I have a theory about a classical coin tossing experiment, which specifies the probabilities ##p_{h/t}=\frac{1}{2}##, then "nature is genuinely random and will randomly select one of the possibilities" is a perfectly fine explanation. If that's true, then no further explanation can be possible.

I think you're arguing something different than vanhees is. I am not arguing against the possibility of a stochastic description of physics.

The point about coarse-graining is that if you could actually do the computations to figure out how the composite wave function for system + measuring device + observer + environment (however far it goes) evolves, then you would find that:

If
  • when the system being measured is in a state corresponding to eigenvalue o_1, the measuring device makes a transition to a macroscopic state S_1, and
  • when the system being measured is in a state corresponding to eigenvalue o_2, the measuring device makes a transition to a macroscopic state S_2, then
  • when the system is in a superposition of those two states, then the composite wave function makes a transition to a superposition of those macroscopic states (or mixture, if you like, but I'm using superposition because I'm including the environment in the wave function)
That follows from the linearity of the evolution equations for quantum mechanics. Coarse graining is a mathematical tool for extracting a macroscopic state from a microscopic state. It isn't going to produce a single outcome if the underlying microscopic state reflects a superposition of macroscopically different outcomes.

You can certainly have an additional, stochastic step in which one coarse-grained macroscopic state is selected from the superposition or mixture, but that is an additional step.
 
  • Like
Likes PeterDonis
  • #329
stevendaryl said:
Tell me how my story changes if you don't assume collapse. I think that without collapse, then measurements become only subjective, which is basically the Many Worlds idea.

I would distinguish three aspects of the collapse hypothesis:
  1. After measurement, there is no longer observable interference between alternatives.
  2. Measurement reveals a fact about the world (that a certain observable has a certain value).
  3. After measurement, the composite system (system measured plus measuring device plus environment) is in a state consistent with the value measured, and so from then on, evolves from that state, not the original superposition.
Number 1 is not a necessary assumption, because it is presumably derivable from ordinary quantum mechanics, if you take into account decoherence.

So in denying collapse, are you denying 2? Measurements don't actually reveal anything about the world? I can't believe that. How could a result count as a measurement if it doesn't tell us anything about the world? But if it does tell us something about the world, what is telling us about the world?

As for #3, my discussion didn't mention anything about evolution after the measurement, so that's not relevant.
On the contrary, I think it shows that the minimal interpretation is incoherent as it stands. What am I misunderstanding?

After a measurement usually you have a "pointer reading", i.e., a macroscopic observable but you are far from knowing the precise quantum state of the composite system. There's no collapse in the sense of some Copenhagen flavors, because this violates locality and causality implemented in relativstic QFT.
 
  • #330
vanhees71 said:
After a measurement usually you have a "pointer reading", i.e., a macroscopic observable but you are far from knowing the precise quantum state of the composite system.

That's what the macroscopic states S_i are: pointer readings. So are you saying that after the measurement, the device is in one of the states S_i with probability |c_i|^2? How is that not a collapse?
 
  • #331
It's not a collapse in the sense of Copenhagen. It's due to a local interaction between the measured object and the device but not an instantaneous interaction at a distance violating causality!
 
  • #332
Demystifier said:
"##S_x=+1\wedge S_y=-1## (at the same time)" is an invalid statement in BM. It is also an invalid statement in the standard Copenhagen interpretation. And more importantly, there is no experiment which gives ##S_x=+1\wedge S_y=-1## (at the same time). So how can this be dictated by experiments?
This is exactly what CH says. It is logically invalid to form the conjunction ##A\wedge B##, if ##A## is ##S_x=+1## and ##B## is ##S_y=-1##, i.e. you cannot apply the rules of classical logic to propositions about quantum systems. BM of course (like every other interpretation) doesn't get around this. The CH rules tell you, which propsitions are logically meaningful and can be combined into a single framework. My example about ##S_x## and ##S_y## is one such meaningless proposition. It's not meaningful in CH, Copenhagen and BM. CH doesn't violate the rules of classical logic more than BM does. In classical logic, you can always form conjunctions like ##A\wedge B##. If you give me two meaningful propositions ##A## and ##B## and I'm not allowed to take their conjunction ##A\wedge B##, then I'm not dealing with classical logic.

stevendaryl said:
I think you're arguing something different than vanhees is. I am not arguing against the possibility of a stochastic description of physics.
Okay, I see.

when the system is in a superposition of those two states, then the composite wave function makes a transition to a superposition of those macroscopic states (or mixture, if you like, but I'm using superposition because I'm including the environment in the wave function)
Maybe in vanhees POV, these superpositions will cancel each other, leaving only one macroscopic possibility (with ##P>\epsilon##). I don't think this is impossible, but one needs a big enough Hilbert space and there will of course always remain some variables (concerning the whole system), which will be completely non-classical.
 
  • #333
vanhees71 said:
It's not a collapse in the sense of Copenhagen. It's due to a local interaction between the measured object and the device but not an instantaneous interaction at a distance violating causality!

This is the point that I have been making: I don't have a philosophical problem with what you're saying; I have a technical problem with it. It's factually incorrect. You seem to be saying that local interactions are sufficient to explain the occurrence of definite results for quantum measurements. That's provably false. Bell proved it to be false.
 
  • #334
rubi said:
This is exactly what CH says. It is logically invalid to form the conjunction ##A\wedge B##, if ##A## is ##S_x=+1## and ##B## is ##S_y=-1##, i.e. you cannot apply the rules of classical logic to propositions about quantum systems. BM of course (like every other interpretation) doesn't get around this. The CH rules tell you, which propsitions are logically meaningful and can be combined into a single framework. My example about ##S_x## and ##S_y## is one such meaningless proposition. It's not meaningful in CH, Copenhagen and BM. CH doesn't violate the rules of classical logic more than BM does. In classical logic, you can always form conjunctions like ##A\wedge B##. If you give me two meaningful propositions ##A## and ##B## and I'm not allowed to take their conjunction ##A\wedge B##, then I'm not dealing with classical logic.
So physical impossibility in other interpretations is promoted to a logical nonsense in CH interpretation. But one could do that even in classical physics. For example, let
A = there are two free massive particles at distance r
B = these two particles are not attracted by a force
Due to gravitational force, it is never the case that both A and B are true. In a CH interpretation of Newtonian mechanics, one would say that ##A\wedge B## is a logical nonsense. But that's not a good approach, because science must be testable. One must consider the statement ##A\wedge B## as a logical possibility, and then make experiments to see whether ##A\wedge B## is true. (The experiments show that it isn't).
 
  • Like
Likes ShayanJ
  • #335
stevendaryl said:
This is the point that I have been making: I don't have a philosophical problem with what you're saying; I have a technical problem with it. It's factually incorrect. You seem to be saying that local interactions are sufficient to explain the occurrence of definite results for quantum measurements. That's provably false. Bell proved it to be false.

Let's go through this for the EPR experiment.

Alice has a device that has two pointer states: S_u and S_d. If an electron that is spin-up along the z-axis interacts with her device, then the device will almost certainly make a transition to having the pointer state S_u. If an electron is spin-down, the device will almost certainly make a transition to having the pointer state S_d. If an electron is in a superposition or mixed state of spin-up and spin-down, then the device will make a nondeterministic transition to either the state S_u or S_d (depending on the coefficients of the superposition or mixture). So far, it seems that everything is perfectly well described by local interactions. But now, we add one more constraint on Alice's device:
  • If, far far away, Bob's device, interacting with the electron's twin, already made the transition to the state pointer state S_u, then Alice's device will definitely make the transition to S_d
You can't account for this additional fact with only local interactions.
 
  • Like
Likes ShayanJ and PeterDonis
  • #336
Demystifier said:
So physical impossibility in other interpretations is promoted to a logical nonsense in CH interpretation.
If the problem were only physical impossibility, then the statement ##S_x=+1\wedge S_y=-1## would be completely unproblematic. We would just assign ##P=0## to it and be happy. However, no possible assignment of probabilities to such a proposition is consistent with QM, so it must be the case that taking this conjunction is an invalid operation. (I know the ##d=2## loophole. Let's stick to ##d=2## for simplicity.) This is all CH says. You must restrict yourself to a single framework if you want to apply classical logic. If your framework includes ##S_x=+1##, then it can't include ##S_y=-1##. Nothing more and nothing less. Bohmians should agree with this.

But one could do that even in classical physics. For example, let
A = there are two free massive particles at distance r
B = these two particles are not attracted by a force
Due to gravitational force, it is never the case that both A and B are true. In a CH interpretation of Newtonian mechanics, one would say that ##A\wedge B## is a logical nonsense.
No, CH has nothing to say about this example. The CH rules apply to propositions that are modeled as projectors in a Hilbert space. The problem with your propositions is that they are self-referential. This is not possible in classical logic either. You are dealing with an unformalized problem in natural language here. Translate it into formalized logic and the problem should vanish.

But that's not a good approach, because science must be testable. One must consider the statement ##A\wedge B## as a logical possibility, and then make experiments to see whether ##A\wedge B## is true. (The experiments show that it isn't).
Whether ##S_x=+1\wedge S_y=-1## is a valid proposition can be tested experimentally and the experiment says that it isn't (again, let's not harp on about the ##d=2## loophole).
 
Last edited:
  • #337
vanhees71 said:
It's not a collapse in the sense of Copenhagen. It's due to a local interaction between the measured object and the device but not an instantaneous interaction at a distance violating causality!

Just so we're clear: You are agreeing that after a measurement, the measurement device has a definite pointer state. Is that correct? If so, I don't understand how that is not a "collapse". Whether or not it is mediated by local interactions, the result is that a single outcome is selected out of a set of possible outcomes. That's what people mean by "collapse".
 
  • #338
stevendaryl said:
Let's go through this for the EPR experiment.

Alice has a device that has two pointer states: S_u and S_d. If an electron that is spin-up along the z-axis interacts with her device, then the device will almost certainly make a transition to having the pointer state S_u. If an electron is spin-down, the device will almost certainly make a transition to having the pointer state S_d. If an electron is in a superposition or mixed state of spin-up and spin-down, then the device will make a nondeterministic transition to either the state S_u or S_d (depending on the coefficients of the superposition or mixture). So far, it seems that everything is perfectly well described by local interactions. But now, we add one more constraint on Alice's device:
  • If, far far away, Bob's device, interacting with the electron's twin, already made the transition to the state pointer state S_u, then Alice's device will definitely make the transition to S_d
You can't account for this additional fact with only local interactions.
Yes, we can! The answer is relativistic QFT. The correlations are not due to non-local interactions but due to the preparation in an entangled state at the very beginning, and it's indeed incompatible with local deterministic models a la Bell.
 
  • #339
stevendaryl said:
Whether or not it is mediated by local interactions, the result is that a single outcome is selected out of a set of possible outcomes. That's what people mean by "collapse".

Perhaps it would help to state a concrete example. Suppose we put an electron through a Stern-Gerlach device oriented in the ##z## direction. The spin eigenstates of this measurement are ##\vert z+ \rangle## and ##\vert z- \rangle##. The "pointer variable" here is the direction in which the electron is moving; it starts out moving horizontally, which we will call pointer state ##\vert R \rangle## (for "ready"), and ends up moving either up or down, which we will call pointer states ##\vert U \rangle## and ##\vert D \rangle##.

We know that the Hamiltonian is such that the spin eigenstates will induce evolution as follows:

$$
\vert z+ \rangle \vert R \rangle \rightarrow \vert z+ \rangle \vert U \rangle
$$

$$
\vert z- \rangle \vert R \rangle \rightarrow \vert z- \rangle \vert D \rangle
$$

Therefore, a superposition of spin eigenstates ##\vert \psi \rangle = a \vert z+ \rangle + b \vert z- \rangle##, where ##\vert a \vert^2 + \vert b \vert^2 = 1##, will induce evolution as follows:

$$
\vert \psi \rangle \vert R \rangle \rightarrow a \vert z+ \rangle \vert U \rangle + b \vert z- \rangle \vert D \rangle
$$

This state does not describe "a single outcome"; it describes a superposition of "outcomes". But this state is what unitary evolution predicts. So if in fact the final state is not the above, but either

$$
\vert z+ \rangle \vert U \rangle
$$

or

$$
\vert z- \rangle \vert D \rangle
$$

with probabilities ##\vert a \vert^2## and ##\vert b \vert^2## respectively, then some other process besides unitary evolution must be involved, and this other process is what is referred to by the term "collapse". Decoherence doesn't change this; all decoherence does is ensure that there are no "cross terms" of the form ##\vert z+ \rangle \vert D \rangle## or ##\vert z- \rangle \vert U \rangle## in the superposition.
 
  • Like
Likes MrRobotoToo
  • #340
stevendaryl said:
''
''
  • when the system is in a superposition of those two states, then the composite wave function makes a transition to a superposition of those macroscopic states (or mixture, if you like, but I'm using superposition because I'm including the environment in the wave function)
That follows from the linearity of the evolution equations for quantum mechanics. Coarse graining is a mathematical tool for extracting a macroscopic state from a microscopic state. It isn't going to produce a single outcome if the underlying microscopic state reflects a superposition of macroscopically different outcomes.

You can certainly have an additional, stochastic step in which one coarse-grained macroscopic state is selected from the superposition or mixture, but that is an additional step.
Looking at the third point, "... the composite wave function makes a transition to a superposition of those macroscopic states ... or mixture ... ".

If you mean a statistical mixture then that implies that on every run of the experiment ( ensemble member ) a definite outcome was achieved, and no further step is required. Nor any explanation of how the state was selected. But you probably don't mean that, do you ?
 
  • #341
stevendaryl said:
But in QM, the difference is not simply a matter of what to choose to ignore. Different rules apply to measurements than to other types of interactions.
This may be only a minor point but I don't like the distinction of "ordinary interactions" and "measurement interactions". The Heisenberg cut can be shifted. So whether a specific interaction is of the first or of the second type depends on the person which does the analysis. Measurements are not merely physical.

stevendaryl said:
In what I would consider a coherent formalism, you would describe how the world works, independently of observers, and then add physical-phenomenal axioms saying that such-and-such a condition of such-and-such subsystem counts as a measurement of such-and-such a property. There would be no additional physics to the measurement process, since it would just be an ordinary process.
I agree that this sounds desireable but the chain of logic doesn't reflect how science is done. Our theories are distilled from observations so it isn't a priori clear that we can take the observer out of the picture. Sure, it did work for classical physics but it doesn't for theories with entanglement-like properties.

So given that we have a theory with entanglement, we should ask what a possible alteration of the Born rule could look like in order that we wouldn't consider it ad hoc. If separating the observer from the system changes the system in a non-trivial way, there doesn't seem to be n easy way for this. So for me, the weird thing about QM is that I cannot imagine how a non-weird version of it would look like and I take that as a sign that I don't understand what exactly is weird well enough.
 
  • #342
atyy said:
But why can we ignore part of the universe? Is it because of locality? Why is the universe operationally local, even though reality is nonlocal (or retrocausal etc ...)?
I'm not sure how this relates to what I wrote.

Your starting point seems to be that out of all possible theories, we got one which is demonstrably nonlocal on a fundmental level but at the same time, everything we as humans can think of to exploit this nonlocality is equally demonstrably impossible. How strange! (Please correct me if I'm wrong)

The starting point of my post #281 was us doing experiments. From this point of view, your first question doesn't make sense. We cannot not ignore part of the universe because in order to observe something, we have to exclude at least the part of ourselves which experiences the observation.

(Also, more fundamental theories must include the older ones as limiting cases. So shouldn't you condition your second question on the fact that classical physics is local? But here I see even less connection to my post)
 
  • #343
stevendaryl said:
Let's go through this for the EPR experiment.

Alice has a device that has two pointer states: S_u and S_d. If an electron that is spin-up along the z-axis interacts with her device, then the device will almost certainly make a transition to having the pointer state S_u. If an electron is spin-down, the device will almost certainly make a transition to having the pointer state S_d. If an electron is in a superposition or mixed state of spin-up and spin-down, then the device will make a nondeterministic transition to either the state S_u or S_d (depending on the coefficients of the superposition or mixture). So far, it seems that everything is perfectly well described by local interactions. But now, we add one more constraint on Alice's device:
  • If, far far away, Bob's device, interacting with the electron's twin, already made the transition to the state pointer state S_u, then Alice's device will definitely make the transition to S_d
You can't account for this additional fact with only local interactions.

I can't quite follow this,( but I'm trying my best). If the electron were in a superposition of u and d states wouldn't it have needed to have been prepared in a definite state of l or r to be guaranteed to produce a random result in the u/d direction? If the up coefficient were bigger than the down coefficient (so to speak) would it not be expected to migrate one way rather than the other?
 
  • #344
Mentz114 said:
Looking at the third point, "... the composite wave function makes a transition to a superposition of those macroscopic states ... or mixture ... ".

If you mean a statistical mixture then that implies that on every run of the experiment ( ensemble member ) a definite outcome was achieved, and no further step is required. Nor any explanation of how the state was selected. But you probably don't mean that, do you ?

It's a little difficult to discuss without getting into endless levels of details. But the use of mixtures is not limited to the case in which a system has a definite (though unknown) state. You also get a mixture by taking a composite system and "tracing out" unobservable degrees of freedom.

This is what makes the discussion a little complicated. On the one hand, people say that realistically, you shouldn't use pure states to describe macroscopic objects, you should use mixtures. But the use of mixtures already blurs the distinction between probabilities that are inherent in the quantum formalism and probabilities that are due to lack of knowledge. Some people say that there is no distinction, but that seems wrong to me. To say that an electron is in a superposition of spin-up and spin-down is not to say that it is one state or the other, I just don't know which.
 
  • #345
PeterDonis said:
Therefore, a superposition of spin eigenstates ##\vert \psi \rangle = a \vert z+ \rangle + b \vert z- \rangle##, where ##\vert a \vert^2 + \vert b \vert^2 = 1##, will induce evolution as follows:

$$
\vert \psi \rangle \vert R \rangle \rightarrow a \vert z+ \rangle \vert U \rangle + b \vert z- \rangle \vert D \rangle
$$

This state does not describe "a single outcome"; it describes a superposition of "outcomes". But this state is what unitary evolution predicts. So if in fact the final state is not the above, but either

$$
\vert z+ \rangle \vert U \rangle
$$

or

$$
\vert z- \rangle \vert D \rangle
$$

with probabilities ##\vert a \vert^2## and ##\vert b \vert^2## respectively, then some other process besides unitary evolution must be involved, and this other process is what is referred to by the term "collapse". Decoherence doesn't change this; all decoherence does is ensure that there are no "cross terms" of the form ##\vert z+ \rangle \vert D \rangle## or ##\vert z- \rangle \vert U \rangle## in the superposition.

Unless I'm misunderstanding you, you seem to be agreeing with me. Decoherence, or irreversibility is not going to result in a definite outcome.

So if someone says that a measurement results in either the state |U\rangle, with such-and-such probability, or the state |D\rangle, with such and such a probability, then does that imply that something nonunitary is involved?
 
  • #346
vanhees71 said:
Yes, we can! The answer is relativistic QFT.

This is a misconception on your part. There are two parts to QFT, just as there are two parts to nonrelativistic quantum mechanics: (1) evolution of the quantum state, and (2) the Born rule.

QFT affects the first part, but not the second. We're only talking about the second.
 
  • #347
stevendaryl said:
you seem to be agreeing with me.

Yes, I am. But I'm trying to state your point using a concrete example in the hope that it will help to make it clearer to others.

stevendaryl said:
So if someone says that a measurement results in either the state ##|U\rangle##, with such-and-such probability, or the state ##|D\rangle##, with such and such a probability, then does that imply that something nonunitary is involved?

To me it obviously must, since unitary evolution gives the superposition. So a no collapse interpretation, like MWI, would say that the measurement results in the superposition--i.e., "measurement" is not a non-unitary collapse but simply a unitary entanglement interaction between the measured system and the "pointer" system (the one whose state becomes entangled with the measured system's state). In the Stern-Gerlach case, the measured system is the electron's spin and the pointer system is the electron's momentum.
 
  • #348
vanhees71 said:
Yes, we can! The answer is relativistic QFT. The correlations are not due to non-local interactions but due to the preparation in an entangled state at the very beginning, and it's indeed incompatible with local deterministic models a la Bell.

No, relativistic QFT is not the answer. Look, if you described everything in EPR---the twin pair, the detectors, Bob, Alice, the environment, etc.---using QFT, what you would NOT find is that smooth unitary evolution would result in Alice's detector having a definite pointer state. What you would find is that the composite wave function of everything relevant would evolve into a superposition of different worlds with different pointer states. It doesn't matter whether you describe the detectors using QFT or nonrelativistic quantum mechanics. The evolution equations are not stochastic, they are deterministic.

So if Alice's detector ends up in a definite pointer state, that is NOT described by QFT's unitary evolution.
 
  • #349
kith said:
This may be only a minor point but I don't like the distinction of "ordinary interactions" and "measurement interactions". The Heisenberg cut can be shifted. So whether a specific interaction is of the first or of the second type depends on the person which does the analysis. Measurements are not merely physical.

Okay, that's a possible answer, which is that there is nothing objective about measurement results--for one person, an object might be in a definite state, while for another person, the same object might be in a superposition.

So given that we have a theory with entanglement, we should ask what a possible alteration of the Born rule could look like in order that we wouldn't consider it ad hoc. If separating the observer from the system changes the system in a non-trivial way, there doesn't seem to be n easy way for this. So for me, the weird thing about QM is that I cannot imagine how a non-weird version of it would look like and I take that as a sign that I don't understand what exactly is weird well enough.

Fair enough. I'm not sure what a nonweird version of QM would be like, either.
 
  • #350
stevendaryl said:
It's a little difficult to discuss without getting into endless levels of details. But the use of mixtures is not limited to the case in which a system has a definite (though unknown) state. You also get a mixture by taking a composite system and "tracing out" unobservable degrees of freedom.

This is what makes the discussion a little complicated. On the one hand, people say that realistically, you shouldn't use pure states to describe macroscopic objects, you should use mixtures. But the use of mixtures already blurs the distinction between probabilities that are inherent in the quantum formalism and probabilities that are due to lack of knowledge. Some people say that there is no distinction, but that seems wrong to me. To say that an electron is in a superposition of spin-up and spin-down is not to say that it is one state or the other, I just don't know which.
As far as I understand this I can't see anything to disagree with. The point I've emphasized is worth exploring further, so I'll do that for a bit.
 
Back
Top