Can the Born Rule Be Derived in the Many Worlds Interpretation?

  • #51
Fredrik said:
Why would we need to assign a wavefunction to the universe? I agree that we can't, but I don't see a reason to think of that as a problem. We're talking about a generalized probability theory. All it does is to assign probabilities to results of measurements, so we can't expect it to be useful in scenarios where measurements aren't possible in principle.

That's what doesn't make sense to me. If there is nothing special about measurement, it's just a complicated interaction between one system--the system of interest--and another system--the recording device/observer, then giving probabilities for measurement results seems like it must amount to giving probabilities for certain configurations of physical systems. It seems to me that either measurement is special, or it's not. If measurement is not special, then why would theory specifically describe probabilities for results of measurements, and not describe other sorts of interactions.
 
Physics news on Phys.org
  • #52
atyy said:
Well, the idea is that in classical physics, we don't talk about measurements in formulating a theory. We write down eg. an action specifying the fields and their interactions. We assume that the whole universe can be described by the theory, and because the measurement device is in the universe it is also described by the theory.
Right, these theories describe fictional universes with properties in common with our own. (If you prefer, they are approximate descriptions of our universe). But when you think about what a theory must be like in order to be falsifiable, you see that it doesn't need to describe a universe. It just needs to assign probabilities to possible results of measurements.

atyy said:
However, in quantum theory, we seem to have a problem in extending the wave function to the whole universe.
Yes, but there's no need to, if you just stop thinking of QM as a theory of the first kind, and accept that it's a theory of the second kind. The problem that you're referring to, and all the other "problems" with QM, are consequences of the unnecessary assumption that I'm rejecting.

To clarify: The unnecessary assumption is the identification of pure states with possible configurations of the system. This assumption takes a theory of the second kind (an assignment of probabilities) and pretends that it's a theory of the first kind (a description). And the result is a disaster. Suddenly we have a "measurement problem", and a need to be able to associate a pure state with the universe. This leads inevitably to many worlds. (As I mentioned in a previous post, I don't think we can avoid many worlds by adding a collapse axiom. That just makes everything even worse).

atyy said:
Because of this, in Copenhagen and Ensemble interpretations we do have to make this classical/quantum cut.
All we have to do is to say which measuring device is supposed to tell us the result of the experiment. I wouldn't describe this as a classical/quantum cut.

atyy said:
It is true that in decoherence we can shift the classical quantum cut so that the environment+measuring device+system are quantum, but then we still need a classical realm outside of that, unless the wave function of the universe makes sense.
If what you mean by a classical realm is something like semi-stable records of the results of certain interactions, then yes. If you mean something truly classical, then no.
 
  • #53
stevendaryl said:
then why would theory specifically describe probabilities for results of measurements, and not describe other sorts of interactions.

It describes probabilities that particular states will be reached after an interaction. Those states that are especially useful to us, we call "measurement results".
 
  • #54
stevendaryl said:
To me, that seems kind of weird reasoning, to say such and such must be true, because otherwise, we would have a hard time doing science. The world isn't required to accommodate our needs.
I didn't say that reality is a certain way because of science. I only said that a scientific theory needs to be falsifiable. It's the idea that a good theory has to be more than that that's wishful thinking. QM is a perfectly fine generalized probability theory, but people are still looking for ways to interpret it as a description of a universe, presumably because they really want QM to be a description of a universe.

stevendaryl said:
If measurement is not special, then why would theory specifically describe probabilities for results of measurements, and not describe other sorts of interactions.
The theory has to assign probabilities to something that people can think of as "results", in order to be falsifiable. (A "theory" that doesn't do that isn't a theory). Some interactions produce "results", and some don't. OK, strictly speaking, none of them does, but some interactions produce states that are for practical purposes indistinguishable from classical superpositions. This is the sort of interaction I have in mind when I (somewhat sloppily) say that some interactions produce results. The ones that do can be considered measurements. So in that specific sense, measurements are "special", but they're not fundamentally different. We just put the "measurement" label on those interactions that are the most useful when we test the theory.

I can't answer the question of why our best theory would be one that assigns probabilities to measurements, and describes what's happening just after a state preparation and just after a measurement, but doesn't describe what's happening between state preparation and measurement. It's certainly counterintuitive, but so are the alternatives.
 
  • #55
Fredrik said:
Right, these theories describe fictional universes with properties in common with our own. (If you prefer, they are approximate descriptions of our universe). But when you think about what a theory must be like in order to be falsifiable, you see that it doesn't need to describe a universe. It just needs to assign probabilities to possible results of measurements.

Yes, that is enough for quantum mechanics to be a successful theory. That is what the Copenhagen interpretation says. The measurement problem then is:

(1) Is there a theory of the first type that can underlie quantum mechanics? Historically, von Neumann claimed to prove that this is impossible, ie. there is no "real state" of the system represented by the wave function. In fact, von Neumann's proof was in error, and Bohmian mechanics supplied a concrete example of a theory of the first type for non-relativistic quantum mechanics.

(2) In Copenhagen the wave function is just a calculating device for making predictions about subsystems of the universe, and does not represent the "real state" of the system. However, is it really the case that the wave function cannot represent the complete real state of the system? Many-worlds investigates this possibility.

Fredrik said:
Yes, but there's no need to, if you just stop thinking of QM as a theory of the first kind, and accept that it's a theory of the second kind. The problem that you're referring to, and all the other "problems" with QM, are consequences of the unnecessary assumption that I'm rejecting.

To clarify: The unnecessary assumption is the identification of pure states with possible configurations of the system. This assumption takes a theory of the second kind (an assignment of probabilities) and pretends that it's a theory of the first kind (a description). And the result is a disaster. Suddenly we have a "measurement problem", and a need to be able to associate a pure state with the universe. This leads inevitably to many worlds. (As I mentioned in a previous post, I don't think we can avoid many worlds by adding a collapse axiom. That just makes everything even worse).

As long as the wave function does not represent the real state of the system, then there is no problem with adding collapse. In fact, collapse or an equivalent axiom is necessary in quantum mechanics to describe the results of filtering experiments.

The measurement problem then is: Why can't the wave function represent the state of the system?

Is it because the system has no state (no quantum reality)? This would be an extremely surprising answer. If this answer is acceptable, there is no measurement problem (but there is a difficulty with this answer if one believes that there is a common-sense "classical" reality in the part of the universe not covered by the wave function). If the answer is not acceptable, then there is a measurement problem and the question is to supply examples of the state underlying the wave function.

Fredrik said:
All we have to do is to say which measuring device is supposed to tell us the result of the experiment. I wouldn't describe this as a classical/quantum cut.

Fredrik said:
If what you mean by a classical realm is something like semi-stable records of the results of certain interactions, then yes. If you mean something truly classical, then no.

Yes, I agree, but by convention these are called the classical/quantum cuts. In a sense, it isn't reasonable to object to a classical/quantum cut being misleading, while also acknowledging that the wave function cannot extend to the whole universe. If the wave function cannot extend to the whole universe, and there is always a cut, why is not reasonable to place the cut between the measuring device and the quantum system?

The measurement problem is the conflict between:
1) the idea that a cut somewhere seems necessary, since we cannot write a wave function of the universe
2) the idea that it is unreasonable to place the cut between the measuring device and quantum system, since the measuring device is also quantum
 
Last edited:
  • #56
vanhees71 said:
The only difference between classical physics and quantum physics is that in classical physics all observables always have definite values, be they known or unknown to a physicist, while in quantum theory any observables can have a definite value or not, depending on how the system was prepared. It's of course a bit unintuitive in the beginning of learning quantum theory that observables can be indetermined, but that's how Nature is.

It's much weirder than having a definite value because of the way it is prepared. In a spin-1/2 EPR experiment, Alice can measure the spin of the electron in the z-direction and find out that it is spin-up. At that point, the twin positron is as much definitely spin-down in the z-direction if it had been prepared that way.

That's a lot different than classical probabilities. I certainly agree that that's how Nature is, but what exactly it means is pretty mysterious, to me, at any rate.
 
  • #57
Jilang said:
Thanks. Would it be fair to say that the record made was non-reversible then?

That's exactly what makes something a measurement, irreversibility. Actually, there doesn't actually have to be a measurement made at all, in the strict sense. That's what decoherence says. If a quantum system undergoes an irreversible interaction with the environment, or any system, then after that point, for all intents and purposes, the wave function has collapsed.

But at a fundamental level, irreversibility is not a very satisfying condition for wave function collapse, because nothing is REALLY irreversible. It's just that as the number of particles involved grows, the conditions necessary to reverse the interaction rapidly becomes impossible in practice.

That's another thing that I find unsatisfying about the foundations of quantum mechanics. If the axioms involve measurement, then that means that they indirectly involve irreversibility. It seems a little weird to me that the fundamental axioms describing electrons (say) must involve macroscopic concepts such as irreversibility.
 
  • #58
Yes, it's very different. Quantum theory implies correlations (called entanglement) that are undescribable within a local deterministic theory (Bell's theorem).

On the other hand, given quantum theory there's nothing "mysterious" about the spin-1/2 EPR experiment you describe. The electron positron pair was created to be in an entangled state like
|\Psi \rangle=\frac{1}{\sqrt{2}} (|1/2,-1/2 \rangle-|1/2,+1/2 \rangle).
Then the single electron at Alices detector is described as a totally unpolarized state, i.e., through the Statistical Operator
\hat{\rho}_{\text{Alice}}=\frac{1}{2} \hat{1}.
The same holds true for Bob's positron.

Nevertheless the two-particle system is in an entangled state due to some preparation procedure (say the decay of a neutral particle like a \rho meson). This implies indeed, what you write, namely that as soon as Alice measures the electron's spin-z component to be +1/2, Bob's positron must have a spin-z component of -1/2. There is, however nothing mysterious here. The correlation was due to the preparation of the electron-positron pair and not through Alice's measurement of the electron's spin-z component. There is no "spooky action at a distance" as Einstein put it.

This example also illustrates quite well the reason, why a collapse interpretation is at least problematic (I think, the assumption of a spontaneous collapse, which is outside of the dynamics of quantum theory is inconsistent with Einstein causality as is a "cut" between a quantum and a classical world; everything is quantum, the appearance of a classical behavior is due to a coarse grained observation of macroscopic observables of objects of macroscopic scales and well-understood from quantum-many body theory). If you assume that it's Alice's measurement which causes Bob's positron to spontaneously get a determined spin-z component, you indeed violate Einstein causality, because no signal can travel faster with the speed of light to make Bob's spin determined although initially it was completely indetermined. Within the minimal interpretation, there is no problem, because you take the Born interpretation of states really seriously, i.e., before Alice's measurement the spin-z component of both the electron and the positron were (even maximally) undetermined, but due to the preparation in an entangled state, the correlations are already implemented when the electron-positron pair were prepared. Of course, such a thing is not describable with local deterministic hidden-variable theories, and as long as nobody finds a consistent non-local deterministic theory which is as successful as QT, I stick to (minimally interpreted) QT :-).
 
  • #59
Nugatory said:
It describes probabilities that particular states will be reached after an interaction. Those states that are especially useful to us, we call "measurement results".

I was just responding to Fredrick's comment that quantum mechanics can't be applied in situations that don't involve measurement. To the extent that measurement is an ordinary interaction, we can certainly apply quantum mechanics anywhere, whether there are any observers, or not. If we're dealing with something like the effects of quantum mechanics on the early universe, soon after the Big Bang, then we certainly can't talk about measuring devices and observers.

That's the nice thing about decoherence is that you don't need observers or measuring devices. Probabilities show up as the transition from pure states to mixed states after tracing over environmental degrees of freedom. Of course, the final step, from a mixed state to a definite outcome, never happens without observers/measurement devices. Which makes me think it never happens at all. (Which is the Many Worlds view)
 
  • #60
vanhees71 said:
Within the minimal interpretation, there is no problem, because you take the Born interpretation of states really seriously, i.e., before Alice's measurement the spin-z component of both the electron and the positron were (even maximally) undetermined, but due to the preparation in an entangled state, the correlations are already implemented when the electron-positron pair were prepared. Of course, such a thing is not describable with local deterministic hidden-variable theories, and as long as nobody finds a consistent non-local deterministic theory which is as successful as QT, I stick to (minimally interpreted) QT :-).

That kind of correlation/entanglement is present in classical probability, as well. I have a red ball and a black ball, and I put each into a sealed box, and mix up the boxes. I send one box to Alice and one box to Bob. Then before either opens his or her box, we can describe the situation as follows:

  • The probability that Alice has a red ball is 1/2.
  • The probability that Alice has a black ball is 1/2.
  • The probability that Bob has a red ball is 1/2.
  • The probability that Bob has a black ball is 1/2.
  • The probability that they both have red balls is 0.

So the probability distribution is "entangled". It's nonlocal, in the sense that the last probability involves distant events. If either Alice or Bob opens his or her box, immediately the appropriate probability distribution "collapses".

However, such entanglement is understood classically by saying that "probabilities" don't refer to anything objective, but instead refer to our lack of information about the true state of the world. The true state either has Alice with a red ball and Bob with a black ball, or vice-verse. We just don't know the true state.

In certain ways, probabilities in QM seem very similar to the subjective probabilities of classical probability theory. Alice measures spin-up in the x-direction and immediately knows that Bob is going to measure spin-down in the x-direction. That seems exactly analogous to the classical case of Alice opening her box and finding a red ball, and immediately knowing that Bob has a black ball. So the quantum case should be no more mysterious than the classical case...

Except that we don't have the classical explanation for the correlation. Classically, the explanation is that the ball already had a color prior to opening the box, and opening it only revealed its color. But we don't have the same resolution in the quantum case. We can't assume that the electron already has a spin in the x-direction, we just don't know what it is.
 
  • #61
stevendaryl said:
So the probability distribution is "entangled".

Standard proability theory does not exhibit entanglement - it fact QM's ability to do that is what distinguishes it from probability theory:
http://arxiv.org/pdf/0911.0695v1.pdf

Thanks
Bill
 
Last edited:
  • #62
Fredrik said:
I didn't say that reality is a certain way because of science. I only said that a scientific theory needs to be falsifiable. It's the idea that a good theory has to be more than that that's wishful thinking.

Well, in a certain sense, it's wishful thinking that it's possible to do science, at all. There is no reason for the universe to be predictable or comprehensible in any way. The idea behind science is that we optimistically hope that things are comprehensible. There is no reason for them to be.

You are making the distinction between science as a means of making predictions, and science as something more than that. You're calling the second "wishful thinking". But both are wishful thinking, in the sense that there is no necessary reason for nature to be humanly comprehensible, or predictable, at all.
 
  • #63
vanhees71 said:
(I think, the assumption of a spontaneous collapse, which is outside of the dynamics of quantum theory is inconsistent with Einstein causality as is a "cut" between a quantum and a classical world; everything is quantum, the appearance of a classical behavior is due to a coarse grained observation of macroscopic observables of objects of macroscopic scales and well-understood from quantum-many body theory). If you assume that it's Alice's measurement which causes Bob's positron to spontaneously get a determined spin-z component, you indeed violate Einstein causality, because no signal can travel faster with the speed of light to make Bob's spin determined although initially it was completely indetermined. Within the minimal interpretation, there is no problem, because you take the Born interpretation of states really seriously, i.e., before Alice's measurement the spin-z component of both the electron and the positron were (even maximally) undetermined, but due to the preparation in an entangled state, the correlations are already implemented when the electron-positron pair were prepared. Of course, such a thing is not describable with local deterministic hidden-variable theories, and as long as nobody finds a consistent non-local deterministic theory which is as successful as QT, I stick to (minimally interpreted) QT :-).

If everything is quantum and there is only unitary evolution, then there would be unitary evolution of the wave function of the universe. The "minimal interpretation" without collapse and without a classical quantum cut is not minimal - it is making a huge claim - that the unitary evolution of the wave function of the universe makes sense. If this were true, the minimal interpretation would have solved the problem that the many-worlds approach investigates.
 
Last edited:
  • #64
bhobba said:
Standard provability theory does not exhibit entanglement - it fact QM's ability to do that is what distinguishes it from probability theory:
http://arxiv.org/pdf/0911.0695v1.pdf

Thanks
Bill

Maybe I'm using the wrong word, but it seems to me that in QM, when we say that two particles are entangled, what we mean is that the composite state is not expressible as a product of one-particle states. That concept has a direct analogy in classical probability theory: you have a probability distribution describing a composite system that cannot be expressed as a product of probability distributions of the component systems.

What I would say is different about quantum mechanics is not entanglement, but the fact that it is not possible to understand entanglement as lack of information about an unknown un-entangled state.

This is really the basis of Bell's inequality. We start with a joint probability distribution for Alice and Bob:

P(R_A, R_B | \alpha, \beta)

(The probability that Alice gets result R_A and Bob gets result R_B, given that Alice performs measurement \alpha and Bob performs measurement \beta).

There is a special class of joint probability distributions, the "factorable" ones, that can be written as follows:

P(R_A, R_B | \alpha, \beta) = P_A(R_A | \alpha) P_B(R_B | \beta)

I was using the word "entangled" to mean any joint probability distribution that cannot be factored that way.

A fact about classical joint probabilities, if there is no causal influence between the two measurements, is that even when the probabilities don't factor, they can be understood in terms of lack of information about factorable distributions. That is, there is some more detailed description of the probabilities as follows:

P(R_A, R_B | \alpha, \beta) = \sum_\lambda P_C(\lambda) P_A(R_A | \alpha, \lambda) P_B(R_B | \beta, \lambda)

In other words, classically, we can always find some fact, represented in the formula by the value of the parameter \lambda such that if we knew that fact, we could then factor the joint probability distributions for distant, causally disconnected measurements.

I believe that the use of the word "entangled" in QM is such that it always means a composite state that cannot be factored into a product of component states.
 
  • #65
stevendaryl said:
I believe that the use of the word "entangled" in QM is such that it always means a composite state that cannot be factored into a product of component states.

Actually, immediately after writing that, I realized that that's slightly wrong. For a pair of particles, the state is ALWAYS entangled in this sense, because of Bose or Fermi statistics. That is, if I have a two-electron state |\Psi \rangle, I can never write it in a product form |\Psi\rangle = |\phi\rangle |\psi\rangle, because Fermi statistics require that the state be anti-symmetric under swapping the two electrons. So the closest I can get to a product state is one of the form:

|\Psi\rangle = \sqrt{\dfrac{1}{2}}(|\phi\rangle|\psi\rangle - |\psi\rangle|\phi\rangle

So it looks like I have to amend my definition of an "entangled" to mean a state that cannot be expressed as the symmetrization of a product of component states. That's kind of a messy definition, I know.
 
  • #66
bhobba said:
Standard proability theory does not exhibit entanglement - it fact QM's ability to do that is what distinguishes it from probability theory:
http://arxiv.org/pdf/0911.0695v1.pdf

Thanks
Bill

Actually, they are using almost the same definition of "entangled" that I am using:

We call the pure state entangled if it is not a product state.

I was using an analogous definition to distinguish between entangled and un-entangled classical probability distributions: A probability distribution is entangled if it is not a product of component probability distributions.

The difference in terminology is that the authors only apply the word "entangled" to pure states, not mixtures. But that makes the notion of "entangled" not just false for classical probability, but meaningless (or maybe trivial). The only pure states in classical probability theory are states where all probabilities are 0 and 1. It's pretty obvious that you can't have entangled pure states using only probabilities 0 and 1.
 
Last edited:
  • #67
stevendaryl said:
You are making the distinction between science as a means of making predictions, and science as something more than that. You're calling the second "wishful thinking". But both are wishful thinking, in the sense that there is no necessary reason for nature to be humanly comprehensible, or predictable, at all.
It would be wishful thinking to believe that science can find all the answers, but I haven't advocated that view. There's no wishful thinking involved in thinking that QM makes very accurate predictions about results of experiments, or in explaining what a theory must do in order to be falsifiable. But there's wishful thinking involved in thinking that QM must be "more than that" (in the sense discussed above). It doesn't bother me that people are interested in exploring that option too, but it should be viewed as a long shot.
 
  • #68
Fredrik said:
It would be wishful thinking to believe that science can find all the answers, but I haven't advocated that view. There's no wishful thinking involved in thinking that QM makes very accurate predictions about results of experiments, or in explaining what a theory must do in order to be falsifiable. But there's wishful thinking involved in thinking that QM must be "more than that" (in the sense discussed above). It doesn't bother me that people are interested in exploring that option too, but it should be viewed as a long shot.

But using the word "wishful thinking" is just not helpful. Given any unsolved problem, it's wishful thinking to a certain extent to believe that we will ever solve it. So whether something is wishful thinking is not much of a guide to what we should be working on in science.
 
  • #69
stevendaryl said:
But using the word "wishful thinking" is just not helpful. Given any unsolved problem, it's wishful thinking to a certain extent to believe that we will ever solve it. So whether something is wishful thinking is not much of a guide to what we should be working on in science.

What's different about the mysteries of quantum mechanics is not whether it's wishful thinking. It's really the (almost) complete lack of progress, and (almost) complete lack of any hints as to where a solution might be found. People give up because they are tired of beating their heads against a wall. I think it's something akin to "sour grapes" to retroactively adjust your view of what science is all about so that the problems that you have no idea how to solve are excluded as not really science, in the first place. I guess there is a practical reason for drawing such a boundary, and to consider something a scientific question if there is some hope of answering it. But in lots of cases, the only way we know whether there is any hope of answering something is by trying, and either succeeding or failing.
 
  • #70
atyy said:
If everything is quantum and there is only unitary evolution, then there would be unitary evolution of the wave function of the universe.
I assume that what you mean by "everything is quantum" is that every physical system is such that a pure state (a mathematical thing) can represent what you previously called the system's "real state" (a real-world thing). Since the universe is a physical system, it follows that we can assign a state to the universe. But to me, "everything is quantum" just means that there's no experiment in which QM will not work, and that doesn't imply that we can assign a state to the universe.

atyy said:
The "minimal interpretation" without collapse and without a classical quantum cut is not minimal - it is making a huge claim - that the unitary evolution of the wave function of the universe makes sense. If this were true, the minimal interpretation would have solved the problem that the many-worlds approach investigates.
If someone who advocates a minimal interpretation disagrees with this, it's not because they're making some huge assumption. It's because they disagree with you about the meaning of concepts like "collapse" or "classical/quantum cut", as I did above.
 
  • #71
Fredrik said:
I assume that what you mean by "everything is quantum" is that every physical system is such that a pure state (a mathematical thing) can represent what you previously called the system's "real state" (a real-world thing). Since the universe is a physical system, it follows that we can assign a state to the universe. But to me, "everything is quantum" just means that there's no experiment in which QM will not work, and that doesn't imply that we can assign a state to the universe.

In your minimal interpretation, does the universe have a "real state"?

Fredrik said:
If someone who advocates a minimal interpretation disagrees with this, it's not because they're making some huge assumption. It's because they disagree with you about the meaning of concepts like "collapse" or "classical/quantum cut", as I did above.

I was replying to vanhees71 there, not to you, because I am not sure that your and vanhee71's idea of a "minimal interpretation" are the same. For example, I am pretty sure that bhobba's ensemble interpretation is not the same as Ballentine's, and there is no substantial disagreement between his Ensemble interpretation and Copenhagen. So far, I am not sure whether you and I disagree about the meaning of a "classical/quantum cut" and "collapse", maybe just the naming of the concept.

Edit: bhobba's Ensemble interpretation differs from Ballentine's because bhobba explicitly acknowledges as axioms a classical/quantum cut, and the equivalence of proper and improper density matrices. That's why I believe bhobba's interpretation makes sense, while Ballentine's is misleading or wrong.
 
  • #72
atyy said:
In your minimal interpretation, does the universe have a "real state"?

I can let Fredrick answer for himself, but I certainly wouldn't call any assumption about a "real state" part of a minimal interpretation. What I think of the minimal interpretation is purely an input/output relation: Set up the initial conditions, let things evolve, make a measurement. Quantum mechanics gives you the probability for each possible output (measurement results) as a function of the input (initial setup). That's minimal in that you don't need to assume anything else in order to apply QM.

I guess what's not minimal about this minimal interpretation is that it assumes that the input and output can be understood in pre-quantum terms.
 
  • #73
atyy said:
Edit: bhobba's Ensemble interpretation differs from Ballentine's because bhobba explicitly acknowledges as axioms a classical/quantum cut, and the equivalence of proper and improper density matrices. That's why I believe bhobba's interpretation makes sense, while Ballentine's is misleading or wrong.

I'm a little uncomfortable with the ensemble interpretation, in that it seems to me that there is an element of pretense involved. After decoherence, you perform a trace over unobservable environmental degrees of freedom, and then what's left is a density matrix that looks like a mixed state. Then you can go on to pretend that this mixed state represents an ensemble. But I call it a pretense, because you know that really, pure states never evolve into mixed states. You're pretending it's a mixed state so that you can give an ensemble interpretation.
 
  • #74
stevendaryl said:
I'm a little uncomfortable with the ensemble interpretation, in that it seems to me that there is an element of pretense involved. After decoherence, you perform a trace over unobservable environmental degrees of freedom, and then what's left is a density matrix that looks like a mixed state. Then you can go on to pretend that this mixed state represents an ensemble. But I call it a pretense, because you know that really, pure states never evolve into mixed states. You're pretending it's a mixed state so that you can give an ensemble interpretation.

It's fine, because once you make the classical/quantum cut, you acknowledge that it is all pretense - or in more conventional language - quantum mechanics is an instrumental theory and only tells us how to predict the outcomes of measurements, where a measuring device is a fundamental concept.

ie: it's fine, because we acknowledge the problems (or limitations) upfront.
 
  • #75
Well, I think this threads, proves my hypothesis that there are as many interpretations of quantum theory as physicists using it ;-)).

The question, if there exists a (pure or mixed) state of the whole universe, of course, is a challenge to the ensemble representation, because you cannot prepare an ensemble of universes, because there is only one (except you adhere to some "parallel universes" picture, which in my opinion is unscientific, because by definition, you cannot observe these parallel universes at all).

I don't think that the notion of a quantum state of the entire universe makes sense, because a probabilistic description can only be checked by doing measurements of an ensemble of independently and equally prepared setups of a system, and that cannot be done.
 
  • #76
vanhees71 said:
Well, I think this threads, proves my hypothesis that there are as many interpretations of quantum theory as physicists using it ;-)).

The question, if there exists a (pure or mixed) state of the whole universe, of course, is a challenge to the ensemble representation, because you cannot prepare an ensemble of universes, because there is only one (except you adhere to some "parallel universes" picture, which in my opinion is unscientific, because by definition, you cannot observe these parallel universes at all).

I don't think that the notion of a quantum state of the entire universe makes sense, because a probabilistic description can only be checked by doing measurements of an ensemble of independently and equally prepared setups of a system, and that cannot be done.

As I said in another post, people can certainly apply physics to the early universe where there were no observers or measurement devices. Of course, measurements and observations are critical in testing theories of science, but the theories themselves have a usefulness beyond testability. We can use physics for reasoning about "what-if" scenarios: What if the matter in the universe were arranged in perfect spherical symmetric? What would the gravity be like? What if the universe were filled with noninteracting dust? What if all the mass in a star were concentrated into a volume of say 60 cubic kilometers?

Saying that it's not science if there are no observers or measurement devices makes for an overly constrained notion of what counts as science.
 
  • #77
vanhees71 said:
Well, I think this threads, proves my hypothesis that there are as many interpretations of quantum theory as physicists using it ;-)).

The question, if there exists a (pure or mixed) state of the whole universe, of course, is a challenge to the ensemble representation, because you cannot prepare an ensemble of universes, because there is only one (except you adhere to some "parallel universes" picture, which in my opinion is unscientific, because by definition, you cannot observe these parallel universes at all).

I don't think that the notion of a quantum state of the entire universe makes sense, because a probabilistic description can only be checked by doing measurements of an ensemble of independently and equally prepared setups of a system, and that cannot be done.

OK, that makes sense - but in which case why do you object to a classical quantum cut? If there is no wave function of the universe, and quantum mechanics only applies to subsystems of the universe, then the cut between the measuring device and the quantum system is a classical/quantum cut.

The coarse graining doesn't eliminate the cut, because if we take the measuring device and the quantum system as a quantum system, there is nothing to coarse grain it, yet the measuring device is classical. We could coarse grain the measuring device and measured system by extending the quantum boundary once more - but there is a limit to this, since the wave function doesn't apply to the universe. So at some point, in the Ensemble interpretation, you have to make a cut - one can debate where - but there is a cut.

Strictly speaking, I don't think a successful ensemble interpretation can require a real ensemble, because otherwise the calculation of Mukhanov and Chibisov and the test by Planck (or BICEP2?) will not make sense in the ensemble interpretation.
 
  • #78
atyy said:
In your minimal interpretation, does the universe have a "real state"?
It's impossible to assign a state (in the sense of QM) to it.

A more interesting question is if any system has a "real state". This question can be split in two: 1. Does a (pure) state in QM represent a "real state"? 2. If no, does a system have a "real state" at all?

If we are to remain truly minimal, we should leave question 1 unanswered. But since "yes" is the starting point of a many-worlds interpretation, most "minimalists" are probably thinking that the answer is probably "no".

Question 2 is of course impossible to answer without a better theory to replace QM, but it's interesting to think about it. I'm thinking that systems probably do have "real states", and that a "theory" that can describe them may have some very undesirable features. It might describe what's going on in terms of things that are unobservable in principle, and be falsifiable only in the sense that under certain conditions, it makes essentially the same predictions as QM.

atyy said:
I was replying to vanhees71 there, not to you, because I am not sure that your and vanhee71's idea of a "minimal interpretation" are the same. For example, I am pretty sure that bhobba's ensemble interpretation is not the same as Ballentine's, and there is no substantial disagreement between his Ensemble interpretation and Copenhagen.
Yes, there are varieties, and no standardized terminology. Ballentine doesn't even agree with Ballentine. His 1970 article is assuming that regardless of what the wavefunction is, every particle has a well-defined position at all times. I haven't seen anything like that in his book. My only issue with that part of the book is that his wording makes it sound like he's proving the ensemble interpretation.

Copenhagen is typically defined only to ridicule it, by people who have misunderstood it, so I prefer not to use that term when I can avoid it. I think that a sensible definition of Copenhagen would be identical to a sensible definition of a minimal statistical interpretation.

atyy said:
So far, I am not sure whether you and I disagree about the meaning of a "classical/quantum cut" and "collapse", maybe just the naming of the concept.
Probably just terminology.
 
Last edited:
  • #79
stevendaryl said:
I can let Fredrick answer for himself, but I certainly wouldn't call any assumption about a "real state" part of a minimal interpretation. What I think of the minimal interpretation is purely an input/output relation: Set up the initial conditions, let things evolve, make a measurement. Quantum mechanics gives you the probability for each possible output (measurement results) as a function of the input (initial setup). That's minimal in that you don't need to assume anything else in order to apply QM.
Agreed.

stevendaryl said:
I'm a little uncomfortable with the ensemble interpretation, in that it seems to me that there is an element of pretense involved. After decoherence, you perform a trace over unobservable environmental degrees of freedom, and then what's left is a density matrix that looks like a mixed state. Then you can go on to pretend that this mixed state represents an ensemble. But I call it a pretense, because you know that really, pure states never evolve into mixed states. You're pretending it's a mixed state so that you can give an ensemble interpretation.
Pure states never evolve into mixed states under unitary time evolution (i.e. the Schrödinger equation), but only isolated systems evolve that way. If such a system has two interacting subsystems, the only way to assign states to them is through the partial trace operation, and when you do, you will find that a pure state (of a subsystem that isn't isolated) does evolve into a mixed state. Further, this evolution is irreversible in the sense that it can't be reversed by unitary evolution of that subsystem alone.

At least that's my naive understanding of the methods of positive operator valued measures and similar techniques that I only recently began to look at. So far I have only skimmed Flory's article "POVMs and superoperators" (I can't find it online...weird), and read a few pages in the book by Busch, Grabowski & Lachti.
 
  • #80
Fredrik said:
Pure states never evolve into mixed states under unitary time evolution (i.e. the Schrödinger equation), but only isolated systems evolve that way. If such a system has two interacting subsystems, the only way to assign states to them is through the partial trace operation, and when you do, you will find that a pure state (of a subsystem that isn't isolated) does evolve into a mixed state. Further, this evolution is irreversible in the sense that it can't be reversed by unitary evolution of that subsystem alone.

At least that's my naive understanding of the methods of positive operator valued measures and similar techniques that I only recently began to look at. So far I have only skimmed Flory's article "POVMs and superoperators" (I can't find it online...weird), and read a few pages in the book by Busch, Grabowski & Lachti.

Yes, you're right, tracing produces a mixed state, but the origin of the mixed state is from the fact that you're doing a trace. You might start with a description of a single electron (say), then at some later time, it interacts with the environment, and you do a trace to get a mixed state representation. But the state came from a single electron. It doesn't really represent an ensemble.
 
  • #81
stevendaryl said:
I'm a little uncomfortable with the ensemble interpretation, in that it seems to me that there is an element of pretense involved. After decoherence, you perform a trace over unobservable environmental degrees of freedom, and then what's left is a density matrix that looks like a mixed state. Then you can go on to pretend that this mixed state represents an ensemble. But I call it a pretense, because you know that really, pure states never evolve into mixed states. You're pretending it's a mixed state so that you can give an ensemble interpretation.
At least in Ballentine's book, already pure states are interpreted as referring to ensembles. So in a measurement, the reduced mixed state refers to an ensemble of systems because the pure entangled state refers to an ensemble of apparatuses+systems.
 
  • #82
atyy said:
The coarse graining doesn't eliminate the cut, because if we take the measuring device and the quantum system as a quantum system, there is nothing to coarse grain it, yet the measuring device is classical. We could coarse grain the measuring device and measured system by extending the quantum boundary once more - but there is a limit to this, since the wave function doesn't apply to the universe. So at some point, in the Ensemble interpretation, you have to make a cut - one can debate where - but there is a cut.
But why should this cut be called quantum / classical cut? If you acknowledge that you can move the boundary, how do you verify that the far side of the cut behaves according to classical mechanics? For every possible experiment which investigates something at the far side, you could simply shift the boundary by using the quantum description of this something and you would be in the quantum domain again.
 
  • #83
kith said:
But why should this cut be called quantum / classical cut? If you acknowledge that you can move the boundary, how do you verify that the far side of the cut behaves according to classical mechanics? For every possible experiment which investigates something at the far side, you could simply shift the boundary by using the quantum description of this something and you would be in the quantum domain again.

We can call it the Heisenberg cut if you prefer, or the quantum/common-sense reality cut or the quantum/macroscopic cut (or whatever, if it is just a matter of naming).
 
  • #84
kith said:
At least in Ballentine's book, already pure states are interpreted as referring to ensembles. So in a measurement, the reduced mixed state refers to an ensemble of systems because the pure entangled state refers to an ensemble of apparatuses+systems.

The ensemble interpretation doesn't solve allow one to derive that proper and improper mixed states are the same. It must be postulated, which is equivalent to postulating collapse.

A proper mixed state is when Alice makes Ensemble A in pure state |A> and Ensemble B in pure state |B>, then she makes a Super-Ensemble C consisting of equal numbers of members of Ensemble A and Ensemble B. If she hands me C without labels A and B, I can use a mixed density matrix to describe the statistics of my measurements on C. But if in addition I receive the labels A and B, then I can divide C into two sub-ensembles, each with its own density matrix, since C was just a mixture of A and B. Here C is a "proper" mixture, which can be naturally divided into sub-ensembles.

An improper mixed state is when I have an ensemble C in a pure state, each member of which consists of a subsystem A entangled with subsystem B. If I do a partial trace over B, I get a density matrix (the reduced density matrix) which describes the statistics of all measurements that are "local" to A. This reduced density matrix for A is not a pure state, and is an "improper" mixed state. There is no natural way to partition this into sub-ensembles, since there is only one ensemble C.
 
  • #85
atyy said:
We can call it the Heisenberg cut if you prefer, or the quantum/common-sense reality cut or the quantum/macroscopic cut (or whatever, if it is just a matter of naming).

Why is is not referred to as the "no going back due to irreversible increase in entropy" cut?
 
  • #86
Jilang said:
Why is is not referred to as the "no going back due to irreversible increase in entropy" cut?

I think Weinberg's term is the best "common sense reality", which I notice bhobba has also adopted.

Technically, entropy can be defined on a quantum system. There is the entropy of a mixed state. There is even the entanglement entropy of a subsystem of a pure state. There are attempts to show that the second law of thermodynamics can be derived from an increase in entanglement emtropy. So I think within a Copenhagen/Ensemble interpretation where there is a cut, I would reserve the word "entropy" for something else.

In a MWI approach there is no cut, so the apparent cut would be derived from decoherence which I think leads to increased entropy in the subsystem via entanglement.
 
Last edited:
  • #87
Fredrik said:
Question 2 is of course impossible to answer without a better theory to replace QM, but it's interesting to think about it. I'm thinking that systems probably do have "real states", and that a "theory" that can describe them may have some very undesirable features. It might describe what's going on in terms of things that are unobservable in principle, and be falsifiable only in the sense that under certain conditions, it makes essentially the same predictions as QM.

Ok, I think I agreed with everything you said in that post, so let me just try to use this bit to say what the measurement problem is in these terms. If we believe a theory beyond QM is possible in principle, and that such a theory has "real states" in principle, can we show that this possibility exists in principle? Historically, the problem arose because of von Neumann's erroneous proof that such a theory cannot exist even in principle. The achievement of Bohm was to providing a concrete example that the proof was wrong, ie. that there is no way to correct von Neumann's proof to make it right.

A Bohmian-type view even supports your intuition that such a theory might have very undesirable features, explaining why we prefer to use QM in practice as long as experiments allow. For example, Montina showed that "any ontological Markovian theory of quantum mechanics requires a number of variables which grows exponentially with the physical size." http://arxiv.org/abs/0711.4770

However, although a Bohmian-type theory is unwieldy and under-constrained without experimental input, it can be falsified in a sense that goes beyond reproducing the predictions of QM. This is because BM says that QM is a "quantum equilibrium" situation, and to fully solve the measurement problem BM has to postulate that at some point there was "quantum nonequlibrium", and that in principle there are experiments that will show QM to be an incorrect description of the universe.

So in a sense, the measurement problem is to show that "real states" can exist, and to construct some possibilities. BM constructs some possibilities by adding things, MWI tries to construct it by removing things.
 
  • #88
atyy said:
We can call it the Heisenberg cut if you prefer, or the quantum/common-sense reality cut or the quantum/macroscopic cut (or whatever, if it is just a matter of naming).
Most of these expressions suggest that there's a domain where QM is valid and a domain where QM is wrong and classical mechanics is valid instead. My point is that this statement can't be justified if there is no definitive limit to shifting the boundary.
 
  • #89
atyy said:
There is no natural way to partition this into sub-ensembles, since there is only one ensemble C.
If we want to perform an experiment on a sub-ensemble, we select the sub-ensemble in a physical way (by blocking one beam in a SG apparatus for example). This way obviously depends on the experimental setting. Why do we need an additional, "natural" way to partition the ensemble?
 
  • #90
kith said:
Most of these expressions suggest that there's a domain where QM is valid and a domain where QM is wrong and classical mechanics is valid instead. My point is that this statement can't be justified if there is no definitive limit to shifting the boundary.

Yes, it doesn't mean that. It means that every user of quantum mechanics known so far must make this cut, and have as a fundamental notion a measuring device that registers a macroscopic mark. MWI tries to make it such that that statement might be false for future users of QM.
 
  • #91
kith said:
If we want to perform an experiment on a sub-ensemble, we select the sub-ensemble in a physical way (by blocking one beam in a SG apparatus for example). This way obviously depends on the experimental setting. Why do we need an additional, "natural" way to partition the ensemble?

In an improper mixture, there is no notion of a sub-ensemble. Since the experiment clearly shows that sub-ensembles exist, you have to add the notion of a sub-ensemble to the improper mixture. Adding this notion is the statement that an improper mixture can be treated like a proper mixture, or collapse.

Edit: One can have sub-ensembles for the pure state ensemble if one assumes hidden variables. But if one does this then the interpretation is not minimal. At any rate, at this point one must add something: equivalence of proper and improper mixtures, collapse, or hidden variables in order to define the notion of a subensemble.

Edit: Once a Heisenberg cut has been made, and if one is agnostic about the reality of the wave function, then there is no problem with collapse, since you are just collapsing an unreal thing.
 
Last edited:
  • #92
atyy said:
A proper mixed state is when Alice makes Ensemble A in pure state |A> and Ensemble B in pure state |B>, then she makes a Super-Ensemble C consisting of equal numbers of members of Ensemble A and Ensemble B. If she hands me C without labels A and B, I can use a mixed density matrix to describe the statistics of my measurements on C. But if in addition I receive the labels A and B, then I can divide C into two sub-ensembles, each with its own density matrix, since C was just a mixture of A and B. Here C is a "proper" mixture, which can be naturally divided into sub-ensembles.

An improper mixed state is when I have an ensemble C in a pure state, each member of which consists of a subsystem A entangled with subsystem B. If I do a partial trace over B, I get a density matrix (the reduced density matrix) which describes the statistics of all measurements that are "local" to A. This reduced density matrix for A is not a pure state, and is an "improper" mixed state. There is no natural way to partition this into sub-ensembles, since there is only one ensemble C.
I've been wondering what you guys meant by proper and improper mixed states. The distinction and terminology seem odd to me. The distinction only makes sense to someone who believes that a pure state represents the system's "real state". (Such of person is an MWI-advocate, whether they understand it or not). Such a person would say that a mixed state is only used when we don't know what the correct pure state is.

Regarding the terminology, it seems to make at least as much sense to call the first kind "improper" and the other "proper", because the first kind involves a degree of ignorance that's been introduced artificially, and the second kind involves something fundamentally unknowable. But I guess the terms "proper" and "improper" are only supposed to be labels anyway. The terms might as well be "blue" and "green".

atyy said:
The ensemble interpretation doesn't solve allow one to derive that proper and improper mixed states are the same. It must be postulated, which is equivalent to postulating collapse.
It doesn't allow you to say that they're not the same. So the "postulate" would have to be implicit in the complete lack of additional postulates on top of QM.

You appear to be saying that even the minimal ensemble/statistical/Copenhagen interpretation automatically (implicitly) includes collapse. Since you're saying that without explaining what you mean by "collapse", it looks like you're referring to an exact collapse, which requires modifications of the theory. In that case, I strongly disagree. "Collapse" in a minimal ensemble/statistical interpretation is just decoherence, and that's of course included, since the "minimal interpretation" isn't really an interpretation. It's just QM without unnecessary assumptions.
 
  • #93
I am sure that I must be missing something obvious here. My understanding is the wave function evolution is a reversible process. When there is some sort of event that increases entropy would that not make the evolution non-reversible? Isn't that the cut?
 
  • #94
Jilang said:
I am sure that I must be missing something obvious here. My understanding is the wave function evolution is a reversible process. When there is some sort of event that increases entropy would that not make the evolution non-reversible? Isn't that the cut?

There is never a precise, objective moment where anything irreversible happens. It's just that as time goes on, and a particle interacts with more and more particles, the practical possibility of reversing the interaction drops exponentially fast.
 
  • #95
Fredrik said:
I've been wondering what you guys meant by proper and improper mixed states. The distinction and terminology seem odd to me. The distinction only makes sense to someone who believes that a pure state represents the system's "real state". (Such of person is an MWI-advocate, whether they understand it or not). Such a person would say that a mixed state is only used when we don't know what the correct pure state is.

Regarding the terminology, it seems to make at least as much sense to call the first kind "improper" and the other "proper", because the first kind involves a degree of ignorance that's been introduced artificially, and the second kind involves something fundamentally unknowable. But I guess the terms "proper" and "improper" are only supposed to be labels anyway. The terms might as well be "blue" and "green".

Yes, "proper" and "improper" are just labels. The definition does assume that a pure state represents the maximum information one can have about a quantum system, without recourse to hidden variables. The pure state is privileged because it obeys unitary Schroedinger evolution. Yes, this is a bit like a secret Many-Worlds. :smile: But it isn't technically, because there is a classical/quantum cut, and collapse. The proper mixed state, being constructed from pure states, is also privileged because each component in the proper mix obeys Schroedinger evolution. The improper mixed state is not, because its evolution is governed by Schroedinger evolution of the pure state of the total system, and the subsystem does not usually evolve by Schroedinger evolution.

Fredrik said:
It doesn't allow you to say that they're not the same. So the "postulate" would have to be implicit in the complete lack of additional postulates on top of QM.

You appear to be saying that even the minimal ensemble/statistical/Copenhagen interpretation automatically (implicitly) includes collapse. Since you're saying that without explaining what you mean by "collapse", it looks like you're referring to an exact collapse, which requires modifications of the theory. In that case, I strongly disagree. "Collapse" in a minimal ensemble/statistical interpretation is just decoherence, and that's of course included, since the "minimal interpretation" isn't really an interpretation. It's just QM without unnecessary assumptions.

Yes, I am saying that the minimal ensemble/statistical/Copenhagen interpretation must explicitly or implicitly include collapse or an equivalent axiom in order be considered correct quantum mechanics. Actually, Copenhagen explicitly includes collapse, which is the Born rule in the form that the probability to observe a state |a> given that the system is in state |ψ> is |<a|ψ>|2.

The Ensemble interpretation without collapse usually says that the probability to observe the eigenvalue corresponding to state |a> given that the system is in state |ψ> is |<a|ψ>|2. Thus this form of the Born rule without collapse doesn't give you the probability of the sub-ensembles that are formed. Yet we know that the probability of obtaining sub-ensemble |a> after a measurement is |<a|ψ>|2. The Born rule without collapse is unable to make this prediction.

Incidentally, I should say that one who thinks that collapse or an equivalent axiom is not needed, and that only decoherence is needed is also secretly a Many-Worlds advocate, because it is trying to do everything with unitary evolution of a pure state. :smile:
 
Last edited:
  • #96
atyy said:
In an improper mixture, there is no notion of a sub-ensemble. Since the experiment clearly shows that sub-ensembles exist [...]
I don't think that these statements are obviously true. I don't see a quick way to resolve this, so... what is the definition of a sub-ensemble?

/edit: maybe this belongs in an own thread
 
Last edited:
  • #97
atyy said:
A proper mixed state is when Alice makes Ensemble A in pure state |A> and Ensemble B in pure state |B>, then she makes a Super-Ensemble C consisting of equal numbers of members of Ensemble A and Ensemble B. If she hands me C without labels A and B, I can use a mixed density matrix to describe the statistics of my measurements on C. But if in addition I receive the labels A and B, then I can divide C into two sub-ensembles, each with its own density matrix, since C was just a mixture of A and B. Here C is a "proper" mixture, which can be naturally divided into sub-ensembles.

An improper mixed state is when I have an ensemble C in a pure state, each member of which consists of a subsystem A entangled with subsystem B. If I do a partial trace over B, I get a density matrix (the reduced density matrix) which describes the statistics of all measurements that are "local" to A. This reduced density matrix for A is not a pure state, and is an "improper" mixed state. There is no natural way to partition this into sub-ensembles, since there is only one ensemble C.

kith said:
I don't think that these statements are obviously true. I don't see a quick way to resolve this, so... what is the definition of a sub-ensemble?

/edit: maybe this belongs in an own thread

Yes, what is a sub-ensemble? Maybe there are several ways to do this.

1. The way I did it above, I only defined ensemble. Then I defined a super-ensemble for a proper mixed state. So a sub-ensemble is an ensemble that is part of a super-ensemble, which leaves the notion of sub-ensemble for a pure state undefined, and the corresponding problem for an improper mixed state.

2. The other way of doing it is to say that for a pure state ensemble |a> each individual of the ensemble has a hidden variable x, so that the state is really (|a>, x), with a conventional probability distribution over x. Since there is a conventional probability distribution here, the ensemble |a> can be divided into physical sub-ensembles. We know that x must be a "Bohmian" hidden variable, and cannot be a quantum variable, because if the quantum state |a>=|b1>+b|2>, and we measure in the B basis, the sub-ensembles will be in definite states of b, even though |a> could not have been drawn from a distribution over b1 and b2. Another way of saying that the hidden variables cannot be quantum variables is that for a generic pure state, there is no classical probability distribution over [x,p], since the Wigner function has negative bits.

In option 1, since the sub-ensemble for a pure state is undefined, to derive a sub-ensemble we have to add the postulate by hand. In option 2, the sub-ensembles are defined by "Bohmian" hidden variables and even a pure state corresponds to a conventional probability distribution, so we can form the sub-ensembles. But option 2 is not minimal.

Is there another way?
 
Last edited:
  • #98
atyy said:
Yes, what is a sub-ensemble? Maybe there are several ways to do this.

1. The way I did it above, I only defined ensemble. Then I defined a super-ensemble for a proper mixed state. So a sub-ensemble is an ensemble that is part of a super-ensemble, which leaves the notion of sub-ensemble for a pure state undefined, and the corresponding problem for an improper mixed state.
How does the experiment show the existence of such sub-ensembles - or proper mixed states to begin with? Wouldn't this falsify the MWI?

I think the introduction of proper mixed states corresponds to defining the location of the Heisenberg cut. In principle, we could shift the boundary and track the correlations with the state of the instruments which produced the mixture.
 
  • #99
kith said:
How does the experiment show the existence of such sub-ensembles - or proper mixed states to begin with? Wouldn't this falsify the MWI?

Experiment shows the existence of sub-ensembles because if we take all the individual systems whose measurement produced the eigenvalue k corresponding to the vector |k>, and form an ensemble from those systems, that ensemble has state |k>. For example, if after a measurement you get 2 beams, where one beam is measured to be up and the other down, and you block the beam of particles that were measured to be down. The ensemble formed from the beam measured to be up will have state |up>. This is why in Copenhagen, the Born rule is stated that the probability to find a particle in state |k> given that it is in state |ψ> is |<k|ψ>|2.

I don't know whether it would falsify Many-Worlds, but Many-Worlds is the programme of trying to derive (among other things) the collapse as only an apparent collapse due to unitary evolution of the wave function.

kith said:
I think the introduction of proper mixed states corresponds to defining the location of the Heisenberg cut. In principle, we could shift the boundary and track the correlations with the state of the instruments which produced the mixture.

Yes, you can do that. But unless Many-Worlds works, we ultimately must place a cut somewhere in order to use quantum mechanics, and have measurements in which there are definite outcomes.
 
  • #100
kith said:
I think the introduction of proper mixed states corresponds to defining the location of the Heisenberg cut. In principle, we could shift the boundary and track the correlations with the state of the instruments which produced the mixture.
atyy said:
Yes, you can do that.
Then I don't see how experiments can say anything definite about sub-ensembles.

Your definition of sub-ensembles relies on proper mixed states. The question whether a mixed state is proper or not depends on the location of the Heisenberg cut. If the experiment shows that some states are proper mixtures, it also shows where the cut is. So we would not be able to do what I have written above.
 
Back
Top