Collapse and Peres' Coarse Graining

  • Thread starter Thread starter atyy
  • Start date Start date
  • Tags Tags
    Collapse
  • #51
Isnt' the collapse issue just another name for the state preparation vs. observable measurement difference? I mean, states should be simple bookkeeping devices which have a probabilistic interpretation, once one goes from the abstract (rigged) Hilbert space to lab experiments.
How would one go about explaining to an experimentalist that quantum mechanics is essentially a statistical theory? I've learned QM from a mixture of (the so-called) orthodox Copenhagen formulation (Born rule separated from SEq separated from von Neumann's state reduction after measurement) and the virtual statistical ensemble approach, which used numerical probabilities for results of experiments done on an infinite number of (tricky issue coming!) "identically prepared real quantum systems". How would you reconcile the von Neumann state reduction with the virtual statistical ensemble?
 
Physics news on Phys.org
  • #52
dextercioby said:
Isnt' the collapse issue just another name for the state preparation vs. observable measurement difference? I mean, states should be simple bookkeeping devices which have a probabilistic interpretation, once one goes from the abstract (rigged) Hilbert space to lab experiments.
How would one go about explaining to an experimentalist that quantum mechanics is essentially a statistical theory? I've learned QM from a mixture of (the so-called) orthodox Copenhagen formulation (Born rule separated from SEq separated from von Neumann's state reduction after measurement) and the virtual statistical ensemble approach, which used numerical probabilities for results of experiments done on an infinite number of (tricky issue coming!) "identically prepared real quantum systems". How would you reconcile the von Neumann state reduction with the virtual statistical ensemble?

I don't understand why there is any need to reconcile state reduction with the virtual statistical ensemble.

atty's point, which seems valid to me, is that the whole idea that a preparation procedure produces a system in a known state implies collapse of a sort. The alternative view (which I guess is equivalent) is instead of the orthodox view of preparing the system in an initial state, and then later measuring some observable, you could just talk about relative probabilities of histories of observations, which doesn't explicitly involve preparation or collapse.
 
  • #53
OK, then how about the conflict between the Schrödinger's equation for a time evolution of states and the reduction postulate which necessary involves a time evolution of the state, too.
 
  • #54
stevendaryl said:
Actually, in that equation, there is still a \rho(t_0) reflecting the initial preparation, but maybe that can be replaced by an initial observation?

Yes, there is still the initial state, which doesn't evolve in this form of Heisenberg picture. It could be linked to the initial observation, but it would be cumbersome, since the state represents an equivalence class of different classical operations which we call preparations. I can accept that the presence of the initial quantum state is not a form of collapse, in the sense that collapse links preparations and measurement, and says that measurement can result in two outcomes: a classical outcome and a quantum state, and that the classical outcome indicates the quantum state, so both are given by the same Born rule. For consistency, rejecting collapse means that the Schroedinger picture is invalid, and that measurement cannot be used as state preparation, both of which are contrary to standard quantum mechanics, but I am willling to accept that the view is at least consistent (and thus a plausible interpretation).
 
  • #55
dextercioby said:
Isnt' the collapse issue just another name for the state preparation vs. observable measurement difference?

In the orthodox Copenhagen interpretation, measurement can be used as a means of state preparation. A measurement can potentially have two outcomes: a classical outcome which is the reading of the apparatus, and a quantum state. The collapse postulate says that the quantum outcome and the classical reading are linked, and both are given by the Born rule.

So not all preparations result from measurement, but some preparations can result from measurement.

dextercioby said:
I mean, states should be simple bookkeeping devices which have a probabilistic interpretation, once one goes from the abstract (rigged) Hilbert space to lab experiments.

Yes, in the orthodox Copenhagen interpretation quantum states are just book keeping devices, as is unitary evolution and collapse

dextercioby said:
How would one go about explaining to an experimentalist that quantum mechanics is essentially a statistical theory?

In the orthodox Copenhagen interpretation, quantum mechanics only makes statistical predictions by postulation. Thus for a given initial quantum state, the classical and quantum outcomes of a measurement are probabilistic and given by the Born rule.

dextercioby said:
I've learned QM from a mixture of (the so-called) orthodox Copenhagen formulation (Born rule separated from SEq separated from von Neumann's state reduction after measurement) and the virtual statistical ensemble approach, which used numerical probabilities for results of experiments done on an infinite number of (tricky issue coming!) "identically prepared real quantum systems". How would you reconcile the von Neumann state reduction with the virtual statistical ensemble?

In the orthodox Copenhagen interpretation, we can label each member of the virtual ensemble by a pure quantum state: the virtual ensemble and the quantum state are both book keeping devices.

dextercioby said:
OK, then how about the conflict between the Schrödinger's equation for a time evolution of states and the reduction postulate which necessary involves a time evolution of the state, too.

In the orthodox Copenhagen interpretation, we have to divide the world into a classical portion and a quantum portion. This is subjective, but for all practical purposes, we do know what a classical measurement apparatus is, and we can time stamp our observations (as is done in experimental Bell tests). Since we know when measurements occur for all practical purposes, we can also deal with the unitary evolution between measurements, and the non-unitary evolution that occurs when a measurement is made. This division is not absolute, but each user of quantum theory must make this division. In the orthodox Copenhagen interpretation, we do not know whether there is any meaning to the "wave function of the universe".
 
Last edited:
  • #56
atyy said:
We are not discussing terminology. vanhees71 and I have agreed that collapse is not necessarily physical. Collapse is found in almost all standard texts. As far as I know, collapse or an equivalent postulate must be added to {unitary evolution + Born rule without collapse}. I think vanhees71 is saying that collapse can be derived from {unitary evolution + Born rule without collapse}, but I don't believe this is true.
But what would be your objection to having epistemic collapse implicit in the ensemble concept plus the matter of fact of performing and using measurements, instead of explicitly in the form of postulate? Your initial question seemed to admit the possibility that explicit collapse could be replaced with coarse-graining(implicit collapse) making the explicit postulate unnecessary.
To me even if such a view only expressed explicitly unitary evolution+Born rule, it would be including the non-unitary phase of evolution as well. I might very well be missing some subtlety and that's why I ask.
 
  • #57
TrickyDicky said:
But what would be your objection to having epistemic collapse implicit in the ensemble concept plus the matter of fact of performing and using measurements, instead of explicitly in the form of postulate? Your initial question seemed to admit the possibility that explicit collapse could be replaced with coarse-graining(implicit collapse) making the explicit postulate unnecessary.
To me even if such a view only expressed explicitly unitary evolution+Born rule, it would be including the non-unitary phase of evolution as well. I might very well be missing some subtlety and that's why I ask.

Demystifier pointed out that the coarse-graining in Peres's view only acts on the apparatus, not on the quantum system. Hence for the quantum system, the coarse-graining (as far as we can understand Peres) is only equivalent to decoherence. Decoherence does not do away with the need for collapse, because the {system+apparatus+environment} is still in a pure state, which presumably corresponds to an "ensemble". In order to have sub-ensembles, the individual members of the ensemble must be labelled with different labels (eg. an ensemble of identical balls has no natural sub-ensembles, but a mixture of red and green balls is an ensemble with natural sub-ensembles). If the only label that the individual members of the ensemble have is the pure state, then there are no natural sub-ensembles. If there is a label that is not the pure state, that label is a hidden variable, which is an additional postulate.

Another problem is that decoherence is not perfect. But let's suppose decoherence is perfect, in which case it can be argued that decoherence does pick a preferred basis, and thus picks natural sub-ensembles. But this would mean that an ensemble with no sub-ensembles is suddenly divisible into sub-ensembles at the moment of perfect decoherence. This sudden appearance of sub-ensembles is equivalent to collapse. It is clear that this sudden appearance of sub-ensembles needs an additional postulate.

The basic way to see that an additional postulate is needed is that a measurement potentially has two outcomes - a classical reading of the apparatus and a quantum state. If the Born rule has to be specified as a postulate for the classical reading, then it also has to be specified as a postulate for the quantum state. There are of course other postulates that are equivalent, such as noncontextuality for the measurement outcomes, from which the Born rule can be derived via Gleason's theorem. Still the noncontextuality has to be specified as an additional postulate for the classical and quantum outcomes of the measurement.

In any case, I am willing to consider coarse-graining as an additional postulate. However, it is only discussed vaguely in one book, and as far as I can tell, Peres does not specify well enough how the coarse-graining is done that it can replace collapse. So it should be considered speculative research, not mainstream quantum mechanics.
 
Last edited:
  • #58
vanhees71 said:
Ok, I'll see that I get this done over the weekend, but I'll not use the Schroedinger picture, because that's very inconvenient in relativistic QFT, but of course, there are only free fields as usual in quantum optics. Then you only need a "wave-packet description" for the photons. The polarizer is described as ideal in terms of a projection operator located at Alice's and Bob's place. Everything works of course in the two-photon Fock space.

If the polarizer is modeled as a projection operator, one already has non-unitary time evolution.
 
Last edited:
  • Like
Likes TrickyDicky
  • #59
atyy said:
Demystifier pointed out that the coarse-graining in Peres's view only acts on the apparatus, not on the quantum system. Hence for the quantum system, the coarse-graining (as far as we can understand Peres) is only equivalent to decoherence.

Decoherence does not do away with the need for collapse, because the {system+apparatus+environment} is still in a pure state, which presumably corresponds to an "ensemble". In order to have sub-ensembles, the individual members of the ensemble must be labelled with different labels (eg. an ensemble of identical balls has no natural sub-ensembles, but a mixture of red and green balls is an ensemble with natural sub-ensembles). If the only label that the individual members of the ensemble have is the pure state, then there are no natural sub-ensembles. If there is a label that is not the pure state, that label is a hidden variable, which is an additional postulate.

Another problem is that decoherence is not perfect. But let's suppose decoherence is perfect, in which case it can be argued that decoherence does pick a preferred basis, and thus picks natural sub-ensembles. But this would mean that an ensemble with no sub-ensembles is suddenly divisible into sub-ensembles at the moment of perfect decoherence. This sudden appearance of sub-ensembles is equivalent to collapse. It is clear that this sudden appearance of sub-ensembles needs an additional postulate.
I don't have access to Peres book so I'm not really commenting on his view. But saying that quantum coarse-graining only acts on the apparatus introduces the quantum-classical distinction as something real when it is just epistemic. See below.
The basic way to see that an additional postulate is needed is that a measurement potentially has two outcomes - a classical reading of the apparatus and a quantum state. If the Born rule has to be specified as a postulate for the classical reading, then it also has to be specified as a postulate for the quantum state. There are of course other postulates that are equivalent, such as noncontextuality for the measurement outcomes, from which the Born rule can be derived via Gleason's theorem. Still the noncontextuality has to be specified as an additional postulate for the classical and quantum outcomes of the measurement.
In any case, I am willing to consider coarse-graining as an additional postulate. However, it is only discussed vaguely in one book, and as far as I can tell, Peres does not specify well enough how the coarse-graining is done that it can replace collapse. So it should be considered speculative research, not mainstream quantum mechanics.
Bohr said that there is no quantum world, just an abstract quantum description, even though there is confusion about what he meant by that I take it to mean that there is no real quantum-classical cut, it is just a graphic way to talk about the measurement problem or the existence of hidden variables but if one insists on taking it literally it can be misleading. Now the minimal interpretation is agnostic about hidden variables(I think the only interpretation that is) and allows to ignore the quantum-classical cut, the wave function is not real in it and the coarse-graining acts on the quantum system as a whole since there is no quantum-classical distinction and it gives freedom to consider the probabilities as subjective(irreversibility or non-unitary evolution), I'd say it is in this sense that the collapse postulate is redundant in this interpretation. But as you say this might not be a strictly mainstream QM take on it since it seems to assume hidden variables rather than being agnostic.
 
  • #60
TrickyDicky said:
Bohr said that there is no quantum world, just an abstract quantum description, even though there is confusion about what he meant by that I take it to mean that there is no real quantum-classical cut, it is just a graphic way to talk about the measurement problem or the existence of hidden variables but if one insists on taking it literally it can be misleading. Now the minimal interpretation is agnostic about hidden variables(I think the only interpretation that is) and allows to ignore the quantum-classical cut, the wave function is not real in it and the coarse-graining acts on the quantum system as a whole since there is no quantum-classical distinction and it gives freedom to consider the probabilities as subjective(irreversibility or non-unitary evolution), I'd say it is in this sense that the collapse postulate is redundant in this interpretation. But as you say this might not be a strictly mainstream QM take on it since it seems to assume hidden variables rather than being agnostic.

My view is that vanhees71 and Ballentine are just wrong (and that the standard textbooks like Landau and Lifshitz and Weinberg are right). (Maybe Peres is also wrong, but it is a bit vague whether he rejects collapse or not, since he seems to use it extensively in http://arxiv.org/abs/quant-ph/9906034. If you don't have his book, you can see http://arxiv.org/abs/quant-ph/9712044 and http://arxiv.org/abs/quant-ph/9906023 for his remarks on coarse graining. Peres seems to accept the classical/quantum cut and the notion that a measurement outcome must be irreversible.) My view is that the minimal interpretation is the "orthodox" Copenhagen interpretation with a classical/quantum cut and and collapse (unless one means that MWI is minimal). Some Ensemble Interpretations such as bhobba's are (AFAICT) correct and equivalent to the "orthodox" Copenhagen interpretation.

In the "orthodox" flavour of Copenhagen, the enigmatic "there is no quantum world" is taken to mean that we are agnostic about whether the wave function is ontic or epistemic. In fact, the traditional wording of the "orthodox" flavour uses both conceptions of the wave functions as conceptual tools. On the one hand, by taking a classical/quantum cut, where the classical world is taken to be absolute reality, while the wave function does not have this status and is taken to be an FAPP tool, the wave function is already "epistemic" or at least "non-ontic" in some sense. The "epistemic" nature of the wave function is especially clear when one considers that in this interpretation, the classical/quantum cut is not absolute and can be shifted. On the other hand, it is acknowledged that we make no mistake in predictions if, having taken the cut, the wave function is taken to be FAPP the complete physical state of an individual system, so the wave function is also taken to be "ontic" in some sense. I like the terms "absolute reality" and "relative reality" that Tsirelson uses in his discussion of the measurement problem http://www.tau.ac.il/~tsirel/download/nonaxio.ps.
 
Last edited:
  • #61
atyy said:
But without collapse, how can measurement be used as a means of quantum state preparation, where we use the classical result obtained to figure out the quantum state of the selected sub-ensemble? (I do understand there is a more general collapse rule than projective measurements, but let's keep things simple here, since there is still collapse in the more general rule.) Does this mean that measurement cannot be used as a form of state preparation in the minimal interpretation?
In classical probability, a probability distribution represents alternate "possibilities". A measurement "actualizes" a sub-ensemble. Would you say there is collapse involved?
The sub-ensemble could still be described with a different probability distribution in which case you could say the measurement "prepared" the new state by selecting a subset of the possibilities. However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".
 
  • #62
It seems the collapse and quantum-classical cut you refer to is just the non-commutativity of observables, the quantum hypothesis itself, so it should be present in any interpretation of QM as they all must obey the HUP. A minimum volume is given to the classical phase space that assures irreversibility of measured observables whether one postulates it formally or not the math is there.
 
  • #63
billschnieder said:
In classical probability, a probability distribution represents alternate "possibilities". A measurement "actualizes" a sub-ensemble. Would you say there is collapse involved?
The sub-ensemble could still be described with a different probability distribution in which case you could say the measurement "prepared" the new state by selecting a subset of the possibilities. However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".

Yes, if it's just a subjective way to describe our information about the system, then whether you call it a collapse or not, there is nothing very weird about it. But consider an example such as EPR with anti-correlated twin pairs (electron/positron). Alice measures spin-up for the electron, then she knows that Bob will measure spin-down for the corresponding positron. So, she just acquired the information, and there's nothing weird about that. But if it's only a matter of acquiring information, then one would think that Bob's particle was spin-down in the z-direction before Alice's measurement. So the view of "collapse" as being purely information would (it seems to me) to imply pre-existing values for such things as spin, which is basically hidden variables, which is ruled out by Bell's theorem.

The corresponding "collapse" in the classical case really does mean hidden variables. You have two pieces of paper, one white and one black. You put each into an envelope and mix them up, and give one to Alice to open and another to Bob to open. The second that Alice opens her envelope and finds a white piece of paper, she knows that Bob's envelope contains a black piece of paper. In that case, it is completely consistent (and perfectly natural) for Alice to assume that Bob's envelope contained a black piece of paper even before either of them opened their envelopes.
 
  • Like
Likes zonde
  • #64
atyy said:
I like the terms "absolute reality" and "relative reality" that Tsirelson uses in his discussion of the measurement problem
I think the wave function is "possible reality" while experimental results are "actual reality". What selects one of the "possibilities" to be actualized, then is what some people call "collapse", but this is obviously not restricted to quantum mechanics.
 
  • #65
TrickyDicky said:
It seems the collapse and quantum-classical cut you refer to is just the non-commutativity of observables, the quantum hypothesis itself, so it should be present in any interpretation of QM as they all must obey the HUP.

In the minimal interpretation, the notion of measurement is fundamental. A measurement is the interaction between a classical measurement apparatus and a quantum system, resulting in a definite classical outcome. So the classical/quantum cut is fundamental to the minimal interpretation, unless the notion of measurement can be removed as fundamental in quantum mechanics.

TrickyDicky said:
A minimum volume is given to the classical phase space that assures irreversibility of measured observables whether one postulates it formally or not the math is there.

There is no minimum volume or even a classical probability distribution over the classical phase space in quantum mechanics. It is true that some books such as Reif use this concept, but although it is useful, I don't think (maybe I'm wrong) it is fundamental, more a very good heuristic like wave-particle duality.
 
  • #66
stevendaryl said:
The corresponding "collapse" in the classical case really does mean hidden variables. You have two pieces of paper, one white and one black. You put each into an envelope and mix them up, and give one to Alice to open and another to Bob to open. The second that Alice opens her envelope and finds a white piece of paper, she knows that Bob's envelope contains a black piece of paper. In that case, it is completely consistent (and perfectly natural) for Alice to assume that Bob's envelope contained a black piece of paper even before either of them opened their envelopes.
It is not as easy as you make it sound. If we move away from the Bertlmann's socks type variables of "paper color" to the more relevant space-type dynamically changing variables then it is impossible for Alice to say by opening her envelope, what Bob will see. For example, let the papers by dynamically changing colors between black and white (opposite sequence between the two) at a given hidden frequency such that once you open the envelope, the exposure to light stops the dynamics instantly and locks it to one color. Your claim that Alice will know the result of Bob's paper by opening her envelope is false. However, we know for a fact that the two pieces of paper have perfectly anti-correlated colors. Often when we discuss these things, we easily gloss over the fact that the discussion requires that Alice and Bob opened their envelopes at the exact same time. But it is impossible to test this experimentally without post-selection. You have to take time-tags on both sides and use the time at one end to filter the results at the other end. Only then will the experiment match what is actually happening. But post-selection invalidates the derivation of the Bell inequalities since the joint probability distribution for post-selected experiments is non-factorable. Not surprising that all experiments to date, claiming violation of inequalities, employ one form of post-processing or another.
 
  • #67
@billschneider, please discuss your postselection issue by starting another thread, not here.
 
  • #68
billschnieder said:
However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".
And if it is understood as formal mathematics measure (http://en.wikipedia.org/wiki/Measure_(mathematics)) based on an independent axiomatic of any application, leaving aside any semantic notion (just a formal writing game) ?

Patrick
 
  • #69
atyy said:
In the minimal interpretation, the notion of measurement is fundamental. A measurement is the interaction between a classical measurement apparatus and a quantum system, resulting in a definite classical outcome. So the classical/quantum cut is fundamental to the minimal interpretation, unless the notion of measurement can be removed as fundamental in quantum mechanics.
There is no minimum volume or even a classical probability distribution over the classical phase space in quantum mechanics. It is true that some books such as Reif use this concept, but although it is useful, I don't think (maybe I'm wrong) it is fundamental, more a very good heuristic like wave-particle duality.
Measurement is fundamental to any empirical science, not specifically to the minimal
interpretation of QM, if you define it as interaction classical apparatus/quantum system you are already introducing a specific heuristic or interpretation as fundamental when the measurement problem is basically the lack of consensus about what measurent in QM entails.
Sometimes I think it would be more productive to turn to the Schrodinger for puzzlement instead of being surprised by measuring classical observables and getting classical outcomes
 
  • #70
TrickyDicky said:
Measurement is fundamental to any empirical science, not specifically to the minimal
interpretation of QM, if you define it as interaction classical apparatus/quantum system you are already introducing a specific heuristic or interpretation as fundamental when the measurement problem is basically the lack of consensus about what measurent in QM entails.

In classical physics (Newtonian physics, special and general relativity), measurement is not a fundamental concept. Historically, Einstein did postulate measurement as fundamental in special relativity: the speed of light measured by any inertial observer is the same. However, we have removed that, and nowadays we say that special relativity means the laws have Poincare symmetry. Historically, measurement was also important in the genesis of general relativty: test particles follow geodesics. The test particle is a sort of measurement apparatus that is apart from the laws of physics because it does not cause spacetime curvature, in contrast to all other forms of matter. However, in the full formulation of general relativity, test particles are not fundamental. So quantum mechanics is different from classical physics in needing to specify measurement as a fundamental concept.

I am using a particular interpretation to define QM, but it is the minimal interpretation. The measurement problem is that we have to put this classical/quantum cut to define the minimal interpretation. The other interpretations are then approaches to solving the measurement problem by removing the need for measurement to be a fundamental concept in the mathematical specification of a theory. Examples of such interpretations are consistent histories (flavour of Copenhagen), hidden variables (generally predicting deviations from QM), or Many-Worlds.

TrickyDicky said:
Sometimes I think it would be more productive to turn to the Schrodinger for puzzlement instead of being surprised by measuring classical observables and getting classical outcomes

Another way of stating the measurement problem, is that if everything is quantum and we have a wave function of the universe, how can we make sense of such an idea? The minimal interpretation cannot make sense of such an idea, and always needs a classical/quantum cut. Bohmian Mechanics and Many-Worlds are two approaches to solving the measurement problem, in which the wave function of the universe is proposed to make sense.
 
  • #71
atyy said:
However, in the full formulation of general relativity, test particles are not fundamental. So quantum mechanics is different from classical physics in needing to specify measurement as a fundamental concept.
Measurement is needed to "actualize" one of the "possibilities" (or a small set of the non-mutually exclusive possibilities -- aka commuting observables). Without measurement, there is nothing actual to talk about, just possibilities. Measurement will not be so important in a theory that does not rely on a device which represents simultaneously all the "possible realities", like general relativity. In probability theory however, measurement is very important.

I am using a particular interpretation to define QM, but it is the minimal interpretation. The measurement problem is that we have to put this classical/quantum cut to define the minimal interpretation.
We do not. In the minimal interpretation, the classical/quantum cut is simply an unnecessary fiction, which only appears once you choose to interpret the wave function as a real physical thing. But the minimal interpretation is that it is a device for cataloging information about possible states within an ensemble.

Another way of stating the measurement problem, is that if everything is quantum and we have a wave function of the universe, how can we make sense of such an idea?
Understanding the wave function as a catalog of information about possible realities of the universe, there is no difficulty to make sense of. It already makes sense, in the minimal interpretation. No classical quantum cut is needed. MWI and BM are attempts to solve a problem introduced because the proponents insist on interpreting the wavefunction as a real physical thing. MWI by suggesting that the possibilities are all actualities (including the mutually exclusive ones), BM by suggesting that the possibilities exist as "guiding waves" to orchestrate observations.
 
  • #72
billschnieder said:
Understanding the wave function as a catalog of information about possible realities of the universe, there is no difficulty to make sense of.

If one actually tries to construct the catalogue of all the possible realities in the wave function, one ends up with the consistent histories approach.
 
Last edited:
  • #73
Say we have two orthogonal polarizers and we shine a light trough them. There is practically no light passing through them.
Then we put in additional polarizer at 45 deg. between first two and we get quarter of light trough.
Considering this I don't understand how one can view measurement as information update.
 
  • #74
@zonde and @billschneider, could you start a new thread about whether collapse is update or physical? I agree it's an interesting question, but not so relevant at the moment. I would like to focus on the hard science question I wrote in post #47. Is Weinberg's Eq 2.1.7 in Vol 1 of his QFT text part of quantum theory? If it is, is it postulated, or derivable from only {unitary evolution + Born rule without collapse}? If you'd like to discuss whether Eq 2.1.7 represents information update or a physical process in this thread, why don't we wait a bit until we understand vanhees71's view of Weinberg's Eq 2.1.7?
 
  • #75
billschnieder said:
In classical probability, a probability distribution represents alternate "possibilities". A measurement "actualizes" a sub-ensemble. Would you say there is collapse involved?
... However, if probability distributions are understood as information and not physical, there is absolutely no need to invent a concept of "collapse".
In this case, of course, there would be no need for a physical process identified with a collapse. But this picture is incompatible with the violation of Bell's inequalities.

And interpreting something as "information" always requires an answer to the question "information about what?".
 
  • Like
Likes stevendaryl
  • #76
vanhees71 said:
Well, it's at least not a physical process, as claimed by some collapse proponents in the case of quantum theory. It's then even less needed in classical than in quantum theory.
Indeed, in classical statistics we can explain the uncertainty of the statistics as completely being a problem of insufficient information. Getting more information about the real process does not mean that the process is physically influenced.

In quantum theory such an explanation is impossible.

You have an eigenstate of A with eigenvalue a1. You can repeat the measurement of A as much as you like, the result is always a1, never a2, a3, ...

Now you measure some non-commuting B. Then, you choose the subgroup of those with result b1. This operation differs from choosing a subgroup in classical statistics given some additional information that B has value b1, as can be easily seen: Measuring A gives now, with nonzero probability, other values than a1. Instead, restricting to a subensemble of those with value b1 would not modify the result of A being a1.
 
  • #77
atyy said:
@zonde and @billschneider, could you start a new thread about whether collapse is update or physical? I agree it's an interesting question, but not so relevant at the moment. I would like to focus on the hard science question I wrote in post #47. Is Weinberg's Eq 2.1.7 in Vol 1 of his QFT text part of quantum theory? If it is, is it postulated, or derivable from only {unitary evolution + Born rule without collapse}? If you'd like to discuss whether Eq 2.1.7 represents information update or a physical process in this thread, why don't we wait a bit until we understand vanhees71's view of Weinberg's Eq 2.1.7?
Born rule without collapse could work only if filtering measurement represents information update. So it's relevant to your question. Besides didn't vanhees71 express already his viewpoint in post #45?
 
  • #78
atyy said:
In classical physics (Newtonian physics, special and general relativity), measurement is not a fundamental concept. Historically, Einstein did postulate measurement as fundamental in special relativity: the speed of light measured by any inertial observer is the same. However, we have removed that, and nowadays we say that special relativity means the laws have Poincare symmetry. Historically, measurement was also important in the genesis of general relativty: test particles follow geodesics. The test particle is a sort of measurement apparatus that is apart from the laws of physics because it does not cause spacetime curvature, in contrast to all other forms of matter. However, in the full formulation of general relativity, test particles are not fundamental. So quantum mechanics is different from classical physics in needing to specify measurement as a fundamental concept.

I am using a particular interpretation to define QM, but it is the minimal interpretation. The measurement problem is that we have to put this classical/quantum cut to define the minimal interpretation. The other interpretations are then approaches to solving the measurement problem by removing the need for measurement to be a fundamental concept in the mathematical specification of a theory. Examples of such interpretations are consistent histories (flavour of Copenhagen), hidden variables (generally predicting deviations from QM), or Many-Worlds.
Another way of stating the measurement problem, is that if everything is quantum and we have a wave function of the universe, how can we make sense of such an idea? The
minimal interpretation cannot make sense of such an idea, and always needs a classical/quantum cut. Bohmian Mechanics and Many-Worlds are two approaches to solving the measurement problem, in which the wave function of the universe is proposed to make sense.
But you don't seem to be talking about what is usually known as the minimal or ensemble interpretation, you are giving too much reality to the quantum state, not even Copenhagen is so realistic about the state, it only considers it well defined in the context of observation, that's why they need a well defined collapse of the wave function at the moment of measurement. If you don't consider physical states at all then collapse doesn't arise in such explicit form because there is no wave function to collapse in any actual sense to begin with, but of course you still have irreversibility and the Born rule in your post 47 form, by definition the basic math of QM must be the same for all interpretations, otherwise it would be a different theory.
This doesn't mean the ensemble interpretation solves the measurement problem anymore than magical collapse, or decoherence or many-worlds do, they all leave the preferred-basis side of it basically untouched, but as we saw recently in the thrashing thread's paper, that problem is inherent to the first postulate so it can't be cured just looking at measurement.
 
Last edited:
  • #79
atyy said:
Collapse is the statement of the Born rule in the form ##P(\phi) = |\langle \phi | \psi \rangle|^{2}##, which is what happens in a filtering measurement. This is Eq 2.1.7 in volume 1 of Weinberg's QFT text. I think you agreed that measurement can be used as a means of state preparation, so in that sense I thought you said that collapse exists. If collapse does not exist, then are you saying that Weinberg's Eq 2.1.7 does not exist? If collapse does exist, then it seems you are saying that collapse can be derived, ie. Weinberg's Eq 2.1.7 can be derived.
The only thing I need is the statement that, if a quantum system is prepared in a state, represented by a normalized vector ##|\psi \rangle##, the probability (density) to find the value ##a## of the observable ##A## in the discrete (continuous) part of the spectrum of its representing self-adjoint operator ##\hat{A}##, is given by
##P(a)=\sum_{\beta} |\langle a,\beta|\psi \rangle|^2,##
where ##\beta## labels a complete set of orthonormalized eigenvectors of ##\hat{A}## to the eigenvalue ##a##. Of course, ##\beta## can also be continuous. Then the sum has to be substituted with the corresponding integral.

Where do I need a collapse for Born's postuate?
 
  • #80
vanhees71 said:
The only thing I need is the statement that, if a quantum system is prepared in a state, represented by a normalized vector ##|\psi \rangle##, the probability (density) to find the value ##a## of the observable ##A## in the discrete (continuous) part of the spectrum of its representing self-adjoint operator ##\hat{A}##, is given by
##P(a)=\sum_{\beta} |\langle a,\beta|\psi \rangle|^2,##
where ##\beta## labels a complete set of orthonormalized eigenvectors of ##\hat{A}## to the eigenvalue ##a##. Of course, ##\beta## can also be continuous. Then the sum has to be substituted with the corresponding integral.

Where do I need a collapse for Born's postuate?

So you disagree with Weinberg that his Eq 2.1.7 is a postulate?
 
  • #82
dextercioby said:
What is equation 2.1.7?

Here is my transcription of Weinberg's Eq 2.1.7 in http://books.google.com/books?id=doeDB3_WLvwC&source=gbs_navlinks_s (p50).

## P(\mathscr{R} \rightarrow\mathscr{R_{n}}) = |(\Psi,\Psi_{n})|^{2} ##

It is the probability that a system prepared in state ##\mathscr{R}## is found in state ##\mathscr{R_n}##, if a test is done to find out whether the system is in one of several orthogonal states ##\{ \mathscr{R_1}, \mathscr{R_2}, .. \}##, and ##\Psi## is a (unit) vector representing the state ##\mathscr{R}##.
 
Last edited:
  • #83
Oh, you mean his QFT book, I thought his regular QM book. OK, that makes sense now, even though it's odd it appears in his QFT book.
 
Last edited:
  • #84
atyy said:
Here is my transcription of Weinberg's Eq 2.1.7 in http://books.google.com/books?id=doeDB3_WLvwC&source=gbs_navlinks_s (p50).

## P(\mathscr{R} \rightarrow\mathscr{R_{n}}) = |(\Psi,\Psi_{n})|^{2} ##

It is the probability that a system prepared in state ##\mathscr{R}## is found in state ##\mathscr{R_n}##, if a test is done to find out whether the system is in one of several orthogonal states ##\{ \mathscr{R_1}, \mathscr{R_2}, .. \}##, and ##\Psi## is a (unit) vector representing the state ##\mathscr{R}##.
Is it applicable for a dissipative system, that is put in a mixed quantum state ?

Patrick
 
  • #85
microsansfil said:
Is it applicable for a dissipative system, that is put in a mixed quantum state ?

When the state is mixed, it has to be represented by a density operator. The Born rule, and collapse or state reduction for a density operator is given in
http://arxiv.org/abs/1110.6815 (Eq II.3, II.4 on p9)
http://arxiv.org/abs/0706.3526 (Eq 2, 3 on p4)
 
Last edited:
  • #86
I guess, you mean Eq. (2.1.7) in Quantum Theory of Fields Vol. 1? Then, of course it's Born's rule and thus one of the postulates of quantum theory. I still don't see, where you need a collapse for its statement.

There's no difference when you are in a general (mixed ) state. Also there no collapse is needed. You have a mixed state, whenever in addition to the irreducible indeterminacy of observables (which are present even when the state of the system is completely determined) there's also incomplete knowledge about in which state the system is prepared.

A pure state is of course also uniquely and equivalently described by a statistical operator, not only a ray in Hilbert space. A statistical operator represents a pure state if and only if it is a projection operator.
 
  • #87
vanhees71 said:
I guess, you mean Eq. (2.1.7) in Quantum Theory of Fields Vol. 1? Then, of course it's Born's rule and thus one of the postulates of quantum theory. I still don't see, where you need a collapse for its statement.

Yes, in Weinberg's QFT volume 1. Eq 2.1.7 is the definition of collapse, because it gives the probability for the system to jump from state R to Rn. For example, given a Bell state |00> + |11>, when Alice gets the measurement outcome "0", the state will transition to |00>. That is the collapse, just a mathematical statement that is postulated.
 
  • #88
The formulation in Weinberg may be a bit misleading, but there's no collapse implied as far as I understand him. Also one should be more careful in formulating Born's rule to make it compatible with the dynamics. You have to distinguish between the kets representing the state (rays in Hilbert space) and the (generalized) eigenvectors of observables. In a general picture (Dirac picture) of time evolution they evolve with different unitary time-evolution operators
$$|\psi,t \rangle=\hat{C}(t) |\psi,0 \rangle, \quad |\vec{a},t \rangle=\hat{A}(t) |\vec{a},0 \rangle.$$
Here ##\vec{a}## is in the common spectrum of a complete compatible set of observables, and the time-evolution operators obey the equations of motion
$$\frac{\mathrm{d}}{\mathrm{d} t} \hat{C}(t)=-\mathrm{i} \hat{H_1}(t) \hat{C}(t), \quad \frac{\mathrm{d}}{\mathrm{d} t} \hat{A}(t)=\mathrm{i} \hat{H_2}(t) \hat{C}(t)$$
with the initial conditions
$$\hat{C}(0)=\hat{A}(0)=\hat{1}.$$
The operators ##\hat{H}_1## and ##\hat{H}_2## are local in ##t## and self-adjoint operators, related to the Hamiltonian of the system by
$$\hat{H}=\hat{H}_1+\hat{H}_2.$$
Then the probability (density) to find the outcome ##\vec{a}## at time ##t## when measuring the complete set of observables on the system, prepared in the state represented by ##|\psi,t \rangle## is
$$P(\vec{a},t|\psi)=|\langle \vec{a},t|\psi,t \rangle|^2.$$
By construction the time evolution of this probability (density) is independent of the choice of the picture of time evolution.

This does not imply that the quantum system automatically somehow collapses into a state represented by ##|\vec{a},t \rangle## at the time of the measurement. In the case of observables with continuous spectrum this even contradicts the "kinematical" part of the quantum postulates, because then this is not a normalizable state in Hilbert space but a generalized state in the dual space of the nuclear space, where the observable operators are densely defined. Already this formal argument shows that the collapse hypothesis makes no proper sense.
 
  • #89
vanhees71 said:
The only thing I need is the statement that, if a quantum system is prepared in a state, represented by a normalized vector ##|\psi \rangle##, the probability (density) to find the value ##a## of the observable ##A## in the discrete (continuous) part of the spectrum of its representing self-adjoint operator ##\hat{A}##, is given by
##P(a)=\sum_{\beta} |\langle a,\beta|\psi \rangle|^2,##
where ##\beta## labels a complete set of orthonormalized eigenvectors of ##\hat{A}## to the eigenvalue ##a##. Of course, ##\beta## can also be continuous. Then the sum has to be substituted with the corresponding integral.

Where do I need a collapse for Born's postuate?

I think we already went around once about this. The question is: How does measurement constitute a preparation procedure for a particular state? Take the simplest example of preparing an electron in the state with spin-up in the z-direction. The usual assumption is that measuring the spin, and finding it to be spin-up means that afterward, the electron is in the spin-up state. Isn't that basically the collapse hypothesis? Without some assumption along those lines, how would you prepare an electron in the spin-up state?
 
  • #90
stevendaryl said:
I think we already went around once about this. The question is: How does measurement constitute a preparation procedure for a particular state? Take the simplest example of preparing an electron in the state with spin-up in the z-direction. The usual assumption is that measuring the spin, and finding it to be spin-up means that afterward, the electron is in the spin-up state. Isn't that basically the collapse hypothesis? Without some assumption along those lines, how would you prepare an electron in the spin-up state?
I answered this already, but again:

Take a Stern-Gerlach (SG) apparatus to "measure" the spin-z direction. A particle running through this apparatus is deflected in one of two possible directions. After this deflection, which is described by unitary time evolution for a particle with spin running through an appropriate inhomogeneous magnetic field (with a large homogeneous component in z direction, which sorts out the spin-z components as measured observables), particles at the respective are with in principiple arbitrary accuracy prepared in the corresponding spin state. Nowhere do you need a collapse. It's simply unitary time evolution, in this example even of a single-particle Schrödinger-Pauli equation.
 
  • #91
vanhees71 said:
I answered this already, but again:

Take a Stern-Gerlach (SG) apparatus to "measure" the spin-z direction. A particle running through this apparatus is deflected in one of two possible directions. After this deflection, which is described by unitary time evolution for a particle with spin running through an appropriate inhomogeneous magnetic field (with a large homogeneous component in z direction, which sorts out the spin-z components as measured observables), particles at the respective are with in principiple arbitrary accuracy prepared in the corresponding spin state. Nowhere do you need a collapse. It's simply unitary time evolution, in this example even of a single-particle Schrödinger-Pauli equation.

I'm not sure about that. If you have a single electron, and you send it through a stern-gerlach device, then afterward, the electron is describable as a superposition of a left-going spin-up electron and a right-going spin-down electron. How do you then get a prepared state |U\rangle consisting of only a spin-up electron?
 
  • #92
I just take into account only electrons in that region of space, where they have the desired spin-z component and ignore the ones in the other region. That's the paradigmatic example for a filter measurement. Of course you are right, after the SG apparatus the state is
$$|\Psi \rangle=|\Phi_{\vec{x}},\sigma_1=1/2 \rangle+|\Phi_{\vec{y}},\sigma_z=-1/2 \rangle,$$
where ##\Phi_{\vec{x}}## and ##\Phi_{\vec{y}}## are states corresponding to wave packets located around ##\vec{x}## and ##\vec{y}##, respectively, i.e., the SG apparatus entangles the position of the electron with its ##\sigma_z## value.

In terms of wave mechanics (i.e., the spin-position representation) this state reads
$$\Psi(\vec{r})=\begin{pmatrix}
\Phi_{\vec{x}}(\vec{r}) \\ \Phi_{\vec{y}}(\vec{r})
\end{pmatrix}.$$
 
  • #93
stevendaryl said:
I'm not sure about that. If you have a single electron, and you send it through a stern-gerlach device, then afterward, the electron is describable as a superposition of a left-going spin-up electron and a right-going spin-down electron. How do you then get a prepared state |U\rangle consisting of only a spin-up electron?

Just to reiterate: Suppose you have some source of electrons in an unknown spin state. You would describe this as the density matrix:

\rho = \frac{1}{2} ( |U\rangle \langle U | + |D\rangle \langle D | )

What could you do to the electron to put it into the pure spin-up state |U\rangle \langle U| using only unitary evolution? I think that you can't.
 
  • #94
vanhees71 said:
I just take into account only electrons in that region of space, where they have the desired spin-z component and ignore the ones in the other region. That's the paradigmatic example for a filter measurement. Of course you are right, after the SG apparatus the state is
$$|\Psi \rangle=|\Phi_{\vec{x}},\sigma_1=1/2 \rangle+|\Phi_{\vec{y}},\sigma_z=-1/2 \rangle,$$
where ##\Phi_{\vec{x}}## and ##\Phi_{\vec{y}}## are states corresponding to wave packets located around ##\vec{x}## and ##\vec{y}##, respectively, i.e., the SG apparatus entangles the position of the electron with its ##\sigma_z## value.

In terms of wave mechanics (i.e., the spin-position representation) this state reads
$$\Psi(\vec{r})=\begin{pmatrix}
\Phi_{\vec{x}}(\vec{r}) \\ \Phi_{\vec{y}}(\vec{r})
\end{pmatrix}.$$

That doesn't prepare a pure spin-up state. I suppose you could say that because the experiment involves a region of space in which the spin-down component is negligible, then effectively, we can pretend that there is only a spin-up component. That's fine, but then the notion of preparing a system in a particular state is a shorthand for something more complicated.
 
  • #95
Why is it too complicated to just consider only the electrons in a certain region of space?

Of course, also if you initial state is in a mixture, a Stern-Gerlach apparatus can be used as described. In the original experiment Stern and Gerlach used thermal silver atoms from a oven extracting a beam by keeping a small opening. That's clearly a mixture of the type you've written down.
 
  • #96
vanhees71 said:
Why is it too complicated to just consider only the electrons in a certain region of space?

I'm just pointing out that when people say that they are starting with electrons in some particular state |\psi\rangle, that is not what they are literally doing, if you don't invoke a collapse hypothesis. You don't get to pick the starting state, if you are only allowed unitary evolution.
 
  • #97
But in my gedanken experiment I simply take a particle in a certain region of space after it has run through a SG apparatus sorting the particles according to their spin-z components in different regions. I could set a "beam dump" to absorb the particles except the ones in the region associated with the wanted ##\sigma_z## value. Then, of course, the time evolution of the system as a whole is not described by unitary time evolution, if you don't take the whole beam dump into account. What's wrong with that?
 
  • #98
vanhees71 said:
But in my gedanken experiment I simply take a particle in a certain region of space after it has run through a SG apparatus sorting the particles according to their spin-z components in different regions. I could set a "beam dump" to absorb the particles except the ones in the region associated with the wanted ##\sigma_z## value. Then, of course, the time evolution of the system as a whole is not described by unitary time evolution, if you don't take the whole beam dump into account. What's wrong with that?

I didn't say there was anything wrong with it, only that it isn't accurate to describe such a thing as "preparing the system in state |\psi\rangle". In another post, I suggested an alternative description in terms of histories of observations. You use unitary evolution, plus a generalization of the Born rule to give the relative probability of getting history H given that the initial part of the history is H_0 (some initial segment of history H).
 
  • #99
Ok, so how would you describe the working of the SG apparatus?
 
  • #100
vanhees71 said:
Ok, so how would you describe the working of the SG apparatus?

I don't have a suggestion for the right way to describe things, I'm only questioning what are the implications of describing things in terms of "preparing a system in a particular initial state". If you assume "collapse of the wave function", then you can describe things this way: If you measure the position of the electron, then afterwards, the electron is in a state in which position is localized. Then you pass that electron through a Stern-Gerlach device, and the spin becomes correlated with position. So if you later measure its position, then the wave function collapses to a state of localized position and definite spin. Now you've prepared an electron in the state spin-up in whatever direction.

If you don't assume collapse of the wave function, then it's less clear to me exactly how things should be described. Possibly, as I suggested, it should be described in terms of the relative probabilities of histories of observations, rather than "prepare in state |\psi\rangle, measure observable O, get eigenvalue \lambda with probability such and such computable using \psi, O, \lambda, and unitary evolution". It could be that the latter description is derivable from the description in terms of histories.
 
Back
Top