Under what circumstances does the wave function collapse?

In summary, the wave function collapses when an observer makes a measurement or interaction with a quantum system. This causes the superposition of multiple states to collapse into a single definite state. The exact mechanism of this collapse is still a subject of debate and research in quantum mechanics. However, it is a fundamental concept that plays a crucial role in understanding the behavior of quantum particles.
  • #1
jaydnul
558
15
The hypothesis that a conscious observer collapses the wave function has been discarded, right? The real reason is that the particle you use to measure the other disrupts the wave function, forcing it to choose an eigenvalue.

So since we are able to remove the conscious observer as the cause, I'm confused how to interpret everyday reality. Wouldn't almost all particles be forced to constantly choose an eigenvalue because they are so closely interacting with everything else (like my foot and the floor)? This would lead me to speculate that only the rare particle that is isolated enough, like in the vacuum of space or in the laboratory, would be in a superposition of eigenvalues and the majority of matter around us is constantly at a defined eigenvalue due to the wave function always collapsing.
 
Physics news on Phys.org
  • #2
1) There is everyday reality. But everything to do with the wave function - including collapse - is not necessarily real, and just a tool to calculate the probabilities of the outcomes of measurements. I especially recommend the textbook of Landau and Lifshitz or Weinberg on this point, in which they stress that to use quantum mechanics, we need to put a commonsense cut between the macroscopic measuring apparatus which registers a particular outcome, and the quantum world which is not necessarily real. Because the cut requires commonsense, it is acceptable if unfashionable to say that a "conscious observer" or an "observer with commonsense" is needed to collapse the wave function. Here is an example of treating the wave function and collapse as not necessarily real: http://arxiv.org/abs/1007.3977.

2) As a variant of the above, one can use the formalism of continuous measurements: http://arxiv.org/abs/quant-ph/0611067.

In the past, some people have said that maybe quantum mechanics means that there is no commonsense reality that extends throughout all spacetime, since part of it is always described by the wave function, which is not real. This was partly due to an erroneous proof by von Neumann, which claimed that there cannot be "hidden variables" that describe reality underlying quantum mechanics. However, Bohmian Mechanics provides one sort of counterexample, so there is no problem with simply saying there is commonsense reality throughout all spacetime, but we just do not know the true underlying degrees of freedom, and we use quantum mechanics as an effective theory, until it is falsified experimentally and experimental data guides us to discovering the degrees of freedom.

An alternative approach to reality is to say that all possible outcomes occur. This is the Many-Worlds approach. It is not yet known if this is without technical problems. One should also say that the Bohmian approach looks very ugly if special relativity is exact.
 
Last edited:
  • #3
Jd0g33 said:
The hypothesis that a conscious observer collapses the wave function has been discarded, right?

Well actually wave-function collapse is not part of the formalism of QM - its only part of some interpretations, and even then only for so called filtering type observations.

Jd0g33 said:
The real reason is that the particle you use to measure the other disrupts the wave function, forcing it to choose an eigenvalue.

The real reason is decoherence:
http://www.ipod.org.uk/reality/reality_decoherence.asp

In so far as we have an explanation that is - it doesn't explain the so called problem of outcomes without going into exactly what that is.

The modern view would be an observation occurs once decoherence happens, which causes apparent collapse. The classical common-sense world we see around us is explained by the fact everything is observed by the environment.

Some issues do remain however I will be happy to go into if you are interested.

Thanks
Bill
 
Last edited by a moderator:
  • #4
atyy said:
1)An alternative approach to reality is to say that all possible outcomes occur. This is the Many-Worlds approach. It is not yet known if this is without technical problems. One should also say that the Bohmian approach looks very ugly if special relativity is exact.
bhobba said:
The modern view would be an observation occurs once decoherence happens, which causes apparent collapse. The classical common-sense world we see around us is explained by the fact everything is observed by the environment.

Some issues do remain however I will be happy to go into if you are interested.

So is this to say that the majority of matter that we humans encounter (or are immediately aware of from our limited senses) is in fact being "observed" or collapsed by nature, giving it a definite eigenvalue?
 
  • #5
Jd0g33 said:
So is this to say that the majority of matter that we humans encounter (or are immediately aware of from our limited senses) is in fact being "observed" or collapsed by nature, giving it a definite eigenvalue?

No, your question seems to assume that the wave function is real. What we observe is real. The wave function is not necessarily real, and its collapse is not necessarily real. As an example: in relativity different observers have different notions of simultaneity, and collapse occurs simultaneously across all space for each observer, which means different observers do not agree on collapse. So collapse is not an invariant event. The probabilities of measurement outcomes are invariant, even though each observer collapses the wave function along a different slice of spacetime. http://arxiv.org/abs/1007.3977

If you want an example of the sort of reality that may underlie the quantum world, you can take a look a Bohmian mechanics.

I recommend reading Bell's 'Against Measurement' http://www.tau.ac.il/~quantum/Vaidman/IQM/BellAM.pdf.
 
Last edited:
  • #6
Another point to bear in mind is that when a conscious observer does a measurement, they are making a choice. They are choosing an apparatus that effects docoherence in a very particular way, and there is no reason to think that reality itself yields the same type of decoherence. For example, a person doing a measurement might want to know the momentum of a particle, so would intentionally couple the quantum system to an apparatus that decoheres the different momentum states, or they might want to know the location, so they intentionally couple to an apparatus that decoheres position states. Nature doesn't need to make those choices, it just let's the particle interact with its environment in ways that might create very complicated decoherences that put the particle into neither an eigenstate of location or of momentum. So I think it is rather incorrect to hold that nature is constantly putting particles into eigenstates, without the participation of physicists.
 
  • #7
bhobba said:
Well actually wave-function collapse is not part of the formalism of QM - its only part of some interpretations, and even then only for so called filtering type observations.
Could you please suggest us a lecture with respect to this statement? I am very interested. Thanks.
 
  • #8
fluidistic said:
Could you please suggest us a lecture with respect to this statement? I am very interested. Thanks.

Its to do with the axiomatic basis of QM.

You will find a correct explanation of those axioms (there are only two and they do not contain collapse) and their consequences in Ballentine - Quantum Mechanics - A Modern Development:
https://www.amazon.com/dp/9810241054/?tag=pfamazon01-20

To elaborate further see a recent post I did (post 137)
https://www.physicsforums.com/showthread.php?t=763139&page=8

Its a strange but true fact one can derive all of QM from a single axiom (which is what I show in the post above where I derive Ballentines two axioms).

The axiom is:
An observation/measurement with possible outcomes i = 1, 2, 3 ... is described by a POVM Ei such that the probability of outcome i is determined by Ei, and only by Ei, in particular it does not depend on what POVM it is part of.

As you can see their is nothing about collapse in the axiom. That axiom implies the existence of this thing called a state, but its simply an aid to calculating the probability's of outcomes.

Most of the time when you observe something its destroyed by the observation so speaking of the state collapsing in that case is meaningless. Only in the case where its not destroyed does continuity imply the state of the system is an eigenvector of the observable after the observation, but, really all you have done is prepared the system in a different way - these are called filtering type observations. The modern view is to associate a state with a preparation procedure so naturally it changes if you prepare a system differently. Since, from the formalism, the state is simply an aid to calculating the probability of observation outcomes the fact it changes when you prepare a system differently doesn't mean anything - naturally it will change.

Its the same if you throw a dice. Before throwing the dice each side had a 1/6th chance of coming up. Throw it and one side gets a probability of one, the rest zero. Nothing collapsed - the knowledge about the dice simply changed.

Ballentine also carefully explains this as well.

This is from the formalism. Interpretations have their own take.

Thanks
Bill
 
Last edited by a moderator:
  • #9
bhobba said:
You will find a correct explanation of those axioms (there are only two and they do not contain collapse) and their consequences in Ballentine - Quantum Mechanics - A Modern Development:
https://www.amazon.com/dp/9810241054/?tag=pfamazon01-20

Well, bhobba and I disagree on this. My opinon is that Ballentine is misleading and does not get it correct. In fact as far as I can tell, bhobba does get it correct because he explicitly postulates the equivalence of proper and improper mixed states, which is equivalent to collapse. But Ballentine has no such postulate, as far as I can tell.

One can get away without collapse, but the interpretation is usually quite weird. For example, if one denies the existence of successive measurements. This strategy also denies the existence of Bell tests, because a measurement that is simultaneous in one frame, will be successive in another frame. Nonetheless, since the cut is subjective, there is nothing to prevent one from denying the reality of distant measurements. Milburn and Wiseman's book has some formulations without collapse https://www.amazon.com/dp/0521804426/?tag=pfamazon01-20.

Some people also say that the Bohmian interpretation and Many-Worlds do not have collapse, which is correct if they are exactly equivalent to quantum mechanics. I would not say that, because the Bohmian interpretation as a solution of the measurement problem implicitly goes beyond quantum mechanics. Many-Worlds would be exactly equivalent to quantum mechanics, but I think its correctness is still debated.
 
Last edited by a moderator:
  • #10
atyy said:
Well, bhobba and I disagree on this. My opinon is that Ballentine is misleading and does not get it correct.

Mate we all know you have a different view of Ballentine than most around here.

I have zero problem with you explaining your view - in fact I think its great for people to get different takes.

Thanks
Bill
 
  • #11
Ken G said:
Another point to bear in mind is that when a conscious observer does a measurement, they are making a choice.

Yea Ken - all true mate.

But I think it also needs to be pointed out nature is making observations all the time without any kind of conscious intervention eg a dust particle is decohered by a few stray photons from the CBMR to give it an effective definite position.

Thanks
Bill
 
  • #12
bhobba said:
As you can see their is nothing about collapse in the axiom. That axiom implies the existence of this thing called a state, but its simply an aid to calculating the probability's of outcomes.

To me, the case where "collapse" seems appropriate is a case of a composite system such as the typical EPR thought experiment: We prepare two spin-1/2 in an entangled state, with opposite spins. Alice measures the spin of one particle in direction [itex]\vec{a}[/itex] to be spin-up. Later, Bob measures the spin of the other particle in some other direction, [itex]\vec{b}[/itex].

After Alice measures spin-up for her particle, but before Bob measures his particle's spin, she can predict the probability of Bob measuring spin-up by assuming that Bob's particle is in a definite state that is spin-down in direction [itex]\vec{a}[/itex]. So from Alice's point of view, Bob's chances of measuring spin-up in direction [itex]\vec{b}[/itex] changed discontinuously when she performed the measurement of her particle.

It seems strange to view that as a matter of "filtering".

Of course, one way to view this that avoids the issue of collapse is to say this is a composite measurement: Alice measures spin of one particle at angle [itex]\vec{a}[/itex], and Bob measures spin of another particle at angle [itex]\vec{b}[/itex]. QM gives the probability distribution for possible outcomes of this composite measurement. However, if the measurements are not simultaneous (suppose, as I said, Alice does hers first), then Alice can consider the situation between her measurement and Bob's measurement, and ask: What is the state of affairs in this in-between time? She can certainly characterize the state by a probability distribution on Bob's outcomes, and this distribution will be different before and after Alice's measurement. To me, that change is what is meant by "collapse".
 
  • #13
stevendaryl said:
Of course, one way to view this that avoids the issue of collapse is to say this is a composite measurement: Alice measures spin of one particle at angle [itex]\vec{a}[/itex], and Bob measures spin of another particle at angle [itex]\vec{b}[/itex]. QM gives the probability distribution for possible outcomes of this composite measurement.

It all depends on how you view it.

The formalism is neutral on that.

In EPR all you have done is disentangle an entangled state - and that still is simply a different preparation procedure.

Thanks
Bill
 
  • #14
bhobba said:
It all depends on how you view it.

The formalism is neutral on that.

In EPR all you have done is disentangle an entangled state - and that still is simply a different preparation procedure.

Yes, but if one defines a rule of preparation with given probabilities (in other words, a rule of collapse), then a measurement is defined.

For example, in section 6.2.2 of http://arxiv.org/abs/0810.3536 they state that a measurement model defines an instrument which defines an observable. Here we only need that an instrument defines an observable, since defining an instrument defines a rule of collapse or preparation from measurement (Eq 6.8).

A similar statement is found in http://arxiv.org/abs/0706.3526, where an instrument as something that is a rule of collapse is defined in Eq 3 and 4, followed by the comment "The above equation shows that every instrument defines a unique POVM."

At this point, the state evolution changes from deterministic to probabilistic, and there is a new rule of evolution. It is true that one need not view collapse as non-unitary, but there is a new rule here which Ballentine is missing, or misleadingly suggesting that unitary deterministic evolution is sufficient.
 
Last edited:
  • #15
bhobba said:
But I think it also needs to be pointed out nature is making observations all the time without any kind of conscious intervention eg a dust particle is decohered by a few stray photons from the CBMR to give it an effective definite position.
But why would nature conspire to give it a definite position and not, for example, a definite momentum? I think nature is decohering every which way from Sunday, and we just pick out the examples where it decoheres around a position basis for what we call a position measurement, and where it decoheres around a momentum basis for what we call a momentum measurement. Most of the time, nature isn't doing either of those, not being a careful physicist after all! So my main point is, just because we have an eigenstate of one type of measurement does not mean the state "has collapsed", since it's still in a superposition state in regard to any measurement that is more of the complementary type. This is just like your point about when measurements are more properly characterized as preparations, but the careful physicist chooses the preparation he/she desires, whereas nature makes no such choices so has preparations that are more of a mish-mash.

For example, nature might prepare a mishmash of spins, and the careful (and conscious) physicist passes them through a Stern-Gerlach apparatus to separate up and down spins. So we have a preparation/collapse in regard to up and down spins, but we built the Stern-Gerlach for that very purpose. Now, nature might accidentally make what is in effect a Stern-Gerlach apparatus too, but more likely it won't, due to certain entropic considerations. The thinking mind is in some sense locally bucking entropy for some particular purpose, and that is quite important in how measurement/preparations work in physics. So I think we could rightly claim that collapse, in the careful sense of preparation that you point out, is indeed a consequence of the functioning of the conscious mind of the physicist. Not in some magico-mystical way that requires some strange explanation, but in the very clear choices that physicists make when they do physics, which requires no explanation that goes beyond neuroscientific understanding of how we figure stuff out.
 
  • #16
Ken G said:
But why would nature conspire to give it a definite position and not, for example, a definite momentum?

Its to do with the radial nature of most interactions - they always single out position

The detail can be found in Schlosshauer's text:
https://www.amazon.com/dp/3540357734/?tag=pfamazon01-20

Thanks
Bill
 
Last edited by a moderator:
  • #17
Jd0g33 said:
The hypothesis that a conscious observer collapses the wave function has been discarded, right?

Wrong.

"A poll was conducted at a quantum mechanics conference in 2011 using 33 participants (including physicists, mathematicians, and philosophers). Researchers found that 6% of participants indicated that they believed the observer "plays a distinguished physical role (e.g., wave-function collapse by consciousness)".

https://en.wikipedia.org/wiki/Von_Neumann–Wigner_interpretation#Reception
 
  • #18
atyy said:
At this point, the state evolution changes from deterministic to probabilistic, and there is a new rule of evolution

Again you are interpreting it a particular way not implied by the formalism.

The formalism is this - a systems state is defined as synonymous with a preparation procedure. It evolves according to Schrodinger's equation until subject to another preparation procedure - or destroyed. Naturally, since its the definition of a state, it will change when subject to a different preparation procedure.

I mentioned before throwing a dice. It has one side up - so that has probability one and the rest zero. The reason for that is it has been prepared that way. I throw it and after another side may come up and that side has probability 1 and the rest zero. I have subject it to a different preparation procedure. Has anything collapsed - no.

In fact the above lies right at the heart of QM.

Suppose we have a system in 2 states represented by the vectors [0,1] and [1,0]. These states are called pure. These can be randomly presented for observation and you get the vector [p1, p2] where p1 and p2 give the probabilities of observing the pure state. This is like throwing the dice as above. Such states are called mixed. Probability theory is basically the theory of such mixed states. Presenting the states for observation is a preparation procedure. Now consider the matrix A that say after 1 second transforms one pure state to another with rows [0, 1] and [1, 0]. But what happens when A is applied for half a second. Well that would be a matrix U^2 = A. You can work this out and low and behold U is complex. Apply it to a pure state and you get a complex vector. This is something new. Its not a mixed state - but you are forced to it if you want continuous transformations between pure states.

QM is basically the theory that makes sense of such complex states. Its done by the Born Rule - and Gleason strongly suggests its the right way to go - but doesn't prove it. But the situation is the same. A state is simply some preparation procedure. The very foundation of QM is that the state can change continuously - in fact that, as explained above, is what separates it from standard probability theory which can't.

IMHO this collapse thing masks the real issue - it's the problem of outcomes - but that should really be another thread.

Thanks
Bill
 
Last edited:
  • #19
mal4mac said:
Wrong.

You consider 6% a mainstream interpretation?

The issue isn't that such an interpretation exists, its that the vast majority, by your number 96%, dismiss it. They can do that because, unlike what some bits of junk, such as - What The Bleep Do We Know Anyway, will tell you, plenty of interpretations exist that don't require it, and views that do are very much in the minority. That's one reason that book is a load of bollocks - it uses a minority view to justify new age mysticism. I in fact view it as a strong reason to reject such views - it can be used as justification for mystical nonsense.

Thanks
Bill
 
Last edited:
  • #20
bhobba said:
Its to do with the radial nature of most interactions - they always single out position

The detail can be found in Schlosshauer's text:
https://www.amazon.com/dp/3540357734/?tag=pfamazon01-20
I feel that might be begging the question though, because we agree that it is the preparation type of "collapse" that we are really interested in, so it's not the position per se that is important, it is what the position tells us about the state we are trying to understand. So for a spectrometer, the position in the CCD tells us the momentum of the original state we are studying, and in a Stern-Gerlach, the position tells us the spin of the original state. So while at the end of day we are always decohering position (indeed, we are probably reading a dial, so that's already a position), what matters is what is the meaning of the dial reading-- and that's what needs to be selected carefully by the conscious intelligence. Nature can do this accidentally, like how raindrops can break sunlight into the spectrum of a rainbow (which is a momentum preparation that will correlate with decohered position), but it is rare as required by entropy considerations. Decoherence that reliably or consistently conveys information about the original system (which is what happenstance position decoherences of the type Schlosshauer must be talking about do not do) is what we really mean by collapse, and that is what requires the careful conscious choices made by an intelligent scientist.
bhobba said:
Suppose we have a system in 2 states represented by the vectors [0,1] and [1,0]. These states are called pure. These can be randomly presented for observation and you get the vector [p1, p2] where p1 and p2 give the probabilities of observing the pure state. This is like throwing the dice as above. Such states are called mixed. Probability theory is basically the theory of such mixed states. Presenting the states for observation is a preparation procedure. Now consider the matrix A that say after 1 second transforms one pure state to another with rows [0, 1] and [1, 0]. But what happens when A is applied for half a second. Well that would be a matrix U^2 = A. You can work this out and low and behold U is complex. Apply it to a pure state and you get a complex vector. This is something new. Its not a mixed state - but you are forced to it if you want continuous transformations between pure states.
I absolutely love this, you have pointed this out before and I don't think I have ever seen any more valuable insight into the nature of quantum mechanics. I think it is also consistent with the view that "collapse" is really just a manifestation of the choices we make when we decide how we are going to frame nature. We choose a die because it gives discrete outcomes, and so we spend our time studying dice. Then we ponder how weird it is that nature gives us discrete outcomes out of continuous phenomena? It's not mystical to say that consciousness is responsible for collapse, it is simply recognizing the role of how we think, of what types of experiments we consider to be telling us something.
 
Last edited by a moderator:
  • #21
bhobba said:
You consider 6% a mainstream interpretation?

The issue isn't that such an interpretation exists, its that the vast majority, by your number 94%, dismiss it. They can do that because, unlike what some bits of junk, such as - What The Bleep Do We Know Anyway, will tell you, plenty of interpretations exist that don't require it, and views that do are very much in the minority. That's one reason that book is a load of bollocks - it uses a minority view to justify new age mysticism. I in fact view it as a strong reason to reject such views - it can be used as justification for mystical nonsense.

That figure comes from a mainstream document so it *is* a mainstream interpretation! I was lazy just posting the Wikipedia link, the original is:

http://arxiv.org/abs/1301.1069

Why bring in What The Bleep? Astrologists can use anything to justify mystical nonsense - e.g., planets. Should we not mention planets in case some astrologer uses them to justify nonsense? There's a long tradition behind the "consciousness causes wave-function collapse" idea, with first rate minds behind it (von Neumann, Heisenberg, Wigner...) The OP prompted me to look and see if it was still prominent. OK 6% isn't that prominent, but it shows it's still a player! You can't say it has been discarded until that figure hits 0%.
 
  • #22
mal4mac said:
That figure comes from a mainstream document so it *is* a mainstream interpretation! I was lazy just posting the Wikipedia link, the original is:

http://arxiv.org/abs/1301.1069

In this case, the paper is reporting that when a group of 33 people including "physicists, philosophers, and mathematicians" were polled, two of them responded to a vaguely-worded question in a particular way. That's a shaky foundation for a claim that 6% of modern physicists believe that consciousness causes collapse.

(That paper received a fair amount of discussion here immediately after it was published - search for it and you'll find the thread.)
 
  • Like
Likes 1 person
  • #23
mal4mac said:
That figure comes from a mainstream document so it *is* a mainstream interpretation!.

Your interpretation of mainstream and mine differ somewhat. And I suspect others as well.

But that is not the issue - the issue is the vast majority reject it and for good reason.

I could discuss it here - but the silly view of the world it leads to is well known and has been discussed many times on this forum - eg see post 43:
https://www.physicsforums.com/showthread.php?t=765350&page=3

It does not prove such a view wrong - but like I said in the post if you promulgated it in front of a CS class they would likely leave laughing their heads of.

And the issue with books like I mentioned is they use it to justify new age nonsense without pointing out most physicists would not accept such a view and it leads to all sorts of absurdities.

Thanks
Bill
 
  • #24
bhobba said:
https://www.physicsforums.com/showthread.php?t=765350&page=3


It does not prove ["consciousness causes collapse"] wrong - but like I said in the post if you promulgated it in front of a CS class they would likely leave laughing their heads of.

So what? If the interpretation is valid then who cares whether they laugh or not. As someone said on the thread you linked me to - some professors of physics do believe in "consciousness causes collapse", and that paper I quoted has 6% believing it. OK this is a low percentage, but no one would get into a high prestige conference on interpretation if they were barmy (would they?) If they are barmy then "the world of physics" needs to get its act together and get the barmy ones carried away by the men in white coats, or how can the general public trust anything 'the world of physics' says?
 
  • #25
Nugatory said:
In this case, the paper is reporting that when a group of 33 people including "physicists, philosophers, and mathematicians" were polled, two of them responded to a vaguely-worded question in a particular way. That's a shaky foundation for a claim that 6% of modern physicists believe that consciousness causes collapse.

Fair enough, but it doesn't get away from the fact that several leading experts on "Fundamental Problems in Quantum Theory" believe "consciousness causes collapse".
 
  • #26
What's more, it's hard to know how each of those experts interprets that phrase. If you ask a question to 33 people, and they interpret the question in 33 different ways, then a "yes" or "no" answer they give is essentially meaningless, what matters is what are those 33 different possible interpretations of the question.

For example, as I said above, even if I would not be considered an expert, I think I can make a strong case that it is impossible to even give meaning to the concept of collapse, used as a means of conveying the knowledge of a preparation of a system, without referring to consciousness. This is because of the essential role the word "knowledge" is playing in that sentence, and the inseparable connection between knowledge and consciousness invoked in science. Hence, I would not say that consciousness causes collapse, the way dropping a rock on my toe causes my toe to hurt, I would say that collapse requires consciousness, the way a hurting toe requires a consciousness as well. How would an expert that takes my meaning answer that question? I think that is very unclear.
 
  • #27
bhobba said:
Again you are interpreting it a particular way not implied by the formalism.

The formalism is this - a systems state is defined as synonymous with a preparation procedure. It evolves according to Schrodinger's equation until subject to another preparation procedure - or destroyed. Naturally, since its the definition of a state, it will change when subject to a different preparation procedure.

I mentioned before throwing a dice. It has one side up - so that has probability one and the rest zero. The reason for that is it has been prepared that way. I throw it and after another side may come up and that side has probability 1 and the rest zero. I have subject it to a different preparation procedure. Has anything collapsed - no.

In fact the above lies right at the heart of QM.

Suppose we have a system in 2 states represented by the vectors [0,1] and [1,0]. These states are called pure. These can be randomly presented for observation and you get the vector [p1, p2] where p1 and p2 give the probabilities of observing the pure state. This is like throwing the dice as above. Such states are called mixed. Probability theory is basically the theory of such mixed states. Presenting the states for observation is a preparation procedure. Now consider the matrix A that say after 1 second transforms one pure state to another with rows [0, 1] and [1, 0]. But what happens when A is applied for half a second. Well that would be a matrix U^2 = A. You can work this out and low and behold U is complex. Apply it to a pure state and you get a complex vector. This is something new. Its not a mixed state - but you are forced to it if you want continuous transformations between pure states.

QM is basically the theory that makes sense of such complex states. Its done by the Born Rule - and Gleason strongly suggests its the right way to go - but doesn't prove it. But the situation is the same. A state is simply some preparation procedure. The very foundation of QM is that the state can change continuously - in fact that, as explained above, is what separates it from standard probability theory which can't.

One can name it whatever what wants, but the point is that Gleason's (or Busch's) also applies to the preparation procedure (as you are calling collapse in the Bell test) which has probabilistic outcomes, because the preparation procedure defines a POVM. So there is preparation-measurement link, even though it is true that there is not necessarily a measurement-preparation link (an instrument defines an observable, but an observable does not define a unique instrument).
 
Last edited:
  • #28
bhobba said:
Now consider the matrix A that say after 1 second transforms one pure state to another with rows [0, 1] and [1, 0]. But what happens when A is applied for half a second. Well that would be a matrix U^2 = A. You can work this out and low and behold U is complex. Apply it to a pure state and you get a complex vector. This is something new. Its not a mixed state - but you are forced to it if you want continuous transformations between pure states.
Actually, on further reflection, I think this continuity requirement must not be the whole story, because I think I can offer an interpretation of this situation which is still continuous, but doesn't give complex results and isn't quantum mechanics. Just imagine a normal x-y plane, except interpret the signs of the coordinates as meaningless, i.e., -1/2 = 1/2 and so on, which is natural because we are only going to ever test the probabilities |p1|2 and so on. Then we still have continuity, but no complex numbers. A is a rotation by 90 degrees followed by ignoring the sign of the amplitudes, and the square root of A is a rotation by 45 degrees also ignoring the sign of the amplitudes.

So this is a rather different beast than QM, because the square root of A maps the meaningful upper right quadrant two-to-one onto only half the upper right quadrant, whereas in QM, the mapping is one-to-one and onto, if we mod out by an ignorable common phase of both coordinates. So it seems we need more than just continuity, we also need the one-to-one quality, and we need to mod out by global complex phase rather than just by individual signs.

So there's something deeper here than just continuity of probability, there is something about how these probabilities are behaving that is checking out in practice that is required to say we have QM. I believe this additional element is the unitarity, which is essentially a time-invertible property (the one-to-one requirement), or you can look at it algebraically as you do above by saying that you interpret A = U2 as a true algbraic constraint on what A and U do, so the "equality" there has to mean more than "gives the same testable probabilities." Then we are not talking about equivalence classes of states under the acts of scientific testing, we have to be talking about algebraically equal states, even though these states are never actually tested as such.

That's why it seems there has to be something ontological about the wave function that is needed if you want to derive QM from the continuity postulate. Or you could drop the ontological requirement of an algebraic meaning of equality, and instead say that all A and U have to give not only continuous results, but also invertible results. I think this means you need a continuous group of probability transformations.
 
  • #29
Ken G said:
Actually, on further reflection, I think this continuity requirement must not be the whole story, because I think I can offer an interpretation of this situation which is still continuous, but doesn't give complex results and isn't quantum mechanics. Just imagine a normal x-y plane, except interpret the signs of the coordinates as meaningless, i.e., -1/2 = 1/2 and so on, which is natural because we are only going to ever test the probabilities |p1|2 and so on. Then we still have continuity, but no complex numbers. A is a rotation by 90 degrees followed by ignoring the sign of the amplitudes, and the square root of A is a rotation by 45 degrees also ignoring the sign of the amplitudes.

So this is a rather different beast than QM, because the square root of A maps the meaningful upper right quadrant two-to-one onto only half the upper right quadrant, whereas in QM, the mapping is one-to-one and onto, if we mod out by an ignorable common phase of both coordinates. So it seems we need more than just continuity, we also need the one-to-one quality, and we need to mod out by global complex phase rather than just by individual signs.

So there's something deeper here than just continuity of probability, there is something about how these probabilities are behaving that is checking out in practice that is required to say we have QM. I believe this additional element is the unitarity, which is essentially a time-invertible property (the one-to-one requirement), or you can look at it algebraically as you do above by saying that you interpret A = U2 as a true algbraic constraint on what A and U do, so the "equality" there has to mean more than "gives the same testable probabilities." Then we are not talking about equivalence classes of states under the acts of scientific testing, we have to be talking about algebraically equal states, even though these states are never actually tested as such.

That's why it seems there has to be something ontological about the wave function that is needed if you want to derive QM from the continuity postulate. Or you could drop the ontological requirement of an algebraic meaning of equality, and instead say that all A and U have to give not only continuous results, but also invertible results. I think this means you need a continuous group of probability transformations.

Yes. bhobba is probably referring in a very summarized way the postulates for finite dimensional quantum mechanics set out by Hardy in http://arxiv.org/abs/quant-ph/0101012. Of course, one can see from there that one doesn't have to postulate collapse, if one adds other postulates. Nonetheless, there is collapse, as Hardy derives. So I am mystified by the claim that collapse is not part of the formalism, if the reference given is Ballentine. (I do accept other references such as Bohmian Mechanics, Many-Worlds as a potential approach, and those described in the text by Wiseman and Milburn.)
 
  • #30
Hardy does seem to reach the same conclusion I was pondering, though I had a question about this: " Axiom 5 (which requires that there exists continuous reversible transformations between pure states) rules out classical probability theory. If Axiom 5 (or even just the word "continuous" from Axiom 5) is dropped then we obtain classical probability theory instead." How can Hardy say that dropping "continuous" from Axiom 5 yields classical probability theory instead of QM? Surely dropping an axiom that QM obeys cannot require you to obtain a probability theory that QM does not obey. Perhaps he means that if you don't require continuity, then you open the door for classical probability theory, but you have to replace it with some other axiom that QM does not obey to actually get classical probability theory, that replaces QM.
 
  • #31
Ken G said:
Hardy does seem to reach the same conclusion I was pondering, though I had a question about this: " Axiom 5 (which requires that there exists continuous reversible transformations between pure states) rules out classical probability theory. If Axiom 5 (or even just the word "continuous" from Axiom 5) is dropped then we obtain classical probability theory instead." How can Hardy say that dropping "continuous" from Axiom 5 yields classical probability theory instead of QM? Surely dropping an axiom that QM obeys cannot require you to obtain a probability theory that QM does not obey. Perhaps he means that if you don't require continuity, then you open the door for classical probability theory, but you have to replace it with some other axiom that QM does not obey to actually get classical probability theory, that replaces QM.

I think you are right, and it should be that without continuity, one has either classical or quantum probability.

Edit: Here's support for Ken G's correction from Hardy's later version (but with slightly different axioms) http://arxiv.org/abs/1303.1538 : "Classical probability theory and quantum theory are only two theories consistent with the following postulates. ... To single out quantum theory it suffices to add anything that is inconsistent with classical probability and consistent with quantum theory."
 
Last edited:
  • #32
Yes, that certainly makes more sense. But I really like this way of characterizing quantum and classical probability theories as just two different classes of probability theories, with remarkably minor differences in the axioms that have such major implications on what seems normal vs. bizarre.
 
  • #33
bhobba said:
The real reason is decoherence:
http://www.ipod.org.uk/reality/reality_decoherence.asp

In so far as we have an explanation that is - it doesn't explain the so called problem of outcomes without going into exactly what that is.

The modern view would be an observation occurs once decoherence happens, which causes apparent collapse. The classical common-sense world we see around us is explained by the fact everything is observed by the environment.

From Fred Kuttner, co-author of Quantum Enigma:
You make an excellent argument. The resolution is to note that decoherence does not give us a classical world. Rather, it gives us a world which is still in a superposition, but in which the off-diagonal elements in the density matrix, which would allow us to measure the superposition, vary so rapidly in space so that measured with any macroscopic instrument they average to zero.

The quantum Zeno effect, on the other hand, requires an actual collapse, and no resulting superposition. Thus there is really no contradiction as long as you realize the bogus argument made by some decoherence proponents.

The email is in response to this I sent him:
It is often claimed on www.physicsforums.com that we experience a classical world due to decoherence. However, decoherence is simply ignorance of the whole state of the environment entangled with the system. If it were true that the environment is constantly observing the macroscopic world, and because of the Quantum Zeno Effect, wouldn’t the macroscopic world stay static because its being observed all the time?

Wouldn’t the above be a good argument against decoherence producing the classical world around us – because the classical world doesn’t remain static?
 
Last edited by a moderator:
  • #34
Regarding Kuttner's response:

1) It is true that decoherence is insufficient to explain the classical real world with its continuous sequence of particular outcomes even when one is not looking - one needs needs to add an interpretation such as Bohmian Mechanics to do that

2) However, the quantum Zeno effect is not a good argument against decoherence constantly observing the world - there is the formalism of continuous measurements in which the the system continues to evolve.
http://arxiv.org/abs/quant-ph/0611067
http://arxiv.org/abs/math-ph/0512069
http://arxiv.org/abs/quant-ph/0201115v2

The Belavkin paper is interesting because it obtains equations like those in Continuous Spontaneous Localization theories (which go beyond quantum mechanics) by using the quantum formalism and the idea of continuous measurement.
 
Last edited:
  • #35
atyy said:
It is true that decoherence is insufficient to explain the classical real world with its continuous sequence of particular outcomes even when one is not looking - one needs needs to add an interpretation such as Bohmian Mechanics to do that

There seems to be a bit of confusion about what I wrote, so I will repeat it with the key point highlighted:

bhobba said:
The real reason is decoherence:
http://www.ipod.org.uk/reality/reality_decoherence.asp

In so far as we have an explanation that is - it doesn't explain the so called problem of outcomes without going into exactly what that is.

I didn't want to get into the issue, but the tangent this has gone off at means it looks like I have to.

The problem of outcomes is basically what makes an improper mixture a proper one. That is the extra assumption that allows decoherence to resolve the measurement problem. That is the key unexplained issue of decoherence - and different interpretations tackle differently. I tackle it head on and simply declare it the same - how - blank-out. BM assumes an objective trajectory and position so trivially an improper mixed state is a proper one. MW assumes each outcome of the improper mixture is a separate world. Consistent Histories doesn't even have observation - QM for that interpretation is the stochastic theory of histories.

Stevie TNZ quoated:
'It is often claimed on www.physicsforums.com that we experience a classical world due to decoherence. However, decoherence is simply ignorance of the whole state of the environment entangled with the system. If it were true that the environment is constantly observing the macroscopic world, and because of the Quantum Zeno Effect, wouldn’t the macroscopic world stay static because its being observed all the time?'

I haven't seen that claimed that often, and when done me and others correct it.

My claim, and the claim of those that use interpretations incorporating decoherence is that dechoerence, and SOME OTHER ASSUMPTION explains the classical world.

I know sometimes me and others forget to mention the second bit - so even though its annoying having to go over the same territory for the umpteenth time - fair cop - you have to explain it again. But this time I think I was pretty careful about what was said.

Thanks
Bill
 
Last edited by a moderator:

Similar threads

Replies
1
Views
608
Replies
23
Views
2K
Replies
7
Views
1K
Replies
4
Views
834
Replies
14
Views
1K
  • Quantum Physics
Replies
2
Views
895
Replies
32
Views
2K
  • Quantum Physics
Replies
4
Views
1K
Replies
8
Views
2K
  • Quantum Interpretations and Foundations
2
Replies
43
Views
856
Back
Top