Can the Born Rule Be Derived in the Many Worlds Interpretation?

Physics news on Phys.org
  • #122
Fredrik said:
Without a way to map preparation procedures to the mathematical things that are supposed to represent them in the theory, we don't have a theory. So yes, we need something like the rule that says that after a non-destructive measurement of an observable ##A## with non-degenerate result ##a##, the state is ##|a\rangle##.
What I really can't tell from reading your posts: Do you think this is something weird i.e. weirder than the situation in classical mechanics?
 
  • #123
kith said:
What I really can't tell from reading your posts: Do you think this is something weird i.e. weirder than the situation in classical mechanics?
I do. The reason I'm a bit against "interpretations" is that don't think an explanation for that weirdness is hidden inside QM. The best case scenario is that it can be explained by a new theory. The worst case scenario is that it's beyond the reach of scientific methods, and therefore fundamentally unknowable.

Edit: I think I didn't read your question carefully enough. I read it as "do you think there's something weird in QM?", but you're quoting me and using the word "this", so you were asking specifically if I think that it's weird that we can prepare pure states through non-destructive measurements. That's a bit harder to answer, because now I have the option to focus on the decoherence argument that sort of explains it. But this argument relies on QM, so it takes us back to the fundamental mystery of "what's really happening to the system?".
 
Last edited:
  • #124
Fredrik said:
I read it as "do you think there's something weird in QM?", but you're quoting me and using the word "this", so you were asking specifically if I think that it's weird that we can prepare pure states through non-destructive measurements. That's a bit harder to answer, because now I have the option to focus on the decoherence argument that sort of explains it. But this argument relies on QM, so it takes us back to the fundamental mystery of "what's really happening to the system?".
This was kind of my point. You said that it is essential for a theory to relate preparation procedures with elements of the theory. So this can't really be the weird thing. Another candidate for the weirdness is that the relation of preparation procedures and elements of the theory seems to contradict the time evolution law of the theory. But if we include all relevant parts of the experiment in the quantum description we aren't led to such a contradiction.

So I am inclined to agree with your last sentence that the basic weirdness is "what's really happening to the system?". But does a scientific theory have foundational issues if it doesn't answer this question? After all, I wouldn't call this a scientific question. And QM even answers the question what we get if we apply a simple realistic interpretation: Many Worlds. (I am a bit unsure about whether a full-blown MWI has fundamental issues or not, however)
 
Last edited:
  • #125
kith said:
If we can put the cut wherever we want to, why not include the detectors on the microscopic side? This would mean that the need for collapse depends on the description, so collapse would not be a fundamental notion but a tool we use in the calculations.

What I'm saying is that the state is just a tool we use in our calculations, and collapse is as fundamental as the state as a tool.

kith said:
Now this sounds more like the crucial difference is the observer. If the observer sets up two spin measurement apparatuses and just looks at the final result, no collapse is needed. If he checks the intermediate result, collapse is needed. Do I get you right here?

Yes.
 
  • #126
atyy said:
What I'm saying is that the state is just a tool we use in our calculations, and collapse is as fundamental as the state as a tool.
The state is present in all descriptions, collapse not.

atyy said:
Yes.
Ballentine comments on this in chapter 9.3, where he critisizes different notions of collapse. He writes
"(ii) The observer causes the reduction process (9.9) when he reads the result of the measurement from the apparatus.
This is really just a variant of (i) with the observer, rather than the apparatus, causing the disturbance, and it is refuted simply by redefining (II) [the measurement apparatus] to include both the apparatus and the observer."

/edit: just to state this clearly - your notion of collapse implies that consciousness causes collapse?
 
Last edited:
  • #127
vanhees71 said:
Again, I'm to naive to understand the necessity of a collapse at all! Take a "classical" situation of througing a die. Without further knowledge about its properties, I use the maximum-entropy method to associate a probability distribution. Of course, the least-prejudice distribution in the Shannon-Jaynes information-theoretical sense is that the occurance of any certain value is p_k=1/6, k \in \{1,2,3,4,5,6 \}.

Now I through the die once and get "3". Is now my probability distribution collapsed somehow to p_3=1 and all other p_k=0? I don't think that anybody would argue in that way.

Now, of course my probability distribution is assumed on the uncomplete knowledge about the die. In this case I assumed, I don't know anything about it and used the maximum-entropy principle to make a prediction of the probability distribution based on the objective principle of "least prejudice" (which of course has a certain well-defined meaning). Now, how to check this prediction? Of course, I've to through the die many times indepenently and always under the same conditions and make the statistics, how often the outcomes occur. Then I may come to the conclusion that the die is somehow biased towards a different distribution, and henceforth I use a new probability distribution based on the gained (statistical) knowledge about the die. Would you now say, somehow the probability distribution is collapsed to the new one, and that this is a physical process involving the die? I'd not say that anybody would argue in that way.

Sure, it has long been considered attractive to try to interpret collapse in this way. Cohen-Tanoudji, Diu and Laloe introduce collapse partly in this way.

vanhees71 said:
That's done only in quantum theory. There you also have a very precise defintion of how to associate probability distributions to the outcome of measurents, given a certain (equivalence class of) preparation procedures of the (quantum) system under investigation. E.g., if you prepare a complete set of observables to have determined values, you have prepared the system in a unique pure state (represented by a ray in Hilbert space). Now I perform a measurement of any other observable. In the following let's assume that we perform an ideal von Neumann filter measurement.

Is now some collapse occurring in the sense of some flavors of the Copenhagen interpretation? If so, when does it occur?

Note that there are two possible scenarios: (a) I measure the quantity and filter out all systems having a certain given value of this observable. According to standard quantum theory, such a system, I have to describe by a pure state, corresponding to the very eigenvector which is given by the projection of the original state onto the eigenspace of the measured eigenvalue. Has now the state collapsed to the new pure state and, if yes, when and how does this collapse occur?

Collapse occurs when you make a measurement. The problem does not enter with collapse, it already enters with the notion of measurement, because we can ask: when does a measurement occur?

vanhees71 said:
(b) I measure the quantity in the sense of an ideal filter measurement but do not take notice of the measured outcome. Then, again according to standard quantum theory in the minimal interpretation, I associate a mixed state with the then newly prepared system. Has now the state collapsed to the new mixed state?

I'd answer "no" to all these collapse questions. For me the association of states is an objective association of mathematical objects like state kets or statstistical operators based on a known preparation procedure of the quantum system. This gives me certain probabilities to find a certain value of any observable of the system, implying that the value of this observable is indetermined if the system is not in an eigenstate of the self-adjoint operator assiciated with this observable. The measurement is simply an interaction with a measurement apparatus which somehow stores the value of the measured observable to read off this value. Nothing else happens than this physical interactions, and nothing collapses in the Copenhagen sense as a physical process in nature. It may be that the system can be followed further after the measurement (e.g., when an ideal filter measurement has been performed), and the interaction with the measurement apparatus (and probably an associated filter proceess has been done) can be seen as a new preparation of the system, which I can describe further quantum mechanically in terms (of a pure or mixed) state. In this latter case, there is no collapse either, but just the application of well-defined rules of how to describe our knowledge about the system, which can be checked experimentally. That's all what physics is about: A description of our possible objective knowledge about physical systems. No more and no less. Any further questions some philosophers are after like an "ontology of quantum systems" is metaphysics and not my business as a physicist. This I leave to the philosophers!

Yes, for a non-selective measurement in which you do not take note of the outcome, the Born rule with collapse is not required. It can be modeled by deterministic unitary time-evolution and decoherence. For selective measurements, the time evolution is probabilistic, and collapse must be invoked if a sequence of measurements is described.

vanhees71 said:
More often you even destroy the system you measure, e.g., using photons as quantum systems usually they get absorbed finally by the detector. Are you the saying there must occur a collapse, although there's not even the photon left to be described in any way? I don't think so. I also leave it to philosophers to determine, what is the "ontology of photons" beyond the description we use as physicists in terms of QED.

The generalization of projective measurements, including collapse, is the rule for POVMs. This rule, which includes collapse, correctly gives the lack of a photon after a measurement, because the system is left in the ground state of the photon field of QED.
 
Last edited:
  • #128
kith said:
The state is present in all descriptions, collapse not.

Yes, in the sense that if successive measurements are not performed, collapse is not needed. So I agree collapse is not needed if we never describe successive measurements, and we do not predict that a projective measurement can be used to prepare a state. Ballentine, unfortunately, says that filtering measurements can be used to prepare a state, so he is wrong to reject collapse without replacing it with an equivalent postulate.

kith said:
Ballentine comments on this in chapter 9.3, where he critisizes different notions of collapse. He writes
"(ii) The observer causes the reduction process (9.9) when he reads the result of the measurement from the apparatus.
This is really just a variant of (i) with the observer, rather than the apparatus, causing the disturbance, and it is refuted simply by redefining (II) [the measurement apparatus] to include both the apparatus and the observer."

/edit: just to state this clearly - your notion of collapse implies that consciousness causes collapse?

I wouldn't object to that. If we follow say, Landau and Lifshitz, then measurement - the extraction of a definite outcome by the macroscopic appratus on one side of the Heisenberg cut from the quantum system on the other side of the cut - is what causes collapse. Since I don't know whether a non-conscious observer will draw a Heisenberg cut, it can be argued that consciousness is indirectly responsible for collapse.

To test this, one would have to build a robot that can do quantum mechanics, and then decide whether that robot is conscious:)
 
  • #129
kith said:
So I am inclined to agree with your last sentence that the basic weirdness is "what's really happening to the system?". But does a scientific theory have foundational issues if it doesn't answer this question?
In my opinion no. A set of statements about the real world has to falsifiable in order to be considered a theory, but I think that's it. It doesn't have to describe what's happening between state preparation and measurement. It's great if it does, but OK if it doesn't.
 
  • #130
atyy said:
Yes, in the sense that if successive measurements are not performed, collapse is not needed.
But whether successive measurements are performed or not depends on the description.

Let's assume we have an observer A who puts the Heisenberg cut between a spin system and two SG apparatuses. He performs successive measurements and needs collapse. We as observers B use a different cut which includes the system, the apparatuses and the observer in the quantum domain. Our description doesn't need collapse because we don't perform successive measurements. Both descriptions refer to the same physical situation. One description needs collapse and one doesn't. In this case, collapse wouldn't be a fundamental notion.
 
  • Like
Likes eloheim
  • #131
kith said:
But whether successive measurements are performed or not depends on the description.

Let's assume we have an observer A who puts the Heisenberg cut between a spin system and two SG apparatuses. He performs successive measurements and needs collapse. We as observers B use a different cut which includes the system, the apparatuses and the observer in the quantum domain. Our description doesn't need collapse because we don't perform successive measurements. Both descriptions refer to the same physical situation. One description needs collapse and one doesn't. In this case, collapse wouldn't be a fundamental notion.

Sure, if one agrees never to describe successive measurements, and if one agrees never to use measurement as a means of state preparation, then collapse is not needed as a fundamental notion.
 
  • #132
atyy said:
Sure, if one agrees never to describe successive measurements, and if one agrees never to use measurement as a means of state preparation, then collapse is not needed as a fundamental notion.
My point was that all successive measurements can be described in terms of a single measurement. I don't actually have to use such a description.
 
  • #133
kith said:
My point was that all successive measurements can be described in terms of a single measurement. I don't actually have to use such a description.

Sure. So I should say that if one wants to describe successive measurements, and use measurement as a means of state preparation, as Ballentine does, then collapse is needed as a fundamental postulate. Of course this is subjective, since quantum mechanics is subjective. It's just a tool for the observer to use.

An analogy is equilibrium statistical mechanics. The notion of equilibrium is subjective, since we believe the universe is expanding. If one does not observe equilibrium, then there is no need for equilibrium statistical mechanics.

To be a little speculative, let me suggest that the analogy with equilibrium statistical mechanics can be seen in both Bohmian mechanics and Many-Worlds. Bohmian mechanics depends on the concept of a quantum equilibrium, which presumably is subjective in the same way that equilibrium in classical statistical mechanics is. In Many-Worlds (let's assume it works just for discussion), collapse is transformed into branching which occurs when decoherence occurs. Since decoherence is never perfect, there must be an element of subjectivity as to when branching occurs.
 
Last edited:
  • #134
atyy said:
Sure. So I should say that if one wants to describe successive measurements, and use measurement as a means of state preparation, as Ballentine does, then collapse is needed as a fundamental postulate.
I still don't agree with the fundamental postulate part. If I can get rid of something by using an equally valid description, I don't consider this something fundamental to the theory. We probably won't reach an agreement here.
 
  • #135
atyy said:
Since decoherence is never perfect, there must be an element of subjectivity as to when branching occurs.
I think the very notion of a branch depends on the decomposition of the universal Hilbert space which in turn depends on the observer's definition of systems. Maybe the only way to avoid this subjectivity is to take every one dimensional subspace as a world (as Fredrik suggested in an older thread).
 
  • #136
kith said:
I still don't agree with the fundamental postulate part. If I can get rid of something by using an equally valid description, I don't consider this something fundamental to the theory. We probably won't reach an agreement here.

That's fine. I don't think this is a technical disagreement. There's a similar problem with the Bell tests. When does Bob consider Alice's measurement to be complete? Common sense will say that Bob considers Alice's measurement complete when it was done at a spacelike separation. However, the result was not definite for Bob at a spacelike separation. So can Bob consider that Alice, and her record of the result became definite for him when she showed him the results at non-spacelike separation? It seems yes, he could say that. In which case there are no Bell tests. Basically, Bob can choose his Heisenberg cut so that Alice is quantum until she shows him the result.

Apparently this loophole was known to Einstein. In Wiseman's words http://arxiv.org/abs/quant-ph/0509061:
"None of this appears much different from Einstein’s 1935 arguments. But here for the first time he also stated: [3] (p. 85)

One can escape from this conclusion [that statistical quantum theory is incomplete] only by either assuming that the measurement of S1 (telepathically) changes the real situation of S2 or by denying independent real situations as such to things which are spatially separated from each other. Both alternatives appear to me equally unacceptable.

Omitting the opinion (clearly stated as such) about what was acceptable in a physical theory, the logical deduction to which Einstein came in 1946 was that one of the following is false:
(i) the completeness of statistical QM
(ii) locality (that is, the postulates of relativity)
(iii) the independent reality of distant things."
 
Last edited:
  • #137
gill1109 said:
Please tell me your two rules, Bill. I am not familiar with Ballentine. Obviously I should be ... but sorry.

No need to be sorry :thumbs::thumbs::thumbs::thumbs:

It is however the BEST book on QM I know fixing up many issues and misconceptions and is very well thought of by many that post here - not Atty though - he has issues with it - but its best if he explains them.

Since your background is math, and mine is as well, I will build up to the two axioms in a slightly different way than Ballentine does.

First we need to define a Positive Operator Value Measure (POVM). A POVM is a set of positive operators Ei ∑ Ei =1 from, for the purposes of QM, an assumed complex vector space.

Elements of POVM's are called effects and its easy to see a positive operator E is an effect iff Trace(E) <= 1.

Now we can state the single foundational axiom QM is based on in the way I look at it which is a bit different than Ballentine who simply states the axioms without a discussion of why they are true - it's interesting it can be reduced to basically just one. Of course there is more to QM than just one axiom - but the rest follow in a natural way.

An observation/measurement with possible outcomes i = 1, 2, 3 ... is described by a POVM Ei such that the probability of outcome i is determined by Ei, and only by Ei, in particular it does not depend on what POVM it is part of.

Now I will evoke a very beautiful theorem which is a modern version of a famous theorem you may have heard of called Gleason's, and will in fact prove it.

Only by Ei means regardless of what POVM the Ei belongs to the probability is the same. This is the assumption of non contextuality and is the well known rock bottom essence of Born's rule via Gleason. The other assumption, not explicitly stated, but used, is the strong law of superposition ie in principle any POVM corresponds to an observation/measurement.

I will let f(Ei) be the probability of Ei. Obviously f(I) = 1 since the POVM contains only one element. Since I + 0 = I f(0) = 0.

First additivity of the measure for effects.

Let E1 + E2 = E3 where E1, E2 and E3 are all effects. Then there exists an effect E, E1 + E2 + E = E3 + E = I. Hence f(E1) + f(E2) = f(E3)

Next linearity wrt the rationals - its the usual standard argument from additivity from linear algebra but will repeat it anyway.

f(E) = f(n E/n) = f(E/n + ... + E/n) = n f(E/n) or 1/n f(E) = f(E/n). f(m E/n) = f(E/n + ... E/n) or m/n f(E) = f(m/n E) if m <= n to ensure we are dealing with effects.

Will extend the definition to any positive operator E. If E is a positive operator a n and an effect E1 exists E = n E1 as easily seen by the fact effects are positive operators with trace <= 1. f(E) is defined as nf(E1). To show well defined suppose nE1 = mE2. n/n+m E1 = m/n+m E2. f(n/n+m E1) = f(m/n+m E1). n/n+m f(E1) = m/n+m f(E2) so nf(E1) = mf(E2).

From the definition its easy to see for any positive operators E1, E2 f(E1 + E2) = f(E1) + f(E2). Then similar to effects show for any rational m/n f(m/n E) = m/n f(E).

Now we want to show continuity to show true for real's.

If E1 and E2 are positive operators define E2 < E1 as a positive operator E exists E1 = E2 + E. This means f(E2) <= f(E1). Let r1n be an increasing sequence of rational's whose limit is the irrational number c. Let r2n be a decreasing sequence of rational's whose limit is also c. If E is any positive operator r1nE < cE < r2nE. So r1n f(E) <= f(cE) <= r2n f(E). Thus by the pinching theorem f(cE) = cf(E).

Extending it to any Hermitian operator H.

H can be broken down to H = E1 - E2 where E1 and E2 are positive operators by for example separating the positive and negative eigenvalues of H. Define f(H) = f(E1) - f(E2). To show well defined if E1 - E2 = E3 - E4 then E1 + E4 = E3 + E1. f(E1) + f(E4) = f(E3) + f(E1). f(E1) - f(E2) = f(E3) - f(E4). Actually there was no need to show uniqueness because I could have defined E1 and E2 to be the positive operators from separating the eigenvalues, but what the heck - its not hard to show uniqueness.

Its easy to show linearity wrt to the real's under this extended definition.

Its pretty easy to see the pattern here but just to complete it will extend the definition to any operator O. O can be uniquely decomposed into O = H1 + i H2 where H1 and H2 are Hermitian. f(O) = f(H1) + i f(H2). Again its easy to show linearity wrt to the real's under this new definition then extend it to linearity wrt to complex numbers.

Now the final bit. The hard bit - namely linearity wrt to any operator - has been done by extending the f defined on effects. The well known Von Neumann argument can be used to derive Born's rule. But for completeness will spell out the detail.

First its easy to check <bi|O|bj> = Trace (O |bj><bi|).

O = ∑ <bi|O|bj> |bi><bj| = ∑ Trace (O |bj><bi|) |bi><bj|

Now we use the linearity that the forgoing extensions of f have led to.

f(O) = ∑ Trace (O |bj><bi|) f(|bi><bj|) = Trace (O ∑ f(|bi><bj|)|bj><bi|)

Define P as ∑ f(|bi><bj|)|bj><bi| and we have f(O) = Trace (OP).

P, by definition, is called the state of the quantum system. The following are easily seen. Since f(I) = 1, Trace (P) = 1. Thus P has unit trace. f(|u><u|) is a positive number >= 0 since |u><u| is an effect. Thus Trace (|u><u| P) = <u|P|u> >= 0 so P is positive.

Hence a positive operator of unit trace P, the state of the system, exists such that the probability of Ei occurring in the POVM E1, E2 ... is Trace (Ei P).

To derive Ballentine's two axioms we need to define what is called a resolution of the identity which is POVM that is disjoint. Such are called Von Neumann observations. We know from the Spectral theorem Hermitian operators, H, can be uniquely decomposed into resolutions of the idenity H = ∑ yi Ei. So what we do is given any observation based on a resolution of the identity Ei we can associate a real number yi with each outcome and uniquely define a Hermitian operator O = ∑ yi Ei, called the observable of the observation.

This gives the first axiom found in Ballentine - but the wording I will use will be slightly different because of the way I have presented it which is different to Ballentine - eg he doesn't point out he is talking about Von Neumann measurements, but measurements in general are wider than that, although all measurements can be reduced to Von Neumann measurements by considering a probe interacting with a system - but that is another story.

Axiom 1
Associated with each Von Neumann measurement we can find a Hermitian operator O, called the observations observable such that the possible outcomes of the observation are its eigenvalues yi.

Axiiom 2 - called the Born Rule
Associated with any system is a positive operator of unit trace, P, called the state of the system, such that expected value of of the outcomes of the observation is Trace (PO).

Axiom 2 is easy to see from what I wrote previously E(O) = ∑yi probability (Ei) = ∑yi Trace (PEi) = Trace (PO).

Now using these two axioms Ballentine develops all of QM.

A word of caution however. Other axioms are introduced as you go - but they occur in a natural way. Schroedinger's equation is developed from probabilities being invariant between frames ie the Principle Of Relativity. That the state after a filtering type observation is an eigenvalue of the observable is a consequence of continuity.

Obviously a lot more can be said, but will leave it for now - its a lot to digest already.

Thanks
Bill
 
Last edited:
  • Like
Likes Leo1233783
  • #138
@kith, I'm not sure if one can escape successive measurements in a relativistic context. If a pair of measurements is done simultaneously at spacelike separation in one frame, then the measurements will be non-simultaneous in a different frame.

Some examples regarding collapse and the relativity of simultaneity:
http://arxiv.org/abs/0706.1232
http://arxiv.org/abs/1007.3977
 
Last edited:
  • #139
kith said:
The state is present in all descriptions, collapse not.


Ballentine comments on this in chapter 9.3, where he critisizes different notions of collapse. He writes
"(ii) The observer causes the reduction process (9.9) when he reads the result of the measurement from the apparatus.
This is really just a variant of (i) with the observer, rather than the apparatus, causing the disturbance, and it is refuted simply by redefining (II) [the measurement apparatus] to include both the apparatus and the observer."

/edit: just to state this clearly - your notion of collapse implies that consciousness causes collapse?
So Ballentine does effectively have "collapse" he just likes not to name it, and to "hide" it in another postulate. He also has a cut because he has observers.

/edit Sorry: just saw two more later posts. Important ones!
 
  • #140
gill1109 said:
So Ballentine does effectively have "collapse" he just likes not to name it, and to "hide" it in another postulate. He also has a cut because he has observers.

Just to be sure Kith's comment is Ballentines rejection of conciousness causes collapse which he specifically denies - as he should - it leads to all sorts of unnecessary complications.

Ballentine specifically associates a state with preparation procedures right from the outset (specifically an ensemble of similarly prepared systems) - see section 2.1 - but he goes into it in more detail in Chapter 9 where Kith got his info from. There he critiques other views such as collapse and objective reality of the state. Also I will state from the outset I do not agree with all his criticisms eg he doesn't really understand Copenhagen - but that is a thread in its own right.

His view is the state divides preparation procedures into equivalence classes of the same state and the state is identified with those equivalence classes. For filtering type observations (these are observations where the object is not destroyed and is in fact a state preparation procedure) it simply changes the equivalence class the system belongs to. Of course the state changes so in that sense collapse has occurred - but it doesn't mean anything - all you have done is changed the preparation of the system - so obviously the state changes. Its simply because you have prepared it differently.

Thanks
Bill
 
Last edited:
  • #141
Nowadays there is a neat way to combine everything into one object, called an "instrument with settings". It's a black box which takes a classical input (the experimenter can turn a dial or press a button) and a quantum input (which is described by a density matrix). It has a classical output and a quantum output. We axiomatically state that a probabilistic mixture of quantum inputs is equivalent to an input of the corresponding mixture of density matrices. We assume that independent quantum inputs can be combined using tensor product formalism. We argue that any instrument must be linear, normalized, and totally positive. It follows by Naimark theorem that it has to have the Kraus representation form. Preparations are instruments with no inputs. Measurements are instruments with quantum input and classical output only. It's a theorem (called the dilation theorem) that every instrument can be realized by adding an independent auxiliary quantum input and then combining in turn unitary evolution, measurement of an observable with transformation of the state according to the Lüders - von Neumann collapse postulate, and finally possible discarding of (some parts of) the outputs.

So one can build the most general kinds of black boxes allowed by a few fundamental principles from "elementary boxes" for unitary evolution and von Neumann measurement, as long as one can also bring in auxiliary quantum systems.

This means that there are three equivalent ways to describe a quantum instrument
(1) by its properties of linearity, total positivity, normed
(2) by Kraus representation (collection of matrices ...)
(3) as combination of adding an ancillary Q system, do unitary transformation on composite system, do von Neumann measurement, possibly discard some parts of quantum or classical output
 
Last edited:
  • #142
Yes that is true.

And the dilation theorem is important in the way I tend to look at things - its another way you get POVM's from resolutions of the identity.

However I think the culmination of that way of thinking, ie probabilistically in terms of instrumentalist, reached a fairly definitive expression in the work of Hardy:
http://arxiv.org/pdf/quantph/0101012.pdf

Basically QM is the most reasonable probabilistic model that allows continuous transformations between pure states, forbidden in ordinary probability theory.

The argument goes something like this. Suppose we have a system in 2 states represented by the vectors [0,1] and [1,0]. These states are called pure. These can be randomly presented for observation and you get the vector [p1, p2] where p1 and p2 give the probabilities of observing the pure state. Such states are called mixed. Now consider the matrix A that say after 1 second transforms one pure state to another with rows [0, 1] and [1, 0]. But what happens when A is applied for half a second. Well that would be a matrix U^2 = A. You can work this out and low and behold U is complex. Apply it to a pure state and you get a complex vector. This is something new. Its not a mixed state - but you are forced to it if you want continuous transformations between pure states.

QM is basically the theory that makes sense out of pure states that are complex numbers. There is really only one reasonable way to do it - by the Born rule (you make the assumption of non contextuality - ie the probability is not basis dependant, plus a few other things no need to go into here) - as shown by Gleason's theorem.

Thanks
Bill
 
  • #143
vanhees71 said:
Again, I'm to naive to understand the necessity of a collapse at all! Take a "classical" situation of througing a die. Without further knowledge about its properties, I use the maximum-entropy method to associate a probability distribution. Of course, the least-prejudice distribution in the Shannon-Jaynes information-theoretical sense is that the occurance of any certain value is p_k=1/6, k \in \{1,2,3,4,5,6 \}.

Now I through the die once and get "3". Is now my probability distribution collapsed somehow to p_3=1 and all other p_k=0? I don't think that anybody would argue in that way.

Right. But to push the analogy a little further, suppose that there were a pair of dice such that each die separately seemed to give a random number between 1 and 6, but when you compare the two results, you find that the numbers always add up to 7. I think the way people would reason about such a situation would be either to assume that the dice are not truly random---it's predetermined somehow what number will be rolled, even though we don't know how to calculate it---or to assume that there is a causal influence between the two dice. To use terminology from Bell, the first possibility would be a "hidden variables" assumption, while the second would be a "collapse" of the probability distribution. If one die stopped rolling with result 2, while the second die was still rolling, then you would no longer describe the second die as randomly selecting a number from 1-6, it's probability distribution would be collapsed to the single possibility, 5.

So classically these kinds of correlations imply either hidden variables or some kind of nonlocal influence between the two events. In the first case, there is a kind of collapse that is purely subjective due to gaining more information about the situation. In the second case, the collapse really represents something physical happening.

The weird thing about QM is not so much the adjustment of probability in light of new information--that happens classically, as well. But the weird thing about QM is that it seems that both of the explanations for nonlocal correlations seem to be ruled out (the first by Bell's theorem, and the second by relativity).
 
  • #144
stevendaryl said:
The weird thing about QM is not so much the adjustment of probability in light of new information--that happens classically, as well. But the weird thing about QM is that it seems that both of the explanations for nonlocal correlations seem to be ruled out (the first by Bell's theorem, and the second by relativity).
Let's call it *wonderful*, not *weird*. We see from results like Hardy's that if we want various beautiful (and believable) things about nature to be true, and at the same time we admit that some things are intrinsically discrete, then the only way to make both happen at the same time is through quantum theory and thereby necessarily discarding some other cherished intuitions about the world. Note: quantum theory makes some things possible which classically would be impossible (like violating Bell inequalities, for instance) but it pays for this by making other things impossible which classically would be possible. In particular, since the theory is fundamentally stochastic there are lots of things which *can't* be done.

So quantum theory is *different* from classical physics, and therefore at first sight *weird*, but one can get used to it, and then it just becomes *wonderful*.
 
  • #145
gill1109 said:
Let's call it *wonderful*, not *weird*. We see from results like Hardy's that if we want various beautiful (and believable) things about nature to be true, and at the same time we admit that some things are intrinsically discrete, then the only way to make both happen at the same time is through quantum theory and thereby necessarily discarding some other cherished intuitions about the world. Note: quantum theory makes some things possible which classically would be impossible (like violating Bell inequalities, for instance) but it pays for this by making other things impossible which classically would be possible. In particular, since the theory is fundamentally stochastic there are lots of things which *can't* be done.

So quantum theory is *different* from classical physics, and therefore at first sight *weird*, but one can get used to it, and then it just becomes *wonderful*.

I'm not sure which result by Hardy you're talking about. You can certainly have discreteness without quantum mechanics (for example, cellular automata).
 
  • #146
gill1109 said:
So quantum theory is *different* from classical physics, and therefore at first sight *weird*, but one can get used to it, and then it just becomes *wonderful*.

That's pretty well it IMHO.

Its wonderful - but looked at the right way - very beautiful and actually reasonable after a fashion.

Thanks
Bill
 
Last edited:
  • #147
stevendaryl said:
I'm not sure which result by Hardy you're talking about. You can certainly have discreteness without quantum mechanics (for example, cellular automata).
Discrete outcomes but symmetries under rotations implies it has to be stochastic because you can make discrete probabilities vary continuously ...
 
  • #148
stevendaryl said:
Right. But to push the analogy a little further, suppose that there were a pair of dice such that each die separately seemed to give a random number between 1 and 6, but when you compare the two results, you find that the numbers always add up to 7. I think the way people would reason about such a situation would be either to assume that the dice are not truly random---it's predetermined somehow what number will be rolled, even though we don't know how to calculate it---or to assume that there is a causal influence between the two dice. To use terminology from Bell, the first possibility would be a "hidden variables" assumption, while the second would be a "collapse" of the probability distribution. If one die stopped rolling with result 2, while the second die was still rolling, then you would no longer describe the second die as randomly selecting a number from 1-6, it's probability distribution would be collapsed to the single possibility, 5.

So classically these kinds of correlations imply either hidden variables or some kind of nonlocal influence between the two events. In the first case, there is a kind of collapse that is purely subjective due to gaining more information about the situation. In the second case, the collapse really represents something physical happening.

The weird thing about QM is not so much the adjustment of probability in light of new information--that happens classically, as well. But the weird thing about QM is that it seems that both of the explanations for nonlocal correlations seem to be ruled out (the first by Bell's theorem, and the second by relativity).

But the only thing Bell's theorem says is that nonlocal correlations can't be explained with a local hidden variables theory, it is agnostic about other explanations or hidden variables theories.So the first explanation(dice nontruly random, their correlation being due to a common nonlocal cause for instance) of nonlocal correlations is not ruled out as long as it doesn't involve locality.
 
  • #149
TrickyDicky said:
But the only thing Bell's theorem says is that nonlocal correlations can't be explained with a local hidden variables theory, it is agnostic about other explanations or hidden variables theories.So the first explanation(dice nontruly random, their correlation being due to a common nonlocal cause for instance) of nonlocal correlations is not ruled out as long as it doesn't involve locality.

But if the outcome is pre-determined, then there is no problem with locality. You only need nonlocality if the results are NOT predetermined.
 
  • #150
stevendaryl said:
But if the outcome is pre-determined, then there is no problem with locality. You only need nonlocality if the results are NOT predetermined.
That's where Bell enters to tell you there is indeed problem with locality so in the end you need nonlocality.
 

Similar threads

  • · Replies 34 ·
2
Replies
34
Views
4K
Replies
8
Views
3K
Replies
47
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 16 ·
Replies
16
Views
1K
  • · Replies 13 ·
Replies
13
Views
3K
  • · Replies 120 ·
5
Replies
120
Views
12K