# Experimental Tests of Projection Postulate

Nicky
Have there been any experiments designed to explicitly test the projection postulate? I mean that part of it that says the measured particle is left in an eigenstate of the measured operator.

The usual devices for measuring particles (photomultipliers, phosphor screens, etc.) don't really allow the postulate to be tested, since the measured particle is absorbed, not re-emitted in a perfect eigenstate. Are there other measurement methods that allow testing the projection postulate?

slyboy
You can't really "test" the postulate, since it is something that applies to some experiments and not others. For some experiments, such as Stern-Gerlach measurements, it works very well.

Quantum opticians use the term "non-demolition" to describe measurements where the state updates according to the projection postulate. There has been considerable work done on engineering these measurements in quantum optical systems. Just look up "non-demolition measurements" on the arXiv for further details.

Nicky
slyboy said:
You can't really "test" the postulate, since it is something that applies to some experiments and not others. For some experiments, such as Stern-Gerlach measurements, it works very well.

Even in a Stern-Gerlach apparatus, I would think there is no actual "measurement" until the electron actually hits a detector. Until that time it would be in a superposition of possible pathways through the magnets, wouldn't it?

statespace101
The perbative effects of the magnetic field would reduce the state vector I think. I've been wondering if the postulate could just be done away with by saying the operator itself is observable, but I'm not to sure whether this is correct? It seems to me that the operator has all the properties to be considered a observable, and if so would the potentials therein be considered real? Though, I'm not sure whether a von-nuemann type measurement would have a meaning, or even in such a scnerio would reduction of a state vector even have any conceptual meaning?

slyboy
Even in a Stern-Gerlach apparatus, I would think there is no actual "measurement" until the electron actually hits a detector. Until that time it would be in a superposition of possible pathways through the magnets, wouldn't it?

There is always ambiguity in exactly where one applies the measurement postulate, if at all (since you might be a fan of the many-worlds viewpoint).

In the case of Stern-Gerlach, it is clear that one can recombine the outputs and perform another experiment that shows it is still a coherent superposition. This is true of the vast majority of "nondemolition" measurements, since one typically needs a great deal of control over the measurement interaction in order to make the projection postulate applicable. Typically, the "measuring device" is actually another quantum system, which can itself be coherently manipulated. In quantum optics, it might be another photon for example. The second system is then measured destructively, by absorbtion in a detector or something like that. The entire experiment could be reversed up to the point that the destructive measurement is made.

However, this might be true in principle even of more complicated measurements involving "macroscopic" measuring devices. It is just that the Hamiltonians required to do this are almost impossible to engineer in practice. Evidence that macroscopic superpositions are possible, in the experiments of Zeilinger for example, indicates that this might be true.

von Neuman emphasized that there is a great deal of ambiguity in exactly where the projection postulate is applied (at the level of quantum systems, macroscopic systems, the brain of the conscious observer, etc.). Generally, it is a mathematical idealisation that allows us to calculate what will happen in experiments without having to deal with complicated entangled states of macroscopic systems. However, the probablistic aspect of QM has to be applied at some stage. Exactly where it has to be applied and what it means is one of the main topics of debate in the foundations of quantum theory.

Nicky
von Neuman emphasized that there is a great deal of ambiguity in exactly where the projection postulate is applied (at the level of quantum systems, macroscopic systems, the brain of the conscious observer, etc.). Generally, it is a mathematical idealisation that allows us to calculate what will happen in experiments without having to deal with complicated entangled states of macroscopic systems.

Is it generally accepted, then, that the projection postulate is not formally true, but is an approximation of the measuring device's unitary evolution, in the limit as the number of particles goes to infinity?

slyboy
Is it generally accepted, then, that the projection postulate is not formally true, but is an approximation of the measuring device's unitary evolution, in the limit as the number of particles goes to infinity?

That's a thorny question. Not much is generally accepted when it comes to the measurement part of quantum theory. If you ask most physicists then they will probably mumble something about decoherence, but there is no universally accepted answer to this.

What is true is that if there is an interaction such that the quantum state is an entangled superposition of two or more systems, and you can guarantee that the two branches will not interfere with one another due to the nature of the interaction Hamiltonians that exist, then, from the perspective of one of the systems, no distinction can be made between using the full unitary dynamics of the whole superposition and first applying the projection postulate to the other systems before continuing with the unitary dynamics.

In my opinion, it doesn't have much to do with the limit of a large number of particles, because you can still imagine engineering an interaction that causes the two branches to interfere, even though it may be difficult in practice. On the other hand, macroscopic systems are more likely to cause decoherence under the Hamiltonians that typically exist in nature.

Even if you do accept that sort of answer, there are many unresolved issues. For example, how do quantum probabilities come about if the universe just consists of a massively entangled wavefunction with no collapses?

People who are bothered by this sort of question have proposed alterations to quantum mechanics that resolve the ambiguities, e.g. Bohmian mechanics and spontaneous collapse models. However, these theories have yet to be made fully compatible with relativity - indeed it is difficult to do so because reproducing violation of the Bell ineqalities means that they have to tackle nonlocality head on.

Homework Helper
Hmm, this is a testable notion:

If you run a stream of electrons through such a Stein Gerlach seperator-recombiner in an EPR-like setup would it and then checked correlation along a perpendicular axis you should be able to check whether the results do or do not align.

If the correlation along the perpendicular axis is preserved, what happens if you put a 'nilpotent' destructive detector along the path of one of the beams? (By nilpotent I mean a destructive detector that should never detect anything.)

Staff Emeritus
Gold Member
For example, how do quantum probabilities come about if the universe just consists of a massively entangled wavefunction with no collapses?

Instead of saying, for instance, that there's a 50% chance of a particle being in a spin up state, you'd say that 50% of the states in the superposition correspond to spin up registering on the measuring device.

Instead of saying you're very likely to see about 50 spin-ups in 100 experiments, you'd say that most of the states in the superposition correspond to about 50 spin-ups detected in the 100 experiments.

Nicky
slyboy said:
What is true is that if there is an interaction such that the quantum state is an entangled superposition of two or more systems, and you can guarantee that the two branches will not interfere with one another due to the nature of the interaction Hamiltonians that exist, then, from the perspective of one of the systems, no distinction can be made between using the full unitary dynamics of the whole superposition and first applying the projection postulate to the other systems before continuing with the unitary dynamics.

What kind of interaction hamiltonians would prevent interference? Are they non-hermitian?

slyboy
What kind of interaction hamiltonians would prevent interference? Are they non-hermitian?

No, they are just the usual hermitian interaction hamiltonians that occur in nature. The main point is that if you have a macroscopic system, such as the pointer on a measuring device, it is likely to couple to environmental degrees of freedom (the em field, dust particles, etc.) very differently depending on its state in position space. Then, you would have to be able to control all of these environmental degrees of freedom on a quantum level in order to cause the different position states of the pointer to reinterfere. This is impossible in practice, so we can apply the projection postulate to make effective predictions once we know that the system has interacted with the measuring device.

Instead of saying, for instance, that there's a 50% chance of a particle being in a spin up state, you'd say that 50% of the states in the superposition correspond to spin up registering on the measuring device. Instead of saying you're very likely to see about 50 spin-ups in 100 experiments, you'd say that most of the states in the superposition correspond to about 50 spin-ups detected in the 100 experiments.

Of course, you can say that, but the fact of the matter is that it appears to us that measurements have actual outcomes, rather than being terms in a superposition, so you have to explain why we have this experience.

More, seriously, it works well for the situation that you describe, but what about unequal superpositions, e.g.

$\frac{1}{\sqrt{3}}| \mbox{up} z \rangle | \mbox{measuring device registers up} \rangle + \frac{\sqrt{2}}{\sqrt{3}} | \mbox{down} z \rangle | \mbox{measuring device registers down} z \rangle$

There are only two terms in the superposition, so by your prescription the probabilities should be 50-50. However, the actual QM probabilities are 1/3 and 2/3. You have to explain why we can give a probability interpretation to the amplitudes of states, rather than just the number of terms.

Another, problem is how do you decide which basis it is OK to make the probability statement in? I could decompose the spin state in the x-basis, and then the relative states of the measuring device would be superpositions of the "registers up" and "registers down" states.

All these are problems that afflict any interpretation wherein QM is complete and the wavefunction is taken to be a literal specification of the state of reality, such as many worlds. I am not saying that these questions have no good answers, since the many-worlders have come up with several ingenious proposals (albeit proposals that are not universally accepted). The main point, is just that there must be more to it than simply reading the probabilities directly from the wavefunction.

Nicky
slyboy said:
[...] The main point is that if you have a macroscopic system, such as the pointer on a measuring device, it is likely to couple to environmental degrees of freedom (the em field, dust particles, etc.) very differently depending on its state in position space. Then, you would have to be able to control all of these environmental degrees of freedom on a quantum level in order to cause the different position states of the pointer to reinterfere. This is impossible in practice, so we can apply the projection postulate to make effective predictions once we know that the system has interacted with the measuring device.

Is it fair to say, then, that the macroscopic pointer states still formally interfere with each other, but the intereference effects are so minute that the pointer's behavior cannot be distinguished from classical behavior?

slyboy
Is it fair to say, then, that the macroscopic pointer states still formally interfere with each other, but the intereference effects are so minute that the pointer's behavior cannot be distinguished from classical behavior?

Yes, pretty much. Have a look at Zurek's Physics Today article on the subject for more details.

Staff Emeritus
Gold Member
slyboy said:
More, seriously, it works well for the situation that you describe, but what about unequal superpositions, e.g.

$\frac{1}{\sqrt{3}}| \mbox{up} z \rangle | \mbox{measuring device registers up} \rangle + \frac{\sqrt{2}}{\sqrt{3}} | \mbox{down} z \rangle | \mbox{measuring device registers down} z \rangle$

There are only two terms in the superposition, so by your prescription the probabilities should be 50-50. However, the actual QM probabilities are 1/3 and 2/3. You have to explain why we can give a probability interpretation to the amplitudes of states, rather than just the number of terms.

This is indeed THE remark that kills off many "naive" statistical interpretations of the wavefunction, something that Everett and Co never really solved in a satisfactory way. The closest comes Deutsch's "rational decider" argument, but even there, he needs additional "reasonable assumptions".

Another, problem is how do you decide which basis it is OK to make the probability statement in? I could decompose the spin state in the x-basis, and then the relative states of the measuring device would be superpositions of the "registers up" and "registers down" states.

All these are problems that afflict any interpretation wherein QM is complete and the wavefunction is taken to be a literal specification of the state of reality, such as many worlds. I am not saying that these questions have no good answers, since the many-worlders have come up with several ingenious proposals (albeit proposals that are not universally accepted). The main point, is just that there must be more to it than simply reading the probabilities directly from the wavefunction.

I think that is very true, and MWI proponents (of which I'm in a way part, with caveat) cannot avoid ADDING extra hypotheses for the Born rule to emerge, no matter of their repeated claims of the opposite. However, there's NOTHING WRONG with adding extra hypotheses, as long as they make sense. But I think extra hypotheses are in any case necessary in order to make the Born rule appear, the most important one being that ONLY ONE of the terms (in what basis?) is only observed (with what probability?).
The difference with Copenhagen-style interpretations is that no physical process is responsable for a wavefunction collapse. It is there where Copenhagen-style interpretations "do not make much sense": with some magic, there are "measurement processes" in nature which "make the transition from the quantum to the classical world". Only, no such physical process is known - while the processes happening in the measurement apparatus ARE known - otherwise we wouldn't know what the apparatus is measuring in the first place (except gravity perhaps - but this is clearly not explicitly stated in the Copenhagen-style interpretations), and it is left very vague exactly where and how this transition is supposed to take place.

However, all this shouldn't stop you from using the projection postulate "FAPP" (for all practical purposes).

cheers,
Patrick.

Staff Emeritus
Gold Member
Dearly Missed
Nicky
I am just a grad student so my understanding of QM is still naive, but I am still not getting why the Born rule needs to be a fundamental postulate. Can't it just be viewed as an heuristic, an approximation to the hopelessly complicated unitary dynamics of a macroscopic system?

seratend
Patrick, the 'relational" measurement paper I posted about at https://www.physicsforums.com/showthread.php?t=80769, claims to solve the basis choice problem. Could you maybe take a look at it and tell us what you think?

I have quickly read it. If I am not wrong, it deals only with the additional difficulty of the time transformation due to the Lorentz invariance or the equivalence principle of GR. But I think it does not solve anything for the measurement.
The Collapse postulate just defines a property on a given reference frame for the whole system. We have to re-express the projector of this property if we change the frame of reference in order to have the same property correctly expressed in both frames. Therefore, we still have the problem of the (prediction) preferred basis.

Seratend.

Staff Emeritus
Gold Member
Nicky said:
I am just a grad student so my understanding of QM is still naive, but I am still not getting why the Born rule needs to be a fundamental postulate. Can't it just be viewed as an heuristic, an approximation to the hopelessly complicated unitary dynamics of a macroscopic system?

Simply said, no, for a very simple reason: let us call that very complicated UNITARY operator, U. If it is unitary, no matter how complicated, it is LINEAR.

Take the system under study in state |s1>, and your measurement apparatus in its pre-measurement state |M0>.
Now, if state |s1> always gives you outcome 17 on the measurement dial, then your hopelessly complicated U has at least the following property:

U |s1>x|M0> = |s_something> x |M-17>

If state |s2> always gives you an outcome 38 on the measurement dial,
then your U has at least also the property:

U |s2> x |M0> = |s_somethingelse> x |M-38>

Purely from the linearity follows then:

U (a |s1> + b |s2> ) = a |s_something> x |M-17> + b |s_something_else> x |M-38>

and there's no way in which you can only get ONE of the terms, probabilistically. By linearity, you ALWAYS get BOTH terms.
Born's rule gives you ONE term, with a certain probability.

cheers,
Patrick.

seratend
vanesch said:
Purely from the linearity follows then:

U (a |s1> + b |s2> ) = a |s_something> x |M-17> + b |s_something_else> x |M-38>

and there's no way in which you can only get ONE of the terms, probabilistically. By linearity, you ALWAYS get BOTH terms.
Born's rule gives you ONE term, with a certain probability.

cheers,
Patrick.

You mean projection postulate gives one term and born rules the probability :tongue2:

Seratend.

Staff Emeritus
Gold Member
seratend said:
You mean projection postulate gives one term and born rules the probability :tongue2:

Eh yes.

slyboy
I think that is very true, and MWI proponents (of which I'm in a way part, with caveat) cannot avoid ADDING extra hypotheses for the Born rule to emerge, no matter of their repeated claims of the opposite. However, there's NOTHING WRONG with adding extra hypotheses, as long as they make sense. But I think extra hypotheses are in any case necessary in order to make the Born rule appear, the most important one being that ONLY ONE of the terms (in what basis?) is only observed (with what probability?).

I agree that extra hypotheses are needed, and that there is NOTHING WRONG with adding them in principle. However, I think we must be incredibly careful about just what kind of hypotheses we allow ourselves to add. This is because adding hypotheses about probability is very likely to constrain the possible interpretation we can give to quantum probabilities. For example, if we simply say that the probabilities ARE given by the Born rule, and leave it at that, then that strongly suggests that quantum probabilities MUST be taken to be objective probabilities of some sort.

As you probably know, philosophers, statisticians, economists, physicists, etc. have debated the interpretation of probabilities in the classical case ad infinitum, with no clear consensus having emerged. The three front-runners are the frequentist, propensity and subjective interpretation of probability.

Now, my personal favourite is the subjective interpretation, but that is pretty much beside the point here. However, I am inclined to believe (hope, pray, etc.) that the interpretational problems of quantum theory can be resolved without having to fix on a particular interpretation of probability. If not, then I don't really see any hope of resolving them, since the debate about interpretation of probability is in many ways even more divisive than the debate about quantum mechanics.

Therefore, I would say that the only sort of hypotheses we should add in order to derive the Born rule are ones that can be formulated in all three interpretations of probability. In particular, hypotheses of the form "the probability is ..." should not be allowed because they cannot be formulated in the subjective interpretation of probability.

Nicky
vanesch said:
[...] Purely from the linearity follows then:

U (a |s1> + b |s2> ) = a |s_something> x |M-17> + b |s_something_else> x |M-38>

and there's no way in which you can only get ONE of the terms, probabilistically. By linearity, you ALWAYS get BOTH terms.
Born's rule gives you ONE term, with a certain probability.

Thanks much, Patrick. Very clear explanation.

Staff Emeritus
Gold Member
slyboy said:
Therefore, I would say that the only sort of hypotheses we should add in order to derive the Born rule are ones that can be formulated in all three interpretations of probability. In particular, hypotheses of the form "the probability is ..." should not be allowed because they cannot be formulated in the subjective interpretation of probability.

I'm not sure about that (even convinced of the opposite, say :tongue2: ). In a MWI setting, where objectively, nothing is probabilistic and "everything" happens, the Born-style probabilities are only observer-related (and hence subjective). So what's wrong with: "and for the observer in that branch, the subjective probability is..." ? A bit in the style of the rational decider's probabilities in Deutsch's approach ?

cheers,
Patrick.

slyboy
I'm not sure about that (even convinced of the opposite, say ). In a MWI setting, where objectively, nothing is probabilistic and "everything" happens, the Born-style probabilities are only observer-related (and hence subjective). So what's wrong with: "and for the observer in that branch, the subjective probability is..." ? A bit in the style of the rational decider's probabilities in Deutsch's approach ?

That's not really a thoroughgoing subjective theory of probability. It gels with the Jaynesian view of probability perhaps, since there an agent's probability is determined by the objective information that the agent has, i.e. it is an objective fact about an agent's objective knowledge. In this case, the information that an agent has depends on which branch of the wavefunction they find themselves in, so perhaps everything works out OK.

However, in a more thoroughgoing subjective approach, there is simply no objective fact that constrains the probabilities that an agent should assign. It is a matter of their state of belief, rather than of their information or knowledge. In my opinion, this is the more justifiable subjective theory, since information and knowledge are nefarious concepts to define in this context, but belief has a well defined operational meaning via de Finetti's arguments. In this approach, things that are directly related to probability, such as quantum amplitudes, ought not to appear as part of the "objective state of reality", so this approach is thoroughly incompatible with many-worlds in any case, even without considering what additional hypotheses are needed to derive the Born rule.

seratend
slyboy said:
That's not really a thoroughgoing subjective theory of probability. It gels with the Jaynesian view of probability perhaps, since there an agent's probability is determined by the objective information that the agent has, i.e. it is an objective fact about an agent's objective knowledge. In this case, the information that an agent has depends on which branch of the wavefunction they find themselves in, so perhaps everything works out OK.

However, in a more thoroughgoing subjective approach, there is simply no objective fact that constrains the probabilities that an agent should assign. It is a matter of their state of belief, rather than of their information or knowledge. In my opinion, this is the more justifiable subjective theory, since information and knowledge are nefarious concepts to define in this context, but belief has a well defined operational meaning via de Finetti's arguments. In this approach, things that are directly related to probability, such as quantum amplitudes, ought not to appear as part of the "objective state of reality", so this approach is thoroughly incompatible with many-worlds in any case, even without considering what additional hypotheses are needed to derive the Born rule.

Waow Sliboy, this is almost phylosophy, where is the physics (the logical deductions and not the subjective or objective ones)?

Seratend.

Staff Emeritus
Gold Member
seratend said:
Waow Sliboy, this is almost phylosophy, where is the physics

I would like to object to the generally held belief in the physicist community that where philosophy starts, physics ends ; this is a view held by "technocratic" physicists who have a caricatural view of what exactly is "philosophy", which they erroneously seem to associate to futile discussions as how many angels can dance on the head of a pin. I'd say that talking about foundational issues in physics necessarily touches upon philosophical arguments in the same way as it touches upon logical (= part of philosophy!) and mathematical arguments.
I have to say that I was also part of the poopooh crowd when philosophy came to issue (the Feynman lectures help having that wrong view - probably one of the rare critiques I have of them), until I read a bit about it. The funny thing is that *certain* ideas touching upon foundational issues in quantum theory are actually OLD STUFF in philosophy: the mind-brain problem, solipsism, objective ontology versus subjective perception, the hard problem of consciousness... all these things which, one way or another have made their way into certain views on the foundations of quantum theory were in fact "old hat" to philosophers. This doesn't mean that philosophers know more about quantum theory than physicists, but at least, we can learn something about those aspects from philosophers. Physicists often have this narcissism of reinventing what other fields have already done for a long time: Heisenberg reinvented linear algebra, dirac re-invented Hilbert spaces, Murray Gell-Mann reinvented the representations of compact groups...
Sometimes it can be good to look over the fence, to look at what the neighbour is doing, without an a priori.

I think that nobody objects in discussions "degenerating" into pure logic or mathematics ; in the same way, one shouldn't object to discussions degenerating into some philosophy.

cheers,
Patrick.

seratend
Waow Patrick, this is a strong reaction!

I do not deny the use of philosophy as long as we are able to associate to the used terms/concepts mathematically well defined objects for a practical use in physics. Hence my remark above.

Seratend

Staff Emeritus
Gold Member
seratend said:
Waow Patrick, this is a strong reaction!

Didn't mean to be mean

My reply wasn't directed at you especially, I just objected to the often given out-of-hand reply that "this is philosophy" comes down to "shut up, you're talking bull****".

On the other hand, I agree with you that one should be careful about such discussions, which DO often degrade into mumbling empty statements.

Staff Emeritus
Gold Member
Dearly Missed
Patrick, the ignorance and caricature work the other way too. Very few philosophers have a correct understanding of what quantum mechanics says. They still think of the uncertainty principle in terms of Heisenberg's microscope.

Staff Emeritus
Gold Member
Patrick, the ignorance and caricature work the other way too. Very few philosophers have a correct understanding of what quantum mechanics says.

I know, I didn't mean to say that philosophers are better at physics than physicists. What I meant was, that when a question in physics touches upon a philosophical issue (such as the distinction between a subjective experience, and the hypothesis of an objective world etc...), this shouldn't be immediately shot down by "pooh, this is philosophy ! Where are the equations ?" ; and it might be interesting to see what philosophers have already said about related questions.
I think that the relationship between physics and philosophy ressembles a bit that between physics and mathematics. Physicists are (usually) better at physics than mathematicians, but when physicists, in doing physics, touch upon a mathematical problem, it might be inspiring to go and look what mathematicians already have done on the issue. That doesn't mean that the physics has to be left to the mathematicians, who will make a mess of it :-)
Now, before the foundational issues in QM, I think the piece of "philosophy" needed in physics was rather simple and intuitive, but the foundational issues in QM have given rise to much deeper questions which sometimes HAVE already been studied for a few millennia. It is true that by training and group think, many physicists feel unconfortable with this matter - and sometimes a bit embarrrassed ; it doesn't fit in the "macho" culture of hard math and "real" measurements somehow, and they fear to be ridiculized by their peers. But it can be inspiring to learn at least what has been done on the subject and see if it can be inspiring. Without leaving the physics to philosophers.

slyboy
I would like to object to the generally held belief in the physicist community that where philosophy starts, physics ends

I agree. I would like to think that the interpretation of quantum mechanics will turn out to be physics (as well as being philosophy). By this I mean that it will lead to new ways of thinking about physics, which in turn will lead to new ways of extending it. Hopefully, this will turn out to be relevant, albeit indirectly, for coming up with the correct theory of quantum gravity.

Waow Sliboy, this is almost phylosophy, where is the physics (the logical deductions and not the subjective or objective ones)

One thing that philosophers are good at is separating out the individual problems that make up the complex issues we think about as physicists. Then, we can try to analyse and solve them one by one.

My main point is that the interpretational problems of quantum mechanics seem to be very closely tied to the problem of interpreting probability theory. I would like to separate them if possible, because trying to solve one hard problem is usually easier than trying to solve two simultaneously.

The subjective theory of probability presents a problem in this regard, because we are used to thinking of quantum probabilities as the objective predictions of the theory. However, according to the subjective theory, probabilities are just not the right sort of thing to appear at such a fundamental level.

On the other hand, subjective probability does not do away with objective facts entirely. Although, probabilities are not themselves objective, things such as the possible options that an agent has to decide between, and the possible events that can occur, are taken as objective facts.

In my view, any hypothesis used to derive the Born rule should be expressible in terms of the things that are taken to be objective in any theory of probability. In the subjective theory, this means thoroughly grounding things in decision theory.

If this cannot be done, then the hypothesis is not compatible with all the major interpretations of probability, and its proponents have to go and fight the battle about the interpretation of probability before they can convince everyone that it is the correct way to think about quantum theory.

In contrast, if the hypothesis does have an equivalent formulation in all the major interpretations of probability, then we can ignore the issue and just fight about quantum mechanics instead, which is what we wanted to do in the first place.

Staff Emeritus
Gold Member
slyboy said:
The subjective theory of probability presents a problem in this regard, because we are used to thinking of quantum probabilities as the objective predictions of the theory. However, according to the subjective theory, probabilities are just not the right sort of thing to appear at such a fundamental level.

I think you cannot get around this: probabilities seem to be fundamental in quantum theory, and I'd say that if you have a view on probability that cannot take probabilities as fundamental quantities, then you'll always have a problem with QM in that view. I think you are very close to the views of Deutsch, no ? I think that anyone who tries to do QM *without* somehow postulating that probabilities will appear, will run in a circle because it is the *only* thing that comes out of the wavefunction ! After all, what else is a wavefunction good for ? What does it mean to have a system in a state |psi> if there's no link to anything probabilistic ? There's a paper with critique on the MWI which explains this very well by Adrian Kent grqc/9703089. I think his critique is too harsh for MWI but it illustrates very well that *without* any probability postulate, QM is dead-empty. There is no "emergence" of probabilities simply because there's some vector wobbling in Hilbert space if that's all there is. Let it wobble. What does it mean ?

cheers,
Patrick.

slyboy
I think you are very close to the views of Deutsch, no ?

I don't think so, since I am not actually a fan of many-worlds. The main reson is that you have an object so closely related to probabilities apprearing as the "state of reality", i.e. the wavefunction, and I don't think that it can ever be made compatible with the all theories of probability.

However, if one follows an interpretation in which the quantum state is epistemic, then there is not such a big problem. We can have different agents assigning different states to the same system, so it becomes much more like a probability distribution.

Of course, the big problem with this approach is to identify what the "states of reality" are in quantum mechanics if they are not the wavefunction. I believe that there are 4 main contenders for this:

1) There is no "state of reality" - This puts us close to a Copenhagen or an instrumentalist view of quantum mechanics. We have to give up a huge chunk of realism, which I would prefer not to do.

2) Hidden variable theories - The problem here is that in most of the viable contenders, such as Bohmian mechanics, the wavefunction still enters as part of the state of reality. There is currently no plausible hidden variable theory without this property, although one can construct toy theories. However, if we drop the equilibrium hypothesis, then the wavefunction is no longer directly related to probability distributions, so this might be a viable approach.

3) Quantum Logic - Here, the only thing that changes when you go from classical to quantum is the structure of events. Probabilities are introduced in exactly the same way as in the classical theory. Events are objective in all theories or probability, so it is compatible with propensities, frequencies and subjective approaches.

4) Miscellaneous proposals that have yet to be fully worked out.

seratend
slyboy said:
I don't think so, since I am not actually a fan of many-worlds. The main reson is that you have an object so closely related to probabilities apprearing as the "state of reality", i.e. the wavefunction, and I don't think that it can ever be made compatible with the all theories of probability.

So you are saying that the born rules is not a probability law? :tongue2:

For me, once you define a measure on a given set with a sigma algebra (the borel sets, in the case of the observables) and the eigenvalue-outcome link, you have what I call formally a "classical" probability space. Why asking for more than this?

Seratend.