# Interesting View Of Quantum Mechanics and The Measurement Problem

1. Mar 29, 2013

### Staff: Mentor

I have been going through the following interesting paper on the foundations of Quantum Mechanics:
http://arxiv.org/pdf/0911.0695v1.pdf

'We deﬁne the state of a system as that mathematical object from which one can determine the probability for any conceivable measurement. Physical theories can have enough structure that it is not necessary to give an exhaustive list of all probabilities for all possible measurements, but only a list of probabilities for some minimal subset of them. We refer to this subset as ﬁducial set. Therefore, the state is speciﬁed by a list of d (where d depends on dimension N) probabilities for a set of ﬁducial measurements: p = (p1, . . . , pd). The state is pure if it is not a (convex) mixture of other states. The state is mixed if it is not pure. For example, the mixed state p generated by preparing state p1 with probability λ and p2 with probability 1 − λ, is p = λp1 + (1 − λ)p2. When we refer to an N-dimensional system, we assume that there are N states each of which identiﬁes a diﬀerent outcome of some measurement setting, in the sense that they return probability one for the outcome. We call this set a set of basis or orthogonal states. Basis states can be chosen to be pure. To see this assume that some mixed state identiﬁes one outcome. We can decompose the state into a mixture of pure states, each of which has to return probability one, and thus we can use one of them to be a basis state. We will show later that each pure state corresponds to a unique measurement outcome.'

After thinking about it there seems to an assumption being made here - namely in associating a mixed state with an ensemble of other states they are assuming that if the list of probabilities they call a state is exactly the same as such an ensemble then it is to be interpreted that way. I cant see how it follows from the definitions they make, but rather is an assumption following from their first axiom 'All systems of the same information carrying capacity are equivalent'. However if that assumption is at the very foundations of QM then decoherence solves the measurement problem. For it means the improper mixed state decoherence transforms a superposition into must be interpreted as an actual ensemble - its assumed in it foundations.

What do others think?

Thanks
Bill

Last edited: Mar 29, 2013
2. Mar 29, 2013

### strangerep

Bhobba,

I haven't read into the paper as far as you have, but I hit a simpler puzzle about their axioms, i.e.,

They seem not to define "state", nor "pure state" before giving these axioms, which seems a tad sloppy.

Later they seem to assume that a state space is a vector space, but don't include this in their axioms.

Presumably a single electron at rest in the laboratory frame, in a spin eigenstate (wrt to some axis) qualifies as an "elementary system" in a pure state in their terminology? If so, then presumably a positron at rest in a similar eigenstate also qualifies?

But their later requirement that those reversible transformations be continuous is not satisfied for converting an electron into a positron (charge superselection), so something seems to be missing.

Or else such an electron or positron does not qualify as "elementary". But in that case, what does??

Probably, I need to read some of their references, such as

[23] A. Zeilinger, A Foundational Principle for Quantum Mechanics, Found. Phys. 29, 631 (1999).

3. Mar 29, 2013

### Staff: Mentor

It is very sloppy indeed - I think you have to go through it a few times to see exactly whats going on. I am doing that right now - however in its defense I think its meant to be read in conjunction with Lucien Hardy's paper where he develops QM from 5 axioms which I have gone through before:
http://arxiv.org/pdf/quant-ph/0101012v4.pdf

I don't see the issue with the positron at this stage - the basis states in QFT are all generated by the creation and annihilation operators so in principle it looks true there.

Thanks
Bill