Interesting View Of Quantum Mechanics and The Measurement Problem

Messages
10,904
Reaction score
3,782
I have been going through the following interesting paper on the foundations of Quantum Mechanics:
http://arxiv.org/pdf/0911.0695v1.pdf

'We define the state of a system as that mathematical object from which one can determine the probability for any conceivable measurement. Physical theories can have enough structure that it is not necessary to give an exhaustive list of all probabilities for all possible measurements, but only a list of probabilities for some minimal subset of them. We refer to this subset as fiducial set. Therefore, the state is specified by a list of d (where d depends on dimension N) probabilities for a set of fiducial measurements: p = (p1, . . . , pd). The state is pure if it is not a (convex) mixture of other states. The state is mixed if it is not pure. For example, the mixed state p generated by preparing state p1 with probability λ and p2 with probability 1 − λ, is p = λp1 + (1 − λ)p2. When we refer to an N-dimensional system, we assume that there are N states each of which identifies a different outcome of some measurement setting, in the sense that they return probability one for the outcome. We call this set a set of basis or orthogonal states. Basis states can be chosen to be pure. To see this assume that some mixed state identifies one outcome. We can decompose the state into a mixture of pure states, each of which has to return probability one, and thus we can use one of them to be a basis state. We will show later that each pure state corresponds to a unique measurement outcome.'

After thinking about it there seems to an assumption being made here - namely in associating a mixed state with an ensemble of other states they are assuming that if the list of probabilities they call a state is exactly the same as such an ensemble then it is to be interpreted that way. I can't see how it follows from the definitions they make, but rather is an assumption following from their first axiom 'All systems of the same information carrying capacity are equivalent'. However if that assumption is at the very foundations of QM then decoherence solves the measurement problem. For it means the improper mixed state decoherence transforms a superposition into must be interpreted as an actual ensemble - its assumed in it foundations.

What do others think?

Thanks
Bill
 
Last edited:
Physics news on Phys.org
Bhobba,

I haven't read into the paper as far as you have, but I hit a simpler puzzle about their axioms, i.e.,

Dakic+Brukner said:
Axiom 1. (Information capacity) An elementary system has the information carrying capacity of at most one bit. All systems of the same information carrying capacity are equivalent.

Axiom 2. (Locality) The state of a composite system is completely determined by local measurements on its subsystems and their correlations.

Axiom 3. (Reversibility) Between any two pure states there exists a reversible transformation.
They seem not to define "state", nor "pure state" before giving these axioms, which seems a tad sloppy.

Later they seem to assume that a state space is a vector space, but don't include this in their axioms.

Presumably a single electron at rest in the laboratory frame, in a spin eigenstate (wrt to some axis) qualifies as an "elementary system" in a pure state in their terminology? If so, then presumably a positron at rest in a similar eigenstate also qualifies?

But their later requirement that those reversible transformations be continuous is not satisfied for converting an electron into a positron (charge superselection), so something seems to be missing.

Or else such an electron or positron does not qualify as "elementary". But in that case, what does??

Probably, I need to read some of their references, such as

[23] A. Zeilinger, A Foundational Principle for Quantum Mechanics, Found. Phys. 29, 631 (1999).
 
strangerep said:
They seem not to define "state", nor "pure state" before giving these axioms, which seems a tad sloppy.

It is very sloppy indeed - I think you have to go through it a few times to see exactly what's going on. I am doing that right now - however in its defense I think its meant to be read in conjunction with Lucien Hardy's paper where he develops QM from 5 axioms which I have gone through before:
http://arxiv.org/pdf/quant-ph/0101012v4.pdf

I don't see the issue with the positron at this stage - the basis states in QFT are all generated by the creation and annihilation operators so in principle it looks true there.

Thanks
Bill
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. Towards the end of the first lecture for the Qiskit Global Summer School 2025, Foundations of Quantum Mechanics, Olivia Lanes (Global Lead, Content and Education IBM) stated... Source: https://www.physicsforums.com/insights/quantum-entanglement-is-a-kinematic-fact-not-a-dynamical-effect/ by @RUTA
If we release an electron around a positively charged sphere, the initial state of electron is a linear combination of Hydrogen-like states. According to quantum mechanics, evolution of time would not change this initial state because the potential is time independent. However, classically we expect the electron to collide with the sphere. So, it seems that the quantum and classics predict different behaviours!

Similar threads

Back
Top