A POVM is fairly mundane stuff. One doesn't need to talk about cosmology, black holes, Heisenberg cuts or anything else. A POVM models measurements that have errors, thermal noise, false positives, dark counts, indirect measurements via coupling to an ancilla and so on. It's just as vanhees71 has above, a more general description of measurements than PVMs.
Thus regarding Couchyam's earlier statement, we don't need to consider randomised operators when discussing imprecise, noisy, etc measurements. We just use POVMs.
vanhees71 said:
What's really not yet understood is quantum gravity, but that's a scientific and not a philosophical problem.
I agree.
My own take is that quantum theory was mostly sorted out conceptually by 1935. Heisenberg formulated the beginnings of the theory in 1925, but there were several conceptual points to be cleared up. These include details like the Mott paper on how lines observed in a bubble chamber were compatible with wave mechanics, von Neumann putting the theory on a sound mathematical footing, entanglement being first articulated, properly understanding scattering theory, that variables not commuting was not simply "disturbance" and so on.
What wasn't fully appreciated by 1935 were the deeper uses that could be made of entanglement and why the collective coordinates of macroscopic bodies, e.g. positions of planets, motion of a car, obey classical probability theory.
Entanglement has since been much better understood. For the latter we can now show how coarse-graining, decoherence, the exponential size of the Hilbert space and many other effects dampen coherence for macro-coordinates for a typical large object well beyond an inverse googleplex in magnitude.
Had there really been other problems they would have shown up in formulating QFT. The actual issues there however were completely separate: correctly formulating the Poincaré group for Hilbert spaces, renormalisation, the relation between particles and fields, treating couplings to massless states (i.e. gauge theories).