What I have in mind is some description of the measurement problem in terms of algebraic QFT the latter being presented in the book by Haag I already cited.
I just googled a thesis who tries to do so and looks interesting:
http://pub.uni-bielefeld.de/download/2303208/2303211I think the point with the impossibility to write down a wavefunction for the whole universe is as in the following example (which lacks completely mathematical rigor):
Consider an infinite lattice of spins ##\sigma_i##. If ##\xi_i## is the wavefunction for spin i, we could formally introduce the wavefunction of the whole universe as ##\prod_i \xi_i##, however, these functions are not well defined as we have no means as to define the convergence of the infinite product.
Assume now that all the ##\xi_i ## are the same.
We could create a new wavefunction if we apply the same unitary transformation U to each of the ##\xi_i##
##\xi'_i=U\xi_i##. Now, ##|\langle \prod \xi_i|\prod \xi'_i \rangle|=\prod |\langle \xi_i|\xi'_i\rangle|=0## as ##|\langle \xi|\xi\rangle|<1##. Hence even the infinitesimally transformed wavefunctions would be orthogonal to each other and in fact no local operation acting on only a finite number of spins can transform one wavefunction into the other.
The ill defined states wouldn't even live in the same Hilbert space.
We now introduce the following operator: ##S=\mathrm{lim}_{V\to \infty} 1/V \sum_{i \in V} \sigma_i##. This operator will commute with all operators constructed from a finite number of the ##\sigma_i## which form an algebra. The operator S is therefore a classical observable distinguishing the different universes.
S can be represented by a pure number according to Schur's lemma iff the representation of the algebra is irreducible however, there are also reducible representations possible. This is a shadow of the superposition principle.
Hence the notion of (ir-)reducible representations of the algebra has replaced the ill defined notion of superpositions of whole universes.
However, we have no means to decide whether we live in a reducible or an irreducible representation.
We could now imagine that decoherence will transform asymptotically in the limit ##t \to \infty## an initial superposition ##\psi+\psi'## into a reducible representation i.e. a mixture of different possible classical outcomes.
The convergence of the limit ## t \to \infty## is typically very rapid, i.e. already in a very short time, we can't distinguish a superposition from a statistical mixture of measurement outcomes as our measurements have but finite precision.