Rade said:
I would describe this state as being "impossible" unless I was riding the horse inside the grocery store.
I know. But it is THE FOUNDING PRINCIPLE of quantum theory.
It is from this principle (the superposition principle) and the principle of complementarity that the entire machinery of quantum theory is build up.
So it would be totally crazy to deny the principle *to interpret quantum theory*. You might reject the principle, or limit its applicability (as did Bohr: only to the microscopic world). But doing so means also that you kill off the entire formalism of quantum theory that was build upon this. Fine. Go ahead. But come back with something to replace it that explains its empirical success then.
It is a bit as having the principle of relativity (nature is the same for all inertial observers) - take it from Galileo or from Einstein - and clinging on the existence of an absolute space (such as in ether theory).
If you do so - which you can, of course - then the principle is dead. You can try to build something that is empirically equivalent, like introducing AD HOC rules for "length contraction" and "time dilatation", but these are things introduced BY HAND. There's not one single reason for things to be so. For instance, length contraction and time dilatation could work for matter made out of the even elements, while it could not work for stuff made out of uneven elements in the periodic table. Nothing requires it.
The power of physical theories has always been that some fundamental principles are taken as their foundation, and from there, everything is build up. For special relativity, these are the principle of relativity and the fact that lightspeed is the same for all observers. For general relativity, these are the equivalence principle and general covariance. And for quantum theory, this is the superposition principle (and the principle of complementarity).
The naturalness by which the formalism is deduced from these fundamental principles is much greater, than when one has to start from an opposite paradigm, and introduce a lot of tricks BY HAND to obtain the empirically verified formal results. This is, for instance, what happens in ether theory, or what happens in Bohmian mechanics when dealing with 1) relativity and 2) the very existence of the quantum formalism.
The entire machinery of the wavefunction is incorporated in BM *by hand*. It is not required by any fundamental principle, but things are put in this way so that it comes out the same *as if* the superposition principle were true (which it isn't, of course), and so that things come out *as if* the principle of relativity were true (which it isn't, of course).
Of course, all this is just an esthetical judgement. Nature doesn't have to follow anything we desire. Solipsism is not falsifiable, and there's no obligation for causality either. Things can happen and if we cannot make sense of it, because there is no deeper sense or no underlying principle, then so be it.
So whether or not we should require there to be a causal ontology, or even an ontology in the first place or whether or not we should require there to be some fundamental principles from which we can derive an ultimate theory of nature describing such an ontology, is an open question. After all, the only thing we can really do is to find some regularities in our observations. All the rest is hypothesis. Nevertheless, the tradition of taking some basic principles, and stick to it, has given us very impressive results in the past. Have we exhausted that line ? Who can tell ?
So, if we stick to the idea that founding principles of a theory are somehow universal and strict, then the superposition principle, on which all of quantum theory is constructed, tells us that we can be in a state made up of us being in the grocery store, and us riding a horse in the woods.
One doesn't realize from the start what is the totally crazy, mindboggling implication of this statement, when formally announced in quantum theory.
It is from this statement that follows that quantum states span a hilbert space, and that we should use linear operators over the space, for instance.
Then one learns about hermitean operators, eigenvalues, harmonic oscillators, unitary time evolution, etc... and after a long time, one starts to think again about the "measurment problem" and this funny formalism that doesn't allow you to describe a measurement as a physical interaction described by a hamiltonian, and those remaining superpositions and all that. And one blames the formalism of quantum theory, and "to reify the mathematics" and all that.
But it was put in from day 1 ! It was put in that the voltmeter could be in a state which is a superposition of "reading 5V" and "reading -2V" at the same time. That's simply the superposition principle, applied to a voltmeter. It isn't surprising, then, that this comes out of the formalism.
This has nothing to do with "taking the maths too seriously" (like using unphysical solutions to an equation or so).