A. Neumaier said:
I get easily both the standard probabilistic/statistical connection between theory and experiment in cases where it applies (namely for frequently repeated experiments), and the standard deterministic/nonstatistical connection between theory and experiment in cases where it applies (namely for experiments involving only macroscopic variables).
There is no need to assume a fundamental probabilistic feature of quantum mechanics, and no need to postulate anything probabilistic, since it appears as a natural conclusion rather than as a strange assumption about mysterious probability amplitudes and the like that must be put in by hand. Thus it is a significant conceptual advance in the foundations.
I understand QT as the present fundamental symmetry of matter (with the qualification that we don't have a fully satisfactory of the QT of gravitation), and thus it should explain both extreme cases you cite from one theory, and for me the standard minimal interpretation, used to connect the theory with real-world observations/experiments, is very satisfactory, and the key feature is the probabilistic interpretation. It explains both, the meaning of observations on microscopic objects and the quasi-deterministic behavior of macroscopic observables on macroscopic systems. In the latter case, the "averaging" (done in the microscopic case by repeating an experiment many times) is "done" by the measurement apparatus itself. It's a spatial and/or temporal average. All this is well described within the statistical interpretation of the state.
You have a very similar way to define such "averages" in classical electrodynamics applied to optics, where you define the apparently time-independent intensity of light in terms of the classical electromagnetic field by a temporal average. If you follow the history of QT, I think it is fair to say that the original thinking on the meaning of the wave function by Schrödinger came via the analogy with this case. In optics you define the intensity of light as the energy density averaged over typical periods of the em. field (determined by the typical frequency of the emitted em. wave), and these are quadratic forms of the field, like the energy density itself,
$$\epsilon=\frac{1}{2} (\vec{E}^2+\vec{B}^2),$$
or the energy flow,
$$\vec{S}=c \vec{E} \times \vec{B}.$$
(em. energy per area and time; both in Heaviside-Lorentz units).
Schrödinger originally thought of the wave function as a kind of "density amplitude" and its modulus squared as a density in a classical-field sense, but this was pretty early considered a wrong interpretation and lead to Born's probability interpretation, which is the interpretation considered valid today. I still don't understand, why you deny the Born interpretation as a fundamental postulate about the meaning of the quantum state, because it satisfactorily describes both extremes you quote above (i.e., microscopic observations on few quanta and macroscopic systems consisting of very many particles, leading to classical mechanics/field theory as an effective description for the macroscopically relevant observables) and also the "mesoscopic systems" lying somehow in between (like quantum dots in cavity QCD, ultracold rarefied gases in traps including macroscopic quantum phenomena like Bose-Einstein condensation, etc.).