vanhees71 said:
That confirms my (still superficial) understanding that now I'm allowed to interpret ##\hat{\rho}## and the trace operation as expectation values in the usual statistical sense,
There are two senses: One as a formal mathematical construct, giving quantum expectations, and
the other in a theorem stating that when you do actual measurements, the limit of the sample means agree with these theoretical quantum expectations.
vanhees71 said:
and that makes the new approach much more understandable than what you called before "thermal interpretation".
I derive the thermal interpretation from this new approach. See Section 7.3 of my paper, and consider the paper to be a much more understandable pathway to the thermal interpretation, where in my book I still had to postulate many things without being able to derive them.
vanhees71 said:
I also think that the entire conception is not much different from the minimal statistical interpretation. The only change to the "traditional" concept seems to be that you use the more general concept of POVM than the von Neumann filter measurements, which are only a special case.
The beginnings are not much different, but they are already simpler than the minimal statistical interpretation - which needs nontrivial concepts from spectral theory and a very nonintuitive assertion called Born's rule.
vanhees71 said:
The only objection I have is the statement concerning EPR. It cannot be right, because local realistic theories are not consistent with the quantum-theoretical probability theory, which is proven by the violation of Bell's inequalities (and related properties of quantum-mechanically evaluated correlation functions, etc) through the quantum mechanical predictions and the confirmation of precisely these violations in experiments.
Please look at my actual claims in the paper rather than judging from the summary in the Insiight article! EPR is discussed in Section 5.4. There I claim
elements of reality for quantum expectations of fields operators, not for Bell-local realistic theories! Thus Bell inequalities are irrelevant.
vanhees71 said:
I take it that it is allowed also in your new conception to refer to ##\hat{\rho}## as the description of equivalence classes of preparation procedures, i.e., to interpret the word "quantum source" in the standard way)
No. A (clearly purely mathematical) construction of equivalence classes is not involved at all!
A quantum source is a piece of equipment emanating a beam - a particular laser, or a fixed piece of radioactive material behind a filter with a hole, etc.. Each quantum source has a time-dependent state ##\rho(t)##, which in the stationary case is independent of time ##t##.
vanhees71 said:
All the quantum state implies are the probabilities for the outcome of measurements.
The quantum state implies known values of all quantum expectations (N-point functions). This includes smeared field expectation values that are (for systems in local equilibrium) directly measurable without any statistics involved. It also includes probabilities for statistical measurements.
vanhees71 said:
I think within your conceptional frame work, "observable" takes a more general meaning as the outcome of some measurement device ("pointer reading") definable in the most general sense as a POVM.
It takes a meaning independent of POVMs.
- In classical mechanics where observables are the classical phase space variables ##p,q## and everything computable from them; in particular the kinetic and potential energy, forces, etc..
- In quantum mechanics observables are the quantum phase space variables ##\rho## (or its matrix elements) and everything computable from them, in particular, the N-point functions of quantum field theory. For example, 2-point functions are often measurable through linear response theory.