A Statistical ensemble interpretation done right

  • #151
E.g., why you don't accept the standard answer to the question, how "classical behavior" of macroscopic systems are understood by the pracitioners of the field (e.g, condensed-matter physicists). What's not satisfactory for you? Why do you think, we must still refer to the hand-waving arguments of 80 years ago like a "quantum-classical cut" or "collapse of the state", etc.?
 
Physics news on Phys.org
  • #152
vanhees71 said:
The point is that we got totally off topic although it's fully clear, how I meant the term "ensemble" here. Thanks for giving the simple answer. I try to call it "statisical sample" in this forum from now on, and we can get back to the really interesting discussions.
I added to my answer an authoritative quote from Landau and Lifschitz.
 
  • #153
vanhees71 said:
why you don't accept the standard answer to the question, how "classical behavior" of macroscopic systems are understood by the pracitioners of the field
It's not just me; there is a whole community of physicists who think that the "standard answer", while it is, as a I said, fine and workable in a practical sense, does not actually resolve the measurement problem at a foundational level.

vanhees71 said:
Why do you think, we must still refer to the hand-waving arguments of 80 years ago like a "quantum-classical cut" or "collapse of the state", etc.?
The community of physicists I just referred to above are not using hand-waving arguments of 80 years ago. They are looking at the most up to date developments in, for example, decoherence theory. And, as I said, they do not think that all those developments have solved the measurement problem.

You can say you disagree with them, but you cannot say their viewpoint doesn't exist.
 
  • Like
Likes vanhees71 and gentzen
  • #154
I don't say their viewpoint doesn't exist although I don't know that any of my colleagues would think that there's a measurement problem. I may be too naive, but indeed, I don't understand, where there is a problem, because QT from the very beginning was very successful to quantitatively describe the phenomena, starting from the black-body spectrum (Planck 1900), the spectra of atoms (Pauli 1925/Schrödinger 1926 for the hydrogen atom and quickly also of many-electron atoms), cross sections of scattering processes (Born 1926), etc. etc.
 
  • Like
Likes Lord Jestocost
  • #155
vanhees71 said:
I don't say their viewpoint doesn't exist although I don't know that any of my colleagues would think that there's a measurement problem.
So because you don't personally know anyone who thinks it's a problem, you don't see a problem.

vanhees71 said:
I don't understand, where there is a problem
Yes, we know that. But other people think there is one, even if they can't explain why to your satisfaction. And you have given no argument at all about why there isn't one; you have simply asserted without argument that the practical methods you describe are, in your opinion, good enough. Yes, we know that's your opinion. But there's no way to have a productive discussion on that basis. So when other people want to discuss what they see as a measurement problem, it does not help at all for you to jump in for the umpteenth time and assert that you don't think there is one. That adds no value.
 
  • #156
My argument is that QT precisely describes, what's observed, and that you can't expect more from the natural sciences than precisely describing what's observed. The discussions about this topic is so unproductive, because it's not clear, what's lacking. You also don't tell us, what precisely it is what you think QT is lacking!
 
  • Skeptical
  • Like
Likes weirdoguy and Lord Jestocost
  • #157
vanhees71 said:
The discussions about this topic is so unproductive
Speak for yourself. If you seriously can't see any productive use for such discussions, why do you keep posting in them and hijacking them?

vanhees71 said:
You also don't tell us, what precisely it is what you think QT is lacking!
That's because, as I posted in the other thread just now on Gleason's Theorem where we are having a similar exchange, I have concluded that doing so with you is a waste of my time.
 
  • Like
  • Sad
Likes weirdoguy and Lord Jestocost
  • #158
vanhees71 said:
you can't expect more from the natural sciences than precisely describing what's observed.
From the perspective of theory evolution, the problem I have with this pragmatic perspective ö is that you always end up with an effective theory, that is experimentally fine tuned. That gives you a description, but very little explanation meaning its hard to see the pointers forward.

For me the better explanation there is, the less finetuning you need. Explanation to me means how all things that seem tuned, really are related.

The things that are fine tuned in QM/QFT are for example the 4D spacetime continuum background. Here I am sure we disagree, but I personally associate this background structure to the "macroscopic classical environment" that to Bohr is the "observer". This is a connection between dynamics of spacetime background and the QM foundations (dynamics of observers). Here also the fact that there are no "infinite ensembles" in nature and that the ensemble is fiction, compares to that a fixed spacetime background is also a fiction.

But you denied this connection, as does many others. And maybe on reasonable grounds: that the experimental signs of this is out of reach. But could fine tuning problems be a symptom of this? One can of course think that naturalness is not a must.

/Fredrik
 
  • Like
Likes gentzen and vanhees71
  • #159
Fra said:
From the perspective of theory evolution, the problem I have with this pragmatic perspective ö is that you always end up with an effective theory, that is experimentally fine tuned. That gives you a description, but very little explanation meaning its hard to see the pointers forward.
Sure, we don't have a theory of everything, and all theories we have so far are as you describe: On a fundamental basis we have the Standard Model of elementary particle physics with 20+x free parameters, which have to be determined by experiment. This, to the dismay of most physicists looking for "physics beyond the Standard Model", describes all known matter in terms of quarks, leptons, gauge bosons, and the Higgs boson.

Then we have General Relativity, which provides the space-time model and describes the gravitational interaction on the level of a classical field theory. The space-time model (usually approximated as Minkowski space, one solution of the Einstein Equations for an "empty" universe) also to a large extent determines, how the dynamics of the Standard Model looks like. For that you only need one more parameter, the universal coupling between the "matter energy-momentum tensor" and the gravitational field.

This together is the effective theory of contemporary physics. It's pretty clear that it's preliminary as all physical theories we know always have been. With new empirical evidence maybe one day we'll get a more complete new theory "beyond standard physics". That's, how the natural sciences make progress in gaining knowledge about the objective properties of Nature. Philosophy is "incomprehensibly ineffective", as Weinberg once put it.
Fra said:
For me the better explanation there is, the less finetuning you need. Explanation to me means how all things that seem tuned, really are related.
That's indeed a very common opinion. The hope is that with new theories we find new relations of the kind you imply.
Fra said:
The things that are fine tuned in QM/QFT are for example the 4D spacetime continuum background. Here I am sure we disagree, but I personally associate this background structure to the "macroscopic classical environment" that to Bohr is the "observer". This is a connection between dynamics of spacetime background and the QM foundations (dynamics of observers). Here also the fact that there are no "infinite ensembles" in nature and that the ensemble is fiction, compares to that a fixed spacetime background is also a fiction.
Indeed, that's the really big fundamental question, and not fruitless philsophical quibbles about the result of 20th century physics that Nature is inherently random and to be described by QT rather than classical deterministic models.
Fra said:
But you denied this connection, as does many others. And maybe on reasonable grounds: that the experimental signs of this is out of reach. But could fine tuning problems be a symptom of this? One can of course think that naturalness is not a must.
I never denied these obvious facts. How do you come to that conclusion? What I deny is the assumption that one makes progress by philosophical speculations about how Nature should behave rather than finding solid empirical hints to find a new ansatz for a new, more comprehensive theory, e.g., discovering new particles to get an idea, how dark matter might be described or a hint, how to find a satisfactory QT description of the gravitational interaction, maybe implying that the classical space-time description is also an emergent phenomenon as the classical behavior of macroscopic objects is from the point of view of quantum many-body physics.

Particularly, I don't believe that we find a solution of these problems by thinking about a "measurement problem". For that QT is simply too successful in describing all empirical facts within the above described realm of applicability (everything except gravitation and spacetime).
Fra said:
/Fredrik
 
  • #160
Fra said:
The things that are fine tuned in QM/QFT are for example the 4D spacetime continuum background. Here I am sure we disagree, but I personally associate this background structure to the "macroscopic classical environment" that to Bohr is the "observer". This is a connection between dynamics of spacetime background and the QM foundations (dynamics of observers). Here also the fact that there are no "infinite ensembles" in nature and that the ensemble is fiction, compares to that a fixed spacetime background is also a fiction.
I don't understand this association. A quantum theory of gravity/spacetime would be as equally subject to various interpretations and discussions about observers, closed systems etc as quantum mechanics, as you would still presumably have a noncommutative algebra of observables.
 
  • #161
vanhees71 said:
I never denied these obvious facts. How do you come to that conclusion?

Becuase you often finish in this way :)
vanhees71 said:
(everything except gravitation and spacetime).

It's because while you agree that we have not unified theory yet, but you seem to pragmatically categorize any attempt to analyse the structure of theories, and how different theories my be related in a bigger theoryspace (ie beyond what what they simply predict) as fruitless philosophy.

In particular in discussions about the "foundations of QM", you don't see any problems, because the subtle issues of conceptual and logical coherence in reasoning does not immediately manifest themselves as observable deviations today.

I admit that I like your pragmatic view, and the empirical stance, is that is a very important thing even in how I think of this, but I find that you are a bit too pragmatic to the point where you reject things that are admittedly a but fuzzy. But to me, the process of inquiry IS fuzzy.

I also agree that it often happens that things get too fruitless also for me. For example "interpretations" that has no aspiration to make a difference even in the future, or giving no insight into open problems, those discussions don't interest me. But I don't think that means one has to be either or. I think one can manage a balance.
vanhees71 said:
What I deny is the assumption that one makes progress by philosophical speculations about how Nature should behave rather than finding solid empirical hints
vanhees71 said:
Philosophy is "incomprehensibly ineffective", as Weinberg once put it.
I rather see it this way. The rate at which we do find new empiritcal hints, will increase if we know precisely where to look. And questions is then: What clues to be have from where we are? This is what this is all about for me. Is keep spending money to be increase accelerator energies the only way forward? I am not convinced, are you?

Your pragmatism seems to work like a noise reduction that rejects the some clues that we get from analysing the structure of the theory, and see on what ground it rests (premises, axioms, implicit prior information etc).
vanhees71 said:
Particularly, I don't believe that we find a solution of these problems by thinking about a "measurement problem". For that QT is simply too successful in describing all empirical facts within the above described realm of applicability
Exactly, which is to me another way of saying, as long as we ignore the different between finite and infinite "observers" or "ensembles". Your arguments are clear to me, so you are consistent so I think I understand your perspective. I prefer to keep looking for clues, where you seem to "wait for more data". Isn't the fine tuning and lack of even a coherent GUT, enough food for thought? Do we need more data to realize that we have no clear understanding on how one effective theory merges into another one, over the energy ranges, this seems be a conclusion you can draw from looking at the theory? The problem isn't nature, the problem is our theories. We can see it already now I think.

/Fredrik
 
  • Like
Likes Lord Jestocost
  • #162
I don't wish to diverge the thread to elaborated my own associations, as that wasn't the point. My main point was that at least I do see a link between the foundations of QM and the foundations of the future theory. And I tried to divert from twisting words to the more interesting discussions.

Morbert said:
I don't understand this association. A quantum theory of gravity/spacetime would be as equally subject to various interpretations and discussions about observers, closed systems etc as quantum mechanics, as you would still presumably have a noncommutative algebra of observables.
But very briefly:

Your comment here, to me, seems like you see QG as trying to understand what would happen if we can produce and observe black holes at accelerators; then yes, it would still be relative to the background spacetime where the lab is. But in a way that would also reproduce GR in the low energy limit. For example the thinking used in string theory.

But there is another logically possible perspective, which (relates more to fine tuning and how theories scale). You can ask; what OTHER theory than a QFT, describes the effective theories of inside observers, in a away that reproduces QFT in the limit of an infinitely massive observer; and yield gravity for observers with finite mass. This would then introduce "interactions" between two observers of finite mass (this the association to finite ensembles).

/Fredrik
 
  • #163
vanhees71 said:
you can't expect more from the natural sciences than precisely describing what's observed.
You don't expect more, but many others (including myself) expect more, namely to have a mathematically coherent explanation of the measurement process in terms of the fundamental dynamics realized in the universe.

I collected here (pp.5-7) a large number of quotes from very influential physicists of the past and the present, indicating that there is more to be expected.
 
Last edited:
  • Like
Likes DrChinese, dextercioby, gentzen and 3 others
  • #164
Sure, it's always nice to understand the observations from as close to first-principle descriptions as possible, and I've no doubt, that there is a lot to be learnt about the interaction of "quantum systems" with "macroscopic measurement devices" in greater detail than we know today, but I don't think that there'll be much more possible to be learnt concerning at least the rough picture we have today, i.e., the description of the macroscopic measurement device and the pobability distributions of the measurement outcomes due to some effective "open-quantum system description"?
 
  • Like
Likes Lord Jestocost
  • #165
vanhees71 said:
I don't think that there'll be much more possible to be learnt concerning at least the rough picture we have today,
But rough means refinable. Interpretation questions become relevant when one investigates the possibilities for refinement.
 
Last edited:
  • Like
Likes lodbrok, vanhees71 and Lord Jestocost
  • #166
I think it's rather tough mathematical problem than any philosophical pondering about"interpretation".
 
  • #167
vanhees71 said:
I think it's rather tough mathematical problem than any philosophical pondering about"interpretation".
Finding the right framework in which to solve tough mathematical problems that have been unsolved for years in spite of many attempts usually involves much philosophical pondering about the "interpretation" of the problem!

The philosophical part goes away only after the problems have been solved.
 
  • Like
Likes Lynch101, lodbrok, gentzen and 2 others
  • #168
A. Neumaier said:
Finding the right framework in which to solve tough mathematical problems that have been unsolved for years in spite of many attempts usually involves much philosophical pondering about the "interpretation" of the problem!

The philosophical part goes away only after the problems have been solved.
In your quantum tomography paper you say "A suggestive notion for what constitutes a quantum detector and for the behavior of its responses leads to a logically impeccable definition of measurement.". Is it your position that the thermal interpretation is a solution to the mathematical problems of measurement? Or more specifically, that it offers "a mathematically coherent explanation of the measurement process in terms of the fundamental dynamics realized in the universe"?
 
  • #169
Morbert said:
In your quantum tomography paper you say "A suggestive notion for what constitutes a quantum detector and for the behavior of its responses leads to a logically impeccable definition of measurement.". Is it your position that the thermal interpretation is a solution to the mathematical problems of measurement? Or more specifically, that it offers "a mathematically coherent explanation of the measurement process in terms of the fundamental dynamics realized in the universe"?
Almost. The definition of measurement is already impeccable, and within the formal framework of quantum mechanics. Thus - unlike in Born's rule, where measurement is an undefined notion - one can prove mathematical facts about measurement processes. The theory in the quantum tomography paper together with the thermal interpretation already goes a long way towards a solution. There are some unsettled issues (discussed towards the end of my paper), but the remaining issues are of a purely mathematical nature, and hence seem tractable (or can be refuted if incorrect).
 
  • #170
A. Neumaier said:
Almost. The definition of measurement is already impeccable, and within the formal framework of quantum mechanics. Thus - unlike in Born's rule, where measurement is an undefined notion - one can prove mathematical facts about measurement processes. The theory in the quantum tomography paper together with the thermal interpretation already goes a long way towards a solution. There are some unsettled issues (discussed towards the end of my paper), but the remaining issues are of a purely mathematical nature, and hence seem tractable (or can be refuted if incorrect).
Do you mean this as beeing of pure matematical nature?

"It was pointed out that to fully solve the quantum measurement problem, more research is needed on the characterization of quantum systems that are nonstationary on experimentally directly accessible time scales."
- page 91 in your paper https://arxiv.org/pdf/2110.05294.pdf

If so, don't you see potential conceptual complications with this, regarding to establish objectivity?

/Fredrik
 
  • #171
Fra said:
Do you mean this as being of pure mathematical nature?
yes. Purely mathematical concepts (in the shut up and calculate style) informed by the philosophy of the thermal interpretation.
Fra said:
"It was pointed out that to fully solve the quantum measurement problem, more research is needed on the characterization of quantum systems that are nonstationary on experimentally directly accessible time scales."
- page 91 in your paper https://arxiv.org/pdf/2110.05294.pdf
... on the mathematical characterization of such quantum systems.
Fra said:
If so, don't you see potential conceptual complications with this, regarding to establish objectivity?
Objectivity is properly discussed in Section 10.4 (p.84f) of my paper. Of course, observation of unique events of fleeting duration are only as objective as the observer taking notes of the event is. But this is in the nature of objectivity, and not a conceptual weakness.
 
  • #172
A. Neumaier said:
yes. Purely mathematical concepts (in the shut up and calculate style) informed by the philosophy of the thermal interpretation.
Do we presume your interpretation(which everyone may not), and then its purely mathematical given your framework. If so I may understand better.
A. Neumaier said:
Objectivity is properly discussed in Section 10.4 (p.84f) of my paper. Of course, observation of unique events of fleeting duration are only as objective as the observer taking notes of the event is. But this is in the nature of objectivity, and not a conceptual weakness.
Note sure I follow. In 10.4 you refer repeatedly in the arguments to stationarity.

"Through quantum tomography, the quantum state of a sufficiently stationary source, the quantum measure of a measurement device, and the transmission operator of a sufficiently linear and stationary filter can in principle be determined with observer-independent protocols. Thus they are objective properties of the source, the measurement device, or the filter, both before and after measurement."

/Fredrik
 
  • #173
Fra said:
Do we presume your interpretation(which everyone may not), and then its purely mathematical given your framework. If so I may understand better.
Assumed are the definitions given in the paper, which are formal and mathematical, together with their interpretation, which are informal and philosophical, also given in the paper.
Fra said:
Note sure I follow. In 10.4 you refer repeatedly in the arguments to stationarity.

"Through quantum tomography, the quantum state of a sufficiently stationary source, the quantum measure of a measurement device, and the transmission operator of a sufficiently linear and stationary filter can in principle be determined with observer-independent protocols. Thus they are objective properties of the source, the measurement device, or the filter, both before and after measurement."
Yes, verifiable objectivity is tied (even in classical physics) to approximate repeatability. This either requires a sufficiently stationary source, or a nonstationary source that can be taken to be a stationary source of identically distributed short time nonstationary processes. Such a nonstationary source must exhibit some form of ergodicity (discussed on p.77f), to be proved or taken as empirically given.
 
  • Like
Likes Fra and gentzen
  • #174
A. Neumaier said:
Finding the right framework in which to solve tough mathematical problems that have been unsolved for years in spite of many attempts usually involves much philosophical pondering about the "interpretation" of the problem!

The philosophical part goes away only after the problems have been solved.
But there are many solutions (for particularly simple cases though), deriving (semi-)classical transport equations from quantum many-body theory or the entire field of "open quantum systems", using Markovian approximations in terms of quantum master equations (Lindblad). I think this vast work gives enough glimpses on more comprehensive descriptions to exorcize any philosophical speculations ;-)).
 
  • #175
vanhees71 said:
But there are many solutions (for particularly simple cases though), deriving (semi-)classical transport equations from quantum many-body theory or the entire field of "open quantum systems", using Markovian approximations in terms of quantum master equations (Lindblad). I think this vast work gives enough glimpses on more comprehensive descriptions to exorcize any philosophical speculations ;-)).
I know all this. But unless one adopts the thermal interpretation, it doesn't answer questions about observations on single systems. For example, Lindblad equations and all decoherence arguments always average over a whole ensemble of identically prepared systems.
 
  • #176
A. Neumaier said:
Finding the right framework in which to solve tough mathematical problems that have been unsolved for years in spite of many attempts usually involves much philosophical pondering about the "interpretation" of the problem!

The philosophical part goes away only after the problems have been solved.
vanhees71 said:
I think this vast work gives enough glimpses on more comprehensive descriptions to exorcize any philosophical speculations ;-)).
I think that the "philosophical pondering" and the "philosophical part" here should not be confused with "philosophical speculations". More likely, the "philosophical pondering about the interpretation of the problem" will turn out to be mostly metamathematics, with a small amount of linguistics and semantics. We know that you are not a huge fan of semantics either, but words do have meaning, and mathematical formalisms can have meaning too.

Note also that the word "interpretation" can have two slightly different meanings. One of the meanings is to give a mathematical model of a theory. The other meaning is to explain how a mathematical theory is used in its applications. Dismissing anything which requires careful use of words and their meaning as philosophy ensures that "problems ... unsolved for years" will continue to remain unsolved.
gentzen said:
My impression is that linguistic and metamathematics are a huge part of analytical philosophy, and perhaps most of the stuff called "philosophy" in this forum should also better be just called metamathematics.
gentzen said:
And if analytic philosophy had never happened, this would be totally unproblematic. They tried to "save" philosophy from metaphysics and postmodern nonsense. But because of them, substantial parts of most structural sciences and linguistic are now part of philosophy.
 
  • Like
Likes Lynch101, lodbrok and vanhees71
  • #177
A. Neumaier said:
I know all this. But unless one adopts the thermal interpretation, it doesn't answer questions about observations on single systems. For example, Lindblad equations and all decoherence arguments always average over a whole ensemble of identically prepared systems.
Of course, they do that formally, but as discussed many times, you can also interpret it as averaging over parts of a system over microscopically large, macroscopically small, space-time volumes. Of course, this assumes a separation of scales in this sense, i.e., that "quantum fluctuations" are on small space-time scales, while the "relevant" macroscopic observables, referring to local but microscopically large numbers of "microscopic degrees of freedom" are varying on large macroscopic space-time scales. This is behind the idea of the gradient expansion to go from the full microscopic Kadanoff-Baym equations (or many-body Dyson-Schwinger equations) to a semiclassical Boltzmann-like transprot equation in the Wigner representation.

In this sense you get an effective macroscopic description of single macroscopic systems from the underlying (probabilistic) quantum dynamics of its microscopic constituents.
 
  • #178
vanhees71 said:
Of course, they do that formally, but as discussed many times, you can also interpret it as averaging over parts of a system over microscopically large, macroscopically small, space-time volumes. Of course, this assumes a separation of scales
But you cannot do this to analyze a single measurement of a single particle, say. At least the analysis is highly nontrivial, and nobody succeeded in giving for this a precise analysis without smuggling in Born's rule, which assumes many measurements to be meaningful. This is precisely the step that is missing in the solution of the measurement problem through the thermal interpretation.
 
  • #179
Macroscopic observable do not refer to single particles but are rather collective variables. E.g., for a solid body, approximated as a classical point particle you consider the center-of-mass position vector. For a gas close to equilibrium you use hydrodynamics, which describes the flow of "fluid cells" which are macroscopically small but microscopically large, i.e., they still contain many particles. The quantum flucutations are overwhelmed by the thermal fluctuations, which in turn are also pretty small on the macroscopic scale. That's why you get effectively classical behavior of macroscopic systems. What do you think is still missing in the understanding of classical behavior from the underlying microscopic (quantum) dynamics? I thought this is indeed much in the spirit of your "thermal interpretation", although you deny the standard use of probabilities as defined by the general Born rule, for a reason I still do not understand.
 
  • #180
vanhees71 said:
What do you think is still missing in the understanding of classical behavior from the underlying microscopic (quantum) dynamics? I thought this is indeed much in the spirit of your "thermal interpretation", although you deny the standard use of probabilities as defined by the general Born rule, for a reason I still do not understand.
Did you ever heard of Pasch's axiom? It was THE axiom missing from Euclid's axioms. You may think, why is it missing, isn't it SO OBVIOUS that we don't even need to write down that axiom? Well, if you just look at the theory defined by Euclid's axioms, then that theory would also allow other models for which many constructions from Euclid would not work, and many theorems from Euclid would not be true. Of course, we all know that those models were not intended by Euclid, and that is precisely why we can say that Pasch's axiom was missing.
gentzen said:
However, the state might not be the only reason, why there is a measurement problem. For example, A. Neumaier's thermal interpretation uses q-expectations and q-corrections instead of the state. But even here, you don't "automatically" solve the problem of unique results. The thermal interpretation needs an additional assumption for that (the assumption is stated as: there is only a single world).
You may wonder, what is the alternative to there being only a single world. Well, there could be two worlds, or three worlds, or 42 worlds, or infinitely many worlds. And if you have for example three worlds, then there are some ways how those three worlds could related to our experiences.
But as long as you cannot accept that Pasch's axiom was really missing from Euclid's axioms, you will have a very though time trying to make sense of that.
 
  • #181
vanhees71 said:
Macroscopic observable do not refer to single particles but are rather collective variables.
But in a measurement they refer to a property of the single system measured.
vanhees71 said:
What do you think is still missing in the understanding of classical behavior from the underlying microscopic (quantum) dynamics? I thought this is indeed much in the spirit of your "thermal interpretation",
It is. The unsolved question is how precisely a single interaction with a single particle is reflected mathematically in the corresponding collective variable read from the detector.
vanhees71 said:
although you deny the standard use of probabilities as defined by the general Born rule, for a reason I still do not understand.
I never denied this; it is included as a special case. But the thermal interpretation goes beyond it in claiming approximate but objective properties for single systems, where Born's rule (which is about properties of identically prepared ensembles) is silent. This is needed to give the term measurement a formal mathematical meaning.
 
  • #182
A. Neumaier said:
But in a measurement they refer to a property of the single system measured.

It is. The unsolved question is how precisely a single interaction with a single particle is reflected mathematically in the corresponding collective variable read from the detector.

I never denied this; it is included as a special case. But the thermal interpretation goes beyond it in claiming approximate but objective properties for single systems, where Born's rule (which is about properties of identically prepared ensembles) is silent. This is needed to give the term measurement a formal mathematical meaning.
That's what I never understood. For me you have "objective properties for single systems" in the case of macroscopic systems, where the macroscopic coarse-grained description is sufficient for the description of these properties, because the fluctuations (standard deviations) of the "relevant macroscopic observables" is small to the relevant scale of these observables' values. Then it's "almost certain" to find a specific value given by the macroscopic properties of the system (the extreme case is thermal equilibrium, where temperature and chemical potential(s) determine these values).

What is the concrete generalization of this statistical standard argument of (quantum) statistical physics and why do you need it?
 
  • #183
vanhees71 said:
That's what I never understood. For me you have "objective properties for single systems" in the case of macroscopic systems, where the macroscopic coarse-grained description is sufficient for the description of these properties, because the fluctuations (standard deviations) of the "relevant macroscopic observables" is small to the relevant scale of these observables' values. Then it's "almost certain" to find a specific value given by the macroscopic properties of the system (the extreme case is thermal equilibrium, where temperature and chemical potential(s) determine these values).
This is only an informal argument that must be made mathematically cogent. Herein lies the problem.
vanhees71 said:
What is the concrete generalization of this statistical standard argument of (quantum) statistical physics and why do you need it?
What needs to be proved is that the unitary dynamics for a macroscopic system, coupled to a single particle in a way that the latter acts as a detector, almost always produces to high accuracy a measurement outcome (coarse-grained expectation value) that equals one of the eigenvalues of the quantum observable measured.

If one defines the macroscopic system by a mixed state corresponding to a grand canonical ensemble with time-dependent intensive variables (which would be the naive attempt implied by your description) then the unitary dynamics produces instead a superposition of macroscopic systems, each one corresponding to one of the possible eigenvalues.

Thus the naive approach does not give the physically observed answer, and one needs something more sophisticated, something unknown so far. My informal analysis of what is needed points to chaotic motion that (due to the environment) settles quickly to an equilibrium state. But the standard decoherence arguments always take an average somewhere, hence produce only an average answer, but not the required answer in almost every single case. Thus one needs better mathematical tools that apply to the single case. I am working on these, but progress is slow.
 
Last edited:
  • Like
Likes Fra, mattt, physika and 3 others
  • #184
A. Neumaier said:
But the standard decoherence arguments always take an average somewhere, hence produce only an average answer, but not the required answer in almost every single case. Thus one needs better mathematical tools that apply to the single case. I am working on these, but progress is slow.
Mathematically it is the difference between convergence in the mean and almost everywhere convergence. The latter is much harder to achieve than the former. All arguments I have seen in the statistical mechanics of nonequilibrium processes (the measurement process clearly is such a process) are about convergence in the mean only.
 
  • Like
Likes mattt and vanhees71
  • #185
Ok, but why do you think that's not sufficient? If the "fluctuations" (standard deviations) are small on the scale of the relevant observables, the result is with high probability the mean (expectation value).
 
  • #186
A. Neumaier said:
Thus the naive approach does not give the physically observed answer, and one needs something more sophisticated, something unknown so far. My informal analysis of what is needed points to chaotic motion that (due to the environment) settles quickly to an equilibrium state. But the standard decoherence arguments always take an average somewhere, hence produce only an average answer, but not the required answer in almost every single case. Thus one needs better mathematical tools that apply to the single case. I am working on these, but progress is slow.
But in such a case there is no predetermined single outcome, because your measured system is not well-described by a coarse-grained state. If you do a Stern-Gerlach experiment with silver atoms from an oven, you don't expect to find each silver atom at one spot corresponding to the average value 0 of the magnetic moment but randomly (with equal probability) at the one spot for a magnetic moment of +1 magneton or the one for -1 magneton. Indeed, the single Ag atom in this state has no predetermined direction of its magnetization but a random one, and that's what you want to get with your calculation.
 
  • #187
vanhees71 said:
Ok, but why do you think that's not sufficient? If the "fluctuations" (standard deviations) are small on the scale of the relevant observables, the result is with high probability the mean (expectation value).
No. If the Born probability is 50% for two possible results then the result is with high probability one of two values, while the naively predicted expectation is the mean of the two values.
vanhees71 said:
If you do a Stern-Gerlach experiment with silver atoms from an oven, you don't expect to find each silver atom at one spot corresponding to the average value 0 of the magnetic moment but randomly (with equal probability) at the one spot for a magnetic moment of +1 magneton or the one for -1 magneton.
Precisely. This means that in each single case you must get a macroscopic state describing exactly one of the two spots, but what one gets in each single case from a naive argument is instead a superposition of two macroscopic states, one for each spot!
 
  • Like
Likes mattt, physika and PeterDonis
  • #188
I'd say you rather get a mixed state due to decoherence, but that's of course irrelevant for the argument.

Still, I think this is a goal that never can be achieved, because what QT tells you is that indeed the Ag atom hasn't a determined value of the measured component of the magnetic moment before it went through the magnet. It's completely random with probability 1/2 for either of the possible outcomes. That's why for a single experiment there's no way to know beforehand, what will come out, and all that's provided by QT are these probabilities. A good measurement device delivers these correct probabilities for the outcomes in the limit of large statistical samples. It cannot deliver more, because the measured observable's values are true random variables, i.e., this randomness is not due to our ignorance about the state of the Ag atom but because it's truely random. Isn't this "non-realism" the almost inevitable conclusion of all the Bell tests?
 
  • Like
Likes Lord Jestocost
  • #189
vanhees71 said:
I'd say you rather get a mixed state due to decoherence, but that's of course irrelevant for the argument.
Well, actually one gets a mixed state that is something like a superposition of the grand canonical states corresponding to the two measurement results; talking about superpositions when starting with a mixed state is loose talk only.
vanhees71 said:
Still, I think this is a goal that never can be achieved, because what QT tells you
... what the minimal statistical interpretation of QT tells you ...

But the thermal interpretation is more than the minimal statistical interpretation;
the latter appears only as a special case.
vanhees71 said:
is that indeed the Ag atom hasn't a determined value of the measured component of the magnetic moment before it went through the magnet.
Like the detector, the atom, properly prepared, has a definite pure or mixed state, hence has according to the thermal interpretation definite properties. Thus there is a well-defined mathematical problem to be solved, and whether it is solvable is an open question.
vanhees71 said:
It's completely random with probability 1/2 for either of the possible outcomes. That's why for a single experiment there's no way to know beforehand, what will come out, and all that's provided by QT are these probabilities.
This is what happens in practice, i.e., when there is much uncertainty about most details.

But according to the thermal interpretation, a complete knowledge of the joint state of the detector, the atom, and the environment at the start of the experiment determines the joint state deterministically and unitarily. Thus one can in principle find out the properties of the detector at the end of the measurement. That one cannot do it in practice doesn't matter; one cannot calculate in practice the Newtonian dynamics of an N-particle system; nevertheless one can answer many qualitative questions.
vanhees71 said:
this randomness is not due to our ignorance about the state of the Ag atom but because it's truely random. Isn't this "non-realism" the almost inevitable conclusion of all the Bell tests?
No. The violation of Bell's inequalities in Bell tests says nothing at all about true randomness, since Bell's inequalities are based on a classical model of physics, hence are completely silent about quantum mechanics. (Except that they prove that Bell's assumptions are not valid in quantum mechanics.)
 
  • #190
vanhees71 said:
A good measurement device delivers these correct probabilities for the outcomes in the limit of large statistical samples. It cannot deliver more, because the measured observable's values are true random variables, i.e., this randomness is not due to our ignorance about the state of the Ag atom but because it's truely random. Isn't this "non-realism" the almost inevitable conclusion of all the Bell tests?
[Bold by LJ]

The question regarding this should be:

Are there any experimentally verifiable hints which might question this inevitable conclusion!
 
  • #191
No! If there were, we'd need a new theory, different from QT.
 
  • Like
Likes Lord Jestocost
  • #192
vanhees71 said:
If you do a Stern-Gerlach experiment with silver atoms from an oven, you don't expect to find each silver atom at one spot corresponding to the average value 0 of the magnetic moment but randomly (with equal probability) at the one spot for a magnetic moment of +1 magneton or the one for -1 magneton. Indeed, the single Ag atom in this state has no predetermined direction of its magnetization but a random one, and that's what you want to get with your calculation.
A. Neumaier said:
Precisely. This means that in each single case you must get a macroscopic state describing exactly one of the two spots, but what one gets in each single case from a naive argument is instead a superposition of two macroscopic states, one for each spot!
The naivety is in the construction of the sample space of experimental outcomes. If we construct a sample space that includes the outcomes +1 and -1, then we would not expect, for any single run, a superposition of the two.

In QM, unlike classical mechanics, there is no unique, maximally fine-grained sample space of outcomes, and so the experimenter must always select one appropriate for the experiment they are interested in. It's something I tried to explore in this thread
 
Last edited:
  • #193
Morbert said:
The naivety is in the construction of the sample space of experimental outcomes. We construct a sample space that includes the outcomes +1 and -1, then we would not expect, for any single run, a superposition of the two.

In QM, unlike classical mechanics, there is no unique, maximally fine-grained sample space of outcomes, and so the experimenter must always select one appropriate for the experiment they are interested in. It's something I tried to explore in this thread
But since the experimenter is part of the environment, its activities (''must always select'') should be explainable in terms of the physical laws - at least if the experimenter is just a machine doing the recordings. The unique outcome must come from somewhere...

The natural - and the only natural - source for the unique outcome is symmetry breaking due to chaoticity. It is of the same kind as the choice made by a straight Newtonian rod subject to an increasing longitudinal force that at some point makes the rod bend in a random direction. (in 2D physics, this would result in a binary choice.)

The unsolved problem is how to make this principle work mathematically in the quantum case in such a way that, in sufficient generality, the correct Born probabilities appear.
 
  • #194
A. Neumaier said:
The natural - and the only natural - source for the unique outcome is symmetry breaking due to chaoticity.
No. Roland Omnès, a proponent of the consistent histories interpretation, was clear that one cannot disprove MWI. One cannot prove unique outcomes, at least not without going beyond non-relativistic QM. In non-relativistic QM (i.e. where Bohmian Mechanics works), one cannot even disprove that there are not three, or 42 outcomes. So for somebody like me, who is not very good at QFT, the only reasonable way forward is to assume unique outcomes as an additional axiom, and only try to show that the resulting theory is still consistent.

A. Neumaier said:
The unsolved problem is how to make this principle work mathematically in the quantum case in such a way that, in sufficient generality, the correct Born probabilities appear.
And if you don't want to use an additional axiom, then you risk to need a huge amount of QFT knowledge. The drawback of this is that the amount of people able to follow your mathematical proof (if you should be able to find one) will be very small.
 
  • #195
gentzen said:
No. Roland Omnès, a proponent of the consistent histories interpretation, was clear that one cannot disprove MWI.
I am not trying to disprove MWI. I think MWI is completely nonpredictive since everything happens that can possibly happen. Thus it has no scientific content at all.
gentzen said:
One cannot prove unique outcomes, at least not without going beyond non-relativistic QM.
Not in theminimal interpretation, by its assumptions. But the thermal interpretation is not minimal but maximal, hence has a broader basis from which to proceed.
gentzen said:
And if you don't want to use an additional axiom, then you risk to need a huge amount of QFT knowledge. The drawback of this is that the amount of people able to follow your mathematical proof (if you should be able to find one) will be very small.
Whatever will be needed will be used. Who can follow the arguments is a matter of time. In the beginnings of relativity theory there were only a handful experts understanding it, but now even lay people believe to understand the essentials. The same happens whenever some new approach settles something that had been a long term puzzle.
 
  • #196
A. Neumaier said:
But since the experimenter is part of the environment, its activities (''must always select'') should be explainable in terms of the physical laws - at least if the experimenter is just a machine doing the recordings. The unique outcome must come from somewhere...
Not according to standard QT. We indeed also don't need an "experimenter" (not to run in the even more strange idea the "final collapse" would need a "conscious observer" a la von Neumann/Wigner ;-)), just a measurement device, which stores the result somehow (that's how modern experiments in particle physics work: you have detectors, which store the results of measurements electronically, and these data can then be read out and evaluated later).

Taking QT in its minimal statistical interpretation seriously, and for me that's the most straight-forward conclusion of all the experiments testing QT (particularly "Bell tests"), there is no cause, for the oucome of the measurement on a single system. The measured observable has not have a determined value before the measurement, and that's why the outcome is unpredictable, and only with a sufficiently "large statistical sample" of equally performed experiments (equally prepared systems) you can test the predicted probabilities of QT. There's no way to know the unique outcome of a measurement, given the preparation of the system, because the measured observable takes random values with probabilities predicted by QT.

The Bell tests, demonstrating the violation of Bell's inequalities, at least tell us that if you assume "locality" in the usual sense of relativistic theories, including standard relativistic, microcausal QFT, you must accept that "realism" has to be given up, where "realism" means that there is some hidden cause behind the outcome of a measurement on an individual system, i.e., the randomness of the measurement outcomes is "only due to our ignorance of this cause" (usually described as the existence of some additional hidden variables, which we can't observe or simply don't know for whatever reasons).
A. Neumaier said:
The natural - and the only natural - source for the unique outcome is symmetry breaking due to chaoticity. It is of the same kind as the choice made by a straight Newtonian rod subject to an increasing longitudinal force that at some point makes the rod bend in a random direction. (in 2D physics, this would result in a binary choice.)
I don't understand, what this has to do with the unique-outcome quibble. In this example you can always argue within classical physics, and the direction the rod bends is simply due to some asymmetry of the imposed force, which we are not able to determine because of limitations of our control about the direction of this force.
A. Neumaier said:
The unsolved problem is how to make this principle work mathematically in the quantum case in such a way that, in sufficient generality, the correct Born probabilities appear.
I don't understand, where the motivation for this task comes from, given that all tests confirm QT, which tells us that there are in fact no causes that determine the individual measurement outcome.

I think to solve this task you necessarily must find a theory different from standard QT (e.g., something like GRW, where they assume some additional stochastic dynamics which causes the collapse of the quantum state as a real dynamical process). It may well be, that you can construct such a theory, but there's no hint yet that this really is necessary to describe what we observe in Nature, i.e., "irreducibly random" outcomes of measurements on single quantum systems.
 
  • #197
vanhees71 said:
Not according to standard QT.
Not??? Only according to the minimal interpretation! Standard QT is silent about this. The unique outcome is a very reliably observed fact that in my opinion needs to be explained!
vanhees71 said:
Taking QT in its minimal statistical interpretation seriously,
I don't take it seriously since it is too minimal. The thermal interpretation is a more comprehensive maximal interpretation of QT.
vanhees71 said:
The Bell tests, demonstrating the violation of Bell's inequalities, at least tell us that if you assume "locality" in the usual sense of relativistic theories, including standard relativistic, microcausal QFT, you must accept that "realism" has to be given up, where "realism" means that there is some hidden cause behind the outcome of a measurement on an individual system,
No. Instead of argueing against interpretation issues you should read the literature analysing the interpretations - so that you know what can be asserted.

One can conclude from the Bell tests only that there is no classical local hidden variable interpretation of quantum mechanics.
vanhees71 said:
In this example you can always argue within classical physics, and the direction the rod bends is simply due to some asymmetry of the imposed force, which we are not able to determine because of limitations of our control about the direction of this force.
This example is indeed classical physics, since it was intended to serve as an analogy for what needs to be shown in the quantum case.

In my example, the force is exactly longitudial, so the situation is exaclty symmetric, and deterministic elasticity thery predicts no bend. The observed bend is due to random fluctuations (or imperfections) in the dynamics.

The same is likely to hold in quantum dynamics. I expect that noise and imperfections in preparation and experimental setup disturb the theoretical quantum dynamics and (together with dissipation in the environment) produce by symmetry breaking a unique outcome rather than the symmetric superposition.

No additional stochastic dynamics as in GRW should be needed, since coarse-graining produces enough chaoticity.
vanhees71 said:
given that all tests confirm QT, which tells us that there are in fact no causes that determine the individual measurement outcome.
The tests confirm QT. But they do not tell me that there are in fact no causes that determine the individual measurement outcome.
 
Last edited:
  • Like
Likes Lynch101, weirdoguy, Fra and 3 others
  • #198
A. Neumaier said:
Not only according to the minimal interpretation! Standard QT is silent about this. The unique outcome is a very reliably observed fact that in my opinion needs to be explained!

I don't take it seriously since it is too minimal. The thermal interpretation is a more comprehensive maximal interpretation of QT.
But obviously it also can't explain the unique mesurement outcome in the sense you describe it!
A. Neumaier said:
No. Instead of argueing against interpretation issues you should read the literature analysing the interpretations - so that you know what can be asserted.

One can conclude from the Bell tests only that there is no classical local hidden variable interpretation of quantum mechanics.
But isn't it precisely this what you want? I.e., you want a theory, where there's a cause for the single-measurment outcome, i.e., that there is some whatever "hidden cause" behind this outcome or that the world is, in some "hidden way" deterministic.
A. Neumaier said:
This example is indeed classical physics, since it was intended to serve as an analogy for what needs to be shown in the quantum case.

In my example, the force is exactly longitudial, so the situation is exaclty symmetric, and deterministic elasticity thery predicts no bend. The observed bend is due to random fluctuations (or imperfections) in the dynamics.
Exactly, and the randomness of these fluctuations is only due to our ignorance. In classical physics, as a deterministic theory, "in reality" the imperfections are there and fully determined.
A. Neumaier said:
The same is likely to hold in quantum dynamics. I expect that noise and imperfections in preparation and experimental setup disturb the theoretical quantum dynamics and (together with dissipation in the environment) produce by symmetry breaking a unique outcome rather than the symmetric superposition.
But the only result of the quantum dynamics of an ideal closed quantum system is always only probabilities, i.e., QT is "only" probabilistic.
A. Neumaier said:
No additional stochastic dynamics as in GRW should be needed, since coarse-graining produces enough chaoticity.
But this argument you did not accept so far. For me that's indeed all that's needed (at least FAPP), i.e., the classical behavior of macroscopic systems (including mesurement devices) is sufficiently explained by "coarse-graining" to the "relevant collective macroscopic observables", e.g., a grain of silver salt in a photo plate blackened due to the interaction with a single photon, although which point at the plate will be blackened you cannot know given the single-photon state.
A. Neumaier said:
The tests confirm QT. But they do not tell me that there are in fact no causes that determine the individual measurement outcome.
Of course not, because QT claims there are no causes. You'd need a new deterministic theory to get this. It will be very difficult to find one in accordance with relativistic causality, because it should be a non-local theory, and the only relativistic deterministic theories are local (!!!) classical field theories, which obviously cannot explain the results of the corresponding (local) QFTs. So you'd need some non-local classical field theory, in accordance with relativistic causality. Obviously that's a very difficult task!
 
  • #199
vanhees71 said:
It cannot deliver more, because the measured observable's values are true random variables, i.e., this randomness is not due to our ignorance about the state of the Ag atom but because it's truely random.

vanhees71 said:
Taking QT in its minimal statistical interpretation seriously, and for me that's the most straight-forward conclusion of all the experiments testing QT (particularly "Bell tests"), there is no cause, for the oucome of the measurement on a single system. The measured observable has not have a determined value before the measurement, and that's why the outcome is unpredictable, and only with a sufficiently "large statistical sample" of equally performed experiments (equally prepared systems) you can test the predicted probabilities of QT. There's no way to know the unique outcome of a measurement, given the preparation of the system, because the measured observable takes random values with probabilities predicted by QT.
You complain a lot about philosophy but you practice the worst kind yourself very often here.
[Bold emphasis is mine]
 
  • Like
Likes weirdoguy and gentzen
  • #200
vanhees71 said:
But obviously it also can't explain the unique mesurement outcome in the sense you describe it!
Not yet. But this will change in due time. Difficult problems are not solved overnight.
vanhees71 said:
you want a theory, where there's a cause for the single-measurment outcome,
Yes, and the thermal interpretation provides a framework for doing that.
vanhees71 said:
i.e., that there is some whatever "hidden cause" behind this outcome or that the world is, in some "hidden way" deterministic.
It is neither local nor hidden, hence Bell's assumptions don't apply.

It is not hidden since my observables are the N-point correlation functions of QFT, as they were always used, but without the ensemble interpretation - which does not make sense for quantum fields in spacetime, since a spacetime field cannot be prepared repeatedly.

And it is not local in Bell's sense since correlation functions are not local but multilocal observables. Nevertheless, causality is guaranteed by the standard QFT approach.
vanhees71 said:
Exactly, and the randomness of these fluctuations is only due to our ignorance. In classical physics, as a deterministic theory, "in reality" the imperfections are there and fully determined.
And one may presume that the same holds in quantum physics, without having to assume irreducible randomness.
vanhees71 said:
But the only result of the quantum dynamics of an ideal closed quantum system is always only probabilities, i.e., QT is "only" probabilistic.
Only when you take the minimal interpretation stance. But this restricts attention to only a small part of the possibilities that are open in the thermal interpretation.
vanhees71 said:
Of course not, because QT claims there are no causes.
No. You claim that, without giving proof.

QT does not talk about causes. It has no concept that specifies what a cause should mean.
vanhees71 said:
You'd need a new deterministic theory to get this.
No. The old deterministic unitary dynamics represented by the Wightman axioms for N-point function suffices.

vanhees71 said:
It will be very difficult to find one in accordance with relativistic causality,
No. It is manifest in relativistic QFT.
vanhees71 said:
because it should be a non-local theory,
It is Bell-nonlocal but causal, hence local in your terminology.
 
Back
Top