A Evaluate this paper on the derivation of the Born rule

Click For Summary
The discussion revolves around the evaluation of the paper "Curie Weiss model of the quantum measurement process" and its implications for understanding the Born rule in quantum mechanics. Participants express interest in the authors' careful analysis of measurement processes, though some raise concerns about potential circular reasoning in deriving the Born rule from the ensemble interpretation of quantum mechanics. The conversation highlights the relationship between state vectors, probability, and the scalar product in Hilbert space, emphasizing the need for a clear understanding of measurement interactions. There is also skepticism regarding the applicability of the model to real experimental setups, with calls for more precise definitions and clarifications of the concepts involved. Overall, the discourse reflects a deep engagement with the complexities of quantum measurement theory.
  • #91
vanhees71 said:
Physical observables are defined by concrete measurement devices in the lab, not by a theoretical formalism.
These concrete devices are calibrated by using quantum mechanical theory for checking that they actually do what they do. Without having already quantum mechanics working one cannot validate any of these checks. One doesn't know the state a laser produces without knowing the theory of the laser, etc. Thus one cannot check the definition of a physical observable (such as spin up) that goes into the theory with which something is computed without having already the theory. This is standard circularity.
 
Physics news on Phys.org
  • #92
vanhees71 said:
Now you switch to partition sums, i.e., thermodynamical systems.
and you choose an exactly solvable system, which I said are very special cases, the only cases where one can use the sum over probabilities to calculate the partition function. Yes, in particular cases, Born's rule applies and probabilities are used to do the calculations. But these are very special cases.

And even in your partition sum there is not a single measurement but only computations, hence Born's rule (which, according to wikipedia, is ''a law of quantum mechanics giving the probability that a measurement on a quantum system will yield a given result'') doesn't apply. You pay lip service to Born's rule but you don't use it in your computations.
 
  • #93
vanhees71 said:
In homodyne detection what's measured are intensities
What I meant was using homodyne detection to measure simultaneously the quadratures (which are noncommuting optical analogues of position and momentum) by splitting the photon beam 50:50 and then making homodyne measurements on each beam. Of course the raw measurements are measurements of intensities, but in terms of the input, what is measured (inaccurately, within the validity of the uncertainty relation) are noncommuting quadratures.
 
Last edited:
  • #94
vanhees71 said:
Physical observables are defined by concrete measurement devices in the lab, not by a theoretical formalism.
Take as very concrete example the Hamiltonian, which is the observable that goes into the computation of the partition function of a canonical ensemble. How do you define this observable by a concrete measurement device in the lab, that would give according to Born's rule as measurement result the ##k##th energy level ##E_k## with probability ##e^{-\beta E_k}##?

The impossibility of giving such a device proves that defining the meaning of observables and of (accurate) measurements is a thoroughly theoretical process, not just one of experimentation!
 
  • #95
A. Neumaier said:
These concrete devices are calibrated by using quantum mechanical theory for checking that they actually do what they do. Without having already quantum mechanics working one cannot validate any of these checks. One doesn't know the state a laser produces without knowing the theory of the laser, etc. Thus one cannot check the definition of a physical observable (such as spin up) that goes into the theory with which something is computed without having already the theory. This is standard circularity.
Sure, it's well known that physics is "circular" in this way. You need theory to construct measurement devices. At the same time these devices are used to check the very theory on which there construction is based. In a sense, testing the theories is just a test of the consistency of the theory with the observations made.

Spin is a good example. The Stern Gerlach experiment was undertaken before quantum theory in its modern form and before also the modern notion of "spin" has been discovered. The theory used was classical mechanics with some ideas from the Bohr-Sommerfeld model and what was tested were hypotheses based on it. The main trouble in this context was the "anomalous Zeeman effect" which could not be well explained by the Bohr-Sommerfeld model. For a very amusing account of the history (including the fact that without bad cigars the SG experiment most likely would have failed ;-)), see

https://faculty.chemistry.harvard.edu/files/dudley-herschbach/files/how_a_bad_cigar_0.pdf
 
  • Like
Likes Leo1233783
  • #96
A. Neumaier said:
What I meant was using homodyne detection to measure simultaneously the quadratures (which are noncommuting optical analogues of position and momentum) by splitting the photon beam 50:50 and then making homodyne measurements on each beam. Of course the raw measurements are measurements of intensities, but in terms of the input, what is measured (inaccurately, within the validity of the uncertainty relation) are noncommuting quadratures.
I'm not familiar with all applications of homodyne measurements. Before I can comment on this, please give a definition of what experiment you precisely have in mind. What's measured in Quantum Optics are, technically speaking, usually correlation functions of field operators. Such correlation functions sometimes refer to "incomplete measurements" of incompatible observables. How does this, in your opinion, invalidate the standard postulates of Q(F)T? I'm not aware that quantum optics is based on another theory than standard QT.
 
  • #97
A. Neumaier said:
Take as very concrete example the Hamiltonian, which is the observable that goes into the computation of the partition function of a canonical ensemble. How do you define this observable by a concrete measurement device in the lab, that would give according to Born's rule as measurement result the ##k##th energy level ##E_k## with probability ##e^{-\beta E_k}##?

The impossibility of giving such a device proves that defining the meaning of observables and of (accurate) measurements is a thoroughly theoretical process, not just one of experimentation!
Hm, that's not so easy. In principle you can measure it by looking at the emission spectrum of the gas (of course the temperature should be large enough so that the higher than ground states are populated). The relative strengths of different lines is governed by the Boltzmann distribution.
 
  • #98
vanhees71 said:
it's well known that physics is "circular" in this way.
But theoretical physics does not need to be circular; one can have a good theory with a noncircular interpretation in terms of experiments.

While one is still learning about the phenomena in a new theory, circularity is unavoidable. But once things are known for some time (and quantum physics is known in this sense for a very long time) the theory becomes the foundation and physical equipment and experiments are tested for quality by how well they match the theory. Even the definitions of units have been adapted repeatedly to better match theory!

vanhees71 said:
Hm, that's not so easy. In principle you can measure it by looking at the emission spectrum of the gas
But this gives you energy differences, not energy levels. This does not even closely resemble Born's rule!
Moreover, it is a highly nontrivial problem in spectroscopy to deduce from a collection of measured spectral lines the energy levels! And it cannot be done for large molecules over an extended energy range, let alone for a brick of iron.

vanhees71 said:
The relative strengths of different lines is governed by the Boltzmann distribution.
No. It depends also on selection rules and how much they are violated in a particular case. It is quite complicated.

vanhees71 said:
I'm not familiar with all applications of homodyne measurements. Before I can comment on this, please give a definition of what experiment you precisely have in mind.
I mentioned everything necessary. To approximately measure the two quadratures of photons in a beam one passes them through a symmetric beam splitter and then measured the resulting superposition of photons in the two beams by a homodyne detection on each beam. Details are for example in a nice little book by Ulf Leonhardt, Measuring the quantum state of light. This is used in quantum tomography; the link contains context and how the homodyne detection enters.
 
Last edited:
  • #99
vanhees71 said:
Well, in physics you don't want a self-contradictory description. The collapse hypothesis is incompatible with the very foundations of physics, i.e., the causality structure of relativistic spacetime. So why should you assume such a thing? I don't know since I don't know a single example of a real-world experiment, where this assumption is really needed.

I won't use the world collapse form now on, It has meanings that I don't intend. It is also very bad terminology. Let's use the following language from now on, We prepare a system in a state, described as ##|\psi_{in}>##. The system was measured to be in a state described as ##|\psi_{out}>## with a probability ##<\psi_{in}|\psi_{out}>^2##, When we use apparatus where we destroy the particle the appropriate clarification must be made. The wave function is our description of the system. What ##|\psi_{in}>## and ##|\psi_{out}>## are depend on the details of the experimental apparatus.

This must be non controversial to both of us.(Right?)
 
Last edited:
  • #100
Prathyush said:
I won't use the world collapse form now on, It has meanings that I don't intend. It is also very bad terminology. Let's use the following language from now on, We prepare a system in a state, described as ##|\psi_{in}>##. The system was measured to be in a state described as ##|\psi_{out}>## with a probability ##<\psi_{in}|\psi_{out}>^2##, When we use apparatus where we destroy the particle the appropriate clarification must be made. The wave function is our description of the system. What ##|\psi_{in}>## and ##|\psi_{out}>## are depend on the details of the experimental apparatus.
That's not how I would describe things. First off, I would not use the term "measured"; I would rather refer to "state preparation" and "state detection". In the case of detection, it is an eigenvalue of ##a## in representation ##A## chosen by the apparatus that is detected. But we must also take into account the role of the detection apparatus, since the detection process is one of interaction.

The "scattering amplitude" for the interaction is then ##<\psi_i,\phi_i|\psi_f,\phi_f>## where ##\psi_i, \psi_f## are the initial (prepared) and final states of the system that is detected and ##\phi_i, \phi_f## are the initial and final states of the detection apparatus. The detected value ##a## is then interpreted from the change to the apparatus as a function of ##\phi_i## and ##\phi_f## with probability given by the square modulus of the scattering amplitude. In the case that the change in the apparatus is sufficiently small (##\phi_f\approx \phi_i##) and ##\psi_f = \psi_a## is the eigenstate of ##A## with eigenvalue ##a## and then we would have that ##|<\psi_f |\psi_a>|^2## is an approximation to the probability of finding the state ##a##.
 
Last edited:
  • #101
A. Neumaier said:
The quantum formalism is independent of knowledge. Subjective issues have no place in physics, except for judging the adequacy of the assumptions and approximations made.

A measurement of a microscopic system is a reading from a macroscopic device that contains information about the state of the microscopic system. The nature of the coupling and the dynamical analysis must tell which information is encoded in the measurement result, to which accuracy, and with which probabilities.

This definition of a measurement is operationally checkable since one can prepare the states and read the measurement results and can thus compare the theory with the calculations without any ambiguity of concepts.

The only interpretation needed is how the reading from the macroscopic device is related to its macroscopic properties. In the thermal interpretation, this poses no problem at all. The consequences for the microscopic theory are then a matter of deduction, not one of postulation.

Whereas Born's rule is very incomplete in that it doesn't say the slightest thing about what constitutes a measurement, so it is an uncheckable piece of philosophy not of science, unless you know already what measurement means. But this requires knowing a lot of quantum physics that goes into building high quality measurement devices for quantum objects. Thus foundations based on Born's rule are highly circular - unlike foundations based on a properly understood statistical mechanics approach.
AFAICS what you call "a properly understood statistical mechanics approach" doesn't seem to say much more about what constitutes a measurement(at least anything different from the classical measurements with commuting observables that classical statistical mechanics addresses) than the Born's postulate. Furthermore you blur any additional hint by declaring the ambiguity between classical and quantum uncertainty exploited for a statistical mechanics interpretation as something subjective and out of the formalism, so I honestly can't see how this approach improves on the Born's rule for elucidating the nature of the quantum uncertainty and measurements.
 
  • #102
RockyMarciano said:
AFAICS what you call "a properly understood statistical mechanics approach" doesn't seem to say much more about what constitutes a measurement(at least anything different from the classical measurements with commuting observables that classical statistical mechanics addresses) than the Born's postulate. Furthermore you blur any additional hint by declaring the ambiguity between classical and quantum uncertainty exploited for a statistical mechanics interpretation as something subjective and out of the formalism, so I honestly can't see how this approach improves on the Born's rule for elucidating the nature of the quantum uncertainty and measurements.
A measurement of a system is a reading of a macroscopic variable from a macroscopic device that (due to the unitary quantum dynamics of the universe) provides information about the state of the system. This is a very clear and natural definition of measurement, valid both in the classical and the quantum regime. If the quantum dynamics is sufficiently well analyzed, one can infer from a precise protocol on how the reading is done (which might even involve some computation) and a theoretical model of system and device what is observed and how accurate it is.

For a macroscopic variable, the measured value is (to several decimal digits of relative accuracy) the expectation value of the corresponding observable (in quantum mechanics, Hermitian operator). This is the "properly understood statistical mechanics approach", and is one of the interpretative principles stated by the authors of the papers under discussion. Actually, together with the above definition of a measurement, this is the only piece of interpretation needed and defines everything. (A slightly more precise version of this statement is the content of my thermal interpretation of quantum mechanics.)

Given the above, everything can be analyzed in principle, without any ambiguity or circularity. Indeed, this is the very reason why ''shut up and calculate'' works!


Careless reading of a measurement value that could give rise to subjective uncertainty is not part of physics, but figures under lack of ability to qualify as an observer.

In the above scheme, nothing at all needs to be assumed about any commuting properties, any eigenvalues, or any probabilities; Borns rule doesn't enter the picture. All this doesn't matter, except to get closed form results in some exactly solvable toy problems.

In contrast, if you start with Born's rule it doesn't give you the slightest idea of what a measurement is, how the measurement result would appear in a pointer position to be read, say, or what is the objective and what the subjective part in making a measurement. Everything is left completely vague.
 
Last edited:
  • #103
Prathyush said:
it is also possible that you have not made a sufficiently clear and compelling argument. I too find what you are saying to be in need of clarification.
Is the clarification given in posts #85, #87, and #102 sufficient for you? Or what else needs to be clarified?
 
  • #104
A. Neumaier said:
A measurement of a system is a reading of a macroscopic variable from a macroscopic device that (due to the unitary quantum dynamics of the universe) provides information about the state of the system. This is a very clear and natural definition of measurement, valid both in the classical and the quantum regime. If the quantum dynamics is sufficiently well analyzed, one can infer from a precise protocol on how the reading is done (which might even involve some computation) and a theoretical model of system and device what is observed and how accurate it is.

For a macroscopic variable, the measured value is (to several decimal digits of relative accuracy) the expectation value of the corresponding observable (in quantum mechanics, Hermitian operator). This is the "properly understood statistical mechanics approach", and is one of the interpretative principles stated by the authors of the papers under discussion. Actually, together with the above definition of a measurement, this is the only piece of interpretation needed and defines everything. (A slightly more precise version of this statement is the content of my thermal interpretation of quantum mechanics.)

Given the above, everything can be analyzed in principle, without any ambiguity or circularity. Indeed, this is the very reason why ''shut up and calculate'' works!


Careless reading of a measurement value that could give rise to subjective uncertainty is not part of physics, but figures under lack of ability to qualify as an observer.

In the above scheme, nothing at all needs to be assumed about any commuting properties, any eigenvalues, or any probabilities; Borns rule doesn't enter the picture. All this doesn't matter, except to get closed form results in some exactly solvable toy problems.

In contrast, if you start with Born's rule it doesn't give you the slightest idea of what a measurement is, how the measurement result would appear in a pointer position to be read, say, or what the objective and subjective part in making a measurement is. Everything is left completely vague.
If the same measurement definition as in the classic case is valid then an explanation should be included about why the predictions of the measurements are no longer deterministic in principle and also about why the probabilities are computed differently from those of classical measurements, how is this addressed?
 
  • #105
RockyMarciano said:
If the same measurement definition as in the classic case is valid then an explanation should be included about why the predictions of the measurements are no longer deterministic in principle and also about why the probabilities are computed differently from those of classical measurements, how is this addressed?
The reason is simply that the same definition does not imply the same results if the dynamical rules to which the definition applies are different.

Moreover, even classically, measurements are often not predictable over a significant time scale. Classical Brownian motion (a dust particle in a fluid) is intrinsically undetermined, classically, since the initial state cannot be known to the accuracy required.
 
  • #106
A. Neumaier said:
The reason is simply that the same definition does not imply the same results if the dynamical rules to which the definition applies are different.
So I guess you mean by this that the quantum dynamical rules answer my questions, how?

A. Neumaier said:
Moreover, even classically, measurements are often not predictable over a significant time scale. Classical Brownian motion (a dust particle in a fluid) is intrinsically undetermined, classically, since the initial state cannot be known to the accuracy required.
Yes, that's why I wrote "in principle", i.e. if that initial state was known it would be predictable, this is not the case in the quantum theory.
 
  • #107
RockyMarciano said:
that the quantum dynamical rules answer my questions, how?
The relation between the pointer position of a macroscopic apparatus measuring the position of a particle, say, that can be inferred from the dynamics is never completely deterministic, not even in the classical case where the dynamics of the combined system is fully deterministic. See https://www.physicsforums.com/posts/5668841/

Similarly, the quantum rules predict that a reading of a quantum measurement tells nothing deterministic about the state of a single spin, unless the measurement is set up to measure the spin in exactly the same direction as the spin is prepared.
 
Last edited:
  • #108
A. Neumaier said:
The relation between the pointer position of a macroscopic apparatus measuring the position of a particle, say, that can be inferred from the dynamics is never completely deterministic, not even in the classical case where the dynamics of the combined system is fully deterministic.
I already addressed this. In the classical case is not completely deterministic due to a practical imposibility to know the initial conditions completely, once more this is not the case in quantum mechanics, the question was how the dynamics justify this indeterminacy in the absence of the classical initial condition excuse, and also the different probabilities if not recurring to a postulate like the Born rule(however mysterious or vague it might be). I'm afraid just invoking the explanatory power of statistical mechanics for classical measurements is not enough.
 
  • #109
RockyMarciano said:
. In the classical case is not completely deterministic due to a practical impossibility to know the initial conditions completely
Even when the initial state is fixed and the dynamics is deterministic, the information in the position of the particle at time t=0 is at no later time completely transmitted to the pointer of the detector. Thus the pointer cannot accurately reproduce the particle position.

How the Born rule appears in the quantum case is addressed by the papers we have been discussing here since post #1.
 
  • #110
A. Neumaier said:
Is the clarification given in posts #85, #87, and #102 sufficient for you? Or what else needs to be clarified?

It is now clear enough for me to evaluate.
 
  • #111
A. Neumaier said:
Even when the initial state is fixed and the dynamics is deterministic, the information in the position of the particle at time t=0 is at no later time completely transmitted to the pointer of the detector. Thus the pointer cannot accurately reproduce the particle position.
I'm asking you to explain why you claim this, it is not claimed nor explained in the classical theory(the classical pointer measures operationally by congruence between the measured object position and the measuring instrument or detector pointer, with all the uncertainty in the measure atributable to the initial state uncertainty) nor in statisitical mechanics that I'm aware of.
 
Last edited:
  • #112
A. Neumaier said:
Ok, I see there that you don't have an answer to my question, since you are asking it yourself. But then the only advantage I see for the statistical approach is that it makes explicit that the quantum formalism of the Born rule doesn't solve it either.
 
  • #113
RockyMarciano said:
I see there that you don't have an answer to my question, since you are asking it yourself.
I had answered the question you posed. It is quite obvious that there is no natural way to make the pointer position give the exact value of the particle position, so it will be always somewhat inaccurate, which answers your question even without having the details in terms of a particular model.

RockyMarciano said:
it is not claimed nor explained in the classical theory(the classical pointer measures operationally by congruence between the measured object position and the measuring instrument or detector pointer,
There is no congruence on the microscopic level, as point particles do not form straight borders.

The question in the link given is different, since it asks for details how the pointer position gets its value, given the interaction.
 
  • #114
A. Neumaier said:
I had answered the question you posed. It is quite obvious that there is no natural way to make the pointer position give the exact value of the particle position, so it will be always somewhat inaccurate, which answers your question even without having the details in terms of a particular model.
No, you hadn't. You conveniently left out of the quote the part where I say that classically that inaccuracy is attributed to the initial conditions lack of accurate knowledge. Now in the quantum case that is not an acceptable reason(the inherent HUP from noncommuting observables is, somewhat operationally encoded in the Born rule), but you seem not to be willing to recur to it in your interpretation using statistical mechanics reasoning.
So rather than stating the obvious about the fact that there will always be inaccuracy again, I'll ask you for the last time to explain the origin of the uncertainty according to your interpretation.

There is no congruence on the microscopic level, as point particles do not form straight borders..
straight borders??how is congruence achieved in geometry? Do points form straight borders?
The question in the link given is different
You should know, you linked it as an answer to my question.
 
  • #115
RockyMarciano said:
You conveniently left out of the quote the part where I say that classically that inaccuracy is attributed to the initial conditions lack of accurate knowledge.
I had addressed this: My statement holds even when there is exact knowledge of the initial condition and exact deterministic dynamics. The problem is that the pointer is a complex system that cannot be made to exactly represent the position solely through the physical interactions.
RockyMarciano said:
how is congruence achieved in geometry? Do points form straight borders?
In geometry you have a small set of points, and congruence refers to matching these points by a rigid motion. But this is abstract mathematics.

In classical microscopic physics, there are no rigid objects, hence there is no way of performing a rigid motion. To measure the distance between two points one has to match the marked lines on a a macroscopic ruler (or a more sophisticated device) so that they approximate this distance, and this is never more exact than the width of the marked lines. Even if you do this under an electron microscope or another sophisticated device, you incur uncertainty, and it cannot be arbitrarily reduced, even when one would assume that the classical laws were valid down to arbitrarily small distances, since classical point atoms behave chaotically on small distances.

RockyMarciano said:
You should know, you linked it as an answer to my question.
I had referred to the last posting in this thread, where I had mentioned that
A. Neumaier said:
In fact, perfect information cannot be obtained. Whatever is obtained experimentally needs a justification why it deserves being called a particle position or momentum and how uncertain it is.
 
  • #116
A. Neumaier said:
I had addressed this: My statement holds even when there is exact knowledge of the initial condition and exact deterministic dynamics. The problem is that the pointer is a complex system that cannot be made to exactly represent the position solely through the physical interactions.
Yes, ok, but I was trying to find out if you were claiming that the thermal interpretation gives an explanation to this complexity rather than just stating it. Now I know it doesn't, this is all I wanted to know.

In geometry you have a small set of points, and congruence refers to matching these points by a rigid motion. But this is abstract mathematics.

In classical microscopic physics, there are no rigid objects, hence there is no way of performing a rigid motion. To measure the distance between two points one has to match the marked lines on a a macroscopic ruler (or a more sophisticated device) so that they approximate this distance, and this is never more exact than the width of the marked lines. Even if you do this under an electron microscope or another sophisticated device, you incur uncertainty, and it cannot be arbitrarily reduced, even when one would assume that the classical laws were valid down to arbitrarily small distances, since classical point atoms behave chaotically on small distances.
I agree with this, again this complexity is not addressed by statisitcal mechanics in any way substantially better than by the Born rule, it is just a different rationalization of the difficulty that maybe makes more clear that this difficulty(that you address in the other thread linked) exists as much in the classic case as in the quantum but it is much more visible and problematic(infinite literature on the "measurement problem") in the latter that deals more specifically with the microscopic scale.
 
  • #117
RockyMarciano said:
if you were claiming that the thermal interpretation gives an explanation to this complexity
The point of the thermal interpretation is to give better foundations to quantum mechanics than what Born's rule offers, more in line with what happens in actual measurements.

It was never intended to explain complexity. Complexity is a given and needs no explanation.
 
  • #118
A. Neumaier said:
The point of the thermal interpretation is to give better foundations to quantum mechanics than what Born's rule offers, more in line with what happens in actual measurements.
That's a laudable intention.

It was never intended to explain complexity. Complexity is a given and needs no explanation.
This is highly debatable but it is off-topic here, maybe in the other thread.
 
  • #119
Prathyush said:
I won't use the world collapse form now on, It has meanings that I don't intend. It is also very bad terminology. Let's use the following language from now on, We prepare a system in a state, described as ##|\psi_{in}>##. The system was measured to be in a state described as ##|\psi_{out}>## with a probability ##<\psi_{in}|\psi_{out}>^2##, When we use apparatus where we destroy the particle the appropriate clarification must be made. The wave function is our description of the system. What ##|\psi_{in}>## and ##|\psi_{out}>## are depend on the details of the experimental apparatus.

This must be non controversial to both of us.(Right?)
Although you read this terminology very often, it's misleading. What you measure are observables, not states. The probability to find an outcome ##a## when measuring the observable ##A## if the system is prepared in a pure state represented by a normalized vector ##|\psi \rangle## is given by
$$P(a)=\sum_{\beta} |\langle a,\beta|\psi \rangle|^2,$$
where ##|a,\beta \rangle## denote a complete set of eigenvectors of ##\hat{A}## (the self-adjoint operator that represents the observable ##A##).

Note that in general the time-evolution due to dynamics of state vectors and eigenvectors is different, depending on the chosen picture of time evolution. E.g., in the Schrödinger picture the states evolve according to the full Hamiltonian while the operators representing observables are time independent. In the Heisenberg picture it's the other way around, but you can arbitrarily split the time dependence between states and observable-operators. What's observable are the probabilities and related quantities like expectation values, and indeed the "wave function",
$$\psi(a,\beta)=\langle a,\beta|\psi \rangle$$
is picture independent. That only works if you properly formulate the probabilities for finding a certain result when measuring the observable, given the state ##|\psi \rangle## of the system!
 
  • #120
A. Neumaier said:
The point of the thermal interpretation is to give better foundations to quantum mechanics than what Born's rule offers, more in line with what happens in actual measurements.

It was never intended to explain complexity. Complexity is a given and needs no explanation.
Well, I'm still lacking understanding the physics content of the thermal interpretation. Basically what you say is that what's measured are "expectation values", but I'm not allowed to define them via the usual probability interpretation (Born's rule, or rather Born's Postulate if you wish). So how do I understand the meaning of your "expectation values"? And what's "thermal" here? Are you taking always the expectation values with equilibrium distribution functions (equilibrium statistical operators)? I'm using the standard terminology here for lack of a better language. How do you call the "Statistical Operator", if you deny the statistical/probabilistic meaning?
 

Similar threads

  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 54 ·
2
Replies
54
Views
5K
Replies
48
Views
6K
Replies
58
Views
4K
Replies
31
Views
3K
Replies
47
Views
5K
  • · Replies 13 ·
Replies
13
Views
6K