I The thermal interpretation of quantum physics

  • #571
ftr said:
I just saw this, does it have any relation to your interpretation since it also talks about thermal interpretation.
I don't have access to this book, but from the tiny Google snippets in your link, there seems to be no close connection.
 
Physics news on Phys.org
  • #572
A. Neumaier said:
I don't have Schlosshauers book at hand but believe he only works on this level.
I disagree, but I would invite you to check it by yourself.

And by the way, do B&P claim anywhere that they solve the measurement problem? I don't think so. In fact, I think they don't even address the measurement problem.
 
  • #573
Demystifier said:
do B&P claim anywhere that they solve the measurement problem? I don't think so. In fact, I think they don't even address the measurement problem.
Correct; they cannot, because they rely on a traditional interpretation.

But I can solve it, with their help: Their calculations are independent of any interpretation and hence apply also in the context of the thermal interpretation.
 
Last edited:
  • #574
A. Neumaier said:
Correct; they cannot, because they rely on a traditional interpretation.

But I do, with their help; their calculations are independent of any interpretation and hence apply also in the context of the thermal interpretation.
Fine. But if some approach could explain evolution towards a single fixed point of the density matrix, that would be a solution of the measurement problem. Hence their approach cannot explain evolution towards a single fixed point of the density matrix. So how exactly can the thermal interpretation do that?
 
  • #575
Demystifier said:
Fine. But if some approach could explain evolution towards a single fixed point of the density matrix, that would be a solution of the measurement problem. Hence their approach cannot explain evolution towards a single fixed point of the density matrix. So how exactly can the thermal interpretation do that?
Without telling what the beables are, there cannot be a solution of the measurement problem. That's the difference. The thermal interpretation has from the start unique outcomes, and only must explain which ones occur. Note that this does not involve convergence of the density matrix; only the pointer reading, i.e., in the thermal interpretation a q-éxpectation (not an eigenvalue) matters!
 
  • #576
I would like to see a worked-out "toy" example of how metastability leads to the selection of an eigenstate of the observable being measured. To me, it's very counter-intuitive. I actually feel that there should be a proof that it is impossible without assuming something beyond the minimal interpretation of quantum mechanics (which Bohmian mechanics does, as does the "objective" collapse models).
 
  • Like
Likes Demystifier
  • #577
A. Neumaier said:
Without telling what the beables are, there cannot be a solution of the measurement problem. That's the difference. The thermal interpretation has from the start unique outcomes, and only must explain which ones occur. Note that this does not involve convergence of the density matrix; only the pointer reading, i.e., in the thermal interpretation a q-éxpectation (not an eigenvalue) matters!
But if the density matrix does not converge, then how can the expected value, uniquely determined by the density matrix, converge? If the density matrix is
$$\rho=\frac{1}{2}\rho_1 + \frac{1}{2}\rho_2$$
then the expected value of the observable ##O## is
$$\langle O\rangle ={\rm Tr}O\rho=\frac{ \langle O\rangle_1 + \langle O\rangle_2 }{2}$$
where ##\langle O\rangle_k={\rm Tr}O\rho_k##. I don't see how can ##\langle O\rangle## converge to ##\langle O\rangle_1## or ##\langle O\rangle_2##.
 
Last edited:
  • #578
Demystifier said:
But if the density matrix does not converge, then how can the expected value, uniquely determined by the density matrix, converge?
Some function of a matrix can converge even if the matrix itself dos not converge. Just like ##x_k=(k^{-1}-1)^k## does not converge but its squares converge.
Demystifier said:
If the density matrix is
$$\rho=\frac{1}{2}\rho_1 + \frac{1}{2}\rho_2$$
then the expected value of the observable ##O## is
$$\langle O\rangle ={\rm Tr}O\rho=\frac{ \langle O\rangle_1 + \langle O\rangle_2 }{2}$$
where ##\langle O\rangle_k={\rm Tr}O\rho_k##. I don't see how can ##\langle O\rangle## converge to ##\langle O\rangle_1## or ##\langle O\rangle_2##.
This state is only an average state. The true reduced state satisfies a nonlinear stochastic dynamics under which it is unstable and decays after tiny random displacements. Averaging never preserves a nonlinear dynamics.
 
  • Like
Likes Auto-Didact
  • #579
stevendaryl said:
I would like to see a worked-out "toy" example of how metastability leads to the selection of an eigenstate of the observable being measured. To me, it's very counter-intuitive.
I'd like to see such an example, too. But this stuff is quite technical, and not easy to simplify. In the thermal interpretation, no eigenstate is selected, only one of two values for the q-expectation of the pointer variable. Such a 2-valuedness is what generically happens when perturbing a metastable stagte in a double-well potential. Instability in more complicated systems is similar, though in detail more complicated. But detectors are quite special systems, created to produce outcomes of a certain kind.

stevendaryl said:
I actually feel that there should be a proof that it is impossible without assuming something beyond the minimal interpretation of quantum mechanics (which Bohmian mechanics does, as does the "objective" collapse models).
I also think that one needs to assume beables of some sort to get definite results. Both Bohmian mechanics and the thermal interpretation introduce such beables, but in quite different ways.
 
  • #580
A. Neumaier said:
The true reduced state satisfies a nonlinear stochastic dynamics under which it is unstable and decays after tiny random displacements.
If that's true, then why cannot it solve the measurement problem by itself?
 
  • Like
Likes Auto-Didact
  • #581
A. Neumaier said:
Some function of a matrix can converge even if the matrix itself dos not converge. Just like ##x_k=(k^{-1}-1)^k## does not converge but its squares converge.
But expected values (that is, beables in thermal interpretation) are linear in the density matrix.
 
  • #582
Demystifier said:
If that's true, then why cannot it solve the measurement problem by itself?
Because without beables no solution of the measurement problem. The thermal interpretation provides intuitive beables.
 
  • #583
Demystifier said:
But expected values (that is, beables in thermal interpretation) are linear in the density matrix.
Yes, but the reduced dynamics is nonlinear in the density operator. Thus there is no reason to consider your particular mixture, it is an artifact of the ignorance of the stochasticity in ##\rho##.
 
  • #584
A. Neumaier said:
Yes, but the reduced dynamics is nonlinear in the density operator. Thus there is no reason to consider your particular mixture, it is an artifact of the ignorance of the stochasticity in ##\rho##.
I have a proof that you are wrong, which I will present in a separate thread.
 
  • #585
Arnold, in III.4.2 you say:

"These other variables therefore become hidden variables that would determine the stochastic elements in the reduced stochastic description, or the prediction errors in the reduced deterministic description. The hidden variables describe the unmodeled environment associated with the reduced description.6 Note that the same situation in the reduced description corresponds to a multitude of situations of the detailed description, hence each of its realizations belongs to different values of the hidden variables (the q-expectations in the environment), slightly causing the realizations to differ. Thus any coarse-graining results in small prediction errors, which usually consist of neglecting experimentally inaccessible high frequency effects. These uncontrollable errors are induced by the variables hidden in the environment and introduce a stochastic element in the relation to experiment even when the coarse-grained description is deterministic. The thermal interpretation claims that this influences the results enough to cause all randomness in quantum physics, so that there is no need for intrinsic probability as in traditional interpretations of quantum mechanics."

Bell's theorem is understood as constraining these determinstic hidden variables to exhibit either parameter dependence or source dependence. The former type are the non-local HVs in the Bohmian fashion, whereas the latter are local but superdeterminsitic or "conspiratorial" HVs. Which type of hidden variables are you contemplating here? Or do you propose a way out of this choice?
 
  • #586
charters said:
Which type of hidden variables are you contemplating here?
I specified precisely what my hidden variables are. I haven't tried to classify them in terms of the notions you mention. Probably any deterministic interpretation with a wholistic dynamics for the universe looks conspiratorial, but maybe the technical meaning of this term is different.
 
  • #587
A. Neumaier said:
I specified precisely what my hidden variables are. I haven't tried to classify them in terms of the notions you mention. Probably any deterministic interpretation with a wholistic dynamics for the universe looks conspiratorial, but maybe the technical meaning of this term is different.

I think you do need to think about this issue more closely.

If you take the route of parameter dependence you will need to address 1) the preferred foliation problem that is familiar to the Bohmians, but arises for any interpretation with this HV approach and 2) how you supply the necessary non-local corrections to local subsystems, which the Bohmians do through their ontic pilot wave, but I don't see how you achieve.

If you take the source dependent route, superdeterminism/conspiracy route, you need to address the fine tuning concerns. In these interpretations, the validity of standard quantum theory is an accidental coincidence of having the exactly right initial conditions for the HVs, so that the diachronic probabilities of normal quantum theory are produced due to this luck. I can see you had a long thread here about superdeterminism and fine tuning a couple years ago, and though it doesn't look like it was entirely well focused, I understand if you don't want to rehash that.

Regardless, I do think you would benefit from speaking more directly on where you stand on this in the papers, as it is one of the basic frameworks for how folks mentally categorize interpretations, and so will make your ideas easier for readers to understand and place in the constellation of pre-exisiting approaches.
 
  • #588
charters said:
about superdeterminism
If superdeterminism means that everything is determined by the state of the universe in the Heisenberg picture then the TI is superdeterministic. I don't see fine tuning as a problem - the universe is what it is, we need to describe it but not explain why it is the way it is. Moreover, most of what happens in the solar system is fairly independent of the details of the state of the universe, fine-tuning matters only for the analysis of systems fine-tuned by human preparation, such as long distance entanglement experiments.

I leave it to others to classify the TI. @DarMM gave recently a classification into 5 categories, and he placed the TI in the first one, together with Bohmian mechanics.
 
  • #589
charters said:
I do think you would benefit from speaking more directly on where you stand on this in the papers, as it is one of the basic frameworks for how folks mentally categorize interpretations
A. Neumaier said:
I leave it to others to classify the TI. DarMM gave recently a classification into 5 categories, and he placed the TI in the first one, together with Bohmian mechanics.
Actually into 6, here.
DarMM said:
Category 1. Though I should rephrase it possibly.
How to classify is clearly researcher-dependent...
 
  • #590
vanhees71 said:
it's still not clarified what the interpretation of the "thermal interpretation" really is (you only told us what it is not ;-)).
You don't hear any of the positive statements about the TI.

If you compare cross sections to theory you compare q-expectations of the S-matrix, not single statistical events. If you compare spectra with experiments, you compare q-expectations of the spectral density functions, not single statistical events. If you compare quantum thermodynamic predictions with experiments, you compare q-expectations of internal energy, mass, etc., not single statistical events. The thermal interpretation talks about what is actually compared, and thus gives primary (beable) status to these q-expectations, since these are directly related to reproducible (and publishable) experimental results.

You (consistent with the tradition) only talk differently about this, giving primary status instead to the eigenvalues (which have only a statistical meaning), thus creating the illusion of the need for a statistical interpretation.

What do you expect of different interpretations if not that they talk differently about the same theory and the same experiments? If the talk is the same, the interpretation is the same. If the interpretation is different, the talk is different.

There is no interpretation of the TI in terms of your statistical interpretation, and you seem to be blind to any alternative interpretation.
 
  • Like
Likes Auto-Didact and dextercioby
  • #591
vanhees71 said:
How can ##\rho## (assuming it's what's called the statistical operator in the standard interpretation) be a "beable", if it depends on the picture of time evolution chosen? The same holds for operators representing observables.
I meant to say, in any fixed picture, ##\rho## is a beable. In the thread where you posted this, we were silently using the Schrödinger picture. Picture change are like coordinate changes.

The situation is the same as when considering a position vector as a beable of a classical system, by fixing the coordinate system.

vanhees71 said:
What's a physical quantity (...) are
$$P(t,a|\rho)=\sum_{\beta} \langle t, a,\beta|\rho(t)|t,a,\beta \rangle,$$
where ##|t,a,\beta## and ##\rho(t)## are the eigenvectors of ##\hat{A}## and ##\rho(t)## the statistical operator, evolving in time according to the chosen picture of time evolution. In the standard minimal interpretation ##P(t,a|\rho)## is the probability for obtaining the value ##a## when measuring the observable ##A## precisely at time ##t##.
Yes, this physical quantity is the q-observable ##P(t,a|\rho)=\langle B\rangle##, where
$$B=\sum_{\beta} |t,a,\beta \rangle\langle t, a,\beta|$$
Thus you agree that at least certain q-observables are physical quantities, and we are getting closer.

vanhees71 said:
Now, before one discuss or even prove anything concerning an interpretation, one must define, what's the meaning of this expression in the interpretation. I still didn't get, as what this quantity is interpreted in the thermal interpretation, because you forbid it to be interpreted as probabilities.
I do not forbid it; I only remove it from the foundations, and allow q-expectations (rather than eigenvalues) to be interpreted as the true properties (beables). See the previous post #590.

In cases where someone actually performs many microscopic measurements of ##A## in the textbook sense, this q-expectation has indeed the statistical meaning you describe.

But there are other q-observables associated to macroscopic objects (all properties considered in thermodynamics; e.g., the mass of a particular brick of iron) which can be measured by actually performing only a single measurement, and measurement statistics over single events (or over unperformed measurements) is meaningless. The thermal interpretation still applies since it is independent of statistics.

vanhees71 said:
On the other hand, I think it's pretty safe to say the universe, on a large space-time scale, is close to local thermal equilibrium, as defined in standard coordinates of the FLRM metric, where the CMBR is in local thermal equilibrium up to tiny fluctuations of the relative order of ##10^{-5}##.
I agree. This implies that the exact state of the universe is ##\rho=e^-S/k_B##, where the entropy operator ##S## of the universe is approximately given by an integral over the energy density operator and particle density operators, with suitable weights (intensive fields). The coarse-graining inherent in the neglect of field products in an expansion of ##S## into fields makes ##S## exactly equal to such an expression and defines exact local equilibrium as an approximate state of the universe.

Thus the state of the universe is fairly well, but not in all details, specified by our current knowledge.
 
  • #592
Again, my first objection is that an element of the formalism that is description dependent (like the stat. op. and the observable ops. in QT are dependent on the arbitrary choice of the picture of time evolution, or the electromagnetic potentials which depend on the choice of gauge, or coordinates in classical mechanics etc.) cannot describe something in the real world.

My second question is also not answered, because you simply again say that q-expectations are "physical quantities", however without an interpretation, i.e., a way to measure them, that's an empty phrase. Born's description is clear: He says it's the probability to find the possible value of the observable ##A## when measuring it precisely, and that implies that you can give a measurement procedure to measure the observable and that you can repreat the experiment as often as you like to test the prediction for the probabilities. This leads inevitably to the ensemble interpretation. Of course, it's formally the expectation value of the observable described by the projector onto the eigenspace of eigenvalue ##a## of the operator ##\hat{A}##. Yet, you forbid me to use this usual probabilistic meaning of "expectation value" and rename it to "q-expectation value" but not giving the explanation what this means in the lab if not an expectation value in the sense of probability theory.

Last but not least, entropy is not an observable, and there's no operator for it. It's just defined (based on information theory, which you are not allowing in your thermal interpretation either, because also this information-theoretical definition of entropy is based on the probabilistic meaning of the quantum state) as ##S=-k_{\text{B}} \mathrm{Tr} \hat{\rho} \ln \hat{\rho}##.

On the other hand you argue within this information-theoretical paradigma. No matter, how else you redefine the meaning of entropy when denying the probability theoretical foundations, this clearly shows that you never ever refer to the "state of the entire universe" but at most to the "state of the observable part of the universe" plus the assumption of the cosmological principle, i.e., the assumption that our neighborhood is not special in any sense and thus reflects how the part of the universe which is causally connected with our observable neighborhood should look like under this assumption. Well, and now we are completely lost in metaphysics and philosophy ;-))).
 
  • #593
vanhees71 said:
Last but not least, entropy is not an observable, and there's no operator for it. It's just defined (based on information theory, which you are not allowing in your thermal interpretation either, because also this information-theoretical definition of entropy is based on the probabilistic meaning of the quantum state) as ##S=-k_{\text{B}} \mathrm{Tr} \hat{\rho} \ln \hat{\rho}##.
For a density operator ##\rho## with positive spectrum, ##S:=-k_B\log\rho## is a well-defined operator, and I can give it any name I like. I call it the entropy operator, since its q-expectation is your entropy. In this way, the entropy operator is well-defined, and its q-expectation agrees in equilibrium with the observable thermodynamic entropy, just as the q-expectation of the Hamiltonian agrees in equilibrium with the observable thermodynamic internal energy. Thus everything is fully consistent.

Nothing information theoretic is involved, unless you read it into the formulas.
 
Last edited:
  • Like
Likes Auto-Didact and Demystifier
  • #594
ftr said:
Arnold, I just saw this, does it have any relation to your interpretation since it also talks about thermal interpretation.
I checked it; there is no relation at all. Your reference is instead about the interpretation of certain states as thermal states.
 
  • #595
vanhees71 said:
On the other hand you argue within this information-theoretical paradigma.
No. The relevant state is an objective state of the full universe, independent of anyone's knowledge or even its knowability. Subjective is only its approximation by something explicit. But the same holds for the state of a Laplacian universe. Any bounded subsystem of it can know only a very limited part of this state.
 
  • #596
vanhees71 said:
not giving the explanation what this means in the lab if not an expectation value in the sense of probability theory.
I granted your interpretation in the important case where you actually make a large number of experiments, since in this case, the statistical interpretation follows form the thermal interpretation, as I explained in detail in Section 3 of Part II. However, there are many cases where one never actually measures more than once, and these have a different interpretation since statistics is mute about such instances.
 
  • #597
A. Neumaier said:
For a density operator ##\rho## with positive spectrum, ##S:=-k_B\log\rho## is a well-defined operator, and I can give it any name I like. I call it the entropy operator, since its q-expectation is your entropy. In this way, the entropy operator is well-defined, and its q-expectation agrees in equilibrium with the observable thermodynamic entropy, just as the q-expectation of the Hamiltonian agrees in equilibrium with the observable thermodynamic internal energy. Thus everything is fully consistent.

Nothing information theoretic is involved, unless you read it into the formulas.
This is not an operator representing an observable, because it's time evolution in a general picture is not given by the time-evolution operator for an observable but by that for a state.
 
  • Like
Likes Demystifier
  • #598
A. Neumaier said:
I granted your interpretation in the important case where you actually make a large number of experiments, since in this case, the statistical interpretation follows form the thermal interpretation, as I explained in detail in Section 3 of Part II. However, there are many cases where one never actually measures more than once, and these have a different interpretation since statistics is mute about such instances.
We argue in circles :-(. If you never actually measure more than once, the expectation value is provided by the measurement device. That's for sure the case for any measurement concerning a system, where classical (i.e., non-quantum) physics is a good approximation. E.g., measuring the length of the edge of my desk with an ordinary meter provides such a coarse grained observable, namely the "length of my table".

I also don't understand, why you deny to define your interpretation but insist on providing a new interpretation. It's a contraditio in adjecto!
 
  • #599
vanhees71 said:
This is not an operator representing an observable, because it's time evolution in a general picture is not given by the time-evolution operator for an observable but by that for a state.
But this is the operator ##S## I was talking about in post #591, and hence gives sense to my comments about the state of the universe. Its transformation behavior is the same as that of the density operator, which is adequate for this purpose.
vanhees71 said:
If you never actually measure more than once, the expectation value is provided by the measurement device.
Well, this is why the q-expectation is measurable. But it is not an expectation value in the sense of Born's rule, which is about actual measurements and not about imagined ones.
vanhees71 said:
measuring the length of the edge of my desk with an ordinary meter provides such a coarse grained observable, namely the "length of my table".
And why is this an expectation value?? Of which operator?
vanhees71 said:
Interpretation is about the connection of the formal entities of the theory (for QT the Hilbert space, the statistical operators, and the operators representing observables) with physics.
vanhees71 said:
I also don't understand, why you deny to define your interpretation
Whereas I don't understand why you don't accept my definition as a definition; it satisfy your quoted requirement and others had no problem with this. You didn't say why my post #479 is not sufficient interpretation - it refers to plenty of connections between the formal entities of quantum theory with experiment (which surely is physics).

It only interprets it differently from how you want to have it interpreted. This is why it is a different interpretation.
 
Last edited:
  • #600
vanhees71 said:
This is not an operator representing an observable, because it's time evolution in a general picture is not given by the time-evolution operator for an observable but by that for a state.
Yes, physicists (including myself) often forget that von Neumann equation
$$\dot{\rho}=-i[H,\rho]$$
and Heisenberg equation
$$\dot{O}=i[H,O]$$
have the opposite sign and are not simultaneously valid. One is valid in the Schrodinger picture and the other in the Heisenberg picture.
 
  • Like
Likes julcab12

Similar threads

  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 154 ·
6
Replies
154
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 42 ·
2
Replies
42
Views
8K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
48
Views
6K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 53 ·
2
Replies
53
Views
7K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K