I The thermal interpretation of quantum physics

  • #301
vanhees71 said:
here the pointer variable is in the very particle itself, namely its position.
No. It is the center of mass of the positions of a huge number of silver atoms.
vanhees71 said:
It's of course a challenge, but I don't think that there are any principle problems in understanding macroscopic measurement devices (as in this example the screen used to collect the Ag atoms) quantum theoretical since measurement devices are just macroscopic physical systems as anything around us, and these are quite well understood in terms of quantum many-body theory.
There is a problem of principle, namely precisely the measurement problem.

You sweep this problem under the carpet by treating the detectors in a roundabout way, without restricting your interpretation to what is recorded in your postulates.
 
Last edited:
Physics news on Phys.org
  • #302
A. Neumaier said:
The only slighly mysterious thing is why Alice can predict Bob's measurement. Here I don't have a full explanation, but only arguments that show that nothing goes wrong with relativistic causality.

I would say that that is the central mystery of the EPR experiment.
 
  • #303
stevendaryl said:
I would say that that is the central mystery of the EPR experiment.
Yes, and the thermal interpretation does not solve it, though it makes it (at least for me) less of a mystery.

But the thermal interpretation removes the contradictions that appear when the standard interpretations are applied to the problem of measurement itself. This has become possible by decoupling measurement issues from ontological issues.
 
  • Like
Likes Jimster41 and stevendaryl
  • #304
A. Neumaier said:
But why do you demand more for my calculational tools, the q-expectations? Justify them exactly in the same way as you justify your definition of Green's function, and you'll understand me.I only claim that some of the q-expectation give predictions for actual observations, and how these are to be interpreted is fully specified in my posts #263, #266, and #278 (and elsewhere).
I'm obviously still not able to make my point. Perhaps I misunderstand the entire motivation of what you are doing with this project, but you call it "Thermal Interpretation". This I understand as another attempt to provide a different interpretation of the formalism. I've no problem with your math at all (although I don't claim that I understand all the subtle details since I'm a physicist not a mathematician). I still do not see, what is the interpretation, i.e., what is the relation of your q-expectation values to experimental observables in the lab, and that's what "interpretation" in this context means. Despite your claim, you precisely don't specify precisely this relation. You simply forbid to use the usual statistical interpretation but don't say what to use instead. Maybe it's also difficult for me to understand your arguments in this direction, because every physicist from day 1 on is trained to use statistical methods to evaluate data. Even more, there is no other way to objectively treat empirical data if not statistically to begin with. Of course, there's more to error analysis and error estimates, namely what's called systematic errors, but that's not the issue here, because that's anyway a very specific task for any individual experiment. Interpretation as the physics part of a mathematical theory of model doesn't deal with this, but with the relation between the mathematical formalism and the observable values. As I said repeatedly this doesn't forbid you to use "auxiliary quantities" like a set of n-point functions of various kinds in QFT or the gauge-dependent potentials of electromagnetic or non-Abelian gauge fields etc. These have nothing to do with interpretation anyway (it's also historically interesting that the Maxwellians struggled with interpretations using the potentials, but that was before the full meaning of gauge invariance has been understood with the works by Noether, Weyl, Kaluza et al). All you need to provide if you want to give an interpretation is the relation of what's meant to be observable quantities in the formalism.

If I understand it right, you claim, it's the q-expectation values, but this is a common misconception of some not too well thought introductory textbooks, leading to the common wrong interpretation of the Heisenberg uncertainty relation(s) as restrictions on the accuracy of measurements and the inevitable disturbance of the system by the measurements. Heisenberg himself trapped into this misconception, but was corrected by Bohr immediately. It's also obvious, because the uncertainty relation refers to (in the standard statistical interpretation) statistical properties of observables of a system without even considering a specific measurement process at all.
 
  • #305
A. Neumaier said:
Yes, and the thermal interpretation does not solve it, though it makes it (at least for me) less of a mystery.

But the thermal interpretation removes the contradictions that appear when the standard interpretations are applied to the problem of measurement itself. This has become possible by decoupling measurement issues from ontological issues.
The minimal interpretation very simply solves it: It's about correlations, and these are imprinted due to the preparation procedure (e.g., two polarization entangled photons due to the decay of a neutral pion, just to avoid the problem of collective behavior of a crystal in parametric downconversion, although this doesn't make any difference).

Of course, the minimal interpretation also resolves these problems by decoupling it from any "ontological issues" in interpreting the quatnum state as epistemic to begin with. That's not very surprising since everything physics cares about is epistemic. Natural science is about what can be objectively observed and not about some metaphysical ontology!
 
  • #306
vanhees71 said:
I still do not see, what is the interpretation, i.e., what is the relation of your q-expectation values to experimental observables in the lab, and that's what "interpretation" in this context means. Despite your claim, you precisely don't specify precisely this relation
The q-expectation is what you're measuring. Let's make this much simpler. Take spin in the z-direction and let's say we have a state with (Units with ##\hbar = 1##):
$$\langle S_z \rangle = -\frac{1}{4}$$

The Thermal Intepretation is saying that this is not an expectation, but an actual value. It is a property of the state with value ##\frac{1}{4}##, just like charge for example.

However when we measure it, due to the bistability of the measuring device we only get outcomes ##−\frac{1}{2}## and ##\frac{1}{2}##, not the true value of ##−\frac{1}{4}##.

However the true value will show up in the fact that the fact that ##−\frac{1}{2}## will occur more often and eventually through gathering enough measurement samples you can reconstruct the ##−\frac{1}{4}## value as it will show up as the sample mean.
 
  • #307
A. Neumaier said:
I completely agree. But this doesn't invalidate what I said:

That there is a dot on the left, say, is a property of the plate that is computable from the state of the plate at the time of looking at it to record the event. For the state of the plate with a dot on the right is quite different from the state of the plate with a dot on the left or that without a dot. Hence the state differentiates between these possibilities, and hence allows one to determine which possibility happened.

The only slighly mysterious thing is why Alice can predict Bob's measurement. Here I don't have a full explanation, but only arguments that show that nothing goes wrong with relativistic causality.

Note also that the prediction comes out correct only when the entangled state is undisturbed and the detector is not switched off at the time the photon hits - things that Alice cannot check until having compared notes with Bob. Thus her prediction is not a certainty but a conditional prediction only.
But it should be stressed that the plate just fixes the observable "position" of the measured electron. If you consider the SGE, as usual, as a measurement of the corresponding spin component, the point is the (nearly) 100% entanglement between position (pointer variable) and spin component (measured observable). There's nothing mysterious in this, but it can be predicted by using quantum dynamics only. There's no mystery about the SG experiment and the involved measurement devices (a magnet with the appropriately tuned magnetic fields and a screen to collect the particles).
 
  • #308
DarMM said:
The q-expectation is what you're measuring. Let's make this much simpler. Take spin in the z-direction and let's say we have a state with (Units with ℏ=1ℏ=1):

⟨Sz⟩ρ=−14⟨Sz⟩ρ=−14​

The Thermal Intepretation is saying that this is not an expectation, but an actual value. It is a property of the state with value 2323, just like charge for example.

However when we measure it, due to the bistability of the measuring device we only get outcomes −12−12 and 1212, not the true value of −14−14.

However the true value will show up in the fact that the fact that −12−12 will occur more often and eventually through gathering enough measurement samples you can reconstruct the −14−14 value as it will show up as the sample mean.
Well, this is a contradictio in adjecto. It's a very clear example for the misconception of taking the expectation values as the "true observables". What's measured are the values ##\pm 1/2## not the expectation value which well may be ##-1/4##.
 
  • #309
vanhees71 said:
Well, this is a contradictio in adjecto. The values of observables are what a (sufficiently precise) measurement device measures. It's a very clear example for the misconception of taking the expectation values as the "true observables". What's measured are the values ##\pm 1/2## not the expectation value which well may be ##-1/4##.
 
Last edited:
  • #310
Something strange happened with the formatting there!

vanhees71 said:
Well, this is a contradictio in adjecto. It's a very clear example for the misconception of taking the expectation values as the "true observables". What's measured are the values ##\pm 1/2## not the expectation value which well may be ##-1/4##.
The Thermal Interpretation though explains why ##\pm 1/2## are measured despite ##-1/4## being the true value.
 
  • Like
Likes A. Neumaier
  • #311
vanhees71 said:
every physicist from day 1 on is trained to use statistical methods to evaluate data. Even more, there is no other way to objectively treat empirical data if not statistically to begin with.
This is the usual classical statistics, to improve the accuracy of raw measurements. This use of statistics is necessary and useful. Indeed, I even claim that all statistics ever needed in physics (quantum or not) is precisely of this kind.

But the use of statistics to justify theoretical calculations is not necessary, and the thermal interpretation removes the need for it in quantum mechanics, in the same way as modern physics removed the need for a mechanical interpretation of the Maxwell equations deemed neceessary in Maxwell's time.
vanhees71 said:
I still do not see, what is the interpretation, i.e., what is the relation of your q-expectation values to experimental observables in the lab, and that's what "interpretation" in this context means.
Let me repeat from post #298 that some of the q-expectations give predictions for actual observations, and how these are to be interpreted is fully specified in my posts #263, #266, and #278 (and elsewhere). This is the relation you always asked for and never accepted.
vanhees71 said:
As I said repeatedly this doesn't forbid you to use "auxiliary quantities" like a set of n-point functions of various kinds in QFT or the gauge-dependent potentials of electromagnetic or non-Abelian gauge fields etc.
Then please take note that all q-expectations are auxiliary quantities of this kind, except for the ones for which I gave an experimental interpretation in my posts #263, #266, and #278. Please reply line by line to these posts, why you think the statements there are not an answer to your request - for this is what I cannot understand!
vanhees71 said:
If I understand it right, you claim, it's the q-expectation values
It is some q-expectations, primarily those discussed in the above three threads.
vanhees71 said:
but this is a common misconception of some not too well thought introductory textbooks, leading to the common wrong interpretation of the Heisenberg uncertainty relation(s) as restrictions on the accuracy of measurements and the inevitable disturbance of the system by the measurements.
I discuss in Subsection 2.5 of Part II Heisenberg's uncertainty relation and what it means in the thermal interpretation. It is completely independent of measurement (not a restriction of the accuracy of measurements) and limits the accuracy for which a numerical concept is meaningful.

The position of a car - unlike that of its center of mass - is not determined to mm accuracy, and if you specify its position to that accuracy, the trailing digits are spurious. Similarly for position or spin in quantum mechanics.
 
Last edited:
  • #312
DarMM said:
Something strange happened with the formatting there!
This happens whenever you quote or reply to part of an answer, and this part contains formulas; these then appear as a mess, as in post #308 by @vanhees71. It is a bug since long... (@Greg Bernhardt - can something be done about it?)

To get the formulas quoted correctly, always reply to the whole answer, and cut out the unwanted part (and/or insert additional QUOTE pairs).
 
Last edited:
  • Like
Likes DarMM
  • #313
vanhees71 said:
Well, this is a contradictio in adjecto. It's a very clear example for the misconception of taking the expectation values as the "true observables". What's measured are the values ##\pm 1/2## not the expectation value which well may be ##-1/4##.
Note that the uncertainty of the spin in such a state is quite large, which explains why the measured values ##\pm 1/2## can be so far of from the (according to the thermal interpretation) true value ##-1/4## of the spin.
 
  • #314
vanhees71 said:
There's no mystery about the SG experiment and the involved measurement devices
Yes; see post #135 for how this is viewed by the thermal interpretation.

But there is some mystery about how Nature manages to get the right correlations in experimental tests of long distance entanglement (the case under discussion here). Quantum mechanics predicts the right correlations, but does not explain so far how it is possible that these are actually obtained!
 
  • #315
vanhees71 said:
Natural science is about what can be objectively observed and not about some metaphysical ontology!
But isn't interpretation is about some clarity which only comes with some level of ontology that our brains can rap around it.
 
  • #316
A. Neumaier said:
This happens whenever you quote or reply to part of an answer, and this part contains formulas; these then appear as a mess, as in post #308 by @vanhees71. It is a bug since long... (@Greg Bernhardt - can something be done about it?)
It's not a bug but a limitation. Equations are not simple text, but equation code that is formatted by CSS. When you highlight to quote or reply it's a simple javascript browser function that just copies the highlight text. If there is an equation there is no function for the browser to also copy the underlying equation code. When these is an equation, always use the regular reply link.
 
  • #317
vanhees71 said:
But it should be stressed that the plate just fixes the observable "position" of the measured electron. If you consider the SGE, as usual, as a measurement of the corresponding spin component, the point is the (nearly) 100% entanglement between position (pointer variable) and spin component (measured observable). There's nothing mysterious in this, but it can be predicted by using quantum dynamics only. There's no mystery about the SG experiment and the involved measurement devices (a magnet with the appropriately tuned magnetic fields and a screen to collect the particles).

The mystery, as I have said, is that quantum mechanics predicts that if the device interacting with a spin-up electron leads to a state in which there is a black spot on the left photographic plate, and if the device interacting with a spin-down electron leads to a state in which there is a spot on the right plate, then the device interacting with an electron that is in a superposition of spin-up and spin-down would lead to a superposition of the two macroscopic states. The additional claim that only one of the two outcomes attains means that your "minimal interpretation" is either inconsistent (which I think it actually is) or that there is a mysterious macro/micro distinction such that macroscopic variables are treated differently than microscopic variables.
 
  • Like
Likes eloheim
  • #318
There is no mystery. It's all completely explained with quantum dynamics. After the particle has run through the magnet providing an appropriate magnetic field the position of the particle and the measured spin component are (nearly, i.e., with high precision but never really exactly) 100% entangled. There is no mystery to have two partial beams after the magnet nor that each partial beam has (almost) determined values of the spin component in the expected direction. It's one of the very few examples which can be exactly calculated even analytically.
 
  • #319
vanhees71 said:
There is no mystery. It's all completely explained with quantum dynamics. After the particle has run through the magnet providing an appropriate magnetic field the position of the particle and the measured spin component are (nearly, i.e., with high precision but never really exactly) 100% entangled. There is no mystery to have two partial beams after the magnet nor that each partial beam has (almost) determined values of the spin component in the expected direction. It's one of the very few examples which can be exactly calculated even analytically.
By what mechanism do the spins directions agree no matter direction is choosen? How do two randomly generated spacelike separated outcomes always agree?
 
  • #320
A. Neumaier said:
Yes; see post #135 for how this is viewed by the thermal interpretation.

But there is some mystery about how Nature manages to get the right correlations in experimental tests of long distance entanglement (the case under discussion here). Quantum mechanics predicts the right correlations, but does not explain so far how it is possible that these are actually obtained!
What is the mystery? There is none. QT predicts these calculations and they are verified at an astonishing level of significance. I do not understand, where the mystery should be. It's just the prepatation of entangled states which leads to these correlations.

This is also a very nice example for what I'm still missing in your interpretation. For the statistical interpretation it's easy to understand, how entangled states can be prepared. Let's take the most simple example. Just produce a neutral pion and wait until it decays. It's most easily described in its rest frame, i.e., you have a state of zero momentum (with some small uncertainty of course, because momentum eigenstates, i.e., plane waves are no proper but generalized states only) and 0 angular momentum. It decays with almost 100% probability to two photons, obeying the conservation laws, i.e., the photons have (within the uncertainty of momentum) back-to back momenta and total angular momentum 0. The latter property makes the polarizations in any direction 100% correlated but also maximally uncertain, i.e., the single-photon polarization state is that of completely unpolarized photons, but still there are the 100% correlations, i.e., measuring the polarization of each of the photon in the same direction you always get opposite results, i.e., if the one photon is horizontally polarized the other one is necessarily vertically polarized and vice versa. It's completely undetermined which measurement outcome you'll have, before having done the measurement, and you can do the measurement at as far distant places you want (provided there's nothing charged around with which the photons may interact and changing the state before being measured) but the 100% correlation is imprinted due to the preparation of these photons.

As I said, there's no mystery within the statistical minimal interpretation. Now two questions:

(a) how do you interpret this situation, i.e., the preparation of the biphoton state and subsequent measurement of the single-photon polarizations within your statistical interpretation, if it's not allowed to interpret the formal mathematical manipulations statistically?

(b) What's the mystery due to the thermal interpretation, which is obviously absent in the minimal statistical interpretation?
 
  • #321
vanhees71 said:
What is the mystery? There is none. QT predicts these calculations and they are verified at an astonishing level of significance. I do not understand, where the mystery should be. It's just the prepatation of entangled states which leads to these correlations.
Yes, but it doesn't explain the mechanism by which these correlations are satisfied. What you're saying is akin to saying that situation A:
"Stick of dynamite, a lit match, a wall"
is correlated with the following situation B at later times:
"Collapsed wall"
And that there is no mystery here, simply the correlation.

People want the mechanism, i.e. "heated nitroglycerine explodes and the explosion collapses the wall".
 
  • #322
vanhees71 said:
What is the mystery? There is none. QT predicts these calculations and they are verified at an astonishing level of significance. I do not understand, where the mystery should be. [...] the 100% correlation is imprinted due to the preparation of these photons.

As I said, there's no mystery within the statistical minimal interpretation.

This has nothing at all to do with the statistical interpretation. Blind trust in the formalism of quantum mechanics removes the mystery no matter which interpretation.

You simply lost the sense of mystery because you trust the quantum mechanical calculations as confirmed by experiments and do not look for any understanding beyond that. Very well; then there is nothing more to discuss.
 
  • #323
vanhees71 said:
how do you interpret this situation, i.e., the preparation of the biphoton state and subsequent measurement of the single-photon polarizations
Since you never replied to my interpretation of the simpler Stern-Gerlach experiment in post #135 (given upon your request) I don't see any point in interpreting for you this more complex situation. You just ignore my interpretations and only rant against my dismissal of unobserved statistics.
 
  • #324
DarMM said:
Yes, but it doesn't explain the mechanism by which these correlations are satisfied. What you're saying is akin to saying that situation A:
"Stick of dynamite, a lit match, a wall"
is correlated with the following situation B at later times:
"Collapsed wall"
And that there is no mystery here, simply the correlation.

People want the mechanism, i.e. "heated nitroglycerine explodes and the explosion collapses the wall".
What's "the mechanism"? It's just a reproducible experience that "heated nitroglycerine explodes and the explosion collapses the wall". You can of course dig for "deeper explanations", i.e., try to derive this observation from more fundamental knowledge about the molecules making up nitroglycerine, but you'll always end at one point of the most fundamental knowledge, where there is no more deeper explanation of "a mechanism" simply because it's the most fundamental knowledge we currently have, and that's always based on the reproducibility of some observations. The natural sciences are all empirical sciences. That there are amazingly simple mathematically describable "fundamental laws" is one of these empirical findings. As Einstein famously said, the most incomprehensible fact about nature is that it is comprehensible in the sense that there are these amazingly simple laws which can be precisely formulated mathematically.
 
  • #325
vanhees71 said:
What's "the mechanism"? It's just a reproducible experience that "heated nitroglycerine explodes and the explosion collapses the wall".
Yes, but the point is that for many people the current account in QM is missing the "heated nitroglycerine explodes and the explosion collapses the wall " part.

This is even formally the case, where the QM correlations violate the Reichenbach principle of a common cause. There's no event you can condition on that removes the correlations, which is taken as typical of an "explanation" in statistics.
 
  • #326
vanhees71 said:
As Einstein famously said, the most incomprehensible fact about nature is that it is comprehensible in the sense that there are these amazingly simple laws which can be precisely formulated mathematically.
I have to say I definitely don't belong to whatever elite club considers Quantum Yang-Mills theories "simple"!:wink:
 
  • #327
A. Neumaier said:
I would introduce quantum mechanics with the qubit, which is just 19th century optics. This produces the density operator, the Hilbert space, the special case of pure states, Born's rule (aka Malus' law), the Schrödinger equation, and the thermal interpretation - all in a very natural way.
That's precisely how I started my QM lecture for teacher students last semester. Of course, I used the statistical minimal interpretation as soon as it came to the case of very much dimmed light, when single photon events start to be visible (of course with the caveat that there are not single photons prepared but low-intensity coherent states, of course without being able at this very introductory stage to give a precise definition of coherent states).
To deepen the understanding, one can discuss classical mechanics in terms of the Lie algebra of phase space function given by the negative Poisson bracket, and then restrict to a rigid rotor, described by an so(3) given by the generators of angular momentum. This example is the one given in the last two paragraphs of post #63, and also provides the Lie algebra for the qubit.

Next one shows that this Lie algebra is given by a scaled commutator. This generalizes and defines the Lie algebras that describe quantum mechanics. Working out the dynamics in terms of the q-expectations leads to the Ehrenfest equations. Then one can introduce the Heisenberg, Schrödinger, and interaction picture and their dynamics.
That's one way to look at the formalism. I've no problems with that, but it does not provide an interpretation, as you promise by calling the whole endeaver "thermal interpretation".
Then one has everything, without any difficult concepts beyond the Hilbert space and the trace, which appeared naturally. There is no need yet to mention eigenvalues and eigenvectors (these come when discussing stationary states), the subtle problems with self-adjointness (needed when discussing boundary conditions), and the spectral theorem (needed when defining the exponentials ##U(t)=e^{\pm itH}##). The latter two issues are completely absent as long as one works within finite-dimensional Hilbert spaces; so perhaps doing initially some quantum information theory makes sense.
Well, here I'm lost already. The very purpose of the above given starting point with polarization measurements was to get to the eigenvectors and eigenvalues and Born's rule. I don't see, how you can avoid eigenvectors and eigenvalues in the foundations. That's how everything, including expectation values and the quantum dynamics, is defined, providing the necessary minimal interpretation, giving the connection between what's observed in nature and the formalism.

The very fact that you can formulate the mathematical description of time evolution in different pictures shows that you need both the states and also the eigenvectors to finally derive what's observable (in the standard interpretation the probabilities for the outcome of measurements given a state in terms of a preparation). Neither the state (statistical operator) alone nor the eigenvectors (of operators representing observables) alone refer to any observable quantity within the standard statistical interpretation.
The calculations are of course identical, since calculations are not part of the interpretation.

But the interpretation of the calculation is different: In the thermal interpretation, the Ag field is concentrated along the beam emanating from the source, with a directional mass current. The beam is split by the magnetic field into two beams, and the amount of silver on the screen at the end measures the integrated beam intensity, the total transported mass. This is in complete analogy to the qubit treated in the above link. Particles need not be invoked.
You just use other words to describe what also the minimal statistical interpretation describes. The state describes a beam of Ag atoms. I don't know as what it's interpreted in your thermal interpretation precisely, but in the minimal statistical interpretation it's clear: There is a cylinder like region in space, where you have a high probability to find a silver atom with some momentum distributed around the cylinder axis, and these distributions are probability distributions within the statistical interpretation. What else are they in your thermal interpretation? Do you just ignore the atomistic nature of the Ag atoms and just interpret it as a classical density and velocity distribution? Wouldn't this be like the early interpretation by Schrödinger, which however is not consistent with the observation that single Ag atoms just make a single spot on a screen (as in Stern's and Gerlach's experiment) but do not give a smeared distribution, which only occurs after you accumulate very many Ag atoms? As I already stated before, to interpret the expectation values (also those of local quantities like charge, current, or energy densities within QFT) as the observables is contradicting in that very cases, where QT really becomes important, namely whenever the atomistic nature of matter (as well as radiation!) becomes resolved. I think the title + subtitle of Schwinger's QM book brings it to the point: "Quantum Mechanics - Symbolism for atomistic measurements"!

Sorry, for having overseen this nice posting for so long.
 
  • #328
vanhees71 said:
There is no mystery. It's all completely explained with quantum dynamics. After the particle has run through the magnet providing an appropriate magnetic field the position of the particle and the measured spin component are (nearly, i.e., with high precision but never really exactly) 100% entangled. There is no mystery to have two partial beams after the magnet nor that each partial beam has (almost) determined values of the spin component in the expected direction. It's one of the very few examples which can be exactly calculated even analytically.

I don't think you're addressing what I said. Quantum mechanics predicts that if you treat a measuring device as a quantum system (which you should) then it will not make a nondeterministic transition into one or the other pointer states. The nondeterminism implied by Born's rule only applies when you treat the measuring device as a classical system that can only be in one macroscopically distinguishable state.
 
  • Like
Likes dextercioby
  • #329
vanhees71 said:
The very purpose of the above given starting point with polarization measurements was to get to the eigenvectors and eigenvalues and Born's rule. I don't see, how you can avoid eigenvectors and eigenvalues in the foundations.
Stokes didn't need them, neither did I in my account of the work of Malus and Stokes. Nevertheless, the whole phenomenology of a qubit was there.
vanhees71 said:
Neither the state (statistical operator) alone nor the eigenvectors (of operators representing observables) alone refer to any observable quantity within the standard statistical interpretation.
Only because you think again in terms of the statistical interpretation, which you want to teach your students. In contrast, I want to introduce the students to the thermal interpretation, where the true, approximately observable values are the Stokes parameters (and not any eigenvalues!), of which the erratic events on the screen give very poor but slightly significant approximations only, which become reproducible (and hence deserve to be called measurements) only after averaging over many events. In this case, one indeed gets a good approximation of some component of the Stokes vector, proving that the Stokes vector can be observed.
vanhees71 said:
I don't see, how you can avoid eigenvectors and eigenvalues in the foundations.
But I did avoid them! Nowhere any eigenvalue or eigenvector appeared!
vanhees71 said:
The state describes a beam of Ag atoms. I don't know as what it's interpreted in your thermal interpretation precisely, but in the minimal statistical interpretation it's clear: There is a cylinder like region in space, where you have a high probability to find a silver atom with some momentum distributed around the cylinder axis, and these distributions are probability distributions within the statistical interpretation. What else are they in your thermal interpretation?
Something completely different, based on quantum fields rather than a particle picture; this makes the probabilistic interpretation irrelevant. The thermal interpretation dismisses the view that single events imply single particles. That's the whole purpose of the discussion in Section 3.4 of Part III, which shows that there are no convincing grounds (only historical ones) to do so. The thermal interpretation replaces this view by the intuition of fields probed by quantum buckets - see Post #272. The quantum buckets (aka bistable systems leading to single detection events) measure the rate of flow of the silver field, but at low rates only very coarsely.
vanhees71 said:
Do you just ignore the atomistic nature of the Ag atoms and just interpret it as a classical density and velocity distribution? Wouldn't this be like the early interpretation by Schrödinger, which however is not consistent with the observation that single Ag atoms just make a single spot on a screen
The resulting interpretation indeed resembles that of Schrödinger; see post #273, except that it takes a quantum field point of view and hence has access to beables describing correlations, which Schrödinger didn't consider - he wanted a description in fully classical terms.
vanhees71 said:
to interpret the expectation values (also those of local quantities like charge, current, or energy densities within QFT) as the observables is contradicting in that very cases, where QT really becomes important, namely whenever the atomistic nature of matter (as well as radiation!) becomes resolved.
With the thermal interpretation in place of the statistical interpretation, there is no longer a contradiction. The experiments that need statistics can all be explained in terms of the quantum bucket intuition, as in this example.
 
Last edited:
  • #330
stevendaryl said:
I don't think you're addressing what I said. Quantum mechanics predicts that if you treat a measuring device as a quantum system (which you should) then it will not make a nondeterministic transition into one or the other pointer states. The nondeterminism implied by Born's rule only applies when you treat the measuring device as a classical system that can only be in one macroscopically distinguishable state.
There is no indeterministic transition in what I said. To the contrary all is determined by unitary time evolution. The preparation of the spin component with an (almost accurately) determined value (##1/2## or ##-1/2## nothing in between) by splitting of the beam leading to (almost) 100% position-spin-component entanglement works through the unitary time evolution. Nowhere does one make any approximate classical description of the motion of the atom (which however is a pretty good description in this case by the way, but that's not the point here).
 

Similar threads

  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 154 ·
6
Replies
154
Views
3K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 42 ·
2
Replies
42
Views
8K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
48
Views
6K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 53 ·
2
Replies
53
Views
7K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 7 ·
Replies
7
Views
3K