I Are there signs that any Quantum Interpretation can be proved or disproved?

  • #301
vanhees71 said:
What we observe are intensities at some time at some place (quantified by the energy density ##1/2(\vec{E}^2+\vec{B}^2)##).
This is not necessarily true, and not just because ##1/2(\vec{E}\cdot\vec{D}+\vec{B}\cdot\vec{H})## is a more appropriate expression for the energy density. If you put a CCD detector in the path of the light, the component of the Poynting vector perpendicular to the detector surface might be a more appropriate description for what you will observe. Or if you use a photoresist of a certain thickness with a given refractive index and absorption coefficient, then just multiplying the energy density with the absorption coefficient and integrating over the volume might not give you the actually absorbed energy. (But I would have to do the detailed computation again. I don't remember the exact details anymore. It was a complicated computation with a simple result. I think it was proportional to the energy density, just the constant of proportionality was slightly surprising.)
 
  • Like
Likes vanhees71
Physics news on Phys.org
  • #302
In principle I share what also seems to be one of neumaier issue: How to physically motivate the ensemble that lays the basis for the probabilistic framework (wether quantum or classical). Even from my perspective which prefers an agent perspective, this is central. But the imagine solution in qbism mutation are still different from neumaiers idea.

But when the premises of repeatability, information processing etc, to actually construct proper statistics holds - which it does for many situations in a particle lab - then all is fine, but when this does not hold, the soundness of the quantum framework as it stands IMO fails. WHEN this fails, seems to be for example when you consider quantum cosmologt, but also POSSIBLY when one views macroscopic systems with quantum framework.

I personally think however, that we need modification of theory and not just reinterpretation.

/Fredrik
 
  • #303
vanhees71 said:
But in this experiment there are many identically prepared systems using one and the same molecules in a trap. I don't see, why an ensemble shouldn't be realized with one and the same system.
The latter gives an ensemble of ''many identically prepared systems'' only when you can prepare them identically! But the single ion in a trap is at each time in a different state - determined by the Schrödinger equation for trap and measurement device. Thus its time snapshots are ''many nonidentically prepared systems'', for which your postulates say nothing at all!
vanhees71 said:
I also don't understand why you say the ion is not prepared.
I only said that the ion at different times is not identically prepared! Of course it is prepared, but at different times it is prepared in different states!
vanhees71 said:
it's still the same probabilistic meaning as in standard minimally interpreted QT, but then I don't see where's the difference between his and the standard QT interpretation.
Its the same in those cases where it can be derived, namely when you actually have many measurements on identically prepared systems.

It is not the same otherwise since it also allows to derive testable statements for non-identically prepared systems and for single systems, where your interpretation is too minimal to be applicable!
vanhees71 said:
Neumaier seems to believe these cannot be describes within the standard minimal interpretation, but that's not right, because many people in this community of physicists work well with the standard QT
They work with standard QT - but not in the minimal interpretation but in the irrefutable handwaving interpretation, where any intuitive argument is sufficient if it leads to the desired result. Your minimal interpretation is a religion like the other interpretations you are so zealously fighting! In the paper
the most prevailing handwaving interpretation and its relation to the measurement problem is described as follows:
David Wallace said:
Orthodox QM, I am suggesting, consists of shifting between two different ways of understanding the quantum state according to context: interpreting quantum mechanics realistically in contexts where interference matters, and probabilistically in contexts where it does not. Obviously this is conceptually unsatisfactory (at least on any remotely realist construal of QM) – it is more a description of a practice than it is a stable interpretation. […] The ad hoc, opportunistic approach that physics takes to the interpretation of the quantum state, and the lack, in physical practice, of a clear and unequivocal understanding of the state – this is the quantum measurement problem.
WernerQH said:
As Willard Gibbs has shown, it is sufficient for our calculations that we can imagine it.
The strange thing is only that nature behaves according to our calculations though these are only about imagined things! This requires an explanation!
vanhees71 said:
According to quantum theory the properties of an are described by the quantum state, represented by the statistical operator. There is nothing conflicting here. It uniquely tells you the probabilities to find one of the possible values for any observable when you measure them.
There are two approaches to the same mathematical calculus:

  1. Expectation via probability: This is the common tradition since 1933 when Kolmogorov showed how to base probability rigorously on measure theory. But Kolmogorov's approach does not work for quantum probabilities, which creates foundational problems.
  2. Probability via expectation: This was the approach of the founders of probability theory who wanted to know the expected value of games, and introduced probabilities as a way of computing these expectations. If fell out of favor only with Kolmogorov's successful axiomatization of probability. However, in 1970, Peter Whittle wrote a book called ''Probability via expectation'' (the third edition from 2012 is still in print) an axiomatization of expectation in which probabilities were a derived concept and Kolmogorov's axioms could be deduced for them.
From the preface of the first edition:
the principal novelty of the present treatment is that the theory is based on an axiomatization of the concept of expectation, rather than that of a probability measure.
Thus it is now a choice of preference where to start. Probability via expectation is free of measure theory and therefore much more accessible, and as the last chapter in the 2012 edition of Whittle's book shows, it naturally accommodates quantum physics - quite unlike Kolmogorov's approach.

My thermal interpretation views quantum mechanics strictly from the probability via expectation point of view and therefore recovers all traditional probabilistic aspects of quantum mechanics, while removing any trace of measurement dependence from the foundations.

vanhees71 said:
One just has to accept that on a fundamental level the values of observables are indetermined
You 'just' accept it and stop asking further. But many physicists, including great men like t'Hooft and Weinberg find this 'just' glossing over unexplained territory.

vanhees71 said:
vague is not clear to me.
vague in the statistical interpretation is why the measurement of a pointer (a macroscopic quantum system) should give information about the value of a microscopic variable entangled with it. This must be posited as an irreducible postulate in addition to your minimal postulates!
 
Last edited:
  • Like
Likes dextercioby and gentzen
  • #304
Fra said:
But the imagine solution in qbism mutation are still different from neumaiers idea.
What do you mean by "imagine solution in qbism mutation"? Do you mean my short analogy was a misrepresentation of qbism? I was actually more worried that neumaier would find it a misrepresentation of his views. All I wanted to highlight is that there are models (with all their associated structure) in his interpretation, but no agents. In qbism on the other hand, agents play a prime role, and models are not mentioned explicitly, even so I admit that between the lines you could find that they are also part of the picture, as a part of the tools an agent can use. So I guess you protest that my "In QBism, the agent uses QM as a cookbook to update his state." was a mutation of "According to QBism, quantum mechanics is a tool anyone can use to evaluate, on the basis of one’s past experience, one’s probabilistic expectations for one’s subsequent experience." Did I guess correctly?
 
  • #305
A. Neumaier said:
The strange thing is only that nature behaves according to our calculations though these are only about imagined things! This requires an explanation!
The explanation is quite simple: when Nature behaves differently we revise our theories. We celebrate the discovery of a new effect when Nature does not conform to our expectation.
 
  • Like
Likes physicsworks and vanhees71
  • #306
WernerQH said:
The explanation is quite simple: when Nature behaves differently we revise our theories. We celebrate the discovery of a new effect when Nature does not conform to our expectation.
This an empty explanation since it explains everything and nothing.
 
  • #307
A. Neumaier said:
This an empty explanation since it explains everything and nothing.
Think about it.
 
  • Like
Likes physicsworks and vanhees71
  • #308
WernerQH said:
The explanation is quite simple: when Nature behaves differently we revise our theories. We celebrate the discovery of a new effect when Nature does not conform to our expectation.
A. Neumaier said:
This an empty explanation since it explains everything and nothing.
WernerQH said:
Think about it.
It is a truism that can be applied to everything, no matter what it is, and hence is nothing more than an empty phrase.
 
Last edited:
  • #309
gentzen said:
What do you mean by "imagine solution in qbism mutation"? Do you mean my short analogy was a misrepresentation of qbism?
No, actually i have been offline for some weeks prior to this and didn't follow all new posts, i just commented on a response to my old post.

By qbism mutation I simply mean that, my own interpretation (which colours my comments, and goes hand in hand with thinking we need a revision of the theory, not just reinterpretation) is partly in the qbism direction, but a mutation/variant of the common qbism. If I refrain from thinking about modifications, my other interpretation is close to the minimalist one. But the two interpretations have different purposes, the first one is more a guiding principle as well.

/Fredrik
 
  • #310
gentzen said:
What do you mean by "imagine solution in qbism mutation"? Do you mean my short analogy was a misrepresentation of qbism? I was actually more worried that neumaier would find it a misrepresentation of his views. All I wanted to highlight is that there are models (with all their associated structure) in his interpretation, but no agents. In qbism on the other hand, agents play a prime role, and models are not mentioned explicitly, even so I admit that between the lines you could find that they are also part of the picture, as a part of the tools an agent can use. So I guess you protest that my "In QBism, the agent uses QM as a cookbook to update his state." was a mutation of "According to QBism, quantum mechanics is a tool anyone can use to evaluate, on the basis of one’s past experience, one’s probabilistic expectations for one’s subsequent experience." Did I guess correctly?
In my own view, the microstrcutre of information processing going on in the agent, supposedly REPLACES the ensemble fiction. So the inference machiner rests on agent-subjective basis. And the challenge for me is rather to explain that the agents likely will interact in a way that they evolve into agreement, which can approximate observer equivalence.

This is why I keep thinking that the current formulation of QM, corresponds to a dominant non-limiting "agent" that is essentially the whole environment, so that we could almsot think of agents collecting scattering data from the black box, and NOTHING escapes it's processing. Or that the whole environment of classical agents reach an agreement. Then I think that picutre can also be isomorhic to an ensemble view. But this link must be broken when the assymmetry does not hold, and the question is - then how can we understand this, and whatever replaces the ensemsble, and encodes the information about the system?

/FRedrik
 
  • #311
A. Neumaier said:
The latter gives an ensemble of ''many identically prepared systems'' only when you can prepare them identically! But the single ion in a trap is at each time in a different state - determined by the Schrödinger equation for trap and measurement device. Thus its time snapshots are ''many nonidentically prepared systems'', for which your postulates say nothing at all!

I only said that the ion at different times is not identically prepared! Of course it is prepared, but at different times it is prepared in different states!

Its the same in those cases where it can be derived, namely when you actually have many measurements on identically prepared systems.

It is not the same otherwise since it also allows to derive testable statements for non-identically prepared systems and for single systems, where your interpretation is too minimal to be applicable!

They work with standard QT - but not in the minimal interpretation but in the irrefutable handwaving interpretation, where any intuitive argument is sufficient if it leads to the desired result. Your minimal interpretation is a religion like the other interpretations you are so zealously fighting! In the paper
the most prevailing handwaving interpretation and its relation to the measurement problem is described as follows:The strange thing is only that nature behaves according to our calculations though these are only about imagined things! This requires an explanation!

There are two approaches to the same mathematical calculus:

  1. Expectation via probability: This is the common tradition since 1933 when Kolmogorov showed how to base probability rigorously on measure theory. But Kolmogorov's approach does not work for quantum probabilities, which creates foundational problems.
  2. Probability via expectation: This was the approach of the founders of probability theory who wanted to know the expected value of games, and introduced probabilities as a way of computing these expectations. If fell out of favor only with Kolmogorov's successful axiomatization of probability. However, in 1970, Peter Whittle wrote a book called ''Probability via expectation'' (the third edition from 2012 is still in print) an axiomatization of expectation in which probabilities were a derived concept and Kolmogorov's axioms could be deduced for them.
From the preface of the first edition:

Thus it is now a choice of preference where to start. Probability via expectation is free of measure theory and therefore much more accessible, and as the last chapter in the 2012 edition of Whittle's book shows, it naturally accommodates quantum physics - quite unlike Kolmogorov's approach.

My thermal interpretation views quantum mechanics strictly from the probability via expectation point of view and therefore recovers all traditional probabilistic aspects of quantum mechanics, while removing any trace of measurement dependence from the foundations.You 'just' accept it and stop asking further. But many physicists, including great men like t'Hooft and Weinberg find this 'just' glossing over unexplained territory.vague in the statistical interpretation is why the measurement of a pointer (a macroscopic quantum system) should give information about the value of a microscopic variable entangled with it. This must be posited as an irreducible postulate in addition to your minimal postulates!
The Kolmogorov axioms apply to quantum-mechanical probabilities of course only for one given really feasible experiment not for all thinkable experiments. It's indeed very important to keep this in mind.

Further I don't mind, whether you derive the Kolmogorov axioms from some other axioms of probability theory. I don't argue the mathematical foundations at all. That I leave to the mathematicians. What I want to understand is the physical interpretation of your new foundation (I'm not even sure whether it's really a new foundation or just a reformulation of standard QT).

Concerning the preparation of the single atom in the trap, I don't know what you mean. I'm not an expert of this physics, but what I understand from the quoted Nobel citation is that they have one (or a few) ions in a trap and irradiate it with some laser light (i.e., basically a classical em. wave). That's a clear preparation and it's clearly described by standard quantum theory. What's meausured is the distribution of many emitted photons collected over some time of irradiation, and the corresponding photon distribution can be compared to what standard quantum theory predicts. Obviously the agreement is very well.

On the other hand, how would you describe this example in your approach to QT? So how do you interpret the observed photon distribution if not as a probability distribution for the detection of each single photon? Finally, why do you think this to be a superior description of the situation than by standard QT?
 
  • #312
gentzen said:
This is not necessarily true, and not just because ##1/2(\vec{E}\cdot\vec{D}+\vec{B}\cdot\vec{H})## is a more appropriate expression for the energy density. If you put a CCD detector in the path of the light, the component of the Poynting vector perpendicular to the detector surface might be a more appropriate description for what you will observe. Or if you use a photoresist of a certain thickness with a given refractive index and absorption coefficient, then just multiplying the energy density with the absorption coefficient and integrating over the volume might not give you the actually absorbed energy. (But I would have to do the detailed computation again. I don't remember the exact details anymore. It was a complicated computation with a simple result. I think it was proportional to the energy density, just the constant of proportionality was slightly surprising.)
Well, I'm working in the Heaviside Lorentz units, where in a vacuum (and I assume that we measure free photons, because there photons have a clear meaning) ##\vec{E}=\vec{D}## and ##\vec{B}=\vec{H}##. Also it's easy to show that the photon-detection probability when using the photoelectric effect (e.g., with a photomultiplier or a CCD cam) is proportional to the energy density of the em. field. See, e.g., Garrison, Chiao, Quantum optics.
 
  • Like
Likes physicsworks
  • #313
Fra said:
By qbism mutation I simply mean that, my own interpretation (which colours my comments, and goes hand in hand with thinking we need a revision of the theory, not just reinterpretation) is partly in the qbism direction, but a mutation/variant of the common qbism. If I refrain from thinking about modifications, my other interpretation is close to the minimalist one. But the two interpretations have different purposes, the first one is more a guiding principle as well.
So if I understand you correctly, you are quite happy with your other interpretation close to the minimalist one. No need for a revision of the theory from that perspective.

Your variant of the common qbism interpretation on the other hand seems to be the way to go for you in the long run. But you are currently not completely happy with it, and it will require more than just a reinterpretation. But this is were you expect to find the real solution to the mysteries. And those mysteries include that the world around us seems to contain lots of agents, other humans definitively count as such agents, animals probably count too, it is just unclear where to stop. But the real mystery is "to explain that the agents likely will interact in a way that they evolve into agreement, which can approximate observer equivalence."
 
  • #314
vanhees71 said:
Also it's easy to show that the photon-detection probability when using the photoelectric effect (e.g., with a photomultiplier or a CCD cam) is proportional to the energy density of the em. field. See, e.g., Garrison, Chiao, Quantum optics.
And still you could be quite mislead if you evaluated the energy density in vacuum at a surface (where a CCD cam would be placed) as a way to get a first rough prediction of what a CCD cam would measure. It is a different story if you include a model of your CCD cam with the actual geometry and optical material parameters at the relevant frequency in your simulation, and then take the energy density inside the relevant part of the semiconductor. The energy density in vacuum is not a good predictor for that, at least not at frequencies where the dielectric constant of the semiconductor is still significantly different from 1.
 
  • Like
Likes vanhees71
  • #315
gentzen said:
So if I understand you correctly, you are quite happy with your other interpretation close to the minimalist one. No need for a revision of the theory from that perspective.
Actually, my interpretation, when taking it seriously, more or less suggests that the current theory can not be the final answer because of it's "form", it simply is not constructable in terms of an intrinsic inference. So a revision of the theory is required. I am driven by some of the open problems with unification and fine tuning etc. Ie. my interpretation suggests/requires a reconstruction. This is why its more than a "plain interpretation".

gentzen said:
Your variant of the common qbism interpretation on the other hand seems to be the way to go for you in the long run. But you are currently not completely happy with it, and it will require more than just a reinterpretation. But this is were you expect to find the real solution to the mysteries. And those mysteries include that the world around us seems to contain lots of agents, other humans definitively count as such agents, animals probably count too, it is just unclear where to stop. But the real mystery is "to explain that the agents likely will interact in a way that they evolve into agreement, which can approximate observer equivalence."
Yes, this is an open question. I do not have the answers, but I see plenty of clues and hints. My exploit is not to start at the comlpex end, but at the minimal complex end (which means the highest energy end), there I execpt, that the options are finite. These - like string theory - are not directly observable, but the idea is that logical construction principles of a sound inference should guide is. If this works out, some low energy parameters should follow from a self-organisating once the agents form higher complex systems. In principle, the simplest possible agent, would correspond to say ultimate lementary particles or the ultimate quanta of energy.

/Fredrik
 
Back
Top