A Decoherent Histories and Measurement

  • A
  • Thread starter Thread starter Morbert
  • Start date Start date
Morbert
Gold Member
Messages
929
Reaction score
728
TL;DR Summary
Split from another thread
Morbert said:
Decoherent histories has been around for a good few decades at this stage, with one motivation for its development being the description of closed systems, and measurements as processes therein.
https://www.webofstories.com/play/murray.gell-mann/163
https://iopscience.iop.org/article/10.1088/1742-6596/2533/1/012011/pdf
https://arxiv.org/abs/1704.08725

It gives a clear account of what it means for a measurement to occur in a closed system. It might even recover the ensemble interpretation insofar as we could conceptualize an infinite ensemble of histories of the universe and associate measurement with correlations over the ensemble.
A. Neumaier said:
No. It only gives an account of events ''that we can talk about at the breakfast table'' (according to the above paper) - not of dynamical processes that would qualify as measurement processes.

In particular, their discussion assumes measurement results that fall from heaven, given by a POM or POVM in addition to the untary dynamics of the system, rather than taking the state of the universe and deriving from it the distribution of the values read from a macroscopic detector that is part of the dynamics.

Thus everything is empty talk embellishing Born's rule.

Given some quantum theory of an isolated system, as well as a quantum of state of that isolated system, decoherent/consistent histories will ascribe meaning to event probabilities of the isolated system, without needing to couple the system to some ancilla. This is more or less the ambition of decoherent histories as an interpretation. It does not privilege any measure as more correct than any other, and we choose whatever measure is suitable for computing the probabilities for events we are interested in when we carry out a measurement.
 
Last edited:
Physics news on Phys.org
Decoherent histories claims to say something about observations in quantum cosmology, where the quantum system is the whole universe and the observers are inside it.
Morbert said:
Given some quantum theory of an isolated system, as well as a quantum state of that isolated system, decoherent/consistent histories will ascribe meaning to event probabilities of the isolated system,
But it doesn't specify how to recognize from inside the isolated system what was read off from the detector inside this isolated system. The latter is encoded in the macrostate of the detector, which is a coarse-grained version of the state of the isolated system. Thus it should be determined by the unitary dynamics of the isolated system.

But this connection is neither made nor even hinted at. Instead, a discussion is given how to assign probabilities of events given the quantum state of the isolated system together with an unrelated external POVM, said to describe measurement.

Thus it posits more than the state of the system to impose meaning on it. But in the universe, those discussing meaning (and hence meaning itself) must be derived from inside, not posited externally. Without that it is just empty talk.
 
  • Like
Likes bhobba, PeroK and Fra
vanhees71 said:
I thought the POVMs are constructed to describe as best as one can the properties of the measurement device, or rather the other way, experiments construct measurement devices, which as good as possible realize a measurement described by a POVM.
In real life, the physicists choose the POVM based on objective properties of their equipment. In a quantum theory of the solar system - or any system big enough to contain the physicists making the choices -, these objective properties should be encoded in the macrostate of the equipment, hence be determined by the state of the solar system. How to determine these properties from the state is the unsolved part of the measurement problem.
 
  • Like
Likes bhobba, physika, PeroK and 1 other person
The first thing a physicist learns that the scale of a system under consideration is important, and that already roughly defines the "relevant observables" to describe it. The solar system cannot described from first principles as a quantum system. It wouldn't even make sense if it could be done, because all the minute details of any quark, lepton, and gauge bosons making up the matter in the solar systems, cannot be observed.

I also have no clue about the meaning of the "decoherent-history interpretation" and what it should have to do with the description of the solar system.
 
vanhees71 said:
The solar system cannot be described from first principles as a quantum system.
Then quantum mechanics is incomplete, even ignoring the current lack of knowledge about quantum gravity.

But if QM is complete then the solar system must be describable by some state, though we may never know that state to the last detail.

We also don't know the state of a bucket full of water to the last detail. But we can still argue mathematically with this state to derive the coarse-grained macroscopic description from it.

Similarly we can work with the state of the solar system and coarse-grain it, with more details where we know more - such as the equipment in a particular lab used to perform experiments.
vanhees71 said:
I also have no clue about the meaning of the "decoherent-history interpretation".
Then you should refrain from contributing to this thread, which is specifically about this interpretation. Or read the papers from post #1.
 
A. Neumaier said:
Decoherent histories claims to say something about observations in quantum cosmology, where the quantum system is the whole universe and the observers are inside it.

But it doesn't specify how to recognize from inside the isolated system what was read off from the detector inside this isolated system. The latter is encoded in the macrostate of the detector, which is a coarse-grained version of the state of the isolated system. Thus it should be determined by the unitary dynamics of the isolated system.

But this connection is neither made nor even hinted at.
A. Neumaier said:
Then quantum mechanics is incomplete, even ignoring the current lack of knowledge about quantum gravity.

But if QM is complete then the solar system must be describable by some state, though we may never know that state to the last detail.
As I share part of your reasons for developing another interpretation, I at the same time do not share the sort of answer you seek. I think it's interesting to try to put the finger on the conceptual difference.

I think we conceptually agree(?) that...

Current theory with some of the major interpretations such as CI or statistical interpretations, RELY on a fiction(*), which is deeply unsatisfactory.

(*) Namely either an infinite "ensemble", or the macroscopic state of the "observer" which possibly includes the whole classical world. And the probabilistic predictions then stand on this ground. And the typical decoherence approach/resolution to requiring that also a measurement device is a part of pysics, is to just consider a bigger system, which includes the measurement device. But evenetually this tower hits the roof, which is whole universe; then there is no place to further push this imaginary observer. The problem with this is not to mathematicaly imagine such extrapolations, the problems is that the whole approach seems conceptually deeply confused, and it adds not explanatory value but just avoidance of the problems.

We seek a construction based on something that is not a fiction, ie. based on something that is real, and inside our one universe. Ie. not fictive ensembles, nor fictive marcrostates that are forced to be outside the universe.

But we disagree(?) on...

You seek a construction in terms of something that is available and exists for real, inside the universe, but without requiring that all these things are known must be funneled through a single observer channel. You want to ensure objectivity from start, and thus avoid the notion of "observers" that "process information", for the reason that it is indeed problematic.

While I seek an construction in terms of something that is informationally and computationally _at hand_ to a real inside observer/agent (in a qbist sense). Meaning that things that gets truncated when funneling through observation channels are not used.

Your open challenge seems to be howto from the assumed omnipresent reality, allows defining and explaning what a real truncated inside observer acually experiences?

(My problem is different, as beeing of qbism kind, starts from the agent expriences, and tries to explain WHY and HOW does objectivity emerge - as a result of the problematic, but not yet understood information processing - and that the correct objectivity that we do already kno if, is the one singled out; also a bit similar to a problem in string theory)

So do we agree on part of the problem, but seeks answers of different kinds? Understanding each others views helps alot to avoid misunderstanding in almost every thread.

/Fredrik
 
Last edited:
Fra said:
So do we agree on part of the problem, but seeks answers of different kinds?
yes.

The problem with the answer you seek is that the agent (with its information) is part of the universe, hence has no real aspects that it wouldn't inherit from whatever is real in the universe. Thus your attempted approach is based on sand!
 
I hope we at least agree on this starting point: An isolated system like a universe has an enormous number of possible coarse grainings. Some will describe the approximately deterministic everyday experiences of a physicist in a lab with detectors, others won't. No coarse graining is singled out as more correct than any other.
A. Neumaier said:
But it doesn't specify how to recognize from inside the isolated system what was read off from the detector inside this isolated system. The latter is encoded in the macrostate of the detector, which is a coarse-grained version of the state of the isolated system. Thus it should be determined by the unitary dynamics of the isolated system.

But this connection is neither made nor even hinted at. Instead, a discussion is given how to assign probabilities of events given the quantum state of the isolated system together with an unrelated external POVM, said to describe measurement.

Thus it posits more than the state of the system to impose meaning on it. But in the universe, those discussing meaning (and hence meaning itself) must be derived from inside, not posited externally. Without that it is just empty talk.
As systems inside the universe, capable of observing our surroundings, we would construct a coarse graining that describes our surroundings. More specifically, we would choose a quasiclassical coarse graining that describes thermodynamics, hydrodynamics, classical equations of motion, biology, evolution etc. These coarse grainings will describe detector behaviour, as well as the behaviour of the biological systems reading the detector and using quantum theory to understand it. Other exotic, complementary coarse grainings might describe exotic creatures with exotic experiences. Those creatures would use those coarse grainings. Or maybe the gathering of information and utilising it is only possible with quasiclassical processes.

https://arxiv.org/abs/quant-ph/0609190

Gell-Mann and Hartle said:
Quantum mechanics supplies probabilities for the members of various decoherent sets of coarse-grained alternative histories — different ‘realms’ for short. [...] Quantum mechanics by itself does not favor one realm over another. However, we are interested for most purposes in the family of quasiclassical realms underlying everyday experience — a very small subset of the set of all realms. [...] Such a coarse graining, necessary for having both decoherence and approximate classical predictability, is connected with the coarse graining that defines the familiar entropy of thermodynamics. [...] Specific systems like the planet Mars, Western culture, and asparagus exhibit various kinds of particular exploitable regularities. [...] We refer to the family of decoherent sets of coarse-grained histories that describe these regularities as the family of quasiclassical realms.
So decoherent histories by itself is empty insofar as it does not explain the usefulness of the measures we use to describe our experiences with detectors. But it i) tells us we can use them even if we are part of a larger quantum system and ii) provides a framing that lets us pursue explanations with recourse to thermodynamics, hydrodynamics etc.
 
Last edited:
  • #10
Morbert said:
I hope we at least agree on this starting point: An isolated system like a universe has an enormous number of possible coarse grainings.
Yes. In addition, the universe is the smallest isolated system containing us, and the only one about which we can ever have any knowledge.
Morbert said:
Some will describe the approximately deterministic everyday experiences of a physicist in a lab with detectors, others won't.
All reasonable coarse-grainings in use describe experiences of a physicist in a lab with detectors.
Morbert said:
No coarse graining is singled out as more correct than any other.
This is far from correct, in the standard usage of the term. Coarse-graining always refers to dropping highly oscillating or far away degrees of freedom, and the only question is which degrees of freedom are dropped. Properly done, the result is always correct to the extent that the dropped degrees of freedom have only an averaged influence on the variables kept.
Morbert said:
As systems inside the universe, capable of observing our surroundings, we would construct a coarse graining that describes our surroundings. More specifically, we would choose a quasiclassical coarse graining that describes thermodynamics, hydrodynamics, classical equations of motion, biology, evolution etc.
Yes, except when we observe tiny systems, which are then modeled with more than thermodynamic or hydrodynamic detail.
Morbert said:
These coarse grainings will describe detector behaviour, as well as the behaviour of the biological systems reading the detector and using quantum theory to understand it.
Yes. The quasiclassical coarse graining is quasideterministic and leads to unique outcomes, which therefore must be explained from the unitary dynamics of the state of the universe, for any theory that claims to be able to interpret the latter.
Morbert said:
Other exotic, complementary coarse grainings might describe exotic creatures with exotic experiences.
This is completely irrelevant, as we haven't observed these.
Morbert said:
So decoherent histories by itself is empty insofar as it does not explain the usefulness of the measures we use to describe our experiences with detectors.
But to explain the latter is the key in a fundamental interpretation of quantum mechanics.
Morbert said:
But it i) tells us we can, even if we are part of a larger quantum system
though it does not explain why we can. That we can is obvious, since we did it long before decoherent histories were invented.
Morbert said:
and ii) provides a framing that lets us pursue explanations with recourse to thermodynamics, hydrodynamics etc.
In that respect it is no different from the earlier treatments, since it just spells out what follows from a sequence of Born measurements.

Thus decoherent histories are an empty embellishment of Born's rule.
 
Last edited:
  • Like
Likes PeroK and vanhees71
  • #11
A. Neumaier said:
The problem with the answer you seek is that the agent (with its information) is part of the universe
agreed, it does make it more complicated, but i see no other way than to face that
A. Neumaier said:
, hence has no real aspects that it wouldn't inherit from whatever is real in the universe. Thus your attempted approach is based on sand!
We can't discuss or explain thus in details, but I of course disagree even if I understand your critique. But what I have in makes sense only in an evolutionary context. The agents are of course normal matter. The difference lies only in modelling perspective. So understanding matter and the observerer thus not two quests but one.

/Fredrik
 
  • #12
A. Neumaier said:
Yes. The quasiclassical coarse graining is quasideterministic and leads to unique outcomes, which therefore must be explained from the unitary dynamics of the state of the universe, for any theory that claims to be able to interpret the latter.
To be clear before I answer: You are saying the what demands explanation is why, in our universe, a quasiclassical coarse graining obtains? [edit] - Or that given a quasiclassical coarse graining, the unique outcomes must be explained.
 
  • #13
Morbert said:
we would choose a quasiclassical coarse graining that describes thermodynamics, hydrodynamics, classical equations of motion, biology, evolution etc. These coarse grainings will describe detector behaviour, as well as the behaviour of the biological systems reading the detector and using quantum theory to understand it.
"we would choose" is not a satisfactory answer for me, as it relies on fine tuning as involves human choices.

I want to understand the self-organising processes in nature, and understand WHY some options are apparently preferred by nature over others that are just as "logically possible".

/Fredrik
 
  • #14
Morbert said:
To be clear before I answer: You are saying the what demands explanation is why, in our universe, a quasiclassical coarse graining obtains? [edit] - Or that given a quasiclassical coarse graining, the unique outcomes must be explained.
What demands explanation is
1. why, in our universe, the joint reduced dynamical description of a particle to be measured and a detector measuring it, with the detector modelled by quasiclassical coarse graining, results in a unique outcome reading in the description of the detector - which is what we always experience, but 'we' is outside the reduced description; and
2. why this unique outcome satisfies Born's rule when the experiment is repeatedly performed under identical conditions with nonidentical readings.
 
  • Like
Likes PeroK and mattt
  • #15
Fra said:
But what I have in mind makes sense only in an evolutionary context.
Fra said:
I want to understand the self-organising processes in nature
This is quite different from what I want. I want to understand why and how Nature can be objectively described. This is the fundamental problem.

How a self-organising process (an 'agent') in Nature forms reasonably reliable (and approximately valid) concepts of Nature is a completely different problem on a much higher level, and depends a lot on the capabilities of the agent. Moreover, it assumes reality already (as otherwise there is no basis on which to start).
 
Last edited:
  • #16
A. Neumaier said:
What demands explanation is
1. why, in our universe, the joint reduced dynamical description of a particle to be measured and a detector measuring it, with the detector modelled by quasiclassical coarse graining, results in a unique outcome reading in the description of the detector - which is what we always experience, but 'we' is outside the reduced description; and
What is unsatisfactory with the standard description of, e.g., a photo detector as given in standard quantum optics textbooks, e.g.,

J. Garrison and R. Chiao, Quantum optics, Oxford University
Press, New York (2008), https://doi.org/10.1093/acprof:
oso/9780198508861.001.0001.

Of course "we" are completely irrelevant in this, because you can well suppose that no human being is present when the photons hit the photodetector and some technical device (like a photoplate of the old days or some digital storage device in our modern times) stores this outcomes of measurements, and then "we" look at these data way after all the experimental equipment is demounted. Then I think we can agree that indeed "we" don't have any significant influence on the measurment outcomes.
A. Neumaier said:
2. why this unique outcome satisfies Born's rule when the experiment is repeatedly performed under identical conditions with nonidentical readings.
It's, because Born's rule, together with the rest of QT, is based on experience. QT has been discovered, because empirical evidence (mostly about atomic physics like spectra under various settings, e.g., the Stark and Zeeman effects, and all that) made it necessary to develop a "new theory" going beyond classical mechanics and electrodynamics. The latter were also discovered by looking for theories about direct empirical evidence.

The classical behavior of macroscopic objects from QT is also quite well understood, at least in its basic principles, by quantum-statistical many-body physics. It's basically, because for sufficiently coarse-grained collective observables the quantum (and also the usually much more significant "thermal") fluctuations around the mean values of these observables are small.

You may ask, as Wigner famously did, why it is possible at all to describe Nature with such accuracy using mathematics, but that's indeed a question that's way outside of the realm of the natural sciences.
 
  • #17
vanhees71 said:
What is unsatisfactory with the standard description of, e.g., a photo detector as given in standard quantum optics textbooks, e.g.,

J. Garrison and R. Chiao, Quantum optics, Oxford University
Press, New York (2008), https://doi.org/10.1093/acprof:
oso/9780198508861.001.0001.
It only gives a statistical argument involving an infinite ensemble of identically prepared particle+detector systems. Thus it accounts for 2. but not for 1., since it is not shown why (but assumed that) a single outcome arises in each particular case.
 
  • #18
Fra said:
"we would choose" is not a satisfactory answer for me, as it relies on fine tuning as involves human choices.

I want to understand the self-organising processes in nature, and understand WHY some options are apparently preferred by nature over others that are just as "logically possible".
The choice is a choice of description, not a choice of what obtains in reality. You can choose a coarse graining that describes the unique detector outputs we experience, or you can choose a coarse graining that contains macroscopic superpositions of detector outputs. Both are consistent descriptions of the same universe, neither of which is more correct, or more preferred by nature. The reason the former description is more useful to us/preferred by us is because that is the description that contains biological systems like ourselves reading detector outpus.
 
  • #19
A. Neumaier said:
I want to understand why and how Nature can be objectively described. This is the fundamental problem.
That is of course implied in my question as well, but I see objectivity as something emergent, not a constraint. The why question amounts to answering WHY it is emergent. And the de factor objectivity that we human see, corresponds to the fully emerged thing. Most of this emergence process I think happens during early phase of big bang.

A. Neumaier said:
How a self-organising process (an 'agent') in Nature forms reasonably reliable (and approximately valid) concepts of Nature is a completely different problem on a much higher level, and depends a lot on the capabilities of the agent. Moreover, it assumes reality already (as otherwise there is no basis on which to start).
Is this "problem" really different than, that we need already a macroscopic measurement devices in place before we can form QM? What happens to your view during the presumed big bang? Ie. before stable matter was formed? The matter that "populates" the universe, is different today, than during the early big bang. How is this explained?

/Fredrik
 
  • #20
Morbert said:
The choice is a choice of description, not a choice of what obtains in reality. You can choose a coarse graining that describes the unique detector outputs we experience, or you can choose a coarse graining that contains macroscopic superpositions of detector outputs. Both are consistent descriptions of the same universe, neither of which is more correct, or more preferred by nature. The reason the former description is more useful to us/preferred by us is because that is the description that contains biological systems like ourselves reading detector outpus.
I see what you say, as descriptions from the fictive perspective then yes.

But I was referring to when observers inside the universe, is what is "operating" these choices. Then these choices are not just in the theorists mind. They correspond to real physics states, don't they?

/Fredrik
 
  • #21
Fra said:
I see objectivity as something emergent
If the agent does not exist objectively, nothing exists objectively. But the agent is made of atoms, hence physics must be assumed to know about their objective existence. Your approach is heavily circular!
Fra said:
Is this "problem" really different than, that we need already a macroscopic measurement devices in place before we can form QM?
We had macroscopic measurement devices in place long before QM became known to us.
Fra said:
What happens to your view during the presumed big bang? Ie. before stable matter was formed? The matter that "populates" the universe, is different today, than during the early big bang. How is this explained?
The state eveolves according to a determinsitic physical law, and carries objective information in the form of N-point functions. These exist at all stages of the evolution of the universe.

Only our knowledge of them is recent.
 
  • #22
A. Neumaier said:
This is far from correct, in the standard usage of the term. Coarse-graining always refers to dropping highly oscillating or far away degrees of freedom, and the only question is which degrees of freedom are dropped. Properly done, the result is always correct to the extent that the dropped degrees of freedom have only an averaged influence on the variables kept.
This choice of what degrees of freedom are dropped is not imposed by the theory, nor is the choice of representation. Consider a universe with a detector responding to a microscopic system. Decoherent histories interprets a representation and coarse graining that yields probabilities for detector pointer outcomes as just as objectively correct and reliable as ones that yield probabilities for macroscopic superpositions of pointer outcomes. More orthodox interpretations might involve the heavens selecting the former as more objectively correct, but not Decoherent Histories. Even the trivial representation of a unitarily evolving property ##|\Psi\rangle\langle\Psi|## (where ##\Psi## is the wavefunction of the universe) is objectively correct.
A. Neumaier said:
What demands explanation is
1. why, in our universe, the joint reduced dynamical description of a particle to be measured and a detector measuring it, with the detector modelled by quasiclassical coarse graining, results in a unique outcome reading in the description of the detector - which is what we always experience, but 'we' is outside the reduced description; and
2. why this unique outcome satisfies Born's rule when the experiment is repeatedly performed under identical conditions with nonidentical readings.
A. Neumaier said:
In that respect it is no different from the earlier treatments, since it just spells out what follows from a sequence of Born measurements.
Decoherent Histories interprets a quantum theory of the isolated system of the universe as asserting probabilities for what events occur in the universe. From these probabilities, we can work out the implicitly conditional measurement probabilities we use in a lab (like what is the probability a physicist observes a particular outcome conditioned on the earth and mankind and the physicist and the lab existing).

These probabilities for possible histories of the universe are not measurement probabilities, but they are fundamental in the interpretation. Given a state of the universe ##\Psi## and quantum theory of the universe, is it the case under the thermal interpretation that measurement probabilities can be derived in principle if not in practice from uncertain values of the universe ##\{\langle A\rangle_\Psi\}##?
 
  • #23
Morbert said:
This choice of what degrees of freedom are dropped is not imposed by the theory, nor is the choice of representation. Consider a universe with a detector responding to a microscopic system. Decoherent histories interprets a representation and coarse graining that yields probabilities for detector pointer outcomes as just as objectively correct and reliable as ones that yield probabilities for macroscopic superpositions of pointer outcomes.
This means that nothing at all is predicted, since everything is objectively correct.
This perverts the meaning of 'objective'.
Morbert said:
Decoherent Histories interprets a quantum theory of the isolated system of the universe as asserting probabilities for what events occur in the universe.
And what is the meaning of these probabilities? There is only one universe, in which some things actually happen and everything else does not happen. No ensemble of possibilities, unless the ensemble is imposed in addition to the state of the universe. But the imposer is part of the universe, hence should be completely described by its state - so the setting becomes circular.
Morbert said:
From these probabilities, we can work out the implicitly conditional measurement probabilities we use in a lab (like what is the probability a physicist observes a particular outcome conditioned on the earth and mankind and the physicist and the lab existing).
This again involves choosing additional things beyund the coordinaters of the lab, from which the state of the universe should be able to deduce what is in the lab - the arrangement, the chooser of the model and the coarse-graining, and everything else. But instead, decoherent histories has to use all this as additional input.
Morbert said:
These probabilities for possible histories of the universe are not measurement probabilities, but they are fundamental in the interpretation.
... and have themselves no interpretation at all, since there is no ensemble. Even subjective probabilities require a subject, which is inside the universe, hence should be descibed by the state of the universe.

You didn't convince me that there is more than empty talk in the interpretation.
 
  • #24
A. Neumaier said:
It only gives a statistical argument involving an infinite ensemble of identically prepared particle+detector systems. Thus it accounts for 2. but not for 1., since it is not shown why (but assumed that) a single outcome arises in each particular case.
But that's all there is! The randomness you refer to is observed in the lab, and nobody has found any causal explanation for a single outcome. You might be dissatisfied by this, but it seems as if Nature doesn't care and chooses to stay random as she likes.
 
  • #25
A. Neumaier said:
This means that nothing at all is predicted, since everything is objectively correct.
This perverts the meaning of 'objective'.
The consistent histories interpretation is incomplete. That by itself is not so much different than the empty theory in predicate logic being incomplete. The only thing which is different is the expectation of people not familiar with that interpretation.
A. Neumaier said:
No ensemble of possibilities, unless the ensemble is imposed in addition to the state of the universe. But the imposer is part of the universe, hence should be completely described by its state - so the setting becomes circular.
Not circular, just incomplete.
A. Neumaier said:
This again involves choosing additional things beyund the coordinaters of the lab, from which the state of the universe should be able to deduce what is in the lab - the arrangement, the chooser of the model and the coarse-graining, and everything else. But instead, decoherent histories has to use all this as additional input.
Yes, and that it needs all this as additional input again just shows that it is incomplete.

A. Neumaier said:
You didn't convince me that there is more than empty talk in the interpretation.
There is indeed a certain risk of empty talk, but the interpretation itself also has non-trivial mathematical content that makes it valuable for understanding certain aspects of QT. For me, MWI has a greater risk of empty talk, and less valuable mathematical content.
 
  • #26
Morbert said:
These probabilities for possible histories of the universe are not measurement probabilities, but they are fundamental in the interpretation. Given a state of the universe ##\Psi## and quantum theory of the universe, is it the case under the thermal interpretation that measurement probabilities can be derived in principle if not in practice from uncertain values of the universe ##\{\langle A\rangle_\Psi\}##?
I would argue that the "expectation" that you can declare an arbitrary state ##\Psi## as the state of the universe in the thermal interpretation fails. Prescribing an arbitrary state may be fine for a sufficiently small subsystem. The probabilities and unique outcomes don't need to be explained based on the state of the subsystem alone, but can make use of the state of the universe, which is restricted to be less arbitrary.
 
  • #27
vanhees71 said:
But that's all there is!
Only in your nonminimal interpretation (which you call minimal, though it adds the metaphysical claim ''and nothing else'' that cannot be checked by experiment) . You always forget that this extra assumption is not shared by those investigating the foundations of quantum theory!

In the generally accepted minimal statistical interpretation, it is the minimum of what there is, and nothing else is asserted beyond that.

In particular, in a setting where the quantum state of the universe makes sense
(and this thread is only about this), one cannot prepare the universe in multiple states, hence any talk about probabilities is vacuous.

One therefore needs to derive from the state of the universe (which is all there is) that the small pieces of the universe describing the standard quantum experiments behave in the probabilistic way described by Born's rule.
vanhees71 said:
The randomness you refer to is observed in the lab,
Yes, and it needs an explanation in terms of the state of the universe.
vanhees71 said:
and nobody has found any causal explanation for a single outcome.
This does not mean that there is none.

Before a discovery is made, nobody has found this discovery, but this does not prove that there cannot be any discoveries!
vanhees71 said:
You might be dissatisfied by this, but it seems as if Nature doesn't care and chooses to stay random as she likes.
It seems to you. You might be satisfied with this, but it seems to me (and to all those interested in the foundations of quantum mechanics) that Nature cares and an explanation is still ahead of us.
 
Last edited:
  • Like
Likes physika, akvadrako and PeterDonis
  • #28
gentzen said:
Yes, and that it needs all this as additional input again just shows that it is incomplete.
It is silent about just the things essential for better foundations.

It is exactly as incomplete as the minimal statistical interpretation, hence just empty talk.
gentzen said:
For me, MWI has a greater risk of empty talk, and less valuable mathematical content.
It is better than the many words (oh, sorry, many worlds) interpretation only in that it does not postulate fancy additional stuff.

The mathematical contents is precisely that of conditional probability theory applied to Born's rule with collapse, well known before the decoherent histories approach was launched.
 
  • #29
Morbert said:
Given a state of the universe ##\Psi## and quantum theory of the universe,
We have access only to a single realization of the universe, so the state of the unvierse must describe this realization, if it is to make any sense.

In particular, we know a lot about the expectation values of a huge number of quantum observables in this state, namely of all accessible 1-point and 2-point functions of fields at points on the Earth and what is reachable from there by means of telescopes and spacecraft. This drastically restricts the state of the universe in its for us most important aspects.
Morbert said:
is it the case under the thermal interpretation that measurement probabilities can be derived in principle if not in practice
I did this in my book, not in an impeccable way but in enough details that convinced me that a derivation is possible, not only in principle (for this, chaoticity of the BBGKY hierarchy (or rather its quantum equivalent) is sufficient.
Morbert said:
from uncertain values of the universe ##\{\langle A\rangle_\Psi\}##?
These values are not uncertain. Expectations are completely predictable from the state.
 
  • #30
A. Neumaier said:
The mathematical contents is precisely that of conditional probability theory applied to Born's rule with collapse, well known before the decoherent histories approach was launched.
No, this is just not true. Even so you can argue that this might have been the motivation for it (maybe it was to a certain extent, but I guess you don't really care), it has its distint own mathematical structure. Some (for example S. Goldstein) even argue that this structure is slightly ugly and not sufficiently unique to make consistent histories worthy of study, compared to other interpretations like Bohmian mechanics.
 
  • #31
A. Neumaier said:
We have access only to a single realization of the universe, so the state of the unvierse must describe this realization, if it is to make any sense.

In particular, we know a lot about the expectation values of a huge number of quantum observables in this state, namely of all accessible 1-point and 2-point functions of fields at points on the Earth and what is reachable from there by means of telescopes and spacecraft. This drastically restricts the state of the universe in its for us most important aspects.

I did this in my book, not in an impeccable way but in enough details that convinced me that a derivation is possible, not only in principle (for this, chaoticity of the BBGKY hierarchy (or rather its quantum equivalent) is sufficient.

These values are not uncertain. Expectations are completely predictable from the state.
Ok, I will read that link. I thought "uncertain value" was the preferred nomenclature of the thermal interpretation.
 
  • #32
gentzen said:
No, this is just not true.
it is just what one gets for a sequence of Born measurements, using an ancilla to make a history. Then the ancilla is discarded (without harm, since it is only a mathematical trick).

What else does it have?
 
  • #33
A. Neumaier said:
If the agent does not exist objectively, nothing exists objectively. But the agent is made of atoms, hence physics must be assumed to know about their objective existence. Your approach is heavily circular!
It's "circular" to the same extent the scientific method or learning is circular, but it's not circular as in "circular reasoning", the loop is supposed to be a learning/tuning loop as in the context of evolution, not as some entropic self-organisation on a fixed state space.

I use an operational meaning of objectivity: Agents learn about each other, and eventually arrive at an agreement in only on way - by communicating/interacting. Operationally I see objectivity is thus defined as an equivalence relation (which is emergent). So to "classify" the agent interactions, is just another "lingo" for classfiying the physical interactions.

So instead of the traditional method, to rely on "fictions". I embrace the somewhat "circular nature" of things, and try to make sense of it.

In normal QFT we have plenty of "gauge choices" or choices of "observer frames", and sometimes interactions are even explain by their transformations terms. So even if we consider the invariants, how doe we sensible _describe_ this objective invariants, without using the arbitrary choices? As I see it, this is a little bit the same with agents... Except agents are not fictional choices, they are real.

A. Neumaier said:
We had macroscopic measurement devices in place long before QM became known to us.
Yes, but I had in mind times even before that, say GUT or planck era, before we have a clear meaning of spacetime.

A. Neumaier said:
The state eveolves according to a determinsitic physical law, and carries objective information in the form of N-point functions. These exist at all stages of the evolution of the universe.
Ok I see, this is how you handle this, and it's part of your view. I just have problems with this, as from my perspective you are just assuming that deterministic physical law is ruling things, but to identify WHICH law like is subject of fine tuning in a huge theory space.

To sum up: I choose evolving loops (what you call circularity) over fine tuning non-observables.

/Fredrik
 
  • #34
Morbert said:
Ok, I will read that link.
The hierarchy is usually heavily truncated, but even in the approximations, the chaoticiy is very frequent. I have given more detailed references in Sections 11.6-8 of my book Coherent Quantum Physics. See also the references at the end of Section 3 of my quantum tomography paper at arXiv:2110.05294.
Morbert said:
I thought "uncertain value" was the preferred nomenclature of the thermal interpretation.
Well, it is the value, or the q-expectation. It is the 'uncertain value' only when used as an approximation to observed measurement results, which, at least for macroscopic ##A##, are matched up to the uncertainty of ##A## in the given state.
 
  • #35
A. Neumaier said:
What else does it have?
The main ingredient is certainly the consistency condition. A secondary incredient is the (tensor product) structure of the Hilbert space for independent systems. You end-up with a mathematical structure that you can instantiate for different base fields, like the real numbers, the complex numbers, or the quaternions. You get a different structure in each case. There is also some structure of the coarsening, the time instances, and the Hilbert space, which typically people try to get rid of when trying to generalize it to quantum field theory and "sum over histories" path integrals.

But even the more canonical part of the mathematical structure allows for interesting mathematical results:
gentzen said:
The connection to "thinking about cosmology" is probably rather that CH wants to recover normal logic and probabilistic reasoning. But how does CH recover normal logic? In your terms, by ensuring that there can be an agent which knows (remembers) all those things which have happened in the past. I guess you don't like that, but that is why I investigated it, and now write it down. The consistency conditions simply force the Hilbert space to be big enough, at least if the initial state is pure (##\rho = \ket{\phi}\bra{\phi}##). The consistency condition ##\operatorname{Tr}(C_\alpha\rho C_\beta^\dagger)=0## then reduce to ##(C_\beta\ket{\phi},C_\alpha\ket{\phi})=0##, i.e. the vectors ##C_\alpha\ket{\phi}## are orthogonal. So if there are ##m## histories ##\alpha## with non-zero probability, then the dimension ##N## of the Hilbert space satisfies ##N \geq m##.

If the initial state ##\rho## has rank ##r## instead of being pure, then we only get ##r N \geq m##. (One proves this by replacing ##\rho## with a pure space in a suitably enlarged Hibert space, see Diósi for details.) This bound can really be achieved, for example take ...

gentzen said:
Then I wanted to bound the dimension of the Hibert space of CH, ... While searching ..., I found a paper with a disappointingly weak bound, but an insightfull remark following that bound:
Fay Dowker and Adrian Kent said:
In other words, if the Hilbert space of the universe is finite-dimensional there is a strict bound on the number of probabilistic physical events. Once this number has occurred, the evolution of the universe continues completely deterministically. This is mathematically an unsurprising feature of the formalism but, as far as we are aware, physically quite new: no previous interpretation of quantum theory has suggested that quantum stochasticity is exhaustible in this way.
Morbert said:
@gentzen They presumably infer this from this lemma
lemma12-png.png

This limit is a bound on the fine-graining of ##\mathcal{S}##. But there is also a complementary set ##\mathcal{S}'## that can return probabilities for events ##\mathcal{S}## can't address. I.e. This is less a bound on probabilistic events that can occur in the universe, and more a bound on the universe's ability to have some observable ##O = \sum_i^k \lambda_i \Pi_i## capable of recording a history.
I now tried to understand this issue better, both the insightfull remark by Dowker and Kent, and how I can think about the sharp bound itself. (I am still focused on small closed systems.) ...

CH does avoid wavefunction collapse (and its apparent nonlocality), but the remark raises the suspicion that it might not succeed to treat quantum time development as an inherently stochastic process. ...

For thinking about the sharp bound itself, the time-symmetric formulation of CH with two hermitian positive semidefinite matrices ##\rho_i## and ##\rho_f## satisfying ##\operatorname{Tr}(\rho_i \rho_f)=1## seems well suited to me. The decoherence functional then reads ##D(\alpha,\beta)=\operatorname{Tr}(C_\alpha\rho_i C_\beta^\dagger\rho_f)## and the bound on the number ##m## of histories ##\alpha## with non-zero probability becomes ##\operatorname{rank}(\rho_i)\operatorname{rank}(\rho_f)\geq m##. Interpreting ##\rho_i## as corresponding to pre-selection ("preparation") and ##\rho_f## as post-selection ("measurement") gives at least some intuition why there is that unexpected product in the bound.

I had planned to look into different formulations of CH since some time, for a completely different reason: vanhees71 believes that the minimal statistical interpretation can be applied to the time evolution of a single quantum system over an indeterminate time. I would like to understand whether this is indeed possible. Trying to analyse this scenario with the standard formulation of CH doesn't work well, but I knew that there were different formulations of CH, some of which seemed to incorporate features relevant for that scenario. The bound itself rather reduces my confidence that CH will convince me that this is possible. ...
 
  • #36
Fra said:
how doe we sensible _describe_ this objective invariants, without using the arbitrary choices?
By showing how the results form the different choices are related to each other,
providing in this way an observer-independent objectivity.
Fra said:
I had in mind times even before that, say GUT or Planck era, before we have a clear meaning of spacetime.
In my view, spacetime exists prior to everything. Whether something else has to be posited in the GUT or Planck era is pure speculation.
Fra said:
you are just assuming that deterministic physical law is ruling things, but to identify WHICH law like is subject of fine tuning in a huge theory space.
No. I use the well established standard model of quantum field theory, allowing for small unknown modifications to accomodate gravity. Unless I consider cosmology, where I can use any of the standard cosmological models. The foundations do not depend on the details.
 
  • #37
gentzen said:
A secondary incredient is the (tensor product) structure of the Hilbert space for independent systems.
This was known since before 1930. It belongs to the standard toolkit of QM, not to an interpretation.

gentzen said:
You end-up with a mathematical structure that you can instantiate for different base fields, like the real numbers, the complex numbers, or the quaternions.
But physicists only use the complex version; the others are ruled out by experiment.
gentzen said:
There is also some structure of the coarsening,
which structure? It cannot be significantly different from the earlier established methods, averaged second-order perturbation theory, projection formalism, or Schwinger-Keldysh formalism. They are also of the shut-up-and-calculate type, hence not part of the interpretation.
gentzen said:
But even the more canonical part of the mathematical structure
which is just shut-up-and-calculate applied to the Born rule for histories.
 
  • #38
A. Neumaier said:
By showing how the results form the different choices are related to each other,
providing in this way an observer-independent objectivity.
Yes, agreed.

But then to demonstrably show objectivity, and not just assume it. The results from the choices that are made must be collected and processed; and unless we work with fictions, I consider that to be a physical process, essentially meaning - let the acutal population of these "agents" interact for real, and see what the steady state is like. Once in place, objectivity is physically manifest, not sooner. This is how I see the "emergence".

/Fredrik
 
  • #39
Fra said:
Yes, agreed.

But then to demonstrably show objectivity, and not just assume it. The results from the choices that are made must be collected and processed; and unless we work with fictions, I consider that to be a physical process, essentially meaning - let the acutal population of these "agents" interact for real, and see what the steady state is like. Once in place, objectivity is physically manifest, not sooner. This is how I see the "emergence".
All this is very high level, not theoretical physics.

Theoretical physics consists of making assumptions and deducing from these interesting things that are found (no matter how) in agreement with observations, explaining this way objectivity, rather than assuming it.

The processing needed to show the agreement is not part of theoretical physics but a side issue. If you want to study it, better observe those who bring theory and experiment into agrement than to muse about abstract information theoretial issues!
 
  • #40
A. Neumaier said:
This was known since before 1930. It belongs to the standard toolkit of QM, not to an interpretation.
That doesn't make it right. Schrödinger was careful in his "cat paper" to remain unsure whether it would remain correct for the "unification" of QM with special relativity. My understanding is that in QFT independence is modeled by commutation of measurement operators for spacelike separated spacetime regions, and the recent MIP*=RE result disproved the earlier expectations that both would be equivalent.

A. Neumaier said:
But physicists only use the complex version; the others are ruled out by experiment.
Can you even make this distinction and rule out the non-complex versions without a mathematical structure where this distinction exists?

A. Neumaier said:
which structure? It cannot be significantly different from the earlier established methods, averaged second-order perturbation theory, projection formalism, or Schwinger-Keldysh formalism. They are also of the shut-up-and-calculate type, hence not part of the interpretation.
It is not a nice structure, it is just there. It is a part of the interpretation that even its proponents try to get rid of, but it is there.

A. Neumaier said:
which is just shut-up-and-calculate applied to the Born rule for histories.
In the end, you are simply not interested in the consistent histories interpretation. I cannot blame you for that. But what is dangerous is to make blanket statements about interpretations that you don't really understand, nor care about. You didn't do yourself any favor with your old (2000) paper claiming Bohmian mechanics contradicts quantum mechanics. Similarly, S. Goldstein did not do himself a favor with section 2 Decoherent Histories in his Quantum Theory without Observers—Part One.
 
  • #41
gentzen said:
That doesn't make it right. Schrödinger was careful in his "cat paper" to remain unsure whether it would remain correct for the "unification" of QM with special relativity.
But decoherent histories don't even adrress relativity issues!
gentzen said:
My understanding is that in QFT independence is modeled by commutation of measurement operators for spacelike separated spacetime regions, and the recent MIP*=RE result disproved the earlier expectations that both would be equivalent.
I dont' see how a result in quantum complexity has any bearing on this equivalence.
gentzen said:
Can you even make this distinction and rule out the non-complex versions without a mathematical structure where this distinction exists?
No, but when you make it, you cannot reproduce standard QM, hence it is disproved and can be ignored.
gentzen said:
In the end, you are simply not interested in the consistent histories interpretation.
Only because it adds nothing for the interpretation of the physically relevant setting of QM.
gentzen said:
I cannot blame you for that. But what is dangerous is to make blanket statements about interpretations that you don't really understand
I think I understand the interpretation. It was not the first occasion I spent time on it.
gentzen said:
You didn't do yourself any favor with your old (2000) paper claiming Bohmian mechanics contradicts quantum mechanics.
I still think that paper is valid; the replies from the Bohmian camp didn't convince me of the contrary.

Bohmian mechanics (on which I unfortunately spent far too many days) is a very superficial enhancement of quantum mechanics adding irrelevant extra structure. In exchange for it they lose important structure existing in QM, namely phase space symmetries and relativistic symmetries that are fundamental for a proper understanding of quantum mechanics (semiclassical approximations) and quantum field theory (conservation laws).
 
Last edited:
  • #42
A. Neumaier said:
I still think that paper is valid; the replies from the Bohmian camp didn't convince me of the contrary.
Did they even bother to respond? Your paper simply repeats an old misunderstanding, which already existed in the 50s, and which is well explained in many accounts of BM (even in the discussion of BM in Franck Laloë's "Do we really understand quantum mechanics").

A. Neumaier said:
Bohmian mechanics (which I spent many days on)
without noticing that you are not the first to raise that objection?
A. Neumaier said:
is a very superficial enhancement of quantum mechanics adding irrelevant extra structure.
The extra structure provides a source of randomness, even for finite dimensional systems. No need for a wavefunction of the universe like in MWI, even so you can consider one, if you really want.
 
  • #43
A. Neumaier said:
This means that nothing at all is predicted, since everything is objectively correct.
This perverts the meaning of 'objective'.
There is an analogous process in classical physics. If you have a classical theory of a 6-sided die, you can choose a coarse graining and compute probabilities for events [1 or 2], [3 or 4], [5 or 6]. Alternatively, you could choose a different coarse graining and compute probabilities for [1 or 6], [2 or 4], [3 or 5]. Both coarse grainings return objectively correct probabilities. This is uncontroversial, because the two coarse grainings aren't mutually exclusive alternatives for reality. They are addressing different claims about the same reality. The same is true for for different coarse grainings in quantum theory, only now we have different maximally fine-grained representations as well as different coarse grainings. E.g. For a detector, one might define alternative pointer outcomes {D_0, D_1}, another might define {U+, U-} which are macroscopic superpositions. These coarse grainings aren't mutually exclusive alternatives, so no inconsistency arises.
This again involves choosing additional things beyund the coordinaters of the lab, from which the state of the universe should be able to deduce what is in the lab - the arrangement, the chooser of the model and the coarse-graining, and everything else. But instead, decoherent histories has to use all this as additional input.
Given any physical theory, in order to compute probabilities for mutually exclusive alternatives, those mutually exclusive alternatives must be constructed. This does not seem like a great offense to me.
And what is the meaning of these probabilities? There is only one universe, in which some things actually happen and everything else does not happen.
Testing a cosmological theory would presumably involve experimentally testing possibilities for which the theory assigns a probability very close to 1. Even if a probability for some statement is not exactly 1, a negative result would still imply a loss of confidence in the theory.
 
  • #44
gentzen said:
Did they even bother to respond?
Yes.

I am proud of the citation of my paper in Streater's Lost Causes in and beyond Physics from 2007. Streater is one of the authors of the well-known book PCT, spin and statistics, and all that.
gentzen said:
The extra structure provides a source of randomness
But at the cost of losing important structure. There are covariant ways of introducing noise into a system.
 
  • #45
Re/ Decoherent histories and relativity. For posterity: https://arxiv.org/abs/gr-qc/9304006

The interpretation can be applied to a quantum theory of mechanics, of fields, of spacetime geometries etc. so long as the theory exists. Basically the interpretation can be applied to any theory that permits the construction of boolean lattices from event algebras.
 
  • #46
Morbert said:
another might define {U+, U-} which are macroscopic superpositions.
How can one read a value from macroscopic superposition? (I am not really interested in the answer, or in a further continuation of this for me state discussion.)
 
  • #47
Morbert said:
Re/ Decoherent histories and relativity. For posterity: https://arxiv.org/abs/gr-qc/9304006
Can you please point to a page of this thick book where the discussion of relativity starts? This is new to me, so I will look at it more closely.
 
  • #48
A. Neumaier said:
How can one read a value from macroscopic superposition?
Roland Omnes, in his book "The Interpretations of Quantum Mechanics" (section 7.8) argues you would need an apparatus larger than the observable universe.
A. Neumaier said:
Can you please point to a page of this thick book where the discussion of relativity starts? This is new to me, so I will look at it more closely.
In section VII E (page 94), they discuss quantum mechanics of a relativistic world line. In section VIII (page 106) they discuss quantum gravity. These lecture notes make a more ambitious claim, and suggests that decoherent histories might be a generalisation of quantum mechanics (a point also made here in an interview with Gell-Mann). But the notes were from 1992 and I do not know whether it panned out.
 
  • #49
A. Neumaier said:
All this is very high level, not theoretical physics.

Theoretical physics consists of making assumptions and deducing from these interesting things that are found (no matter how) in agreement with observations, explaining this way objectivity, rather than assuming it.

The processing needed to show the agreement is not part of theoretical physics but a side issue. If you want to study it, better observe those who bring theory and experiment into agrement than to muse about abstract information theoretial issues!
We disagree here in our views about that is practical matters or side issues, and what has potential deeper implications for the foundations that will allow for unification. But this I think this is due to the kind of answer we seek, I don't think we will resolve this. But agreeing on the disagreement is good enough, I think I've understood your interpretations a bit better, that was interesting enough as we shared the issues with current interpretations.

I get the difference between theoretical processing and ponderings by theorist which is indeed a side issue, but the "relational processing" one part of nature does with another part, it's the latter part I consider, and I tend to think that the description we have, hints at this. The comparasion with scientific evolution vs circularity was just to illustrate that apparent circulariry may be iteration loops.

/Fredrik
 
  • #50
Morbert said:
Roland Omnes, in his book "The Interpretations of Quantum Mechanics" (section 7.8) argues you would need an apparatus larger than the observable universe.
Thus this is impossible.
Morbert said:
In section VII E (page 94), they discuss quantum mechanics of a relativistic world line. In section VIII (page 106) they discuss quantum gravity.
Thanks. But they do this for coarse-graining only. It is well-known that relativisitc coarse-graining dates back to Schwinger and Keldysh (1965), much earlier than decoherent histories. Thus this is not specific to decoherent histories but part of the general shut-up-and-calculate machinery.

Measurement is discussed in the lecture notes only in passing, on p.136, in the context of coarse-graining through a semiclassical approximation. Thus the lecture notes are silent about whether the observed fact of unique measurement results is a consequence of the unitary dynamics of the universal state.

I conclude that decoherent histories gives on the interpretation level nothing new compared to what was known earlier, namely Born's rule plus conditional probability. The only new thing is a new name.
 

Similar threads

Back
Top