A Decoherent Histories and Measurement

  • Thread starter Thread starter Morbert
  • Start date Start date
  • #31
A. Neumaier said:
We have access only to a single realization of the universe, so the state of the unvierse must describe this realization, if it is to make any sense.

In particular, we know a lot about the expectation values of a huge number of quantum observables in this state, namely of all accessible 1-point and 2-point functions of fields at points on the Earth and what is reachable from there by means of telescopes and spacecraft. This drastically restricts the state of the universe in its for us most important aspects.

I did this in my book, not in an impeccable way but in enough details that convinced me that a derivation is possible, not only in principle (for this, chaoticity of the BBGKY hierarchy (or rather its quantum equivalent) is sufficient.

These values are not uncertain. Expectations are completely predictable from the state.
Ok, I will read that link. I thought "uncertain value" was the preferred nomenclature of the thermal interpretation.
 
Physics news on Phys.org
  • #32
gentzen said:
No, this is just not true.
it is just what one gets for a sequence of Born measurements, using an ancilla to make a history. Then the ancilla is discarded (without harm, since it is only a mathematical trick).

What else does it have?
 
  • #33
A. Neumaier said:
If the agent does not exist objectively, nothing exists objectively. But the agent is made of atoms, hence physics must be assumed to know about their objective existence. Your approach is heavily circular!
It's "circular" to the same extent the scientific method or learning is circular, but it's not circular as in "circular reasoning", the loop is supposed to be a learning/tuning loop as in the context of evolution, not as some entropic self-organisation on a fixed state space.

I use an operational meaning of objectivity: Agents learn about each other, and eventually arrive at an agreement in only on way - by communicating/interacting. Operationally I see objectivity is thus defined as an equivalence relation (which is emergent). So to "classify" the agent interactions, is just another "lingo" for classfiying the physical interactions.

So instead of the traditional method, to rely on "fictions". I embrace the somewhat "circular nature" of things, and try to make sense of it.

In normal QFT we have plenty of "gauge choices" or choices of "observer frames", and sometimes interactions are even explain by their transformations terms. So even if we consider the invariants, how doe we sensible _describe_ this objective invariants, without using the arbitrary choices? As I see it, this is a little bit the same with agents... Except agents are not fictional choices, they are real.

A. Neumaier said:
We had macroscopic measurement devices in place long before QM became known to us.
Yes, but I had in mind times even before that, say GUT or planck era, before we have a clear meaning of spacetime.

A. Neumaier said:
The state eveolves according to a determinsitic physical law, and carries objective information in the form of N-point functions. These exist at all stages of the evolution of the universe.
Ok I see, this is how you handle this, and it's part of your view. I just have problems with this, as from my perspective you are just assuming that deterministic physical law is ruling things, but to identify WHICH law like is subject of fine tuning in a huge theory space.

To sum up: I choose evolving loops (what you call circularity) over fine tuning non-observables.

/Fredrik
 
  • #34
Morbert said:
Ok, I will read that link.
The hierarchy is usually heavily truncated, but even in the approximations, the chaoticiy is very frequent. I have given more detailed references in Sections 11.6-8 of my book Coherent Quantum Physics. See also the references at the end of Section 3 of my quantum tomography paper at arXiv:2110.05294.
Morbert said:
I thought "uncertain value" was the preferred nomenclature of the thermal interpretation.
Well, it is the value, or the q-expectation. It is the 'uncertain value' only when used as an approximation to observed measurement results, which, at least for macroscopic ##A##, are matched up to the uncertainty of ##A## in the given state.
 
  • #35
A. Neumaier said:
What else does it have?
The main ingredient is certainly the consistency condition. A secondary incredient is the (tensor product) structure of the Hilbert space for independent systems. You end-up with a mathematical structure that you can instantiate for different base fields, like the real numbers, the complex numbers, or the quaternions. You get a different structure in each case. There is also some structure of the coarsening, the time instances, and the Hilbert space, which typically people try to get rid of when trying to generalize it to quantum field theory and "sum over histories" path integrals.

But even the more canonical part of the mathematical structure allows for interesting mathematical results:
gentzen said:
The connection to "thinking about cosmology" is probably rather that CH wants to recover normal logic and probabilistic reasoning. But how does CH recover normal logic? In your terms, by ensuring that there can be an agent which knows (remembers) all those things which have happened in the past. I guess you don't like that, but that is why I investigated it, and now write it down. The consistency conditions simply force the Hilbert space to be big enough, at least if the initial state is pure (##\rho = \ket{\phi}\bra{\phi}##). The consistency condition ##\operatorname{Tr}(C_\alpha\rho C_\beta^\dagger)=0## then reduce to ##(C_\beta\ket{\phi},C_\alpha\ket{\phi})=0##, i.e. the vectors ##C_\alpha\ket{\phi}## are orthogonal. So if there are ##m## histories ##\alpha## with non-zero probability, then the dimension ##N## of the Hilbert space satisfies ##N \geq m##.

If the initial state ##\rho## has rank ##r## instead of being pure, then we only get ##r N \geq m##. (One proves this by replacing ##\rho## with a pure space in a suitably enlarged Hibert space, see Diósi for details.) This bound can really be achieved, for example take ...

gentzen said:
Then I wanted to bound the dimension of the Hibert space of CH, ... While searching ..., I found a paper with a disappointingly weak bound, but an insightfull remark following that bound:
Fay Dowker and Adrian Kent said:
In other words, if the Hilbert space of the universe is finite-dimensional there is a strict bound on the number of probabilistic physical events. Once this number has occurred, the evolution of the universe continues completely deterministically. This is mathematically an unsurprising feature of the formalism but, as far as we are aware, physically quite new: no previous interpretation of quantum theory has suggested that quantum stochasticity is exhaustible in this way.
Morbert said:
@gentzen They presumably infer this from this lemma
lemma12-png.png

This limit is a bound on the fine-graining of ##\mathcal{S}##. But there is also a complementary set ##\mathcal{S}'## that can return probabilities for events ##\mathcal{S}## can't address. I.e. This is less a bound on probabilistic events that can occur in the universe, and more a bound on the universe's ability to have some observable ##O = \sum_i^k \lambda_i \Pi_i## capable of recording a history.
I now tried to understand this issue better, both the insightfull remark by Dowker and Kent, and how I can think about the sharp bound itself. (I am still focused on small closed systems.) ...

CH does avoid wavefunction collapse (and its apparent nonlocality), but the remark raises the suspicion that it might not succeed to treat quantum time development as an inherently stochastic process. ...

For thinking about the sharp bound itself, the time-symmetric formulation of CH with two hermitian positive semidefinite matrices ##\rho_i## and ##\rho_f## satisfying ##\operatorname{Tr}(\rho_i \rho_f)=1## seems well suited to me. The decoherence functional then reads ##D(\alpha,\beta)=\operatorname{Tr}(C_\alpha\rho_i C_\beta^\dagger\rho_f)## and the bound on the number ##m## of histories ##\alpha## with non-zero probability becomes ##\operatorname{rank}(\rho_i)\operatorname{rank}(\rho_f)\geq m##. Interpreting ##\rho_i## as corresponding to pre-selection ("preparation") and ##\rho_f## as post-selection ("measurement") gives at least some intuition why there is that unexpected product in the bound.

I had planned to look into different formulations of CH since some time, for a completely different reason: vanhees71 believes that the minimal statistical interpretation can be applied to the time evolution of a single quantum system over an indeterminate time. I would like to understand whether this is indeed possible. Trying to analyse this scenario with the standard formulation of CH doesn't work well, but I knew that there were different formulations of CH, some of which seemed to incorporate features relevant for that scenario. The bound itself rather reduces my confidence that CH will convince me that this is possible. ...
 
  • #36
Fra said:
how doe we sensible _describe_ this objective invariants, without using the arbitrary choices?
By showing how the results form the different choices are related to each other,
providing in this way an observer-independent objectivity.
Fra said:
I had in mind times even before that, say GUT or Planck era, before we have a clear meaning of spacetime.
In my view, spacetime exists prior to everything. Whether something else has to be posited in the GUT or Planck era is pure speculation.
Fra said:
you are just assuming that deterministic physical law is ruling things, but to identify WHICH law like is subject of fine tuning in a huge theory space.
No. I use the well established standard model of quantum field theory, allowing for small unknown modifications to accomodate gravity. Unless I consider cosmology, where I can use any of the standard cosmological models. The foundations do not depend on the details.
 
  • #37
gentzen said:
A secondary incredient is the (tensor product) structure of the Hilbert space for independent systems.
This was known since before 1930. It belongs to the standard toolkit of QM, not to an interpretation.

gentzen said:
You end-up with a mathematical structure that you can instantiate for different base fields, like the real numbers, the complex numbers, or the quaternions.
But physicists only use the complex version; the others are ruled out by experiment.
gentzen said:
There is also some structure of the coarsening,
which structure? It cannot be significantly different from the earlier established methods, averaged second-order perturbation theory, projection formalism, or Schwinger-Keldysh formalism. They are also of the shut-up-and-calculate type, hence not part of the interpretation.
gentzen said:
But even the more canonical part of the mathematical structure
which is just shut-up-and-calculate applied to the Born rule for histories.
 
  • #38
A. Neumaier said:
By showing how the results form the different choices are related to each other,
providing in this way an observer-independent objectivity.
Yes, agreed.

But then to demonstrably show objectivity, and not just assume it. The results from the choices that are made must be collected and processed; and unless we work with fictions, I consider that to be a physical process, essentially meaning - let the acutal population of these "agents" interact for real, and see what the steady state is like. Once in place, objectivity is physically manifest, not sooner. This is how I see the "emergence".

/Fredrik
 
  • #39
Fra said:
Yes, agreed.

But then to demonstrably show objectivity, and not just assume it. The results from the choices that are made must be collected and processed; and unless we work with fictions, I consider that to be a physical process, essentially meaning - let the acutal population of these "agents" interact for real, and see what the steady state is like. Once in place, objectivity is physically manifest, not sooner. This is how I see the "emergence".
All this is very high level, not theoretical physics.

Theoretical physics consists of making assumptions and deducing from these interesting things that are found (no matter how) in agreement with observations, explaining this way objectivity, rather than assuming it.

The processing needed to show the agreement is not part of theoretical physics but a side issue. If you want to study it, better observe those who bring theory and experiment into agrement than to muse about abstract information theoretial issues!
 
  • #40
A. Neumaier said:
This was known since before 1930. It belongs to the standard toolkit of QM, not to an interpretation.
That doesn't make it right. Schrödinger was careful in his "cat paper" to remain unsure whether it would remain correct for the "unification" of QM with special relativity. My understanding is that in QFT independence is modeled by commutation of measurement operators for spacelike separated spacetime regions, and the recent MIP*=RE result disproved the earlier expectations that both would be equivalent.

A. Neumaier said:
But physicists only use the complex version; the others are ruled out by experiment.
Can you even make this distinction and rule out the non-complex versions without a mathematical structure where this distinction exists?

A. Neumaier said:
which structure? It cannot be significantly different from the earlier established methods, averaged second-order perturbation theory, projection formalism, or Schwinger-Keldysh formalism. They are also of the shut-up-and-calculate type, hence not part of the interpretation.
It is not a nice structure, it is just there. It is a part of the interpretation that even its proponents try to get rid of, but it is there.

A. Neumaier said:
which is just shut-up-and-calculate applied to the Born rule for histories.
In the end, you are simply not interested in the consistent histories interpretation. I cannot blame you for that. But what is dangerous is to make blanket statements about interpretations that you don't really understand, nor care about. You didn't do yourself any favor with your old (2000) paper claiming Bohmian mechanics contradicts quantum mechanics. Similarly, S. Goldstein did not do himself a favor with section 2 Decoherent Histories in his Quantum Theory without Observers—Part One.
 
  • #41
gentzen said:
That doesn't make it right. Schrödinger was careful in his "cat paper" to remain unsure whether it would remain correct for the "unification" of QM with special relativity.
But decoherent histories don't even adrress relativity issues!
gentzen said:
My understanding is that in QFT independence is modeled by commutation of measurement operators for spacelike separated spacetime regions, and the recent MIP*=RE result disproved the earlier expectations that both would be equivalent.
I dont' see how a result in quantum complexity has any bearing on this equivalence.
gentzen said:
Can you even make this distinction and rule out the non-complex versions without a mathematical structure where this distinction exists?
No, but when you make it, you cannot reproduce standard QM, hence it is disproved and can be ignored.
gentzen said:
In the end, you are simply not interested in the consistent histories interpretation.
Only because it adds nothing for the interpretation of the physically relevant setting of QM.
gentzen said:
I cannot blame you for that. But what is dangerous is to make blanket statements about interpretations that you don't really understand
I think I understand the interpretation. It was not the first occasion I spent time on it.
gentzen said:
You didn't do yourself any favor with your old (2000) paper claiming Bohmian mechanics contradicts quantum mechanics.
I still think that paper is valid; the replies from the Bohmian camp didn't convince me of the contrary.

Bohmian mechanics (on which I unfortunately spent far too many days) is a very superficial enhancement of quantum mechanics adding irrelevant extra structure. In exchange for it they lose important structure existing in QM, namely phase space symmetries and relativistic symmetries that are fundamental for a proper understanding of quantum mechanics (semiclassical approximations) and quantum field theory (conservation laws).
 
Last edited:
  • #42
A. Neumaier said:
I still think that paper is valid; the replies from the Bohmian camp didn't convince me of the contrary.
Did they even bother to respond? Your paper simply repeats an old misunderstanding, which already existed in the 50s, and which is well explained in many accounts of BM (even in the discussion of BM in Franck Laloë's "Do we really understand quantum mechanics").

A. Neumaier said:
Bohmian mechanics (which I spent many days on)
without noticing that you are not the first to raise that objection?
A. Neumaier said:
is a very superficial enhancement of quantum mechanics adding irrelevant extra structure.
The extra structure provides a source of randomness, even for finite dimensional systems. No need for a wavefunction of the universe like in MWI, even so you can consider one, if you really want.
 
  • #43
A. Neumaier said:
This means that nothing at all is predicted, since everything is objectively correct.
This perverts the meaning of 'objective'.
There is an analogous process in classical physics. If you have a classical theory of a 6-sided die, you can choose a coarse graining and compute probabilities for events [1 or 2], [3 or 4], [5 or 6]. Alternatively, you could choose a different coarse graining and compute probabilities for [1 or 6], [2 or 4], [3 or 5]. Both coarse grainings return objectively correct probabilities. This is uncontroversial, because the two coarse grainings aren't mutually exclusive alternatives for reality. They are addressing different claims about the same reality. The same is true for for different coarse grainings in quantum theory, only now we have different maximally fine-grained representations as well as different coarse grainings. E.g. For a detector, one might define alternative pointer outcomes {D_0, D_1}, another might define {U+, U-} which are macroscopic superpositions. These coarse grainings aren't mutually exclusive alternatives, so no inconsistency arises.
This again involves choosing additional things beyund the coordinaters of the lab, from which the state of the universe should be able to deduce what is in the lab - the arrangement, the chooser of the model and the coarse-graining, and everything else. But instead, decoherent histories has to use all this as additional input.
Given any physical theory, in order to compute probabilities for mutually exclusive alternatives, those mutually exclusive alternatives must be constructed. This does not seem like a great offense to me.
And what is the meaning of these probabilities? There is only one universe, in which some things actually happen and everything else does not happen.
Testing a cosmological theory would presumably involve experimentally testing possibilities for which the theory assigns a probability very close to 1. Even if a probability for some statement is not exactly 1, a negative result would still imply a loss of confidence in the theory.
 
  • #44
gentzen said:
Did they even bother to respond?
Yes.

I am proud of the citation of my paper in Streater's Lost Causes in and beyond Physics from 2007. Streater is one of the authors of the well-known book PCT, spin and statistics, and all that.
gentzen said:
The extra structure provides a source of randomness
But at the cost of losing important structure. There are covariant ways of introducing noise into a system.
 
  • #45
Re/ Decoherent histories and relativity. For posterity: https://arxiv.org/abs/gr-qc/9304006

The interpretation can be applied to a quantum theory of mechanics, of fields, of spacetime geometries etc. so long as the theory exists. Basically the interpretation can be applied to any theory that permits the construction of boolean lattices from event algebras.
 
  • #46
Morbert said:
another might define {U+, U-} which are macroscopic superpositions.
How can one read a value from macroscopic superposition? (I am not really interested in the answer, or in a further continuation of this for me state discussion.)
 
  • #47
Morbert said:
Re/ Decoherent histories and relativity. For posterity: https://arxiv.org/abs/gr-qc/9304006
Can you please point to a page of this thick book where the discussion of relativity starts? This is new to me, so I will look at it more closely.
 
  • #48
A. Neumaier said:
How can one read a value from macroscopic superposition?
Roland Omnes, in his book "The Interpretations of Quantum Mechanics" (section 7.8) argues you would need an apparatus larger than the observable universe.
A. Neumaier said:
Can you please point to a page of this thick book where the discussion of relativity starts? This is new to me, so I will look at it more closely.
In section VII E (page 94), they discuss quantum mechanics of a relativistic world line. In section VIII (page 106) they discuss quantum gravity. These lecture notes make a more ambitious claim, and suggests that decoherent histories might be a generalisation of quantum mechanics (a point also made here in an interview with Gell-Mann). But the notes were from 1992 and I do not know whether it panned out.
 
  • #49
A. Neumaier said:
All this is very high level, not theoretical physics.

Theoretical physics consists of making assumptions and deducing from these interesting things that are found (no matter how) in agreement with observations, explaining this way objectivity, rather than assuming it.

The processing needed to show the agreement is not part of theoretical physics but a side issue. If you want to study it, better observe those who bring theory and experiment into agrement than to muse about abstract information theoretial issues!
We disagree here in our views about that is practical matters or side issues, and what has potential deeper implications for the foundations that will allow for unification. But this I think this is due to the kind of answer we seek, I don't think we will resolve this. But agreeing on the disagreement is good enough, I think I've understood your interpretations a bit better, that was interesting enough as we shared the issues with current interpretations.

I get the difference between theoretical processing and ponderings by theorist which is indeed a side issue, but the "relational processing" one part of nature does with another part, it's the latter part I consider, and I tend to think that the description we have, hints at this. The comparasion with scientific evolution vs circularity was just to illustrate that apparent circulariry may be iteration loops.

/Fredrik
 
  • #50
Morbert said:
Roland Omnes, in his book "The Interpretations of Quantum Mechanics" (section 7.8) argues you would need an apparatus larger than the observable universe.
Thus this is impossible.
Morbert said:
In section VII E (page 94), they discuss quantum mechanics of a relativistic world line. In section VIII (page 106) they discuss quantum gravity.
Thanks. But they do this for coarse-graining only. It is well-known that relativisitc coarse-graining dates back to Schwinger and Keldysh (1965), much earlier than decoherent histories. Thus this is not specific to decoherent histories but part of the general shut-up-and-calculate machinery.

Measurement is discussed in the lecture notes only in passing, on p.136, in the context of coarse-graining through a semiclassical approximation. Thus the lecture notes are silent about whether the observed fact of unique measurement results is a consequence of the unitary dynamics of the universal state.

I conclude that decoherent histories gives on the interpretation level nothing new compared to what was known earlier, namely Born's rule plus conditional probability. The only new thing is a new name.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 25 ·
Replies
25
Views
3K
Replies
1
Views
930
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 23 ·
Replies
23
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 25 ·
Replies
25
Views
5K
  • · Replies 24 ·
Replies
24
Views
4K
Replies
2
Views
2K