Dyson's View Of Wavefunction Collapse

  • Thread starter Thread starter bhobba
  • Start date Start date
  • Tags Tags
    Quantum mechanics
  • #51
A. Neumaier said:
How do you get Born's rule for two consecutive measurements (which is traditionally used to compare with consecutive measurements) from these ##n##-point functions?
Why not just use the Born rule? You may have a desire to derive it from some other special principle describing "measurements". But it doesn't clarify what constitutes a measurement, and as John Bell has argued (Against Measurement), there's really no place for such a concept in a microscopic theory.

The Heisenberg picture differs from the Schrödinger picture in that "measurement" is not thought to happen in an instant. The entire experiment is considered -- lasting for an interval of time including "state preparation" and "measurement". A probability can be calculated for any particular history, but you are not restricted to the use of projection operators (as Consistent Histories may have it). You can, for example, evaluate correlation functions for the positions of a harmonic oscillator at different times. And people have applied these to study the quantum behaviour of the LIGO-mirrors.
 
Physics news on Phys.org
  • #52
martinbn said:
And this article, from it
…. The duration and severity of the second stage are decreasing as the years go by. Each new generation of students learns quantum mechanics more easily than their teachers learned it. The students are growing more detached from prequantum pictures. There is less resistance to be broken down before they feel at home with quantum ideas. Ultimately, the second stage will disappear entirely. Quantum mechanics will be accepted by students from the beginning as a simple and natural way of thinking, because we shall all have grown used to it.
”Ultimately” is a long time, but I am not so optimistic. Early childhood makes everyone an Aristotelian, and (as witness the B-level threads in Classical and Relativity) there is a substantial conceptual shift to get from there to Newtonian physics and Galilean relativity. Those who have made it through that shift tend to be heavily invested in the classical model that they’ve worked so hard to internalize.
 
  • Like
Likes bhobba and martinbn
  • #53
martinbn said:
Not sure what you mean by this and how it relates to my post! Are you saying that Dyson takes observations for lambda?
I am saying that Dyson takes certain things for real. These must be described by real physics, and play the role of what Demystifier calles lambda.

WernerQH said:
Why not just use the Born rule?
Because Born's rule is formulated in the Schrödinger picture, and is for single measurements only. For a sequence of measurements one needs state reduction, and hence must find an alternative expression of this when one claims that n-point functions are everything.
WernerQH said:
as John Bell has argued (Against Measurement), there's really no place for such a concept in a microscopic theory.
But then there must be a replacement doing the same job that Born's rule does. Bell is silent on this.
 
  • Like
Likes bhobba and jbergman
  • #54
A. Neumaier said:
I am saying that Dyson takes certain things for real. These must be described by real physics, and play the role of what Demystifier calles lambda.
I don't think so. Demystitier calls lambda sometging more restrictive than just anything real. For him lambda is the real state of the quantum system if the wave function represents our knowledge.
 
  • #55
martinbn said:
For him lambda is the real state of the quantum system if the wave function represents our knowledge.
Not quite. The PBR theorem assumes that the wave function represents some kind of probability distribution over an underlying set of states; the latter are ##\lambda##. But that concept of ##\lambda## is not limited to the context of the PBR theorem.

For there not to be any ##\lambda##, it would have to be the case that it is impossible to describe the allowed real states of the system using any set of states. The only QM interpretation I'm aware of that makes any claim sort of like that is the version of Copenhagen that says it's impossible to have a more complete description than the probabilistic description QM provides. But that version doesn't seem to be very popular; certainly it's not the interpretation @Demystifier favors. And I'm not sure it's the interpretation Dyson would have favored either.
 
  • #56
martinbn said:
I don't think so. Demystitier calls lambda sometging more restrictive than just anything real. For him lambda is the real state of the quantum system if the wave function represents our knowledge.
PeterDonis said:
Not quite. The PBR theorem assumes that the wave function represents some kind of probability distribution over an underlying set of states; the latter are ##\lambda##. But that concept of ##\lambda## is not limited to the context of the PBR theorem.

For there not to be any ##\lambda##, it would have to be the case that it is impossible to describe the allowed real states of the system using any set of states. The only QM interpretation I'm aware of that makes any claim sort of like that is the version of Copenhagen that says it's impossible to have a more complete description than the probabilistic description QM provides. But that version doesn't seem to be very popular; certainly it's not the interpretation @Demystifier favors. And I'm not sure it's the interpretation Dyson would have favored either.
PeteDonis is right, in general lambda can by anything "real", including the wave function. In a special case, when the wave function is interpreted as a purely epistemological entity, the lambda does not include wave function, but this is just a special case. The PBR theorem rules this case out, at least if the assumptions of the theorem are satisfied.
 
  • #57
Demystifier said:
PeteDonis is right, in general lambda can by anything "real", including the wave function. In a special case, when the wave function is interpreted as a purely epistemological entity, the lambda does not include wave function, but this is just a special case. The PBR theorem rules this case out, at least if the assumptions of the theorem are satisfied.
Yes, I agree with this, but Neumaier said the observations are real. I don't see how the observations can be the lambda. My understanding is that the lambda stands for some parameters that describe some real state. Not just anything that is real. You cannot say electrons are real so they are the lambda, right? The lambda needs to be some characteristic of the electrons.
 
  • #58
martinbn said:
Yes, I agree with this, but Neumaier said the observations are real. I don't see how the observations can be the lambda. My understanding is that the lambda stands for some parameters that describe some real state. Not just anything that is real. You cannot say electrons are real so they are the lambda, right? The lambda needs to be some characteristic of the electrons.
Would you agree that observations can be described by lambda?
 
  • #59
Demystifier said:
Would you agree that observations can be described by lambda?
How? Say I make a measurement and observe the particle at some place. What lambda describes that observation?
 
  • #60
martinbn said:
How? Say I make a measurement and observe the particle at some place. What lambda describes that observation?
The measurement involves a pointer of a macroscopic apparatus, it is supposed to be described by some lambda too. For instance, in Bohmian mechanics this is positions of particles which constitute the measuring apparatus.
 
  • #61
Demystifier said:
The measurement involves a pointer of a macroscopic apparatus, it is supposed to be described by some lambda too. For instance, in Bohmian mechanics this is positions of particles which constitute the measuring apparatus.
Yes, but that is different. You have position variable for all particles, that's your lambda. But that is not the observations. The observations being real doesn't mean that there is some lambda, it least the way I understand it.
 
  • #62
I have seen ##\lambda## invoked as a state, in some space ##\Lambda##, fully specifying the real properties of a system in some ontological model of an operational theory (e.g. Spekkens). A physicist could presumably accept there are real things, but be satisfied with an operational theory.

I haven't been able to find any deep dive by Dyson on the matter, so I don't know what his ultimate position is.
 
Last edited:
  • Like
Likes physika and martinbn
  • #63
A. Neumaier said:
Because Born's rule is formulated in the Schrödinger picture, and is for single measurements only.
That's the traditional way of presenting Born's rule. I think it's too narrow a view to tie it to some "measurement" process. For me, it is completely equivalent to the use of a statistical operator. We can express all observable quantities using density matrices. What is missing if we have a theory that produces the numbers that we can check in experiment?

Incidentally, Born applied this rule in complete analogy with the scattering of light in Maxwell's theory. No mention of measurement. I enjoyed reading Dyson's essay mentioned in @martinbn's post #31.

A. Neumaier said:
But then there must be a replacement doing the same job that Born's rule does.
Do you think so? Aren't we trying to add something (metaphysical baggage) to quantum theory that would better be left out? Trying to solve the "measurement problem" looks very much like the attempts of constructing a mechanical model of the ether. We already have a very successful theory, but our view is obstructed by obsolete metaphysical baggage.
 
  • Like
Likes Lord Jestocost and martinbn
  • #64
WernerQH said:
Do you think so? Aren't we trying to add something (metaphysical baggage) to quantum theory that would better be left out?
One thing that MWI risks to leave out is space (NRQM) or spacetime. Bohmian mechanics on the other hand is commited to the existence of space (it can be more flexible with respect to the existence of for example photons as particles), and already has a difficult time replacing its commitment to space by a commitment to spacetime instead.

But is the existence of space metaphysical baggage, which would be better left out? Space does not occur explicitly in Consistent Histories, but instead you get the tensor product there, which serves a similar purpose. Should it be left out, as metaphysical baggage?
 
  • #65
WernerQH said:
That's the traditional way of presenting Born's rule. I think it's too narrow a view to tie it to some "measurement" process. For me, it is completely equivalent to the use of a statistical operator.
It is not. The statistical operator is a purely mathematical concept. In the standard foundations, the only relation between the math of quantum mechanics and the physical world is the Born rule, which in its modern forms, always explicitly refers to measurement. Without that, Born's rule is empty since it is not clear probabilities of what are meant.
WernerQH said:
Incidentally, Born applied this rule in complete analogy with the scattering of light in Maxwell's theory. No mention of measurement.
Yes, he did. But it didn't survive the critical historical process that followed, and for good reasons! See Chapter 14.2 of my book 'Coherent Quantum Physics', or Section 3.2 of the preprint https://arxiv.org/abs/1902.10778
WernerQH said:
Aren't we trying to add something (metaphysical baggage) to quantum theory that would better be left out?
We are only adding a reasonably clear (and very successful) statement about the connection of the formalism to physical reality, in a formulation due to von Neumann 1927 that survived for nearly 100 years, as evidenced by every modern textbook. Except my new book on algebraic quantum physics, where Born's rule is substituted by a more elementary, immediate physical principle.
 
Last edited:
  • #66
WernerQH said:
A probability can be calculated for any particular history,
For sufficiently long histories of consecutive measurements of continuous quantities (such as the particle paths measured at CERN), these probabilities become extremely tiny. Almost like the probability that a random generator for text produces a sonnett by Shakespeare....
 
  • #67
bhobba said:
... then what is the concern of collapse?
To my mind, there is no concern of collapse when one understands the “wave (or state) function” ##\psi## without adding metaphysical baggage. Carl Friedrich von Weizsäcker has put it in a nutshell:

“We summarize: ##\psi## is knowledge, and knowledge depends on the information collected by the knowing subject. Knowledge is of course not dreaming, not "merely subjective." It is knowledge of objective facts of the past which will turn out to be identical for anybody who has the necessary information; and it is a probability function for the future that holds for everybody who has the same information, and which can be checked empirically through measurement of relative frequencies in the manner described in the third chapter. All paradoxes occur only if one interprets ##\psi## itself in some other sense as an "objective fact," a fact going beyond that at a certain time a certain observer has a certain knowledge. Facts are past events which we in principle can know today.”

In “The Structure of Physics” (the book is a newly arranged and revised English version of "Aufbau der Physik" by Carl Friedrich von Weizsäcker), Chapter 9 “The problem of the interpretation of quantum theory”, Section 9.2.2 ”Gaining information by means of measurement”
 
  • #68
gentzen said:
But is the existence of space metaphysical baggage, which would be better left out?
No, I can't think of physics without space (or spacetime). What I think of as metaphysical baggage is the idea of "objects" moving through spacetime. Sure, we perceive the classical world as composed of objects, and those in turn composed of smaller objects (elementary particles). But they aren't objects in the usual sense -- they have properties that are uncertain or undefined, depending on the circumstances. Does a photon have polarisation? In the Aspect et al. experiments the photons leave the source completely "unpolarized". Do they acquire polarization only on detection? What we know is that the source loses a certain amount of energy, and the detectors gain a certain amount of energy a few nanoseconds later. And current fluctuations in the detectors display some parallelism with those in the source. Of course it is compelling to "explain" the correlations as due to photons carrying information from the source to the detectors. But it is an explanation that produces a host of new problems. (See the numerous threads on entanglement here on PF!) Quantum field theory does not say where the energy is localized between the emission and absorption events, or which path a photon takes in the double-slit experiment. For many people it is obvious that photons must exist. But "photon" could be a concept like the ether. For Maxwell it was obvious that light waves cannot travel without a medium carrying them.
 
  • Like
Likes gentzen and martinbn
  • #69
Lord Jestocost said:
To my mind, there is no concern of collapse when one understands the “wave (or state) function” ##\psi## without adding metaphysical baggage. Carl Friedrich von Weizsäcker has put it in a nutshell:

“We summarize: ##\psi## is knowledge, and knowledge depends on the information collected by the knowing subject. Knowledge is of course not dreaming, not "merely subjective." It is knowledge of objective facts of the past which will turn out to be identical for anybody who has the necessary information; and it is a probability function for the future that holds for everybody who has the same information,
In other words, von Weizsäcker postulates primary reality, given by the objective facts (the ##\lambda##), and a kind of summary of these facts, secondary objective knowledge, encoded in the wave function ##\psi##.

Thus ##\psi## must be a (poorly defined) function of ##\lambda##. The measurement problem is the quest to make this poorly defined relation more precise. It has nothing to do with metaphysical baggage.
 
  • #70
A. Neumaier said:
For sufficiently long histories of consecutive measurements of continuous quantities (such as the particle paths measured at CERN), these probabilities become extremely tiny. Almost like the probability that a random generator for text produces a sonnett by Shakespeare....
Why is this a problem?
 
  • #71
WernerQH said:
Quantum field theory does not say where the energy is localized between the emission and absorption events
It has not even a concept of emission and absorption events (as events in space and time)! So it seems that QFT says nothing about absorption and emission without being interpreted by what you consider to be metaphysical baggage.
 
  • #72
Morbert said:
Why is this a problem?
Because we see extremely improbable things happen all the time. Tiny probabilities are not a good guide to living in the real world.
 
  • #73
A. Neumaier said:
For sufficiently long histories of consecutive measurements of continuous quantities (such as the particle paths measured at CERN), these probabilities become extremely tiny. Almost like the probability that a random generator for text produces a sonnett by Shakespeare....
People at CERN are experts at producing very unlikely events. :smile:
 
  • #74
A. Neumaier said:
Because we see extremely improbable things happen all the time. Tiny probabilities are not a good guide to living in the real world.
I recently read large parts of “Scientific Reasoning : The Bayesian Approach” by Colin Howson and Peter Urbach (2006). It made a similar comment regarding tiny probabilities, as an argument against Cournot's principle:
gentzen said:
One other interesting discussion point in that book was that Cournot’s principle is inconcistent (or at least wrong), because in some situation any event which can happen has a very small probability. Glenn Shafer proposes to fix this by replacing “practical certainty” with “prediction”. He may be right. After all, I mostly learned about Cournot’s principle from his Why did Cournot’s principle disappear? and “That’s what all the old guys said.” The many faces of Cournot’s principle. Another possible fix could be to evaluate smallness of probabilities relative to the entropy of the given situations. That solution came up during discussions with kered rettop (Derek Potter?) on robustness issues:
If an amplitude of 10^-1000 leads to totally different conclusions than an amplitude which is exactly zero, then the corresponding interpretation has robustness issues.
and was later used as an argument against counting arguments in MWI:
For me, one reason to be suspicious of that counting of equally likely scenarios is that this runs into robustness issues again with very small probabilities like 10^-1000. You would have to construct a correspondingly huge amount of equally likely scenarios. But the very existence of such scenarios would imply an entropy much larger than physically reasonable. In fact, that entropy could be forced to be arbitrarily large.
 
  • #75
A. Neumaier said:
Because we see extremely improbable things happen all the time. Tiny probabilities are not a good guide to living in the real world.
I would think coarse graining is what helps us make sense of these things. If I shuffle a deck and lay out the cards, I will observe an order that had a probability of about 1.24e-68. But the more coarse-grained outcome "first card is red" has a probability of about .5. It's these coarse-grained predictions that we care about.
 
  • #76
Morbert said:
If I shuffle a deck and lay out the cards, I will observe an order that had a probability of about 1.24e-68.
But it is fully determined by the way you shuffled it. So it is clear that the probability is an approximation.

But what should fundamental quantum probabilities of 1.24e-68 mean?
 
  • #77
A. Neumaier said:
But what should fundamental quantum probabilities of 1.24e-68 mean?
Even worse: The probability that a position measurement yields an irrational number is 1, but all actual position measurements produce rational numbers.
 
  • #78
A. Neumaier said:
Even worse: The probability that a position measurement yields an irrational number is 1, but all actual position measurements produce rational numbers.
Do they produce a number or an interval?
 
  • #79
martinbn said:
Do they produce a number or an interval?
That depends on the definition what a measurement result is.

According to Born's rule in every formulation I know, measurement results are real numbers, not intervals.

But in my thermal interpretation, measurement results are uncertain numbers, so they have an intrinsic uncertainty.

I have never seen a definition of a measurement result that would define it as an (open or closed?) interval. Their boundaries would have to be uncertain, too.
 
  • #80
A. Neumaier said:
but all actual position measurements produce rational numbers.
Wouldn't this require perfect resolution to be true? And perfect resolution would not be possible even in principle due to the Wigner-Araki-Yanase theorem.

Instead actual position measurements would be modeled with some POVM and yield a highly localized distribution.
 
  • Like
Likes jbergman and PeterDonis
  • #81
Morbert said:
Wouldn't this require perfect resolution to be true? And perfect resolution would not be possible even in principle due to the Wigner-Araki-Yanase theorem.

Instead actual position measurements would be modeled with some POVM and yield a highly localized distribution.
Doesn't that also violate the uncertainty principle?
 
  • #82
jbergman said:
Doesn't that also violate the uncertainty principle?
No. A very narrow position distribution will be a very wide distribution in momentum space.
 
  • #83
PeterDonis said:
No. A very narrow position distribution will be a very wide distribution in momentum space.
Right, but we were talking about perfect resolution. My point was that we can only measure position up to some range because of the uncertainty principle. A delta function at a single position is an unphysical state that violates the uncertainty principle.
 
  • #84
jbergman said:
My point was that we can only measure position up to some range because of the uncertainty principle.

No, that's quite popular misconception. Uncertainty has nothing to do with this, it tells you about statistical spread of your measurements, not how precise each measurement can be - standard deviation for single measurement is 0.
 
Last edited:
  • #85
jbergman said:
we were talking about perfect resolution.
Perfect resolution means infinite precision in position space and infinite spread in momentum space. The Fourier transform of a delta function is a complex exponential with equal amplitude at every value of momentum. All perfectly consistent with the uncertainty principle, as long as you're okay with things like delta functions. (There are other formulations for those who are squeamish about such things, but they end up at basically the same place.)

jbergman said:
My point was that we can only measure position up to some range because of the uncertainty principle.
No, as @weirdoguy has pointed out, that's not correct.

Physically, the reason real measurements can only have finite precision has to do with the finite size of the measuring tools, but that's not something that's driven by the uncertainty principle. It's driven by the fact that measuring tools have to be made of something, and every possible "something" has a finite size.
 
  • Like
Likes bhobba and javisot20
  • #86
PeterDonis said:
Perfect resolution means infinite precision in position space and infinite spread in momentum space. The Fourier transform of a delta function is a complex exponential with equal amplitude at every value of momentum. All perfectly consistent with the uncertainty principle, as long as you're okay with things like delta functions. (There are other formulations for those who are squeamish about such things, but they end up at basically the same place.)
I don't take that as a real state as that would imply infinite energy. This is essentially a mathematical formalism for computation. Just because the delta functions are the eigen-basis for position space doesn't imply that any real particle is in such a state.
PeterDonis said:
No, as @weirdoguy has pointed out, that's not correct.

Physically, the reason real measurements can only have finite precision has to do with the finite size of the measuring tools, but that's not something that's driven by the uncertainty principle. It's driven by the fact that measuring tools have to be made of something, and every possible "something" has a finite size.
I broadly agree with this which is why I stated my original post as a question. I will have to read @A. Neumaier 's work and others, though, to convince myself that this isn't related to the original question.
 
  • #87
jbergman said:
I don't take that as a real state
It's not. A delta function state is not physically realizable. But that doesn't prevent it from having the necessary mathematical properties to satisfy the uncertainty principle.

jbergman said:
as that would imply infinite energy.
No, it doesn't. An infinite spread in momentum is not the same as infinite energy. The particle has some finite momentum, and therefore some finite energy; we just have complete uncertainty as to which finite momentum it is.
 
  • #88
weirdoguy said:
No, that's quite popular misconception. Uncertainty has nothing to do with this, it tells you about statistical spread of your measurements, not how precise each measurement can be - standard deviation for single measurement is 0.

Yes. That is one reason I like Ballentine so much.

He explains this clearly.

Thanks
Bill
 
  • #89
Demystifier said:
But QM as it is has collapse.

I read somewhere that Schwinger's version of QFT had no collapse. I will see if I can dig up the source.

Added later:
Only Rodney Brooks paper. While Rodney is a qualified physicist, I look at him more as a populist:
https://arxiv.org/vc/arxiv/papers/1710/1710.10291v4.pdf

'Quantum collapse was not included in Schwinger’s formulation of QFT'


Thanks
Bill
 
Last edited:
  • #90
bhobba said:
'Quantum collapse was not included in Schwinger’s formulation of QFT'
The full sentence (at the bottom of page 4) is:
Quantum collapse was not included in Schwinger’s formulation of QFT, but it became an important part of the source theory that he developed later.
 
  • Like
Likes physika and bhobba
  • #91
Morbert said:
Wouldn't this require perfect resolution to be true? And perfect resolution would not be possible even in principle due to the Wigner-Araki-Yanase theorem.

Instead actual position measurements would be modeled with some POVM and yield a highly localized distribution.
Born's rule in all its textbook forms claim that measurements produce eigenvalues, and don't say anything about resolution. This shows that Born's rule is an idealization, but people talk as if it were a universal basic law. Real measurement is something quite complicated,
 
  • Like
Likes bhobba and jbergman
Back
Top