Is the concept of "wave function collapse" obsolete?

In summary: The latter position is sometimes brought across as saying ''there is no collapse''.In summary, the concept of "wave function collapse" is still widely accepted, but is seen as secondary to more modern concepts.
  • #1
Sophrosyne
128
21
TL;DR Summary
In the past, physicists talked of the phenomenon of "wave function collapse" very freely, whereas now there seems to be some reservation about it. Why?
Summary: In the past, physicists talked of the phenomenon of "wave function collapse" very freely, whereas now there seems to be some reservation about it. Why?

In reading older popular physics literature, physicists used to talk about "wave function collapse" freely and more often. Intuitively, for the interested layperson, talking about whether a particle is behaving as a wave or a particle makes a lot of sense. But I have noticed that on these threads, the concept of "wave function collapse" tends to get noses upturned a little bit. People seem to be suggesting that with QFT, it really doesn't make much sense anymore to keep talking about the phenomenon. But it's such a useful and helpful way to think about it, at least for me as an interested layperson.

Is this true or am I misunderstanding? And if so, is there a better way to intuitively understand what is happening?

(Some math is OK, but please keep it at the advanced HS/early undergrad stage!)
 
Last edited:
Physics news on Phys.org
  • #2
The wave function collapse, properly understood, is a central part of quantum theory. It refers to how a measurement changes the quantum state. The quantum state does not necessarily represent something in reality, and is a tool to calculate the probabilities of measurement results. The measurement results are considered to be a part of the reality we observe.
 
  • Like
  • Informative
Likes marcusl and Demystifier
  • #3
The wave function used for particles before passing a barrier with two slits and for particles after passing the barrier is manifestly different. The fact of this change (and many other, similar observations in the context of experimental arrangements) is called collapse (ore state reducion).

It is captured in an idealized (but often not applicable) form by stating that the state turns into an eigenstate of A upon the observation of A.

The controversy about the collapse is not whether it is present in situations like that described but whether it is an irreducible effect that must be postulated, or whether it is derivable form the other postulates, e.g., by saying it is a rational change in modeling when an observer updates the state basd on improved knowledge. The latter position is sometimes brought across as saying ''there is no collapse''.
 
  • Like
Likes Auto-Didact
  • #4
Sophrosyne said:
Summary: In the past, physicists talked of the phenomenon of "wave function collapse" very freely, whereas now there seems to be some reservation about it. Why?

In reading older popular physics literature, physicists used to talk about "wave function collapse" freely and more often. Intuitively, for the interested layperson, talking about whether a particle is behaving as a wave or a particular makes a lot of sense. But I have noticed that on these threads, the concept of "wave function collapse" tends to get noses upturned a little bit. People seem to be suggesting that with QFT, it really doesn't make much sense anymore to keep talking about the phenomenon. But it's such a useful and helpful way to think about it, at least for me as an interested layperson.

Is this true or am I misunderstanding? And if so, is there a better way to intuitively understand what is happening?

(Some math is OK, but please keep it at the advanced HS/early undergrad stage please!)
The concept is not obsolete, but is nowadays regarded as secondary, being a consequence of decoherence.
 
  • #5
I see. So it's sounding to me like it's not WRONG to keep talking about wave function collapse to describe what looks to me to be a very real phenomenon. Thank you.
 
  • #6
Michael Price said:
The concept is not obsolete, but is nowadays regarded as secondary, being a consequence of decoherence.
Collapse isn't a consequence of decoherence. Decoherence leaves you with a mixture for macroscopic observables which isn't the same as collapse proper.
 
  • Like
Likes Auto-Didact, Demystifier and dextercioby
  • #7
Perhaps the people the OP mentions were reacting to the dated concept of "wave particle (duality)" prevalent in old literature and still current in popular science.
 
  • #9
epotratz said:
As far as I know Angelo Bassi's research team is performing the most updated research on wave function collapse:
http://www.qmts.it:8080/?q=publications&biblio_year=

Bassi is not investigating wave function collapse in quantum mechanics. Bassi is investigating theories which go beyond quantum mechanics, and to which quantum mechanics is an approximation.
 
  • Like
Likes Auto-Didact and Demystifier
  • #10
OP, you're confusing two ideas. Wave particle duality comes from the concepts emerging from double split experiment/photoelectric effect. What we learned from the photoelectric effect is that light can be thought of as discrete particles, while the double split experiment showed us that it can also produce an interference pattern that of a wave. So, light can be "thought of" as particles/waves depending on the circumstance you're in. Emphasis on "thought of".

The collapse of a wavefunction is still an issue today, as we have yet to see if it needs to be an axiom. The best way to wrangle this concept is to look at the stern-gerlach experiment.
 
  • #11
DarMM said:
Collapse isn't a consequence of decoherence. Decoherence leaves you with a mixture for macroscopic observables which isn't the same as collapse proper.
Each element of the "mixture" sees the collapse as having happened.
https://en.m.wikipedia.org/wiki/Quantum_decoherence
 
  • #12
Michael Price said:
Each element of the "mixture" sees the collapse as having happened.
https://en.m.wikipedia.org/wiki/Quantum_decoherence
That requires interpretational elements beyond the formalism. In applying QM decoherence just gives one that the statistics of macroscopic observables will follow Classical Probability.
 
  • Like
Likes Auto-Didact and Demystifier
  • #13
  • Like
Likes DanielMB and Demystifier
  • #14
DarMM said:
That requires interpretational elements beyond the formalism. In applying QM decoherence just gives one that the statistics of macroscopic observables will follow Classical Probability.
No, the appearance of collapse does not require additions to the formalism - it follows from linearity. It is debated whether the Born statistics require extra formalism, but not the collapse itself.
 
  • #15
Michael Price said:
No, the appearance of collapse does not require additions to the formalism - it follows from linearity. It is debated whether the Born statistics require extra formalism, but not the collapse itself.
It does because with decoherence you are just left with a mixture, the actual state updating does not occur. Even if the mixture includes device states in the usual formalism this refers to the fact that an (unrealistic) second device will see the first device and system to be correlated and macroscopic degrees of freedom of the first device to obey classical statistics. There is no collapse.
 
  • #16
DarMM said:
It does because with decoherence you are just left with a mixture, the actual state updating does not occur. Even if the mixture includes device states in the usual formalism this refers to the fact that an (unrealistic) second device will see the first device and system to be correlated and macroscopic degrees of freedom of the first device to obey classical statistics. There is no collapse.
The subsequent object-subject correlation is the collapse.
 
  • #17
Michael Price said:
The subsequent object-subject correlation is the collapse.
It's not though as it only enters the description based around the second device which itself has not applied collapse. The correlation does not enter the description based around the first device.
 
  • #18
DarMM said:
It's not though as it only enters the description based around the second device which itself has not applied collapse. The correlation does not enter the description based around the first device.
No idea what you're trying to say here. The fact that subsequent immediate measurements (of the same variable) all agree with each other (correlated) is what makes the wavefunction seem to collapse. You don't have to "apply" collapse.
 
  • #19
Michael Price said:
No idea what you're trying to say here. The fact that subsequent immediate measurements (of the same variable) all agree with each other (correlated) is what makes the wavefunction seem to collapse. You don't have to "apply" collapse.
What I'm saying is fairly standard. Decoherence shows the consistency between collapse as a kinematic effect at one level and the unitary dynamics at a higher level, but decoherence does not give you collapse.
 
  • #20
DarMM said:
What I'm saying is fairly standard. Decoherence shows the consistency between collapse as a kinematic effect at one level and the unitary dynamics at a higher level, but decoherence does not give you collapse.
Are we only disagreeing about whether the collapse is apparent or not?
 
  • #21
A. Neumaier said:
The wave function used for particles before passing a barrier with two slits and for particles after passing the barrier is manifestly different. The fact of this change (and many other, similar observations in the context of experimental arrangements) is called collapse (ore state reducion).
This is misleading. According to standard quantum mechanics, as in classical electrodynamics, there is not a wave function used for particles before passing a barrier and one after passing the barrier. There's one wave function ##\psi(t,\vec{x})## as a function of space and time. Using the appropriate initial conditions, i.e., a wave packet that is on the one hand sharply peaked enought to describe a particle "before the barrier" moving towards the barrier. The Schrödinger equation describes, how the wave function behaves as function of time. Nothing collapses.

What is referred to as collapse in some interpretations of QT is an attempt to describe what happens to the wave function, when the particle is somehow measured. Suppose you measure position, then this assumption assumes that when the particle's position is measured through the interaction with the measurement device, the wave function all of a sudden gets sharply peaked around the measured position.

This is of course somewhat misleading, because you cannot say anything precise about what happens with the particle, when it is not clear which measurement device is used and how it interacts. E.g., if you use a screen to measure the particle's position it's usually absorbed by the screen, and then a description by a single-particle wave function doesn't even make any sense anymore.

The collapse is at best an effective pragmatic description as something called a "von Neumann filter measurement", i.e., you use something to filter out particles with certain properties. E.g., you can use crossed electric and magnetic fields as a "velocity filter" for charged particles. Then you simply let all particles with a different velocity than the wanted ones hit a wall and get them absorbed. What's left are particles with a pretty well defined velocity, which you can describe by the corresponding wave function, and the choice of this wave function, given the preparation procedure without resolving in all details what dynamicallly happened, is called "collapse" by some physicists.

The collapse idea, however, is problematic when overinterpreted beyond this pragmatic approach and claiming it's part of the physical interpretation of QM. Then you run in serious issues with well-established facts about the causality structure of relativistic physics.
 
  • Like
Likes akvadrako
  • #22
vanhees71 said:
According to standard quantum mechanics, as in classical electrodynamics, there is not a wave function used for particles before passing a barrier and one after passing the barrier. There's one wave function ##\psi(t,\vec{x})## as a function of space and time.
No. The reason is that not all particles pass the barrier, and this cannot be described by standard quantum mechanics without the collapse.

If you consider a Stern-Gerlach experiment and you block one of the two beams created by the magnet you lose half the particles, and those that remain can be experimentally verified (by quantum state tomogrophy) to have a different state from what you get when you apply the Schrödinger equation to the input. There is no other way to model in quantum mechanics the absorption at the blocking barrier.
vanhees71 said:
Then you simply let all particles with a different velocity than the wanted ones hit a wall and get them absorbed. What's left are particles with a pretty well defined velocity, which you can describe by the corresponding wave function, and the choice of this wave function, given the preparation procedure without resolving in all details what dynamicallly happened, is called "collapse" by some physicists.
By almost all physicists with the exception of a minority that fights the term like you do.
 
  • Like
Likes Auto-Didact
  • #23
If you solve the Schrödinger equation for a barrier (an exercise often done in the QM 1 lecture to teach the mathematical techniques on simple examples), there's of course one part of the wave moving through, on part being reflected. That's not different in principle from solving the analogous problem for the em. field. Of course, if you abandon the reflected part and further work only with particles going through the barrier, many get lost and are not considered anymore, but that's it. There's no problem with that, nor is there a collapse. You just choose to work not with all particles but with those running across the barrier. The same holds for the SGE: You choose to work with just one beam which happen to be position-spin entangled such that you choose only those particles with a definite value of the spin component in the direction of the magnetic field the particles have run through. That's how all von Neumann filter measurements (I'd call it preparation).

I fight the naive non-local collapse assumption, because it's schizophrenic. On the one hand you talk a great deal about why to use relativistic QFT to describe quantum phenomena in the relativistic realm, namely to avoid contradictions with causality, and at the same time you apply unnecessary collapse ideas destroying the very foundations you started with...
 
  • Like
Likes weirdoguy
  • #24
A. Neumaier said:
The wave function used for particles before passing a barrier with two slits and for particles after passing the barrier is manifestly different. The fact of this change (and many other, similar observations in the context of experimental arrangements) is called collapse (ore state reducion).

It is captured in an idealized (but often not applicable) form by stating that the state turns into an eigenstate of A upon the observation of A.

The controversy about the collapse is not whether it is present in situations like that described but whether it is an irreducible effect that must be postulated, or whether it is derivable form the other postulates, e.g., by saying it is a rational change in modeling when an observer updates the state basd on improved knowledge. The latter position is sometimes brought across as saying ''there is no collapse''.

What I am still trying to wrap my head around is how
Klystron said:
Perhaps the people the OP mentions were reacting to the dated concept of "wave particle (duality)" prevalent in old literature and still current in popular science.

No, they were saying that in certain forms of QFT, the distinction becomes superficial and goes away. I have studied a little bit of QFT (emphasis on little) and I haven’t really seen that. Everything is still in terms of superposition and probability waves until the moment of “measurement” or “observation” ( concepts which I know are a big can of worms in themselves as far as what they mean).

But they are still two very different things (waves of probability vs the definitiveness of a measured/observed state) as far as I can see.
 
  • #25
vanhees71 said:
You just choose to work not with all particles but with those running across the barrier.
and then you have the collapse, namely a different wave function after the barrier, no matter whether or not you call it collapse. You cannot choose to work with the particles absorbed by the barrier. Nothing is reflected - the barrier problem in QM1 is quite different.
vanhees71 said:
you choose only those particles with a definite value of the spin component in the direction of the magnetic field the particles have run through.
And how do you know that you choose just these by picking one of the partial beams? You need already the collapse to conclude that!
vanhees71 said:
at the same time you apply unnecessary collapse ideas destroying the very foundations you started with...
I never was against collapse as a useful effective description. It is necessary to do quantum physics in practice, even when it is applied only unconsciously (as in your preparations). And it can be derived under the right circumstances; see Ballentine's book.
 
  • Like
Likes Auto-Didact
  • #26
Sophrosyne said:
No, they were saying that in certain forms of QFT, the distinction becomes superficial and goes away. I have studied a little bit of QFT (emphasis on little) and I haven’t really seen that. Everything is still in terms of superposition and probability waves until the moment of “measurement” or “observation” ( concepts which I know are a big can of worms in themselves as far as what they mean).
In QFT you have fields, hence waves, and particles emerge as an asymptotic approximate concept. The duality appears only for free theories, where one has a number operator and hence can diagonalize this operator to get an exact particle picture.

Calculations in relativistic QFT produce n-point functions and scattering cross sections; time-dependent states are virtually never used in QFT.
 
  • #27
The problem is that the English language was created before quantum mechanics, and there isn't any really neat choice of phrase that explains the process of making a quantum measurement.

When a quantum system is left undisturbed, it's not possible to know exactly what the state of the system is because the wave function oscillates over time.

When a measurement is taken, the quantum system is disturbed by the measurement, so the very act of measurement will change the quantum state.

One very simple minded way of looking at it is that to make a measurement you need to interact with the state with something that is about as "big" in terms of mass or energy as the system. Hitting an atom with a photon is like smacking a basketball with a baseball in a dark room, to figure out where it is by watching where the baseball bounces. The bowling ball is going to move a bit, so you can't pin down exactly where it is, only an approximate location.

You don't know what the state was before the measurement, and you also don't know the state after the measurement. All you can do is make a measurement and know what the state *was*, not what it currently happens to be.

This notion is called "collapsing the wave function" but it's really just disturbing the present state to find out what it is, and that then makes it impossible to be sure what it will be afterwards.

There are some schemes to take a bunch of tiny quantum measurements and use statistics to try to average things out to find a better approximation, but that's like tossing ping pong balls at that metaphorical basketball and guessing that the ping pong balls will on average tend to move the basketball back and forth to a place close to where it was in the first place.

https://phys.org/news/2011-06-canadian-method-quantum-wavefunction.html
 
  • #28
Why do you have a different wave function? You mean the wave function (or more generally a state) for the partial ensemble you filter out? Of course, but that is just with any probabilistic treatment: You choose the probatilities for the ensemble you are investigating. What else should you use?

I know, how to choose the particles by taking only those which I want, namely in your example those which are in a sufficient distance at the other side of the barrier.

This doesn't make any collapse assumption possible. The choice of the particles on the other side of the barrier are local manipulations on these particles. I don't need to assume that instantaneously something happens with the reflected particles far away. I simple don't do anything with them (or rather let them run in a "beam dump" to get rid of the savely ;-))).
 
  • #29
ensign_nemo said:
The problem is that the English language was created before quantum mechanics, and there isn't any really neat choice of phrase that explains the process of making a quantum measurement.

When a quantum system is left undisturbed, it's not possible to know exactly what the state of the system is because the wave function oscillates over time.

When a measurement is taken, the quantum system is disturbed by the measurement, so the very act of measurement will change the quantum state.

One very simple minded way of looking at it is that to make a measurement you need to interact with the state with something that is about as "big" in terms of mass or energy as the system. Hitting an atom with a photon is like smacking a basketball with a baseball in a dark room, to figure out where it is by watching where the baseball bounces. The bowling ball is going to move a bit, so you can't pin down exactly where it is, only an approximate location.

You don't know what the state was before the measurement, and you also don't know the state after the measurement. All you can do is make a measurement and know what the state *was*, not what it currently happens to be.

This notion is called "collapsing the wave function" but it's really just disturbing the present state to find out what it is, and that then makes it impossible to be sure what it will be afterwards.

There are some schemes to take a bunch of tiny quantum measurements and use statistics to try to average things out to find a better approximation, but that's like tossing ping pong balls at that metaphorical basketball and guessing that the ping pong balls will on average tend to move the basketball back and forth to a place close to where it was in the first place.

https://phys.org/news/2011-06-canadian-method-quantum-wavefunction.html

I am not sure this is quite right. I think the problem is not really with the English language, but rather with our current state of understanding: we have no idea WHY quantum mechanics behaves the bizarre way that it does. I suspect that when (or even if) we ever understand, we are going to smack our foreheads and it is going to make more sense (although it is invariably going to create a whole bunch of more questions in turn).
 
  • #30
ensign_nemo said:
When a measurement is taken, the quantum system is disturbed by the measurement, so the very act of measurement will change the quantum state...
...You don't know what the state was before the measurement, and you also don't know the state after the measurement. All you can do is make a measurement and know what the state *was*, not what it currently happens to be.
You can make subsequent measurements on the same axis, and you will get the same result. That is, take a stern gerlach set up, and take the spin up outputs. If you then make subsequent measurements on the same axis, you will get spin up no matter how many times you measure. It's only when you change the axis does the collapse occur again (commutators).

Even then, it's possible to take the outputs from a spin up on the x axis, throw them on an analyzer on the y axis, take both of the outputs, throw them back to an analyzer on the x-axis and you will get spin up again (only if you take both outputs).

You're trying to say that the collapsing of a wavefunction is ignorance. It's not, these objects are truly in a superposition until measured.
 
  • Like
Likes Michael Price
  • #31
Michael Price said:
The subsequent object-subject correlation is the collapse.
No, the correlation cannot be the collapse. The collapse is associated with a single observation outcome, while correlation is associated with a large ensemble of observation outcomes.
 
  • Like
Likes Auto-Didact, DanielMB and DarMM
  • #32
Demystifier said:
No, the correlation cannot be the collapse. The collapse is associated with a single observation outcome, while correlation is associated with a large ensemble of observation outcomes.
I should have been more explicit. Each element in the superposition, after the induced correlation, stops 'seeing' the other elements. So the collapse seems to have occurred.
 
  • #33
What does "each element stops seeing the other elements" mean in a non-MWI reading?
 
  • Like
Likes Auto-Didact and Demystifier
  • #34
DarMM said:
What does "each element stops seeing the other elements" mean in a non-MWI reading?
It means, once correlated, they time-evolve independently of each other.
 
  • #35
DarMM said:
What does "each element stops seeing the other elements" mean in a non-MWI reading?

Good question. I think it’s about an outside observer and how they would calculate the evolution of psi. Even in non-MW readings, that calculation looks like a MW one until they’ve made their measurement.
 
  • Like
Likes DarMM

Similar threads

Replies
14
Views
2K
Replies
54
Views
2K
Replies
105
Views
6K
Replies
13
Views
1K
Replies
5
Views
1K
Replies
1
Views
1K
Replies
4
Views
1K
Replies
5
Views
1K
Back
Top