Is the wave function real or abstract statistics?

  • #51
Jilang,

The underlying reality is the states of the spinning coin (heads or tails). You couldn't have probabilities of where the coin would land if heads or tails weren't objective, real states. The states of the quantum system have to be objectively real if there's a probability that the states can be measured. How can you measure a state that doesn't exist?

The ensemble interpretation and most Copenhagen interpretations say you can't know the state of the system prior to measurement so just shut up and calculate. I believe they say this is because if you accept that the states are real states, then you have to accept the weirdness as being real also. So the objection of the states being real isn't scientific but semantic. It's based on the rejection that quantum weirdness is objectively real. So far, most experiments have suggested otherwise and there's no hidden theory that will turn Heisenberg into Newton so to speak.
 
Physics news on Phys.org
  • #52
Here is a paper that claims to show that the wave function can be interpreted as representing a state of knowledge - ie. the underlying true state may correspond to more than one wave function.

http://arxiv.org/abs/1303.2834
Psi-Epistemic Theories: The Role of Symmetry
Scott Aaronson, Adam Bouland, Lynn Chua, George Lowther
 
  • #53
matrixrising said:
Jilang,

The underlying reality is the states of the spinning coin (heads or tails). You couldn't have probabilities of where the coin would land if heads or tails weren't objective, real states. The states of the quantum system have to be objectively real if there's a probability that the states can be measured. How can you measure a state that doesn't exist?

The ensemble interpretation and most Copenhagen interpretations say you can't know the state of the system prior to measurement so just shut up and calculate. I believe they say this is because if you accept that the states are real states, then you have to accept the weirdness as being real also. So the objection of the states being real isn't scientific but semantic. It's based on the rejection that quantum weirdness is objectively real. So far, most experiments have suggested otherwise and there's no hidden theory that will turn Heisenberg into Newton so to speak.

Sure you can. Heads or tails are real states but until you bring your hand down you cannot say which one it in. Until that point it is best described by a superposition of the two states. There are theories that can reconcile the weirdness with what we understand so far about the universe. See previous posts. If you can believe in zero point energy, then the motion of particles can cause disturbances in this that can describe interference effects etc in quite a classical way. If you don't believe in it, then you are going to struggle as I did for quite a long time.
 
  • #54
Demystifier,

You make some good points and this is why I think Lundeen was a huge success. It showed a one to one correspondence between the spatial wave function of a single photon and an ensemble of photons with identical spatial wave functions. So the state of a single photon was reconstructed even as the ensemble grew. Here's more about weak measurements from Wiki:

The weak value of the observable becomes large when the post-selected state, |\phi_2\rangle, approaches being orthogonal to the pre-selected state, |\phi_1\rangle.[1][4][5] In this way, by properly choosing the two states, the weak value of the operator can be made arbitrarily large, and otherwise small effects can be amplified.[6][7]

Related to this, the research group of Aephraim Steinberg at the University of Toronto confirmed Hardy's paradox experimentally using joint weak measurement’ of the locations of entangled pairs of photons.[8][9] Independently, a team of physicists from Japan reported in December, 2008, and published in March, 2009, that they were able to use joint weak measurement to observe a photonic version of Hardy's paradox. In this version, two photons were used instead of a positron and an electron and relied not upon non-annihilation but on polarization degrees of freedom values measured.[10]

Building on weak measurements, Howard M. Wiseman proposed a weak value measurement of the velocity of a quantum particle at a precise position, which he termed its "naïvely observable velocity". In 2010, a first experimental observation of trajectories of a photon in a double-slit interferometer was reported, which displayed the qualitative features predicted in 2001 by Partha Ghose[11] for photons in the de Broglie-Bohm interpretation.[12][13]

In 2011, weak measurements of many photons prepared in the same pure state, followed by strong measurements of a complementary variable, were used to reconstruct the state in which the photons were prepared.[14]

I think that last part is the ball game and like John Gribbin said, the last nail in the coffin of ensemble interpretations. He said:

However, hopes for turning quantum mechanics back into a classical theory were dashed. Gribbin continues:

"There are many difficulties with the idea, but the killer blow was struck when individual quantum entities such as photons were observed behaving in experiments in line with the quantum wave function description. The Ensemble interpretation is now only of historical interest."[9]

I think it's even worse with the recent Lundeen result. The identical spatial wave functions of individual photons were reconstructed over an ensemble of photons. This is a one to one correspondence of the spatial wave function of an individual photon and an ensemble of photons.
 
Last edited:
  • #55
Jilang, you said:

Heads or tails are real states but until you bring your hand down you cannot say which one it in.

You're just describing the uncertainty of the observer as to which state will be measured. The states have to be objectively real in order for them to be probable states. The only reason you have a probability of measuring heads or tails is because the states heads and tails are an underlying reality.

How can the observer measure a probable state that's not an underlying reality? The states of the quantum system have to objectively exist in order for them to be probable states than can be measured by the observer.
 
  • #56
matrixrising said:
Jilang, you said:

Heads or tails are real states but until you bring your hand down you cannot say which one it in.

You're just describing the uncertainty of the observer as to which state will be measured. The states have to be objectively real in order for them to be probable states. The only reason you have a probability of measuring heads or tails is because the states heads and tails are an underlying reality.

How can the observer measure a probable state that's not an underlying reality? The states of the quantum system have to objectively exist in order for them to be probable states than can be measured by the observer.

Yes, you are correct the state heads and the state tails are an underlying reality, but the state half heads and half tails isn't. Quantum states are generally of the second kind until they are measured.
 
  • #57
Jilang,

Why isn't that state half heads or half tails an underlying reality for the quantum system? This is the fallacy of Schrodinger's cat. People say it can't be an underlying reality for a quantum system as described by the wave function because it doesn't make classical sense. Why should it? Experiment after experiment has shown a quantum system just doesn't make classical sense unless you say the classical world emerged from these states of the quantum system. This way, there's no need to conform the underlying reality of the quantum system with your classical experience.
 
  • #58
@matrixrising: yes it is possible that the wave function is the full and true state of single systems. However, take a look at the paper I linked to in post #52, where one could construct theories that reproduce quantum mechanics in which not only is an underlying true state can correspond to more than one wave function - ie. the wave function is at least in part a state of ignorance of the true underlying state.
 
  • #59
atyy,

Thanks, I missed that and I will look at the paper.
 
  • #60
matrixrising said:
You're just describing the uncertainty of the observer as to which state will be measured. The states have to be objectively real in order for them to be probable states. The only reason you have a probability of measuring heads or tails is because the states heads and tails are an underlying reality. How can the observer measure a probable state that's not an underlying reality? The states of the quantum system have to objectively exist in order for them to be probable states than can be measured by the observer.
Maybe I'm misunderstanding but isn't this just the whole question about non-locality versus realism issue? Norsen in a previous post in this forum provided a local and non-realist (in some sense) model:
Here's a model that non-realistic but perfectly Bell local: each particle has no definite, pre-existing, pre-scripted value for how the measurements will come out. Think of each particle as carrying a coin, which, upon encountering an SG device, it flips -- heads it goes "up", tails it goes "down". That is certainly not "realistic" (in the sense that people are using that term here) since there is no fact of the matter, prior to the measurement, about how a given particle will respond to the measurement; the outcome is "created on the fly", so to speak. And it's also perfectly local in the sense that what particle 1 ends up doing is in no way influenced by anything going on near particle 2, or vice versa. Of course, the model doesn't make the QM/empirical predictions. But it's non-realist and local. And hence a counter-example to any claim that being Bell local requires/implies being "realist".

This is actually the type of model that some like Khrenikov advocate (from my understanding) but he also says that underneath, there's a subquantum reality and his does make different predictions. Actually he argues that this was also Einstein's view. Consider:
...The main distinguishing feature of the present Vaxjo interpretation is the combination of realism on the subquantum level with nonobjectivity of quantum observables (i.e., impossibility to assign their values before measurements). Hence, realism is destroyed by detectors transforming continuous subquantum reality into discrete events, clicks of detectors. The Vaxjo interpretation-2012 is fundamentally contextual in the sense that the value of an observable depends on measurement context. This is contextuality in Bohr’s sense. It is more general than Bell’s contextuality based on joint measurements of compatible observables.
https://www.physicsforums.com/showthread.php?t=721995

But I'm not mathematically competent enough to understand if his argument against Bell's assumptions for Bell inequality are valid. I was hoping someone would shed some light? Then, again I might be messing this up.
 
  • #61
matrixrising said:
Yes, I'm claiming that the state must be real or there's no probability of the state occurring if it isn't coupled with an underlying reality.

matrixrising said:
Why isn't that state half heads or half tails an underlying reality for the quantum system?

You do understand what probabilities are don't you? They are not real. They are, depending on your view, either something very abstract defined by the Kolmogorov axioms, or simply a confidence level you have in something being true as defined by the so called Cox axioms ie an extension of logic.

Neither is real in any sense most would call real - although you would probably find philosophers that argue the point. But physics is not philosophy and the most reasonable view is the usual one.

Many think the state is like that - but for some reason you simply do not get it.

Thanks
Bill
 
  • #62
matrixrising said:
You're just describing the uncertainty of the observer as to which state will be measured.

You are confused. The state is the uncertainty. The state tells us the expected outcome of an observation - it does not tell what outcome will be measured. This is the exact analogue of probabilities, and the spinning coin. While it is spinning we have zero idea what side it will land on, all we have is a certain confidence level in the likelihood of exactly what side it will land on. That is the state - ie it tells us this likelihood.

The measurements are real, the state tells us their expected outcomes. The state isn't real - or to be exact doesn't have to be.

Thanks
Bill
 
  • #63
matrixrising said:
People say it can't be an underlying reality for a quantum system as described by the wave function because it doesn't make classical sense.

I don't think anyone says the state can't be real and depend on an underlying reality - interpretations like DBB more or less say that by introducing things like a pilot wave.

That's not the issue here - the issue is MUST it be like that.

Thanks
Bill
 
  • #64
matrixrising said:
I think that last part is the ball game and like John Gribbin said, the last nail in the coffin of ensemble interpretations. He said:
[...]

We had that already and I told you already that you are quoting things out of context. Again, if you actually bother to look at what is meant by ensemble interpretation in this context, reference 9 from wikipedia says:
"An interpretation of quantum mechanics originally developed by Albert Einstein in the hope of removing some (or all!) of the mystery from quantum theory. The basic idea is that each quantum entity (such as an electron or a photon) has precise quantum properties (such as position and miomentum), and the quantum wave function is related to the probability of getting a particular experimental result when one member (or many members) of the ensemble is somehow selected by experiment."

This is an old classical idea and NOT what is today known as Ballentine's ensemble interpretation. The ensemble interpretation does not assume precise underlying properties. Please stop repeating things you know are wrong and misinforming people.

matrixrising said:
I think it's even worse with the recent Lundeen result. The identical spatial wave functions of individual photons were reconstructed over an ensemble of photons. This is a one to one correspondence of the spatial wave function of an individual photon and an ensemble of photons.

What part of "Lundeen has not measured the wavefunction of an individual particle" is so hard to understand? I have given you an excerpt of a peer reviewed article from Steinberg's group beforehand clearly stating that discussing such properties of individual particles is meaningless.

Do you get the difference between an individual photon and a single photon state?
 
  • #65
bhobba, You said:

You are confused. The state is the uncertainty. The state tells us the expected outcome of an observation - it does not tell what outcome will be measured. This is the exact analogue of probabilities, and the spinning coin. While it is spinning we have zero idea what side it will land on, all we have is a certain confidence level in the likelihood of exactly what side it will land on. That is the state - ie it tells us this likelihood.

Again, this can't be the case. You can't separate probable states from the underlying reality. The probable state gives you an expected outcome of an observation THAT'S AN UNDERLYING REALITY.

Go back to my example of the 4 runners in the race. I can give you an expected outcome of who we might observe coming in 1st, 2nd, 3rd and 4th. I can't give you an expectant value of something that's not an underlying reality. I can't give you an expectant value of who will come in 10th place. This is because 10th place isn't an underlying reality of a 4 man race.

If you have some evidence that you can divorce probable states from an underlying reality, let's see it.

In the case with the coin. You do have an idea of what state it will land on while it's spinning. It can only land on the underlying reality of heads or tails. Probabilities tell us about an underlying reality that can occur.

If I'm on the mound pitching to a batter, my probable states are limited to my underlying reality. So I can strike out the batter, walk him, he can hit a home run or another probable state will occur that's an expectant value of baseball. What I can't do is give you an expectant value of something that's not an underlying reality. I can't give you the probability that I will throw a touchdown to a WR while I'm pitching in the World Series. One, I'm not a professional baseball player and two the probabilities are restricted to the underlying reality of a baseball game.

So the state must be real before there can be a probable state.
 
  • #66
matrixrising said:
Again, this can't be the case. You can't separate probable states from the underlying reality. The probable state gives you an expected outcome of an observation THAT'S AN UNDERLYING REALITY.

Please stop making up terminology which does not exist. Realism is well defined and "underlying reality" is not an existing term in that respect. States with probability 0 are trivially excluded. This is not what realism is about. It is not about underlying reality, it is about ACTUALLY being realized every single time.

matrixrising said:
In the case with the coin. You do have an idea of what state it will land on while it's spinning. It can only land on the underlying reality of heads or tails. Probabilities tell us about an underlying reality that can occur.

No! If it is real it "can" not only occur, it MUST occur.

matrixrising said:
If I'm on the mound pitching to a batter, my probable states are limited to my underlying reality. So I can strike out the batter, walk him, he can hit a home run or another probable state will occur that's an expectant value of baseball.

If you talk about an underlying reality, this automatically means that you consider the wave function as NOT realistic. In a non-realisitc setting you can strike out the batter, walk him or he can hit a home run. In a realistic setting you DO strike out the batter, walk him and he scores a home run simultaneously every single time. If you consider it as realistic, the wave function is literally all there is. There is nothing deeper, nothing underlying. Nobody denies that these underlying states are possible or realistic, but this is not what realism is about. It is not at all about the nature of the states. Realism is well defined and all about taking the wave function absolutely literally. So please stop twisting the meaning of existing terminology. This is not how these forums work.
 
  • #67
matrixrising said:
Again, this can't be the case. You can't separate probable states from the underlying reality. The probable state gives you an expected outcome of an observation THAT'S AN UNDERLYING REALITY.

You are being silly, very confused, or something - I really don't know what.

Again - the outcome of observations are very real - and when measured to have that value they have it - no question - that is not at issue - but we can only predict probabilities like with the spinning coin.

However if you are thinking they have those properties prior to observation and are real in that sense, then you run into the Kochen-Specker theorem:
http://en.wikipedia.org/wiki/Kochen–Specker_theorem
'The Kochen–Specker proof demonstrates the impossibility of a version of Einstein's assumption, made in the famous Einstein–Podolsky–Rosen paper, that quantum mechanical observables represent 'elements of physical reality'

Actually, even though it's usually not presented this way, it a simple corollary of the much more powerful Gleason's theorem I mentioned early on in the thread - but that's just by the by.

Added Later:
In relation to Cthugha comments what is real is what has probability 1 ie a dead cert, it must be, it has it for sure. The Kochen-Specker theorem proves, and its a proof so there is nothing interpretive about it, that you can't assign probability 1 to everything you can observe. Quantum systems can not have definite values independant of observing them - ie they are not real in the usual classical sense. The only out is if they get those values from measurement context - which is itself pretty weird.

Thanks
Bill
 
Last edited:
  • #68
matrixrising said:
If you have some evidence that you can divorce probable states from an underlying reality, let's see it.

Probable states? There is no such thing. The state is very definite - its not probable. But like probabilities it allows us to predict long term averages.

The state is what allows us to determine those probabilities. It's use is in the Born rule:
http://en.wikipedia.org/wiki/Born_rule
'The Born rule (also called the Born law, Born's rule, or Born's law) is a law of quantum mechanics which gives the probability that a measurement on a quantum system will yield a given result. It is named after its originator, the physicist Max Born. The Born rule is one of the key principles of quantum mechanics. There have been many attempts to derive the Born rule from the other assumptions of quantum mechanics, with inconclusive results; the Many Worlds Interpretation for example cannot derive the Born rule. However, within the Quantum Bayesianism interpretation of quantum theory, it has been shown to be an extension of the standard Law of Total Probability, which takes into account the Hilbert space dimension of the physical system involved.'

Aside from the Born rule a state tells us nothing at all. States are not probable - they are used to predict probabilities, but are themselves not probable.

You seem very confused about very basic terminology.

Thanks
Bill
 
Last edited:
  • #69
Cthugha said:
No! If it is real it "can" not only occur, it MUST occur.

Maybe my link to the Kochen-Specker theorem will help.

He may not realize you can't assign definite values to all observables - ie probability 1 - in the quantum formalism - at least some must be unknowable.

Thanks
Bill
 
  • #70
cthugha,

You said:

What part of "Lundeen has not measured the wavefunction of an individual particle" is so hard to understand? I have given you an excerpt of a peer reviewed article from Steinberg's group beforehand clearly stating that discussing such properties of individual particles is meaningless.

This is just false.

When you look at Lundeen, he showed a one to one correspondence with the spatial wave function of a SINGLE PHOTONS with the spatial wave function of an ensemble of photons. The spatial wave function of a single photon was reconstructed over an ensemble of photons. It just doesn't get much clearer than that.

Like I said, Ballentine shows zero evidence that the wave function isn't real. All I see is a bunch of conjecture that's born out of the desire to remove the mysteries of QM whatever that means. It's just shut up and calculate. Here's David Merman:

"For the notion that probabilistic theories must be about ensembles implicitly assumes that probability is about ignorance. (The “hidden variables” are whatever it is that we are ignorant of.) But in a non-determinstic world probability has nothing to do with incomplete knowledge, and ought not to require an ensemble of systems for its interpretation".

A minimalist interpretation of QM is another form of shut up and calculate which is lacking. Where's the evidence that the quantum system isn't in multiple "real" states prior to measurement? This is what gives rise to the quantum properties that we see in experiment after experiment.

In fact, how can we do calculations on probable states if these probable states are not real when it comes to quantum computing?

More on Quantum Computing and Schrodinger's cat:

The Ensemble Interpretation states that superpositions are nothing but subensembles of a larger statistical ensemble. That being the case, the state vector would not apply to individual cat experiments, but only to the statistics of many similar prepared cat experiments. Proponents of this interpretation state that this makes the Schrödinger's cat paradox a trivial non issue. However, the application of state vectors to individual systems, rather than ensembles, has explanatory benefits, in areas like single-particle twin-slit experiments and quantum computing. As an avowedly minimalist approach, the Ensemble Interpretation does not offer any specific alternative explanation for these phenomena.

The single particle has to be in two real states in order for a calculation to occur. The single particle can be in two real states or a qubit prior to measurement.

This is from a paper titled A single-atom electron spin qubit in silicon.

Here we demonstrate the coherent manipulation of an individual electron spin qubit bound to a phosphorus donor atom in natural silicon, measured electrically via single-shot read-out7, 8, 9. We use electron spin resonance to drive Rabi oscillations, and a Hahn echo pulse sequence reveals a spin coherence time exceeding 200 µs. This time should be even longer in isotopically enriched 28Si samples10, 11. Combined with a device architecture12 that is compatible with modern integrated circuit technology, the electron spin of a single phosphorus atom in silicon should be an excellent platform on which to build a scalable quantum computer.

http://www.nature.com/nature/journal/v489/n7417/full/nature11449.html

When it comes to underlying states.

Of course all of these states are real and that's the point. All of these states are coherent and real prior to measurement and this is why we can show single particles in a state of superposition. So the underlying reality of the system(particle) is real. This underlying reality is the wave function in a pure coherent state where pure states simultaneously exist prior to decoherence.

Ballentine's blunder on the Quantum Zeno Effect.

Leslie Ballantine promoted the Ensemble Interpretation in his book "Quantum Mechanics, A Modern Development". In it [6], he described what he called the "Watched Pot Experiment". His argument was that, under certain circmstances, a repeatedly measured system, such as an unstable nucleus, would be prevented from decaying by the act of measurement itself. He initially presented this as a kind of reductio ad absurdum of wave function collapse.

Of course he was wrong when he said:

"Like the old saying "A watched pot never boils", we have been led to the conclusion that a continuously observed system never changes its state! This conclusion is, of course false.

Wrong.

One last thing. there was a poll taken by Anton Zeilinger at the Quantum Physics and Nature of Reality conference in Austria in 2011. Here's what they thought about ensemble interpretations.

Right interpretation of state vectors:

27%: epistemic/informational
24%: ontic
33%: a mix of epistemic and ontic
3%: purely statistical as in ensemble interpretation
12%: other


As you see, the ensemble interpretation got 3%.

I chose not to label the "ensemble interpretation" as correct because the ensemble interpretation makes the claim that only the statistics of the huge repetition of the very same experiment may be predicted by quantum mechanics. This is a very "restricted" or "modest" claim about the powers of quantum mechanics and this modesty is actually wrong. Even if I make 1 million completely different experiments, quantum physics may predict things with a great accuracy.

Imagine that you have 1 million different unstable nuclei (OK, I know that there are not this many isotopes: think about molecules if it's a problem for you) with the lifetime of 10 seconds (for each of them). You observe them for 1 second. Quantum mechanics predicts that 905,000 plus minus 1,000 or so nuclei will remain undecayed (it's not exactly 900,000 because the decrease is exponential, not linear). The relatively small error margin is possible despite the fact that no pair of the nuclei consisted of the same species!

So it's just wrong to say that you need to repeat exactly the same experiment many times. If you want to construct a "nearly certain" proposition – e.g. the proposition that the number of undecayed nuclei in the experiment above is between 900,000 and 910,000 – you may combine the probabilistically known propositions in many creative ways. That's why one shouldn't reduce the probabilistic knowledge just to some particular non-probabilistic one. You could think it's a "safe thing to do". However, you implicitly make statements that quantum mechanics can't achieve certain things – even though it can.

Here's more about the conference:

http://www.technologyreview.com/view/509691/poll-reveals-quantum-physicists-disagreement-about-the-nature-of-reality/

So again, the ensemble interpretation flies in the face of experiment after experiment. It's a way of saying Quantum weirdness can't be objectively real but the truth is, it's an underlying reality for the quantum system not the classical experience.
 
Last edited by a moderator:
  • #71
bhobba,

You said:

Aside from the Born rule a state tells us nothing at all. States are not probable - they are used to predict probabilities, but are themselves not probable.

Tell me, how can these states predict probabilities that are not an underlying reality?

These states have to describe the reality of the quantum system in order to predict probable states of the system. Theses states tell you how the system behaves because these states are describing the underlying reality of the system. How can you say these states can predict probabilities of the system if the states don't describe an underlying reality of the system?

In order to predict probabilities the wave function has to contain all measurable states of the system. Guess what? It does.
 
  • #72
matrixrising said:
cthugha,
When you look at Lundeen, he showed a one to one correspondence with the spatial wave function of a SINGLE PHOTONS with the spatial wave function of an ensemble of photons. The spatial wave function of a single photon was reconstructed over an ensemble of photons. It just doesn't get much clearer than that.

A single particle does not have a wave function. A single photon state has. This is what Lundeen analyzed. It is that simple. This is simple basic qm. I have already given you a paper explicitly stating that you cannot meaningfully discuss these properties for single particles in post #31. There is a recent Nature photonics paper by Boyd and his group (Nature Photonics 7, 316–321 (2013)), applying Lundeen's technique to measure the polarization state of light directly. They also make clear that it does not work for a single particle:

"For a single photon, the weak measurement has very large uncertainty, so the above procedure must be repeated on many photons, or equivalently on a classical light beam, to establish the weak value with a high degree of confidence."

matrixrising said:
Like I said, Ballentine shows zero evidence that the wave function isn't real. All I see is a bunch of conjecture that's born out of the desire to remove the mysteries of QM whatever that means. It's just shut up and calculate.

Of course it does not show evidence that the wave function is not real. There is no evidence that the wave function isn't real. There is also no evidence that it is real. This is why these are interpretations. None of them has better evidence. None is more valid than the others. You can interpret the wave function as realistic, but you do not have to.

matrixrising said:
"For the notion that probabilistic theories must be about ensembles implicitly assumes that probability is about ignorance. (The “hidden variables” are whatever it is that we are ignorant of.) But in a non-determinstic world probability has nothing to do with incomplete knowledge, and ought not to require an ensemble of systems for its interpretation".

Ought not...well, one can have this opinion, yes. One does not have to. Personally, I like Mermin's Ithaqa interpretation, although it is not too consistent.

matrixrising said:
A minimalist interpretation of QM is another form of shut up and calculate which is lacking. Where's the evidence that the quantum system isn't in multiple "real" states prior to measurement? This is what gives rise to the quantum properties that we see in experiment after experiment.

There is no evidence. Nobody in this thread claimed there is. You just claimed the wave function has to be interpreted as real. Everybody else says, it can, but you do not have to.

matrixrising said:
In fact, how can we do calculations on probable states if these probable states are not real when it comes to quantum computing?

Yes.
matrixrising said:
The Ensemble Interpretation states that superpositions are nothing but subensembles of a larger statistical ensemble. That being the case, the state vector would not apply to individual cat experiments, but only to the statistics of many similar prepared cat experiments. Proponents of this interpretation state that this makes the Schrödinger's cat paradox a trivial non issue. However, the application of state vectors to individual systems, rather than ensembles, has explanatory benefits, in areas like single-particle twin-slit experiments and quantum computing. As an avowedly minimalist approach, the Ensemble Interpretation does not offer any specific alternative explanation for these phenomena.

Ehm, I do not see the point. Of course there can be explanatory benefits for special experiments in certain interpretations. This is why there are so many of them. In the ensemble interpretation, quantum computers just work because qm says so. Yes, I agree that this might not be great from a didactics point of view. The disadvantage of minimal interpretations for some people is that it says that things work because the math says so. The advantage of minimal interpretations for some people is that it says that things work because the math says so.

matrixrising said:
The single particle has to be in two real states in order for a calculation to occur. The single particle can be in two real states or a qubit prior to measurement.

No.

matrixrising said:
This is from a paper titled A single-atom electron spin qubit in silicon.

I do not get it. Electron spins make goos qubits. I worked on some of them myself. What is this going to tell us?

matrixrising said:
Of course all of these states are real and that's the point. All of these states are coherent and real prior to measurement and this is why we can show single particles in a state of superposition. So the underlying reality of the system(particle) is real. This underlying reality is the wave function in a pure coherent state where pure states simultaneously exist prior to decoherence.

Maybe. Maybe not. Can you show that it must be this way? Do you have more than one account by the way? The number of people repeating almost the same sentences increased significantly during the last week.

matrixrising said:
Of course he was wrong when he said:

"Like the old saying "A watched pot never boils", we have been led to the conclusion that a continuously observed system never changes its state! This conclusion is, of course false.

Wrong.

Yes. Of course he was. It is the same common fallacy. Trying to think one interpretation is better than the others. This idea is always doomed.

matrixrising said:
One last thing. there was a poll taken by Anton Zeilinger at the Quantum Physics and Nature of Reality conference in Austria in 2011. Here's what they thought about ensemble interpretations.

Right interpretation of state vectors:

27%: epistemic/informational
24%: ontic
33%: a mix of epistemic and ontic
3%: purely statistical as in ensemble interpretation
12%: other


As you see, the ensemble interpretation got 3%.

Yes, indeed physicists interested in interpretations are usually seeking something fundamental from an interpretation. Maybe a good ontology like Bohmians. Or something else. These naturally find ensemble interpretations lacking. That is fine. The ensemble interpretation is a minimalist interpretation preferred usually by working physicists who want to stay clear of exactly the kind of discussion we have here.

matrixrising said:
So again, the ensemble interpretation flies in the face of experiment after experiment. It's a way of saying Quantum weirdness can't be objectively real but the truth is, it's an underlying reality for the quantum system not the classical experience.

There is no experimental evidence against (or for) the ensemble interpretation - or any other standard interpretation.
 
Last edited:
  • #73
matrixrising said:
Tell me, how can these states predict probabilities that are not an underlying reality?

Well you keep using 'underlying reality'.

How about explaining what you mean by 'underlying reality'.

Generally in QM reality means something has a value with a dead cert. If that's what you mean Kochen-Specker says that's impossible. How is it possible - nature is like that. Einstein didn't like it but was forced to accept it - no escaping - you can't ague with a theorem.

But that's perhaps not what you mean by reality.

So let's see your definition of it. Einstein gave a definition in his EPR paper but was proven wrong.

Thanks
Bill
 
  • #74
matrixrising said:
Like I said, Ballentine shows zero evidence that the wave function isn't real.

Exactly what books by Ballentine have you read? I have carefully studied his standard text, including chapter 9, where he carefully explains why it can't be real. Precisely what part of his argument is wrong? IMHO it has a few outs - but let's hear your reasons.

And if you have in fact studied his book, you should be able to state the proper form of the Born rule - its axiom 2 in Ballentine's book. Care to tell us what it is? And if you can't exactly why are you arguing about something you do not know the details of?

Thanks
Bill
 
Last edited:
  • #75
bhobba said:
Well you keep using 'underlying reality'.

How about explaining what you mean by 'underlying reality'.

His naive interpretation of the experiments and papers(Lundeen, etc) indicates that he is making the common novice mistake of identifying the wave function with real individual particles(the classical ones).
He then goes on to confuse this with the abstract obvious notion that probabilities obtained in QM are referred to the "underlying reality", they would be of no use otherwise, that is just saying QM works in different words.
Threads like this might seeem not very useful judging by what the specific individual OP seems to be taking out of it as he replies, but they are useful for what people approaching QM for the first time may take away in terms of not falling for these mistakes.
 
  • #76
cthugha,

Again, the reason why ensemble interpretations got 3% at the conference is because it's an interpretation that says just look away and do the math and oh by the way, QM can't say this or that even though it does. It's like the example given of the unstable nuclei. He made a great point when he said:

However, you implicitly make statements that quantum mechanics can't achieve certain things – even though it can.

Lundeen directly measured the wave function of a single particle. There was a one to one correspondence with the spatial wave function of a single photon and an ensemble of photons.

There's a difference between direct measurement of the wave function of a single photon and knowing the position and momentum of a single photon. Let Lundeen explain:

The wavefunction is the complex distribution used to completely
describe a quantum system, and is central to quantum theory. But
despite its fundamental role, it is typically introduced as an abstract
element of the theory with no explicit definition1,2. Rather, physicists
come to a working understanding of the wavefunction through its
use to calculate measurement outcome probabilities by way of the
Born rule3. At present, the wavefunction is determined through
tomographic methods4–8, which estimate the wavefunction most con-
sistent with a diverse collection of measurements. The indirectness of
these methods compounds the problem of defining the wave-
function. Here we show that the wavefunction can be measured
directly by the sequential measurement of two complementary vari-
ables of the system. The crux of our method is that the first measure-
ment is performed in a gentle way through weak measurement9–18,
so as not to invalidate the second. The result is that the real and
imaginary components of the wavefunction appear directly on our
measurement apparatus. We give an experimental example by
directly measuring the transverse spatial wavefunction of a single
photon, a task not previously realized by any method. We show that
the concept is universal, being applicable to other degrees of freedom
of the photon, such as polarization or frequency, and to other
quantum systems—for example, electron spins, SQUIDs (super-
conducting quantum interference devices) and trapped ions.
Consequently, this method gives the wavefunction a straightforward
and general definition in terms of a specific set of experimental
operations19. We expect it to expand the range of quantum systems
that can be characterized and to initiate new avenues in fundamental
quantum theory.

This is a letter from Lundeen to Nature about the experiment. Here's the kicker:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


Again there's a difference between the direct measurement of a single photons wave function and knowing the position and momentum of a single photon. The first, Lundeen achieved, the second can't be known. He ends the letter with this.

In our direct measurement method, the wavefunction manifests
itself as shifts of the pointer of the measurement apparatus. In this
sense, the method provides a simple and unambiguous operational
definition19 of the quantum state: it is the average result of a weak
measurement of a variable followed by a strong measurement of the
complementary variable. We anticipate that the simplicity of the
method will make feasible the measurement of quantum systems
(for example, atomic orbitals, molecular wavefunctions29, ultrafast
quantum wavepackets30) that previously could not be fully characterized.
The method can also be viewed as a transcription of quantum state of
the system to that of the pointer, a potentially useful protocol for
quantum information

Again, the direct measurement of a single photons wave function that corresponds to an ensemble of photons with the same spatial wave function.

Again, ensemble interpretations basically say shut up and calculate and if experiments say x it's meaningless because it's just the math. It just doesn't make much sense like Ballentine and the Quantum Zeno Effect.

bhobba,

When I say underlying reality, I'm talking about the wave function. The wave function is the underlying reality of the quantum system. It contains all the measurable information about the system. It's the pool table analogy. The pool table contains all the measurable information that the pool balls can be in. The pool balls themselves don't have to be in a measured state (8 ball in the corner pocket) in order for the pool table to contain all measurable states that the pool ball can be in.

These are real states that we can perform quantum calculations on. So the pool table contains all measurable information about the system(pool balls). In this case, the pool balls would represent the measurable information of the pool table in a decohered state.
 
  • #77
matrixrising said:
Again, the reason why ensemble interpretations got 3% at the conference is because it's an interpretation that says just look away and do the math and oh by the way, QM can't say this or that even though it does.

Ehm, does it? QM does not say anything testable about single realizations. Also, it is quite misleading to distinguish between ensemble and epistemic interpretations. The border is not that well defined. Actually the informational/subjectivist interpretations are even further away from your point of view than the ensemble one is and are led by the opinion that there is no reality without measurement.

matrixrising said:
Lundeen directly measured the wave function of a single particle. There was a one to one correspondence with the spatial wave function of a single photon and an ensemble of photons.

I already explained why that is wrong and what directly means in this case. What is so difficult about that?

matrixrising said:
There's a difference between direct measurement of the wave function of a single photon and knowing the position and momentum of a single photon. Let Lundeen explain:
[...]This is a letter from Lundeen to Nature about the experiment. Here's the kicker:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.

Yes, I know that paper well and the wording Lundeen used. The paper got criticized quite heavily at conferences for using that simplified wording (admittedly it helps selling it of course). The correct formulation would have been that they WEAKLY measured the wave function, which has a very different meaning. As I already cited: "For a single photon, the weak measurement has very large uncertainty, so the above procedure must be repeated on many photons, or equivalently on a classical light beam, to establish the weak value with a high degree of confidence." The single weak measurement does not give any useful information about the single particle. It is just the ensemble average that does. This is naturally so. If it gave more information, it would be a strong measurement. The averaged value you get comes from an ensemble measurement and is not done on a single particle. Actually you get different weak values when measuring several different particles. If they had measured the identical true wave function on a single particle, they would have got exactly the same result every time.

matrixrising said:
Again, the direct measurement of a single photons wave function that corresponds to an ensemble of photons with the same spatial wave function.

There are interpretations which allow you to interpret a weak measurement as a measurement on a single photon, yes. But in these interpretations that result on its own is meaningless. The wave function values shown do not come from a single measurement on a single particle. They come from many measurements on single particles and averaging over the weak values. One can claim that the single measurement of the wave function should indeed be interpreted as a measurement of the wave function of a single particle. However, this automatically means that a result of say 5 +/- 739 is a reasonable result. Yes, you can con consider this as a measurement, but is the result on its own meaningful? In my opinion, it is misleading to say that the result has a correspondence with the wave function because this single result does not tell you anything.

Seeing that is difficult, though as weak measurements are a complicated topic. It helps following the topic from the initial paper on weak measurements (Phys. Rev. Lett. 60, 1351–1354 (1988), http://prl.aps.org/abstract/PRL/v60/i14/p1351_1) to the more modern viewpoint of quantum ergodicity (http://arxiv.org/abs/1306.2993).

In a nutshell, if you consider a measurement result of 100 as a good result when measuring the spin of a spin 1/2 particle, Lundeen has performed a measurement on a single particle. But that is pretty trivial and does not mean much. It is like measuring the opinion of all Americans by just asking Steve Miller from Arkansas and nobody else. Yes, you can consider that a measurement. No, it is not meaningful on its own. Therefore, calling a single weak measurement a measurement on a single particle is just semantics to me.
 
Last edited:
  • #78
chutgha,

I think you're misreading Lundeen because you're looking at it through the eyes of an ensemble interpretation. This is why I don't like ensemble interpretations. Results are never results even though they're results. It seems the goal of ensemble interpretations or the small percentage that follow them is to label every result meaningless that makes ensemble interpretations meaningless.

This is what Lundeen said:

The average result of the weak measurement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-function. At each x, the observed position and momentum shifts of the measurement pointer are proportional to ReY(x) and ImY(x),respectively. In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-function of the single particle.

That's pretty straightforward and simple measurement of a single photons wave function.

What happened?

Lundeen first did a weak measurement and then a strong measurement was performed. By reducing the disturbance by performing a weak measurement first and then a strong measurement, he measured the wave function of a single photon.

At each x, (wave function of the individual photon) the observed position and momentum shifts of the measurement pointer were proportional to the real and imaginary parts of the wave function.

It's like the blueprint to build a Lexus is the wave function. It contains all the measurable information you need to build a Lexus. Each individual state(car door, trunk, hood) has to be real and proportional to a Lexus in order to build a Lexus.

So in this experiment, every direct measurement of a single photons wave function is proportional to a Lexus(wave function). You don't get a car door of a 1983 Buick or the trunk of a Cadillac. The wave function of a single photon is proportional to the real and imaginary part of the wave function.

This is why Lundeen said:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


So a singular photons wave function is proportional to an ensemble of photons as it should be.
 
  • #79
matrixrising said:
I think you're misreading Lundeen because you're looking at it through the eyes of an ensemble interpretation. This is why I don't like ensemble interpretations. Results are never results even though they're results. It seems the goal of ensemble interpretations or the small percentage that follow them is to label every result meaningless that makes ensemble interpretations meaningless.

No, I am not misreading it. I know that paper pretty well.

matrixrising said:
This is what Lundeen said:
The average result of the weak measurement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-function. At each x, the observed position and momentum shifts of the measurement pointer are proportional to ReY(x) and ImY(x),respectively. In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-function of the single particle.

That's pretty straightforward and simple measurement of a single photons wave function.

You still need an average result. This is more than just a minor nuisance. See my comment below.

matrixrising said:
Lundeen first did a weak measurement and then a strong measurement was performed. By reducing the disturbance by performing a weak measurement first and then a strong measurement, he measured the wave function of a single photon.

At each x, (wave function of the individual photon) the observed position and momentum shifts of the measurement pointer were proportional to the real and imaginary parts of the wave function.

This is of course misleading at best. It is not the observed shift, but the observed AVERAGE shift which is proportional to the wave function. This is a huge difference.

In particular it is even against your position. Results of a weak measurement do not follow what you call the underlying reality. If you measure the spin of a spin 1/2 particle, you can get a weak value of 100 which is known to be not possible. If you measure your position weakly, the weak measurement can tell you that you are on Lexaar. If you measure weakly what the pitcher on the baseball field will do, you may find that he skates around the goalie, raises his hockey stick and scores. If you perform a weak measurement of who will win the superbowl this year, you may get the Giants or Tampa Bay as the result.

One needs to follow the whole literature about weak wave function measurements to understand what is going on. The nature paper has limited space and necessarily explains little which is a standard problem when trying to publish in nature or science. You need to write a condensed manuscript. Additional explanations on the meaning of their wave function and the meaning of "direct" have been given by Lundeen and Bamber in PRL 108, 070402 (2012), explaining that they have shown a "general operational definition of the wave function based on a method for its direct measurement: ‘‘it is the average result of a weak measurement of a variable followed by a strong measurement of the complementary variable [1,2].’’ By ‘‘direct’’ it is meant that a value proportional to the wave function appears straight on the measurement apparatus itself without further complicated calculations or fitting." and most importantly
"While a weak measurement on a single system provides little information, by repeating it on an arbitrarily large ensemble of identical systems one can determine the average measurement result with arbitrary precision."
and also "Surprisingly, the weak value can be outside the range of the eigenvalues of A and can even be complex".

It has also been investigated in Phys. Rev. A 84, 052107 (2011) which gives a more rigorous mathematical treatment showing that "that the weak values can be exhaustively explained within the quantum theory of sequential measurements." and also that one cannot measure arbitrary states using the technique.

The great thing about Lundeen's paper is the directness of the measurement - the fact that you do not have to run any tomography program afterwards. If you ever get to spend some time waiting for quantum state tomography to do its job, you will know what I mean. It can take weeks for large chunks of data.
 
Last edited:
  • #80
ctugha,

This is the point that was made about ensemble interpretations. There's no evidence, even when there is evidence. Ecerything is just meaningless even though it has been shown to have meaning. You said:

"While a weak measurement on a single system provides little information, by repeating it on an arbitrarily large ensemble of identical systems one can determine the average measurement result with arbitrary precision."

The key here:

PROVIDES LITTLE INFORMATION NOT MEANINGLESS INFORMATION.

It's like a small signal at a Collider. When the signal is repeated it can give you a better understanding of the signal. What Lundeen is saying is that you can't look at the weak measurement of a single system in isolation in order to see the big picture. It doesn't mean that the direct measurement of the photons wave function is meaningless. It's just like my example of the Lexus. You need the Lexus doors to make the car. The doors in isolation will not give you the car but that doesn't make them meaningless. I quote Lundeens letter to Nature one more time:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


How do you leap to the conclusion that this is meaningless?

Results of weak measurement do follow what I call underlying reality.

This can be explained by the Aharonov–Albert–Vaidman effect. This is from Wiki:

The weak value of the observable becomes large when the post-selected state, |\phi_2\rangle, approaches being orthogonal to the pre-selected state.

Here's more from a PDF Lev Vaidman on weak value and weak measurement:

The real part of the weak value is the outcome of the standard measurement pro-
cedure at the limit of weak coupling. Unusually large outcomes, such as
spin 100 for a spin− 1 particle [2], appear from peculiar interference effect
(called Aharonov–Albert–Vaidman (AAV) effect) according to which, the superpo-
sition of the pointer wave functions shifted by small amounts yields similar wave
function shifted by a large amount. The coefficients of the superposition are univer-
sal for a large class of functions for which the Fourier transforms is well localized
around zero.

In the usual cases, the shift is much smaller than the spread Δ of the initial state
of the measurement pointer. But for some variables, e.g., averages of variables of a
large ensemble, for very rare event in which all members of the ensemble happened
to be in the appropriate post-selected states, the shift is of the order, and might be
even larger than the spread of the quantum state of the pointer [5]. In such cases the
weak value is obtained in a single measurement which is not really “weak”.

There have been numerous experiments showing weak values [7–11], mostly of
photon polarization and the AAV effect has been well confirmed. Unusual weak
values were used for explanation peculiar quantum phenomena, e.g., superluminal
velocity of tunneling particles [12,13]. ( Superluminal communication; tunneling).

When the AAV effect was discovered, it was suggested that the type of an am-
plification effect which takes place for unusually large weak values might lead to
practical applications. Twenty years later, the first useful application has been made:
Hosten and Kwiat [14] applied weak measurement procedure for measuring spin
Hall effect in light. This effect is so tiny that it cannot be observed without the
amplification.

So again, saying things are meaningless doesn't mean they're meaningless.
 
  • #81
Are you kidding me?

I gave you the reference to the original and complete Vaidman paper earlier and you cite from a summary about it?

So as you refuse to read the original paper, let me state explicitly what Vaidman himself states about the very measurement you are talking about in the real paper:

"In the opposite limit, where Delta pi is much bigger than all a_i, the final probability distribution will be again close to a Gaussian with the spread Delta pi. The center of the Gaussian will be at the mean value of A: <A> =sum_i |a_i|^2 a_i. One measurement like this will give no information because Delta pi>><A>; but we can make this same measurement on each member of an ensemble of N particles prepared in the same state, and that will reduce the relevant uncertainty by the factor 1/sqrt(N), while the mean value of the average will remain <A>. By enlarging the number N particles in the ensemble, we can make the measurement of <A> with any desired precision."

Let me emphasize that he literally says "no information", not "little information". Essentially, this is how weak measurements work. If and only if the single measurement is so weak that it does not give you any information on its own, it can be performed without disturbing the system.

Your statement "It's like a small signal at a Collider" and "Results of weak measurement do follow what I call underlying reality" are exactly what is not the case. A signal at a collider is still recorded via a strong measurement and thus governed by the eigenvalues of the measurement operator. A single weak measurement can give pretty much any result, mostly nonsensic ones like a spin of 100 for a spin 1/2 particle. It is a feature of weak measurements that the single results explicitly do NOT follow what you call underlying reality. A spin value of 100 is not possible for a spin 1/2 particle. Still it is a possible (and not even rare) result of a weak measurement. If you think that this is the same as a collider signal, you really need to understand weak measurements first.
 
  • #82
cthugha,

Let me quote it again.

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


A weak measurement followed by a strong measurement gives you enough information to directly measure the wave function of a single photon.
 
  • #83
It is enough to give you a single weak measurement value which - as Vaidman correctly states - contains no information.

By the way: Is it the case do not have access to those papers? Usually I assume that posting links to papers is enough. However, it does not help the discussion if I just post them and they go unread.
 
  • #84
cthugha,

Again, I quote you from earlier:

"While a weak measurement on a single system provides little information, by repeating it on an arbitrarily large ensemble of identical systems one can determine the average measurement result with arbitrary precision."

Again, in Lundeen, a weak measurement followed by a strong measurement allowed him to directly measure the wave function of a photon.

You seem to be avoiding Lundeen which says:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.
 
  • #85
matrixrising said:
You seem to be avoiding Lundeen which says:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.

Where am I avoiding him? He performs a single weak measurement of the wave function. You seem to have the impression that a single measurement of the wave function gives you the wave function. This is not the case. As Vaidman's statement above shows, it is even the case that a single weak measurement does not give you any information about the actual value of the quantity you just measured. I do not disagree with Lundeen. He performed a measurement of the wave function of the single particle. This single measurement just does not contain any information (well, the weak part). Only a huge number of repeated experiments does. Lundeen does a huge number of repeated experiments and gets the wave function.

If you disagree with that, please tell me, where exactly Vaidman is wrong, if possible using some peer reviewed evidence.
 
  • #86
Wrong again,

I never said a single weak measurement gives you the wave function. I said a weak measurement followed by a strong measurement reduces the disturbance and gives you the direct measurement of a single particles wave function. From Lundeen.

How the experiment works:Apparatus for measuring the wavefunction

1. Produce a collection of photons possessing identical spatial wavefunctions by passing photons through an optical fiber.
2. Weakly measure the transverse position by inducing a small polarization rotation at a particular position, x.
3. Strongly measure the transverse momentum by using a Fourier Transform lens and selecting only those photons with momentum p=0.
4. Measure the average polarization rotation of these selected photons. This is proportional to the real part of the wavefunction at x.
5. Measure the average rotation of the polarization in the circular basis. (i.e. difference in the number of photons that have left-hand circular polarization and right-hand circular polarization). This is proportional to the imaginary part of the wavefunction at x.


You seem to be debating something that nobody has claimed. Again:

Weakly measuring the projector |x><x| followed by a strong measurement with result p=0 results in a weak value proportional to the wavefunction.
 
  • #87
matrixrising said:
Wrong again,

I never said a single weak measurement gives you the wave function. I said a weak measurement followed by a strong measurement reduces the disturbance and gives you the direct measurement of a single particles wave function.

Ehm, it reduces the disturbance compared to two strong measurements.

And yes, it is a direct weak measurement and it gives you one weak measurement result. I do not disagree with that. My question is simple: Do you think that this SINGLE weak measurement gives you actually some information about the wave function? If so, please tell me where Vaidman is wrong.

If you think it does not, then yes, we agree. They do a single measurement, but the single weak result does not correspond do any physical quantity. Lundeen himself is careful enough to acknowledge this point. He says "The average result of the weak measurement of πx is proportional to the wavefunction of the particle at x.". The average is. The single result is not.
 
Last edited:
  • #88
Neither Vaidmen or Lundeen said a single weak measurement gives you a direct measurement of a single particles wave function. Like I said earlier, I'm not sure what you're debating.

Lundeen said a weak measurement followed by a strong measurement reduces disturbance which results in a weak value proportional to the wave function. One more time:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


This is the direct measurement of a single particles wave function.
 
  • #89
matrixrising said:
Neither Vaidmen or Lundeen said a single weak measurement gives you a direct measurement of a single particles wave function. Like I said earlier, I'm not sure what you're debating.

I am debating your claim
"There was a one to one correspondence with the spatial wave function of a single photon and an ensemble of photons." and your claim that this proves something about underlying realities which go beyond statistical information.

In order to show that, Lundeen would have needed to actually get the full wave function of single photons by measurements on a single photon only. He never even intended to do that. He just wants to do a direct (nontomographic) measurement which actually gives you the wave function in the ensemble average over many weak measurements, which is indeed an important achievement.

matrixrising said:
This is the direct measurement of a single particles wave function.

But how is this connected to your above claim? Where is the correspondence with an underlying reality - whatever that may be? The very point of weak measurements is that this correspondence does not exist on the single measurement level. The wave functions measured are inherently ensemble averaged quantities. This experiment does not contain any well hidden information about every single photon. Lundeen explicitly acknowledges this at the end of the paper when he says: " In this sense, the method provides a simple and unambiguous operational definition of the quantum state: it is the average result of a weak measurement of a variable followed by a strong measurement of the complementary variable."
To Lundeen the wave function is related to the average, not to the single result.
 
  • #90
Of course it supports what I'm saying. You said:

To Lundeen the wave function is related to the average, not to the single result.

Of course it's related to a single result. How can you have an average if the single results are not proportional to the average?

If I give you the average PPG for Lebron James, his individual results will be proportional to the average. This is why Lundeen says:

In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


Bohm 2 gave a good example:

To understand what weak measurement is, the following analogy from everyday life is useful. Assume that you want to measure the weight of a sheet of paper. But the problem is that your measurement apparatus (weighing scale) is not precise enough to measure the weight of such a light object such as a sheet of paper. In this sense, the measurement of a single sheet of paper is - weak.

Now you do a trick. Instead of weighing one sheet of paper, you weigh a thousand of them, which is heavy enough to see the result of weighing. Then you divide this result by 1000, and get a number which you call - weak value. Clearly, this "weak value" is nothing but the average weight of your set of thousand sheets of papers.

But still, you want to know the weight of a SINGLE sheet of paper. So does that average value helps? Well, it depends:

1) If all sheets of papers have the same weight, then the average weight is equal to weight of the single sheet, in which case you have also measured the true weight of the sheet.

2) If the sheets have only approximately equal weights, then you can say that you have at least approximately measured the weight of a single sheet.

3) But if the weights of different sheets are not even approximately equal, then you have not done anything - you still don't have a clue what is the weight of a single sheet.

All of the sheets(particle wave functions) were identical. The weak value of a single photon corresponds to the average. In other words, I can look at Lebron James average PPG and then go back and compare that average to individual games throughout the year and they should correspond to one another.
 
  • #91
matrixrising said:
To Lundeen the wave function is related to the average, not to the single result.

Of course it's related to a single result. How can you have an average if the single results are not proportional to the average?

Do you know what a variance is? A huge variance compared to the mean does exactly this. This is basic first semester stuff.

matrixrising said:
If I give you the average PPG for Lebron James, his individual results will be proportional to the average.

This is a strong measurement. Not a weak one. In a weak measurement you would (to construct the analogy) also get results like -27 points in a game which clearly cannot have any sensible meaning.

matrixrising said:
All of the sheets(particle wave functions) were identical. The weak value of a single photon corresponds to the average. In other words, I can look at Lebron James average PPG and then go back and compare that average to individual games throughout the year and they should correspond to one another.

No! The important thing about weak values is that this is exactly not the case. That is a common fallacy. You consider a measurement with small variance, while weak measurements have huge variance. The thing you look for is called an element of reality. Vaidman himself said: "In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

Consider Vaidman's case of a spin 1/2 particle (which can have spin values of +1/2 and -1/2) which can yield a weak measurement spin value of 100 in a single measurement. How is that related to the average?

edit: To clarify further, let me cite Vaidman again:
"The weak value is obtained from statistical analysis of the readings of the measuring devices of the measurements on an ensemble of identical quantum systems. But it is different conceptually from the standard definition of expectation value which is a mathematical concept defined from the statistical analysis of the ideal measurements of the variable A all of which yield one of the eigenvalues ai."
 
Last edited:
  • #92
Again, Apples&Oranges. You quotes:

"In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

This isn't Lundeen from 2011. In this case the the value of the shift is determined by a stream of photons with identical wave functions. The weak values in the case of Lundeen corresponds to the average. Here's more:

At the centre of the direct measurement method is a reduction of the
disturbance induced by the first measurement. Consider the measure-
ment of an arbitrary variable A. In general, measurement can be seen as
the coupling between an apparatus and a physical system that results in
the translation of a pointer. The pointer position indicates the result of
a measurement. In a technique known as ‘weak measurement’, the
coupling strength is reduced and this correspondingly reduces the
disturbance created by the measurement9–18. This strategy also com-
promises measurement precision, but this can be regained by aver-
aging. The average of the weak measurement is simply the expectation
value ÆYjAjYæ, indicated by an average position shift of the pointer
proportional to this amount.

This is the ball game. A little more:

The average result of the weak mea-
surement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-
function. At each x, the observed position and momentum shifts of the
measurement pointer are proportional to ReY(x) and ImY(x),
respectively. In short, by reducing the disturbance induced by mea-
suring X and then measuring P normally, we measure the wave-
function of the single particle.

Finally the kicker:

The benefit of this reduction in precision
is a commensurate reduction in the disturbance to the wavefunction of
the single photon.


Again, measurements of a single photon correspond to the average. This way you get direct measurement of a single particles wave function.

You reduce the disturbance and you get an average of the weak value that's proportional to the wave function of a single particle.
 
  • #93
matrixrising said:
Again, Apples&Oranges. You quotes:

"In such a case, a measurement performed on a single system does not yield the value of the shift (the element of reality), but such measurements performed on large enough ensemble of identical systems yield the shift with any desirable precision." (Foundations of Physics 26, 895 (1996)).

This isn't Lundeen from 2011. In this case the the value of the shift is determined by a stream of photons with identical wave functions.

Apples & oranges? Weak values and weak values. The physics of weak values does not change over night.

matrixrising said:
The weak values in the case of Lundeen corresponds to the average.

At the centre of the direct measurement method is a reduction of the
disturbance induced by the first measurement. Consider the measure-
ment of an arbitrary variable A. In general, measurement can be seen as
the coupling between an apparatus and a physical system that results in
the translation of a pointer. The pointer position indicates the result of
a measurement. In a technique known as ‘weak measurement’, the
coupling strength is reduced and this correspondingly reduces the
disturbance created by the measurement9–18. This strategy also com-
promises measurement precision, but this can be regained by aver-
aging. The average of the weak measurement is simply the expectation
value ÆYjAjYæ, indicated by an average position shift of the pointer
proportional to this amount.

Says who? Lundeen does not. The average weak values correspond trivially to the average. The single weak value of a single measurement clearly does not correspond to any element of reality.

matrixrising said:
This is the ball game. A little more:
The average result of the weak mea-
surement of px is proportional to the wavefunction of the particle at x.
Scanning the weak measurement through x gives the complete wave-
function. At each x, the observed position and momentum shifts of the
measurement pointer are proportional to ReY(x) and ImY(x),
respectively. In short, by reducing the disturbance induced by mea-
suring X and then measuring P normally, we measure the wave-
function of the single particle.

Still, all about averaged values. Nothing about single weak measurement results.

matrixrising said:
Finally the kicker:

The benefit of this reduction in precision
is a commensurate reduction in the disturbance to the wavefunction of
the single photon.

Wait, do you have the impression that a single photon wave function is the wave function of a single realization? It is the wave function of identically prepared states containing one photon each.

matrixrising said:
Again, measurements of a single photon correspond to the average. This way you get direct measurement of a single particles wave function.

Ehm, as the single photon wave function is an ensemble average, this is trivial, no? The single measured weak values on a single realization, however, does not correspond to the average. It is usually even far off and far away from reasonable values.

matrixrising said:
You reduce the disturbance and you get an average of the weak value that's proportional to the wave function of a single particle.

Yes, the average. Sure.

For the last time: Tell me, where Vaidman is wrong. His results are not just valid on mondays or for years up to 2010. They are pretty general. It is pretty well known that identifying single weak measurement results without averaging with elements of reality is a fallacy.

We can discuss further if you have a valid objection to Vaidman's position, but I will not waste any further time explaining the basics if you do not even have the intention to understand them.
 
  • #94
What? You said:

For the last time: Tell me, where Vaidman is wrong. His results are not just valid on mondays or for years up to 2010. They are pretty general. It is pretty well known that identifying single weak measurement results without averaging with elements of reality is a fallacy.

Wrong about what? What are you talking about?

What does Vaidman have to do with the experiment carried out by Lundeen?

Here's more from Lundeen:

The wavefunction is the complex distribution used to completely
describe a quantum system, and is central to quantum theory. But
despite its fundamental role, it is typically introduced as an abstract
element of the theory with no explicit definition1,2. Rather, physicists
come to a working understanding of the wavefunction through its
use to calculate measurement outcome probabilities by way of the
Born rule3. At present, the wavefunction is determined through
tomographic methods4–8, which estimate the wavefunction most con-
sistent with a diverse collection of measurements. The indirectness of
these methods compounds the problem of defining the wave-
function. Here we show that the wavefunction can be measured
directly by the sequential measurement of two complementary vari-
ables of the system. The crux of our method is that the first measure-
ment is performed in a gentle way through weak measurement9–18,
so as not to invalidate the second. The result is that the real and
imaginary components of the wavefunction appear directly on our
measurement apparatus. We give an experimental example by
directly measuring the transverse spatial wavefunction of a single
photon, a task not previously realized by any method.


Again:

We give an experimental example by
directly measuring the transverse spatial wavefunction of a single
photon, a task not previously realized by any method.


You seem to dodge Lundeen like Superman dodges bullets. It's almost like you just stick your head in the sand and deny, deny, deny regardless of the facts. It's like the guy said at the conference where ensemble interpretations were accepted by 3% of the attendees.

However, you implicitly make statements that quantum mechanics can't achieve certain things – even though it can.

How many times does Lundeen have to say he measured the wave function of a single particle? Lundeen:

The benefit of this reduction in precision
is a commensurate reduction in the disturbance to the wavefunction of
the single photon.


In short, by reducing the disturbance induced by measuring X and then measuring P normally, we measure the wave-
function of the single particle.


You keep talking about everything but Lundeen. Show me where Lundeen said he didn't directly measure the wave function of a single photon.
 
  • #95
Lundeen may be wrong. He is contradicted by Lundeeni:)

http://arxiv.org/abs/1112.3575

"Indeed, it is impossible to determine a completely unknown wavefunction of single system [20]."

"In contrast, we introduce a method to measure ψ of an ensemble directly."
 
Last edited:
  • #96
atyy said:
"Indeed, it is impossible to determine a completely unknown wavefunction of single system [20]."

And obviously so:

bhobba said:
If we observe a state with an apparatus that gives 0 if its not in that state and 1 if it is then the quantum formalism tells us that since states can be a superposition of those two outcomes it may be in a state that sometimes gives 0 and sometimes 1. To determine it is in that state you need to carry out the observation a sufficiently large number of times for the null result to be below your level of confidence - you can never be sure - all you can do is simply make the chances of being wrong arbitrarily small ie is zero for all practical purposes.

There is no 'argument' about it - if QM is correct YOU CAN'T DO IT - its a simple, almost trivial, result from its basic axioms.

Thanks
Bill
 
  • #97
matrixrising said:
Wrong about what? What are you talking about?

What does Vaidman have to do with the experiment carried out by Lundeen?

You claim that a single weak measurement result is meaningful in Lundeen's experiment. Vaidman says that a single weak measurement result is never meaningful. It is not hard to see the problem here.

matrixrising said:
You keep talking about everything but Lundeen. Show me where Lundeen said he didn't directly measure the wave function of a single photon.

Oh, I do not deny that Lundeen did that. I just say that your former claim shows that you do not know what these terms mean. There are 3 or 4 terms here that need to be treated with caution:

1) (not that much of a deal) measured: means weakly measured.
2) wave function: has been defined by Lundeen in his paper: "the average result of a weak measurement of a variable followed by a strong measurement of the complementary variable"
3) direct: means 'not by tomography'/'not by max likelihood reconstruction'. It does not mean something like a single shot measurement.
4) single photon: Can be interpreted in two correct ways here: First, as an ensemble of identically prepared single particle realizations. In this case, measurement means getting the full and accurate description of the ensemble. Second, Lundeen performs a weak measurement of the wave function on every single particle realization. This single result is meaningless on its own (see Vaidman) as it does not correspond to an element of reality, but measurements with meaningless results are of course still measurements. Version 1 is the more probable definition (see atyy's last post).

Can you tell me where Lundeen writes something that supports your position? So far I have not seen anything.

edit: You also seem to be implying that the ensemble interpretation says that qm cannot be applied to a single system or particle. This is of course also wrong. It may well be applied to a single system or particle, and predict what is the probability that that single system will have for a value of one of its properties, on repeated measurements. See the wikipedia entry on the ensemble interpretation or any modern article on it for more details.
 
Last edited:
  • #98
What experiments have been done to determine the speed of collapse in interpretation with collapse? For example, in double slit, when one side of slit screen detector detects the particle, the entire wave function collapse, so does the collapse travel at speed of light or instantaneous between the detectors in both slits? What experiments have been done akin to this to determine if it's instantaneous or travel at speed of light? Or can no experiment be done to determine it, why?
 
  • #99
kye said:
What experiments have been done to determine the speed of collapse in interpretation with collapse? For example, in double slit, when one side of slit screen detector detects the particle, the entire wave function collapse, so does the collapse travel at speed of light or instantaneous between the detectors in both slits? What experiments have been done akin to this to determine if it's instantaneous or travel at speed of light? Or can no experiment be done to determine it, why?

These days collapse is often associated with decoherence - it explains APPARENT collapse. I believe it happens VERY VERY quickly but can't recall the exact time scales off the top of my head. I do know it has been measured, but they had to arrange for it to be slower than usual to do it.

Undoubtedly a google search would yield more concrete figures.

Thanks
Bill
 
  • #100
bhobba said:
These days collapse is often associated with decoherence - it explains APPARENT collapse. I believe it happens VERY VERY quickly but can't recall the exact time scales off the top of my head. I do know it has been measured, but they had to arrange for it to be slower than usual to do it.

Undoubtedly a google search would yield more concrete figures.

Thanks
Bill

decoherence is not the same as collapse because it is just in mixed state and the born rule isn't invoked in decoherence, collapse is additional to decoherence... (don't you agree?) i think true collapse is related to to the term "dynamical collapse"
 

Similar threads

Back
Top