Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Wave function collapse and the statistical nature of quantum states

  1. Nov 15, 2009 #1
    Sorry for a (maybe) dumb question, but... I understand that according to QM, the description of the situation for a particle or system is described by a linear superposition of the wave functions of all the possible states (eigenstates) of the system. When a measurement is made, the wave function, according to garden variety QM, "collapses" to the state measured. The probability of getting a particular result is proportional to the usual square (in the complex sense) of the wave function corresponding to the eigenvector for the state measured. All that being said, it seems to me that there are reasonable scenarios which do not require this composite wave function, nor its collapse. Let me lay out my thoughts for someone to attack...

    If the system is simply in a single state, then the measurement identifies (measures) that state. If I make multiple measurements over a large sample of equivalent systems, I get a distribution of results. QM would say that is because of the nature of the composite wave function. An equally good, and much simpler explanation is that I am simply drawing results from systems already in different single eigenstates, and the distribution I get simply reflects the distribution of eigenstates in the sample population. For the situation I am describing, there is no way to tell the difference between these explanations. Maybe here, the composite wave function is simply an artifact of the QM formalism???
     
  2. jcsd
  3. Nov 15, 2009 #2

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    There actually is a way to tell the difference. The possibility you're describing is ruled out by Bell inequality violations. So you might want to look into that

    There are many threads about it here, so you can try a search. There are some crackpot posts in some of them, so you should focus on the posts made by science advisors. There's a good derivation of a Bell inequality in Isham's book: 215, 216
     
    Last edited: Nov 15, 2009
  4. Nov 15, 2009 #3

    DrChinese

    User Avatar
    Science Advisor
    Gold Member

    As Fredrik says, that is ruled out by Bell. There are also a host of newer theorems (GHZ, Hardy, Leggett, Cabello, etc.) that also rule out the so-called "realistic" or "hidden variable" interpretations.

    Now, there are some that are still on the table that have explicit non-local components (the Bohmian types). However, they too must demonstrate what is called contextuality - which is to say a form of observer dependence. The upshot is that particles do not have classical attributes that are simply waiting to be revealed. What you see is a function of what you look for.
     
  5. Nov 15, 2009 #4
    To LAncienne: How would you interpret the Stern Gerlach experiment? Would you say that electrons enter the magnetic field already in either the spin-up or spin-down state, and the apparatus merely separates them accordingly? Because remember you can also prepare a spin-sideways beam of electrons. Is this a problem for your interpretation?
     
  6. Nov 15, 2009 #5
    If you want to have a local deterministic model, you need to invoke superdeterminism, like 't Hooft does here:


    http://arxiv.org/abs/0908.3408
     
  7. Nov 15, 2009 #6
    You should have a look at the lectures at
    http://www.princeton.edu/WebMedia/lectures/
    of Mermin and another by Conway (Kochen Specker theorem).

    Basically the problem is that in there hidden variable theories, the particles need to know what is going to be measured in advance. Otherwise they couldn't keep a list of replies to all eventually possible measurement once they are apart.

    But I'm also working on understanding this.

    @Count Iblis:
    I'm actually interested in this, but after the first reading I wasn't sure what he meant. Why isn't he using a specific celluar automaton that has local rules, but visibly violates Bell's inequality?
     
  8. Nov 16, 2009 #7

    zonde

    User Avatar
    Gold Member

    There is a problem here about definite states. You see the experiment can be arranged using continuous change in base for measurement. For example you can rotate polarizer by arbitrary angle but your system is in single eigenstate only for certain polarizer angles.
    So you have to define what your system will do if your single state is not in perfect match with measurement.
     
  9. Nov 16, 2009 #8
    He hasn't demonstrated precisely how Bell's inequalites will be violated. His point (also made in other papers), is that the derivation of Bell's inequalities is invalid, because it assumes that the observer could have made a counterfactual experimental setting while leaving the measured system unchanged. Precisely because the model is deterministic, this is impossible.

    The observer is also deterministic. If you give the observer the freedom to chose its experimental setting, then you have to implement that by specifying a set of physically acceptable initial states of the universe, not by assuming that any local change in the current state is a physically acceptable state. It most likely is not, as evolving such a state back in time would most likely lead to the entropy increasing again.
     
  10. Nov 17, 2009 #9
    I think I need to go through all of the Bell arguments myself.
    Maybe the CHSH paper itself. Or which approach do you recommend, so that I can convince myself of the weaknesses?

    Are you saying the theory could be local, but the types of measurement would be predetermined and have to be found self-consistently with the universe?

    Also, the experimenter cannot change the experiment by hindsight, but still the state of the particle has to be prepared for all experimenter's questions? Otherwise at some point it will fail the laws of QM.

    That is possible, but that's equivalent to failing the intended purpose of HV theories. They wanted to make QM more intuitive by the common local picture. A predetermined measurement theory is like MWI. It might relieve us from philosophical difficulties, but doesn't help the understanding.

    Maybe I can outline the Bell argument and you tell me at which point I should search for a flaw:
    1. I will perform experiments label with index i on two particles x and y flying apart
    2. the particles with have states [itex]x_i[/itex] and [itex]y_i[/itex] in these experiments
    3. the detector will be set to states [itex]r_i[/itex] and [itex]s_i[/itex]
    4. the outcomes of the experiment are a physical law given by a function; therefore the outcomes are [itex]f(x_i,r_i)[/itex] and [itex]f(y_i,s_i)[/itex]
    5. if detectors are perfectly set to one of 3 states then we could name a particle by its outcome in all possible experiments [itex](f(x_i,1),f(x_i,2),f(x_i,3))\times(f(y_i,1),f(y_i,2),f(y_i,3))[/itex] which is all we need to know about the state.
    6. writing out a big table [itex](f(x_i,1),f(x_i,2),f(x_i,3))\times(f(y_i,1),f(y_i,2),f(y_i,3))[/itex] where [itex]f(\cdot,\cdot)=\pm 1[/itex], one can use set theory to write down a Bell inequality since probabilities in this table add. Each experiment will fall into one of these 64 cells.
    7. if we assume that some states have zero probability (like [itex]P([+++,-++])=0[/itex]), then experiments seem to violate the inequality (if the given example wasn't zero, then if we ever measure in detector setting 2 or 3, then we'd notice an inconsistency with the laws of QM)
     
    Last edited: Nov 17, 2009
  11. Nov 17, 2009 #10
  12. Nov 17, 2009 #11
  13. Nov 17, 2009 #12
    It is the assumption that you can choose r_i and s_i without affecting x_i and y_i.
     
  14. Nov 17, 2009 #13
    Are you saying that I will not be able to make a measurement inconsistent with QM or Bell, because the universe dictates what I'm going to measure anyway?

    That may be true, but in such a case that also means that there is no intuitive model of QM?!

    The state of particles is chosen at time t and the state of the detector also. Of course the actual result of the measurement depends on how the particles arrives at the detector, but that's all included in f(x,r)
     
  15. Nov 17, 2009 #14
    Well, folks, this has wandered interestingly (sometimes) far from the original question! I was not particularly shilling for any interpretation, just wondering if there was a situation where I could not conclusively tell what the wave function was (had to have (might have) been?) from the results.

    Let me lay out a really simple gedanken example. I have an ensemble of equivalent systems. all of which have three (the same three ;-{) ) eigenstates. I have a detecting apparatus that can determine which state the system (s) are each in. Using this apparatus, I measure a large number of the systems, and get a distribution of results, i.e., the number of measurements that yield each eigenstate. Seem reasonable? I do not worry about any composite states for the apparatus and ensemble, or as Heisenberg once said, where the "schnitter" (dividing line between quantum and macroscopic domains) lies. My point is that, with this situation, I can't tell the difference between the two interpretations. I am sure that there are valid reasons why the QM formalism would indicate that I had interpreted my results incorrectly, BUT, in this instance, there is ambiguity. What's wrong with this? (Remember, the ambiguity, NOT the interpretation!!)
     
  16. Nov 17, 2009 #15
    What is the cellular automata bit, coming from left field?
     
  17. Nov 17, 2009 #16

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    All of this has already been answered. It isn't possible to determine which state the system is in. The observables of a theory that says that this is possible satisfy Bell inequalities, and Bell inequalities do not hold in the real world, so the possibility you're describing has been ruled out by experiments.
     
  18. Nov 17, 2009 #17

    Well, you can interpet the violation of Bell's inequality as ruling out the assumption that you could have made a counterfactual measurement without affecting the exact state of the measured system. That the choice of the setting of polarizer 2 will not affect the outcome measured at polarizer 1 is not necessarily true in a local deterministic theory.
     
  19. Nov 17, 2009 #18
    All of this has already been answered. It isn't possible to determine which state the system is in. The observables of a theory that says that this is possible satisfy Bell inequalities, and Bell inequalities do not hold in the real world, so the possibility you're describing has been ruled out by experiments.

    This is either getting weird or we are not talking about the same thing, somehow. When I make an concrete observation about a thing in the quantum realm (system, particle or ?) I have determined at least one aspect of its state (else what is all this wave function collapse stuff about? collapses to what?... a measured observed state..) as a result of the observation. Think single photons from a very weak source, and three detectors in a row that absorb horizontal, then vertical, then circular polarization. If the first one absorbs the photon, I have actually determined the that the polarization of the photon was vertical and so on. Whether it was actually a mixture and the first absorber "forced" it in some sense, to be so detected, or whether it actually was vertically polarized, I have detected at least this aspect of its state, haven't I?
     
  20. Nov 18, 2009 #19

    Fredrik

    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

    I think we are talking about the same thing, and yes, it's pretty weird.

    You have only determined what the state is after the measurement. There's no measurement that can tell you what the state was before the measurement, if you didn't know it already as a result of a previous measurement. The only way to determine an unknown state is to do measurements on each member of a very large ensemble of identically prepared systems.

    The "collapse" is the change that the wavefunction goes through when you measure an observable. The wavefunction is always changed by the measurement, unless it was already in an eigenstate of the observable you're measuring.

    If the source prepares the photons in the same polarization state every time, then you can find out what state that is by doing polarization measurements on lots of photons. But a single measurement can't tell you what state the photon was in before the measurement. When "the first one absorbs the photon", the only conclusion you can make about the state before the absorption is that it wasn't such that the probability of absorption was zero. So all you know is that it wasn't a state of linear polarization in the vertical direction. It could still be any other state. (And if your detector isn't perfect, it could be that one too).

    Isn't quantum mechanics fun? :smile:
     
  21. Nov 18, 2009 #20

    Fra

    User Avatar

    I think several has tried to make this point already but...

    The QM state vector/wave function, of a system defines "probabilistically" the possible result of a measurement. A system with a well define state is said to be pure.

    Of course this formalism breaks down unless the QM state is not pure (which is of course an idealisation, just like you never have perfect information, it's simply not realistic), because if the state vector itself is uncertain, then you do not even know probabilistically the possible outputs.

    The question is then howto save the QM formalism given this obvious issue?

    Probably the simplset possible resolution of this, which is the standard formalism for dealing with impure states are the densitry matrix formalism,
    http://en.wikipedia.org/wiki/Density_matrix

    where you simply consider the simple idea that the quantum state is smeared by a classical probability distribution. So first the quantum state of the system is only known probabilistically, where each possbility (quantum state) defines probabilistically an outcome.

    One may have objections to this resolution on philosophical grounds, but the main point that you can't know with certainty the state vector from only a finite amount of measurements should be intuitively clear I think?

    /Fredrik
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Wave function collapse and the statistical nature of quantum states
Loading...