Mathematically what causes wavefunction collapse?

  • #51
There are several contenders that "explain" wave function collapse, but the one I lean towards is the Many Worlds Interpretation. That said the usual version leaves a lot to be desired in that it requires infinite dimensions (in state space) and I am far happier with the introduction of another time dimension. The way to think about his is that there are many universes all at a different angle to each other. See if this makes any sense to you. Not my idea, but it's a good one!
http://arxiv.org/pdf/quant-ph/9902037v3.pdf
 
Physics news on Phys.org
  • #52
stevendaryl said:
With a very small number of flips, it's clearer that nobody would believe the frequentist prediction; just because a coin produced heads-up twice in a row doesn't mean it'll produce heads-up three times in a row. When the number of flips gets very large, the frequentist predictions gets more sensible, but also, the difference between frequentist and Bayesian predictions diminishes.

This encapsulates the reason that I posted it pretty well.

Until we can be clear about whether a probability represents a property of an object or if it represents a subject's knowledge of a system and we have an explanation for what the hypothetical, or even real, infinite population that we're sampling actually is, then we can't hope to avoid other inerpretational issues when applying the formalism to the real world.
 
Last edited:
  • #53
Sugdub said:
Whether the state vector established through running the experiment in an iterative way can be projected as a property of each iteration taken individually is a dogma, not an experimental fact.

I would point out the same could be said about flipping a coin and assigning probabilities to it. In modern times probabilities is defined by the Kolmogorov axioms which is an abstract property assigned to an event (in your terminology iteration).

One then shows, via the law of large numbers, that is mathematically provable as a theorem from those axioms (plus a few reasonableness assumptions of the sort used in applied math all the time, but no need to go into that here) that for all practical purposes, if its done enough times the proportion of an event will equal the probability. This is the view taken by the Ensemble interpretation and what the state applies to - a conceptualization of a large number of iterations, events etc such that the proportion is the probability predicted by the Borne rule. When one makes an observation, in that interpretation, its selecting an element from that ensemble and wave-function collapse, in applying only to this conceptual ensemble, and nothing in any sense real, is of no concern at all.

It is a fundamental assumption of the theory that such is possible, but like heaps of stuff in physics usually not explicitly stated - it is assumed by merely mentioning probabilities in the Born rule such is understood. Its like when one defines acceleration as the derivative of velocity you are implicitly assuming the second derivative of position exists.

There is another view of probability that associates this abstract thing, probability, as defined in the Kolmogorov axioms, with a subjective confidence in something. This is the Bayesian view and is usually expressed via the so called Cox axioms - which are equivalent to the Kolmogorov axioms. This view leads to an interpretation along the lines of Copenhagen which takes the state as a fundamental property of an individual system, but gives a subjective confidence instead.

But we also have a very interesting theorem called Gleason's theorem. What this theorem shows, is if you want to associate a number between 0 and 1 on elements of a Hilbert space, and do it in a mathematically consistent way that respects the basis independence of those elements, then the only way to do it is via the Born rule. The reason this theorem is not usually used to justify the Born rule is the physical significance of that mathematical assumption is an issue - its tied up with what's called contextuality - but no need to go into that here - the point is there is quite a strong reason to believe the only reasonable way to assign probabilities to quantum events is via the Born rule. Oh and I forgot to mention it can be shown the Born Rule obeys the Kolmogorov axioms - that proof is not usually given because its assumed when you say gives the probability in an axiom you are assuming it does, but Ballentine, for example, is careful enough to show it.

The bottom line here is that physicists didn't pull this stuff out of a hat - its more or less forced on them by the Hilbert space formalism.

Thanks
Bill
 
  • #54
stevendaryl said:
With a very small number of flips, it's clearer that nobody would believe the frequentist prediction; just because a coin produced heads-up twice in a row doesn't mean it'll produce heads-up three times in a row. When the number of flips gets very large, the frequentist predictions gets more sensible, but also, the difference between frequentist and Bayesian predictions diminishes.

I think if you go even further back to the Kolmogorov axioms you would not fall into any of this in the first place.

The frequentest view requires a very large number for the law of large numbers to apply - the exact number depending on what value in the convergence in probability you want to accept as for all practical purposes being zero eg you could use the Chebyshev inequality to figure out a suitable number to give a sufficiently low probability. Still it's is a very bad view for carrying out experiments to estimate probabilities. The Bayesian view is much better for that because you update your confidence as you go - you simply keep doing it until you have a confidence you are happy with. However for other things the frequentest view is better - you choose whatever view suits the circumstances knowing they both derive from its real justification - the Kolmogorov axioms.

I think its Ross in his book on probability models that points out regardless of what view you subscribe to its very important to learn how to think probabilistically, and that usually entails thinking in terms of what applies best to a particular situation.

But its good to know the real basis for both is the Kolmogorov axioms and Baysean and frequentest are really just different realizations of those axioms.

Thanks
Bill
 
Last edited:
  • #55
craigi said:
Until we can be clear about whether a probability represents a property of an object or if it represents a subject's knowledge of a system and we have an explanation for what the hypothetical, or even real, infinite population that we're sampling actually is, then we can't hope to avoid other inerpretational issues when applying the formalism to the real world.

I say it represents neither - it represents a number that obeys the Kolmogorov axioms. Both the Baysian and frequentest approaches are simply different realizations of those axioms. You choose the view that suits the circumstances.

If you want to use the frequentest view in QM then you are led to something like the Ensemble interpretation.

If you want the Bayesisan view you are led to Copenhagen.

In the MWI the Bayesian view seems to work best because the 'probability' represents a confidence you will find yourself in a particular world - viewing it in a random way like throwing a dice doesn't sit well with a deterministic theory.

I think Consistent Histories views it Bayesian

Thanks
Bill
 
  • #56
stevendaryl said:
In contrast, the Bayesian probability is more complicated to compute. It's something like, letting p be the unknown probability of "heads":

If I remember correctly, and its ages since I studied Baysian statistics, what you usually do is assign it some resonsonable starting probability such as for a coin 1/2 and a 1/2 then you carry out experiments to update this probability until you get it at a confidence level you are happy with.

There is something in the back of my mind from my mathematical statistics classes attended 30 years ago now that this converges quicker than using stuff like the Chebychev inequality to estimate the number of trials to get a reasonable confidence level - but don't hold me to it.

But in QM we have this wonderful Gleason's Theorem that if you want a probability that respects the formalism of vector spaces whose properties are not dependent on a particular basis then the Born Rule is the only way to do it.

Of course that assumption may not be true - but you really have to ask yourself why use a Hilbert space formalism in the first place if it isn't.

Thanks
Bill
 
Last edited:
  • #57
Sugdub said:
Whether the state vector established through running the experiment in an iterative way can be projected as a property of each iteration taken individually is a dogma, not an experimental fact.

bhobba said:
This is the view taken by the Ensemble interpretation and what the state applies to - a conceptualization of a large number of iterations, events etc such that the proportion is the probability predicted by the Borne rule. When one makes an observation, in that interpretation, its selecting an element from that ensemble and wave-function collapse, in applying only to this conceptual ensemble, and nothing in any sense real, is of no concern at all.

Hmm, it seems there is more than one Ensemble interpretation out there:
Einstein said: "The attempt to conceive the quantum-theoretical description as the complete description of the individual systems leads to unnatural theoretical interpretations, which become immediately unnecessary if one accepts the interpretation that the description refers to ensembles of systems and not to individual systems."


bhobba said:
But we also have a very interesting theorem called Gleason's theorem. What this theorem shows, is if you want to associate a number between 0 and 1 on elements of a Hilbert space, and do it in a mathematically consistent way that respects the basis independence of those elements, then the only way to do it is via the Born rule.
Gleason's theorem does not say what these numbers mean physically, right? But Born rule says that these numbers are probabilities.
 
  • #58
zonde said:
Hmm, it seems there is more than one Ensemble interpretation out there:
Einstein said: "The attempt to conceive the quantum-theoretical description as the complete description of the individual systems leads to unnatural theoretical interpretations, which become immediately unnecessary if one accepts the interpretation that the description refers to ensembles of systems and not to individual systems."

Like most interpretations there are a number of variants. The one Einstein adhered to is the one presented by Ballentine in his book and the usual one people mean when they talk about it. And indeed it refers to an ensemble of systems exactly as I have been saying in this tread about the state referring to an ensemble of similarly prepared systems - its the one more or less implied if you want to look on probability the frequentest way.

I hold to a slight variant however - called the ignorance ensemble interpretation that incorporates decoherence - check out the following for the detail:
http://philsci-archive.pitt.edu/5439/1/Decoherence_Essay_arXiv_version.pdf

zonde said:
Gleason's theorem does not say what these numbers mean physically, right? But Born rule says that these numbers are probabilities.

No it doesn't. But if you want to define a probability on the vector space and you want it not to depend on your choice of basis (this is the assumption of non-contextuality which in the Hilbert space formalism seems almost trivial - it actually took physicists like Bell to sort out exactly what was going on) it proves there is only one way to do it.

The assumption you make if you accept Gleason's theorem would go something like this - I don't know what outcome will occur but it seems reasonable I can associate some kind of probability to them. And if you do that then what the theorem shows is there is only one way to do it, namely via the Born Rule, and moreover that way obeys the Kolmogorov axioms. That is in fact a very innocuous assumption because all you are really doing is saying I can assume some kind reasonable confidence level can be associated with each outcome such as the Cox axioms. Or you believe if you do the observation enough times it will tend to a steady limit. But strictly speaking - yes its an assumption - however its so innocuous most would probably not grant it that status - I personally wouldn't.

Thanks
Bill
 
Last edited:
  • #59
bhobba said:
If I remember correctly, and its ages since I studied Bayesian statistics, what you usually do is assign it some reasonable starting probability such as for a coin 1/2 and a 1/2 then you carry out experiments to update this probability until you get it at a confidence level you are happy with.

The way I have used Bayesian probability in the past (and I'm uncertain about the relationship between Bayesian probability and Bayesian statistics), what you are trying to do is to describe your situation in terms of parameters, and then use whatever data is available (including none!) to estimate the likelihood of the various possible values of those parameters.

So relative frequency only very indirectly comes into play. The probabilities are degrees of belief in the values of something, that something may not be a "random variable" at all--it might be a constant such as the mass of some new particle. Actually, that's usually the case, the parameters that you are dealing with are usually one of a kind things, not repeatable events. As for confidence intervals, I don't think those are as important in Bayesian probability as in frequentist. A probability is your confidence in the truth of some claim.

In general, you have some parametrized theory, and you're trying to figure out the values for the parameters.

The way that I would handle the problem of coin tosses would be to parameterize by a parameter p (the probability of heads) that ranges from 0 to 1. This parameter, like any other unknown parameter, has a probability distribution for its possible values. Then you use the available data to refine that probability distribution.

So initially, you guess a flat distribution:

P(p) = 1 for the range 0 \leq p \leq 1

According to this flat distribution for p, you can compute your prior estimate of the likelihood of heads:

P(H) = \int dp P(p) \cdot P(H | p) = \int dp \ 1 \cdot p = 1/2

So I come to the same conclusion, that the likelihood of getting "heads" based on no data at all, is 1/2. But it's not that I guessed that--that's computed based on the guess that the parameter p has a flat distribution in the range [0,1].
 
  • #60
bhobba said:
I would point out the same could be said about flipping a coin and assigning probabilities to it. In modern times probabilities is defined by the Kolmogorov axioms which is an abstract property assigned to an event (in your terminology iteration).

There are two aspects which require some attention.
First, one must clarify the rationale for assigning a probability (which is a form of property) to a discrete occurrence of an event-type, better than assigning this probability to the event-type representing one category of events that may be observed when running the experiment. In the first case the probability is a property of the unique iteration of the experiment which produced the discrete information, but in the second case it is a property of the iterative implementation of the experiment. What I said in my previous input is that the second case formalises what is experimentally true, whereas the first one stems from a dogma which can be accepted or rejected. I do think that the second approach, which is minimal because it endeavours relying exclusively on experimental truth and what can logically be derived from it, should be used as a reference whenever other approaches based on non-verifiable hypotheses lead to paradoxes.

Second, assuming the minimal approach is followed, there might be no compelling need for referring to “probabilities”. The “state vector”, more exactly the orientation of a unit vector, represents an objective property of a quantum experiment run in an iterative way (i.e. the distribution of discrete events over a set of event-types). The quantum formalism transforms the orientation of a unit vector into another orientation of the same unit vector. The new orientation computed by the quantum formalism relates to the objective property of a modified experiment (the distribution pattern remaining over the same set of event-types) or a combination of such experiments, still assuming an iterative run of that set-up. It should be noted that in a manifold the orientation of a unit vector (i.e. a list of cosines) is the canonical representation for a distribution. Hence the choice of a vectorial representation for the quantum theory implies that the formalism will manipulate/transform a set of cosines (the so-called "amplitudes of probability") instead of their squared values which account for relative frequencies. (I'm not aware of any alternative / simple explanation for this peculiar feature of the quantum formalism often presented as a mystery, but I'd be keen to learn about them). Eventually references to the “probability” concept, and more significantly to the "amplitude of probability" mis-concept can be dropped since the former only stands for "relative frequency observed in the iterative mode" whereas the latter has lost any physical significance according to the proposed approach.

bhobba said:
This is the view taken by the Ensemble interpretation and what the state applies to - a conceptualization of a large number of iterations, events etc such that the proportion is the probability predicted by the Borne rule. When one makes an observation, in that interpretation, its selecting an element from that ensemble and wave-function collapse, in applying only to this conceptual ensemble, and nothing in any sense real, is of no concern at all.

I'm sorry I don't understand this last sentence, in particular what you say about the link between the occurrence of an event and the collapse of the wave function. What I said is that a non-continuous modification of the experimental device is likely to translate into a non-continuous evolution of the observed distribution for the new device as compared to the initial distribution. There is no such thing as a collapse of the wave-function triggered or induced by the occurrence of a discrete event. The so-called wave function is a property of an experiment, not a property of a “system” and neither a state of our knowledge or belief.

bhobba said:
There is another view of probability that associates this abstract thing, probability, as defined in the Kolmogorov axioms, with a subjective confidence in something. This is the Bayesian view and is usually expressed via the so called Cox axioms - which are equivalent to the Kolmogorov axioms. This view leads to an interpretation along the lines of Copenhagen which takes the state as a fundamental property of an individual system, but gives a subjective confidence instead

I don't think the formalism (Kolmogorov, Bayes, ...) determines whether the probability should be interpreted as a belief, as some knowledge about what may happen or as an objective state. Only the correspondence you explicitly establish between what you are dealing with and the mathematical objects involved in the probability formalism defines what the probability you compute deals with.
In the minimal approach I recommend to follow, the “probability” refers to an objective property of a quantum experiment, and it actually means “relative frequency observed in the iterative mode”.
Thanks.
 
  • #61
Sugdub said:
There are two aspects which require some attention.
First, one must clarify the rationale for assigning a probability (which is a form of property) to a discrete occurrence of an event-type, better than assigning this probability to the event-type representing one category of events that may be observed when running the experiment.

I have zero idea what you are trying to say. Being able to assign probabilities to events is pretty basic and if it was in anyway not valid great swaths of applied mathematics from actuarial science to statistical mechanics would be in trouble - but they obviously arent.

Sugdub said:
I'm sorry I don't understand this last sentence, in particular what you say about the link between the occurrence of an event and the collapse of the wave function.

Its very simple:
http://en.wikipedia.org/wiki/Ensemble_interpretation

Thanks
Bill
 
  • #62
bhobba said:
Like most interpretations there are a number of variants. The one Einstein adhered to is the one presented by Ballentine in his book and the usual one people mean when they talk about it. And indeed it refers to an ensemble of systems exactly as I have been saying in this tread about the state referring to an ensemble of similarly prepared systems - its the one more or less implied if you want to look on probability the frequentest way.
If you say that measurement outcome is described by probability you say that the rule applies to individual event (relative frequencies emerge from statistical ensemble of idependent events). So you contradict what Einstein was saying.
You have to allow possibility that relative frequencies appear as certainty by deterministic physical process. And then it's Ensemble interpretation.

bhobba said:
The assumption you make if you accept Gleason's theorem would go something like this - I don't know what outcome will occur but it seems reasonable I can associate some kind of probability to them.
I assume that assigning probability to outcome might lead to false predictions.
 
  • #63
Superposed_Cat said:
Hi all, I was wondering mathematically ,what causes wave function collapse? and why does it exist in all it's Eigen states before measurement? Thanks for any help and please correct my question if I have anything wrong.


math is just description.


.
 
  • #64
zonde said:
If you say that measurement outcome is described by probability you say that the rule applies to individual event (relative frequencies emerge from statistical ensemble of idependent events). So you contradict what Einstein was saying.
You have to allow possibility that relative frequencies appear as certainty by deterministic physical process. And then it's Ensemble interpretation.

Einstein wasn't saying that an ensemble is required, only that if we interpret QM as a desription of ensembles rather than individual events we avoid "unnatural" interpretations.

In my opinion, the term unnatural seems to have been used in order to make the statement correct, but also makes it completely subjective. For it to be objective he would've actually had to define what he means by unnatural and if I recall correctly this was effectively an expression of his frustration with indeterminism. He was asserting his own prejudices on nature. It would've been written from a faith position in local realist hidden variable theories. Which we now know to be invalid if we require counterfactual definiteness.
 
Last edited:
  • #65
zonde said:
You have to allow possibility that relative frequencies appear as certainty by deterministic physical process.

I disagree with this. If everything were determined by physics process how would you explain something like a decay rate for an atom or particle. These events have a probability but are inherently random or appear to be so.
 
  • #66
bhobba said:
I have zero idea what you are trying to say. Being able to assign probabilities to events is pretty basic and if it was in anyway not valid great swaths of applied mathematics from actuarial science to statistical mechanics would be in trouble - but they obviously arent.

I had a look to the Ensemble interpretation article you referred to and I must admit I found it anything but clear. The first section displays a quote by Einstein (reproduced in this thread in #57 by Zonde). I would be extremely surprised if in the original context Einstein used the word “system” in a different meaning than a “microscopic object”, I mean something less precise but in the same range as a “particle”. May be somebody could clarify this point.

In the second section of the same article, the “system” is defined as a single run of a quantum experiment, whereas an ensemble-system is defined as an iterative run of that experiment. That looks pretty similar to what I described in my previous inputs, although the use that is made of the word “system” makes the text quite harsh to digest. But then the key sentence according to which one should understand if and why the ensemble interpretation assumes that the wave-function is a property of one single iteration reads as follows:
“The ensemble interpretation may well be applied to a single system or particle, and predict what is the probability that that single system will have for a value of one of its properties, on repeated measurements”.
If “system” stands for “a single iteration of the experiment”, then the sentence actually assigns the “property” to the “repeated measurements” pattern, the ensemble-system, and not to a single run. If “systems” stands for a “microscopic system” (if not, the wording “system or particle” is irrational), then the sentence does not tell whether the property is assigned to a single run or not. The sentence does not include any justification anyway.
Further on an example is presented where a pair of dice, i.e. a physical object involved in the experimental device, plays the role of the so-called “system”. The ambiguity is maximal.

Let's make things simple. If one admits that the probabilistic property assigned to the iterative experiment reflects an underlying probabilistic property assigned to a more elementary level (the single iteration), then there is no reason why this second probabilistic property should not in turn reflect a third probabilistic property standing another level below, whatever the form it takes. This leads to a regression ad infinitum which can only stop when one specifies a level to which a deterministic property can be assigned. So the only realistic and credible alternative to stating that the property at the level of a single run is deterministic (which all physicists assume in the case of classical probabilities) is to accept that there is no property at all at this elementary level, so that the distribution pattern observed at the iterative level is a fundamental property which cannot be reduced to the appearance or synthesis of a more fundamental property.
I've explained in my previous input why and how the quantum formalism actually deals with transforming a distribution of relative frequencies into another distribution of the same nature, thanks to an appropriate mathematical representation using the orientation of a unit vector which makes the “amplitude of probability” an empty physical concept. The quantum formalism deals with a probabilistic property defined at the iterative level, reflecting the experimental truth.
Should there be a more fundamental property at a lower level, whichever level that means, then the quantum formalism would no longer be considered as the most fundamental theory dealing with quantum experiments. It would have to be replaced with a theory explicitly dealing with the lowest level property, and that property would necessarily be deterministic.
 
  • #67
zonde said:
If you say that measurement outcome is described by probability you say that the rule applies to individual event (relative frequencies emerge from statistical ensemble of idependent events). So you contradict what Einstein was saying.

That's simply not true.

It purely depends on your interpretation of probability. In the ensemble interpretation an observation selects an outcome from the conceptual ensemble and what that outcome is can only be described probabilistically.

In most versions of Copenhagen the state applies to an individual system, but is purely a representation of subjective knowledge about the outcome of observations.

Ballentine, correctly, in his book, points out, as Einstein did, the difficulty that arises if you consider it applies to something more definite that an ensemble (the collapse issue is the problem), but for some reason didn't consider the case where is was simply subjective knowledge, which is what most versions of Copenhagen think of the state as.

zonde said:
I assume that assigning probability to outcome might lead to false predictions.

But it doesn't.

Thanks
Bill
 
Last edited:
  • #68
craigi said:
Einstein wasn't saying that an ensemble is required, only that if we interpret QM as a desription of ensembles rather than individual events we avoid "unnatural" interpretations.

Exactly what Einstein was getting at is explained in Ballentine's book.

But basically its the collapse issue. The ensemble interpretation is one way out, considering it purely as a state of knowledge is another.

Also note, and it bears mentioning, Einstein did NOT disagree with QM as you will sometimes read - he considered it incomplete - not incorrect.

Thanks
Bill
 
  • #69
Jilang said:
I disagree with this. If everything were determined by physics process how would you explain something like a decay rate for an atom or particle. These events have a probability but are inherently random or appear to be so.

This would actually be pretty easy to construct a viable deterministic hidden variable theory for. Where they have problems, is when we consider separated entangled particles ans contexuality.

Classical systems that are considered fundamentally deterministic exhibit appararent randomness. In fact, a system that is fundamentally indeterministic can appear deterministic and vice versa.

Einstein believed that apparent indeterminism was fundamentally deterministic. I think that perhaps a better way to look at it, is how does determinism emerge so convincingly from indeterminism, in our experiences, that the human mind considers it to be so fundamental. There are indeterminstic processes taking place all around us on all scales, all the time, but we are much more atuned to the deterministic processes.
 
Last edited:
  • #70
Sugdub said:
I had a look to the Ensemble interpretation article you referred to and I must admit I found it anything but clear. The first section displays a quote by Einstein (reproduced in this thread in #57 by Zonde). I would be extremely surprised if in the original context Einstein used the word “system” in a different meaning than a “microscopic object”, I mean something less precise but in the same range as a “particle”. May be somebody could clarify this point.

In discussions about QM one often encounters an analysis of a typical measurement situation consisting of preparation, transformation, then measurement.

See figure 1 in the following for a discussion:
http://arxiv.org/pdf/quant-ph/0101012.pdf

Thanks
Bill
 
  • #71
bhobba said:
zonde said:
If you say that measurement outcome is described by probability you say that the rule applies to individual event (relative frequencies emerge from statistical ensemble of idependent events). So you contradict what Einstein was saying.

That's simply not true.

It purely depends on your interpretation of probability.
Interpretation does not change prediction, right? But if events are not independent we can get results that are quite different from predictions that are made using probabilities.

Do you agree?

As an example. Say we can have event + or - with equal probability (0.5). Now if we take series of events in a large sample we would expect that there will be series like ++++++++++ or ----------. And we can calculate how big a sample should be to expect series like that with say 99.99% probablity.
But if events are not independent it is possible that series like ++++++++++ or ---------- can never appear (probability 0%) while relative frequencies for + and - is still 0.5 and 0.5.
 
Last edited:
  • #72
zonde said:
But if events are not independent we can get results that are quite different from predictions that are made using probabilities.

Do you agree?

No.

Probability theory deals with correlated events perfectly well.

However, if you naively compute probabilities based upon an incorrect assumption of independence then your prediction will indeed be incorrect.

In fact, it's commonplace in physics to account for correlations to get the correct confidence interval for measurements.

See http://en.wikipedia.org/wiki/CovarianceIt's also worth noting that correlated probabilities in quantum mechanics and not just relevant to random errors in experiments, they're actually fundamental to the theory. If there were a problem with the prediction of quantum mechanics with respect to correlated events, somone would've definitely noticed by now!
 
Last edited:
  • #73
zonde said:
Interpretation does not change prediction, right?

Of course it doesn't.

But what it does do is change how you view it.

And indeed there is an assumption made in the Ensemble interpretation, and even the frequentest interpretation of probability, each trial is independent.

Its from the law of large numbers:
http://en.wikipedia.org/wiki/Law_of_large_numbers
'the expected value is the theoretical probability of success, and the average of n such variables (assuming they are independent and identically distributed (i.i.d.)) is precisely the relative frequency'

In modern times, as I have mentioned previously, the frequentest interpretation of probability is justified by the Kolmogorov axioms to remove any kind of circularity. As a byproduct it also justifies the Baysian view showing they are really different realizations of basically the same thing.

Thanks
Bill
 
  • #74
craigi said:
This would actually be pretty easy to construct a viable deterministic hidden variable theory for.
... Random particle decay. OK then, what sort of hidden variable (short of an inbuilt random number generator! ) do you think could achieve that?:devil:
 
  • #75
Random radioactive decay always has troubled me. I can handle the probabilistic nature of the wavefunction but decay has always bothered me.
 
  • #76
Superposed_Cat said:
Random radioactive decay always has troubled me. I can handle the probabilistic nature of the wavefunction but decay has always bothered me.
There are much more troublesome issues to be resolved especially wrt the foundations and spontaneous decay isn't one of them.
 
  • #77
Jilang said:
... Random particle decay. OK then, what sort of hidden variable (short of an inbuilt random number generator! ) do you think could achieve that?:devil:

I'm not sure what's a plausible mechanism for particle decay, but there is no difficulty conceptually with assuming that it's deterministic. A sophisticated enough pseudo-random number generator, for example, is indistinguishable from a nondeterministic process.

What's difficult to accomplish with hidden variables is, as someone already pointed out, entanglement between distant subsystems.
 
  • #78
stevendaryl said:
What's difficult to accomplish with hidden variables is, as someone already pointed out, entanglement between distant subsystems.

Well, it's certainly troubling me and the Cat!
 
  • #79
Jilang said:
... Random particle decay. OK then, what sort of hidden variable (short of an inbuilt random number generator! ) do you think could achieve that?:devil:

A pseudo random number generator.
http://en.wikipedia.org/wiki/Pseudorandom_number_generator

To be clear, I'm not arguing for a hidden varible theory, only that the decay of a particle is far from the greatest challenge for such a theory.
 
Last edited:
  • #80
Superposed_Cat said:
Random radioactive decay always has troubled me. I can handle the probabilistic nature of the wavefunction but decay has always bothered me.
They are not really 'particles' as you seem to imagine. The particle concept is a handy approximation. That's why spontaneous decay should be the last thing that bothers you. If this world were made of particles, atoms would have collapsed less than a second after the BB(less than a second after they were formed - some thousand years after the BB).
 
Last edited:
  • #81
Just thought I'd add here the clearest argument I've seen for "there is no problem in quantum mechanics". It will, of course, satisfy no one, but it is the clearest I've seen:

http://arxiv.org/abs/1308.5290
 
  • #82
I get that there is not really a problem per say with anything, I just have a minor problem with everything being based off probability. It used to be soothing to me last year but now it bothers me, and that decay is literally based off randomness(well exponential decay).
 
  • #83
Superposed_Cat said:
I get that there is not really a problem per say with anything, I just have a minor problem with everything being based off probability. It used to be soothing to me last year but now it bothers me, and that decay is literally based off randomness(well exponential decay).

I think once you get your head around the fact that determinism can emerge from indeterminism and vice versa, it doesn't seem that weird anymore. It happens in gases, weather systems and even economics, to name but a few.

At the moment, I'm not even sure that I see the concepts of determinism and indeterminsm as all that distinct anymore. Perhaps all we really have is a continuous scale with things that seem indeterministic at one end and things that seem deterministic at the other.
 
  • #84
I understand that, hence me previously being okay with it.
it's just that me and my friend were talking about the weirdness or things like the wavefunction, eulers theorem (we don't like complex numbers), t=0 of the big bang ect. It just bothers me that there are certain things we can't know as a result of physics.

Before discovering physics I accepted that you couldn't know everything in practice, but I don't like that we can never know certain thing regardless.
 
  • #85
Superposed_Cat said:
I understand that, hence me previously being okay with it.
it's just that me and my friend were talking about the weirdness or things like the wavefunction, eulers theorem (we don't like complex numbers), t=0 of the big bang ect. It just bothers me that there are certain things we can't know as a result of physics.

Before discovering physics I accepted that you couldn't know everything in practice, but I don't like that we can never know certain thing regardless.

Sometimes a question seems rational and but may in fact, be a meaningless question. That is not to say that it's wrong to ask it, only that question happens to have an illogical inconsistency already within it, that may not be immediately apparent.

The simplest example that I can think of to illustrate this is the question:

"what's north of the North Pole?"

Initially you may think that "nothing" is the correct answer, but when you think about it, the question is presuming there can exist more north than the maximum amount of north.

Another example might be:

"A man is standing somewhere in a room. What's in his lap?"
[If you're not a native english speaker, then "lap" may not translate too well.]

Again, if you're to answer "nothing", you're complicit in validating the question. The correct response is "a standing man doesn't have a lap".

In neither of these cases is nature conspiring to prevent us from knowing something. There is nothing to know. It is simply that we're asking a meaningless question. The same is true in physics. Often we are so bound by our experiences of the everyday world that we struggle to accept that the concepts that we use in it are not universally applicable.
 
Last edited:
  • #86
PAllen said:
Just thought I'd add here the clearest argument I've seen for "there is no problem in quantum mechanics". It will, of course, satisfy no one, but it is the clearest I've seen:

http://arxiv.org/abs/1308.5290

Hmmm.

Interesting paper.

Have to say I agree with the following:
'Fifth, since neither decoherence nor any other mechanism select one particular outcome the whole “measurement problem” reduces to the question Why is there one specific outcome? which is asking Why are there randomly realized events? in the particular context considered. This harkens back to Sec. 1, where we noted that quantum theory cannot give an answer. In summary, then, the alleged “measurement problem” does not exist as a problem of quantum theory. Those who want to pursue the question Why are there events? must seek the answer elsewhere.'

Schlosshauer correctly identifies that as the key issue. Decoherence seems likely to answer all the other issues with the measurement problem - but that one it leaves untouched.

Is that a problem? Personally I don't know - I don't find it a worry - but I know others do.

What I do know is we have interpretations like DBB where it is not an issue at all and MWI where it has been replaced by something else. For me this suggests we have future surprises in store.

The following might be the beginnings of those surprises:
https://www.simonsfoundation.org/quanta/20130917-a-jewel-at-the-heart-of-quantum-physics/

Only time will tell.

Thanks
Bill
 
  • #88
PAllen said:
I have been interested in that from popular presentations like you link. Unfortunately (for me) there is a bunch I need to learn to try to understand this work in a meaningful way.

Indeed.

But, if what it reports is true, that they are replacing unitary evolution with something else it could have big consequences for the measurement problem - but of course only time will tell.

Thanks
Bill
 
  • #89
bhobba said:
Of course it doesn't.

But what it does do is change how you view it.
But it does not change the assumption that each trial is independent, right?

bhobba said:
And indeed there is an assumption made in the Ensemble interpretation, and even the frequentest interpretation of probability, each trial is independent.
That contradicts that Einstein quote about ensemble interpretation and QM being not applicable to individual systems (trials).
 
  • #90
zonde said:
But it does not change the assumption that each trial is independent, right?

Its the assumption of the law of large numbers.

zonde said:
That contradicts that Einstein quote about ensemble interpretation and QM being not applicable to individual systems (trials).

I have zero idea why you say that. Its simply not true.

The logic is dead simple. By the law of large numbers we can find an ensemble associated with an observation where the proportion of outcomes is the probability. This follows from simply assuming the outcome can be described probabilistically. The state is not even introduced at this point. The Ensemble Interpretation associates the state not with individual systems but with the ensemble. Its that easy. If you still don't get it I will have to leave it to someone else because I simply can't explain it any better.

Thanks
Bill
 
  • #91
bhobba said:
I have zero idea why you say that. Its simply not true.

The logic is dead simple. By the law of large numbers we can find an ensemble associated with an observation where the proportion of outcomes is the probability. This follows from simply assuming the outcome can be described probabilistically. The state is not even introduced at this point. The Ensemble Interpretation associates the state not with individual systems but with the ensemble. Its that easy. If you still don't get it I will have to leave it to someone else because I simply can't explain it any better.
I understand that part perfectly well. The part I don't understand is what in that (Ballentine's) interpretation changes if you associte it with individual system. And as I see it nothing changes if you say it's applicable to individual systems.
 
  • #92
zonde said:
I understand that part perfectly well. The part I don't understand is what in that (Ballentine's) interpretation changes if you associte it with individual system. And as I see it nothing changes if you say it's applicable to individual systems.

Got it now.

You face the discontinuous collapse issue if you think the state applies to an individual system and is in some sense real - that's the key point both Einstein and Ballentine didn't make clear in their objection. If its simply a level of confidence like the Baysian view of probability it doesn't matter one whit.

Thanks
Bill
 
  • #93
bhobba said:
Got it now.

You face the discontinuous collapse issue if you think the state applies to an individual system and is in some sense real - that's the key point both Einstein and Ballentine didn't make clear in their objection. If its simply a level of confidence like the Baysian view of probability it doesn't matter one whit.

Thanks
Bill

I don't understand how the ensemble approach avoids the discontinuous collapse issue. I'm not trying to be argumentative, but I just don't see it.
 
  • #94
stevendaryl said:
I don't understand how the ensemble approach avoids the discontinuous collapse issue. I'm not trying to be argumentative, but I just don't see it.

Its dead simple.

The interpretation assumes an observation selects an element from the conceptual ensemble. This is the sole purpose of the state in that interpretation. Nothing physical changed - the state simply refers to a conceptualization that with the observable determines the proportion of the outcomes in the conceptual ensemble.

To spell it out in excruciating detail given an observable and a state you can calculate the probabilities of the possible outcomes of the observation. This determines an ensemble of outcomes where the proportion of each outcome is the probability of that outcome. The interpretation assumes the observation simply picks a random element of the ensemble and that's the result. Since it all refers to just a conceptualization nothing physical changed.

To be even clearer apply it to throwing a coin. Its state is the vector 1/2, 1/2. Throw the coin and it picks a random entry from the ensemble that is half heads and half tales. The new state is now 0,1 or 1,0 depending if a head or tale came up. The state discontinuously changed - but so what - its just a conceptualization - an aid to figuring out the likelihood of an observation outcome.

Thanks
Bill
 
Last edited:
  • #95
bhobba said:
Its dead simple.

The interpretation assumes an observation selects an element from the conceptual ensemble.

That makes perfect sense for classical ensembles. You have a collection of systems that agree on the macroscopic variables (say, number of particles, or total energy, or something). But the details of how particles are moving differs from system to system. When you measure some quantity that varies from one system to another, nothing changes, you're just discovering which system (or sub-ensemble) is the "real" world.

You could try the same tactic with quantum nondeterminism: The quantity that you are measuring--angular momentum, for example--doesn't have a definite value before the measurement, simply because all you know is that the real world is one system out of an ensemble, and different members of the ensemble have different values for that observable. After the measurement, you haven't done anything other than identify which system (or sub-ensemble) is the real world.

But to assume that the system had a definite value for angular momentum before you measured it is a hidden-variables assumption, isn't it? Why don't Bell-type inequalities rule that out?
 
  • #96
stevendaryl said:
But to assume that the system had a definite value for angular momentum before you measured it is a hidden-variables assumption, isn't it? Why don't Bell-type inequalities rule that out?

One could assume that a quantum system has definite values for all variables at all times, and the only reason for nondeterminism is classical ignorance. One way to frame the results of the various mathematical no-go theorem (Bell's theorem, the Kochen-Specker theorem, etc.) is that if observables have definite values, then our ignorance about those values cannot be described using measurable sets.
 
  • #97
vanhees71 said:
The question, why Born's rule holds true and why the description of nature on a fundamental level is indeterministic is not asked in the realm of physics. You may wonder about it and try to find a simpler or more intuitive set of postulates defining quantum theory (e.g., Weinberg discusses at length, whether Born's postulate can be derived from the other postulates, i.e., the usual kinematical and dynamical postulates in terms of the Hilbert-space formulation with observable operators and state operators, coming to the conclusion that it cannot be derived), but as long as there is no empirical evidence against quantum theory, you better keep this theory.
This got me thinking... If such a question is not asked in the realm of physics in what realm should it be asked? I would not have thought that the philosophers would have the maths, the mathematicians probably not the inclination...

I didn't realize there was any mystery about Born's postulate. Isn't it just the joint probability of something coming one way meeting something coming the other way in imaginary time?
 
  • #98
bhobba said:
Got it now.

You face the discontinuous collapse issue if you think the state applies to an individual system and is in some sense real - that's the key point both Einstein and Ballentine didn't make clear in their objection.

I would say that this quote clarifies Einstein's point:
"For if the statistical quantum theory does not pretend to describe the individual system (and its development in time) completely, it appears unavoidable to look elsewhere for a complete description of the individual system; in doing so it would be clear from the very beginning that the elements of such a description are not contained within the conceptual scheme of the statistical quantum theory." - http://www.marxists.org/reference/archive/einstein/works/1940s/reply.htm

I would say that basically the point is that details (or interpretation) of collapse is outside the scope of QM and in statistical interpretation we speak only about relative frequencies without going into details.

Well apart from that it looks very much like non-contextual (or intrinsic to particle) LHV approach as he speaks about complete description of the individual system as a "complete" version of quantum theory.
 
  • #99
stevendaryl said:
But to assume that the system had a definite value for angular momentum before you measured it is a hidden-variables assumption, isn't it? Why don't Bell-type inequalities rule that out?

This is the Achilles Heel of the ensemble interpretation - its an ensemble of system and observational apparatus combined. Nothing is assumed about the value of any observable prior to observation.

Ballentine in his 1970 paper on it more or less stated he was assuming some kind of hidden variable so it was an ensemble of outcomes - but his book moved away from that.

This is the reason I hold to the ignorance ensemble interpretation with decoherence - you don't need this unnatural assumption.

Thanks
Bill
 
  • #100
stevendaryl said:
One could assume that a quantum system has definite values for all variables at all times,

You run into problems with Kochen-Specker. The only way to do it is hidden variables.

You can also assume it after decoherence - which is the essence of the ignorance ensemble interpretation with decoherence.

Thanks
Bill
 
Back
Top