bhobba said:
I have zero idea what you are trying to say. Being able to assign probabilities to events is pretty basic and if it was in anyway not valid great swaths of applied mathematics from actuarial science to statistical mechanics would be in trouble - but they obviously arent.
I had a look to the Ensemble interpretation article you referred to and I must admit I found it anything but clear. The first section displays a quote by Einstein (reproduced in this thread in #57 by Zonde). I would be extremely surprised if in the original context Einstein used the word “system” in a different meaning than a “microscopic object”, I mean something less precise but in the same range as a “particle”. May be somebody could clarify this point.
In the second section of the same article, the “system” is defined as a single run of a quantum experiment, whereas an ensemble-system is defined as an iterative run of that experiment. That looks pretty similar to what I described in my previous inputs, although the use that is made of the word “system” makes the text quite harsh to digest. But then the key sentence according to which one should understand if and why the ensemble interpretation assumes that the wave-function is a property of one single iteration reads as follows:
“The ensemble interpretation may well be applied to a single system or particle, and predict what is the probability that that single system will have for a value of one of its properties, on repeated measurements”.
If “system” stands for “a single iteration of the experiment”, then the sentence actually assigns the “property” to the “repeated measurements” pattern, the ensemble-system, and not to a single run. If “systems” stands for a “microscopic system” (if not, the wording “system or particle” is irrational), then the sentence does not tell whether the property is assigned to a single run or not. The sentence does not include any justification anyway.
Further on an example is presented where a pair of dice, i.e. a physical object involved in the experimental device, plays the role of the so-called “system”. The ambiguity is maximal.
Let's make things simple. If one admits that the probabilistic property assigned to the iterative experiment reflects an underlying probabilistic property assigned to a more elementary level (the single iteration), then there is no reason why this second probabilistic property should not in turn reflect a third probabilistic property standing another level below, whatever the form it takes. This leads to a regression ad infinitum which can only stop when one specifies a level to which a deterministic property can be assigned. So the only realistic and credible alternative to stating that the property at the level of a single run is deterministic (which all physicists assume in the case of classical probabilities) is to accept that there is no property at all at this elementary level,
so that the distribution pattern observed at the iterative level is a fundamental property which cannot be reduced to the appearance or synthesis of a more fundamental property.
I've explained in my previous input why and how the quantum formalism actually deals with transforming a distribution of relative frequencies into another distribution of the same nature, thanks to an appropriate mathematical representation using the
orientation of a unit vector which makes the “amplitude of probability” an empty physical concept. The quantum formalism deals with a probabilistic property defined at the iterative level, reflecting the experimental truth.
Should there be a more fundamental property at a lower level, whichever level that means, then the quantum formalism would no longer be considered as the most fundamental theory dealing with quantum experiments. It would have to be replaced with a theory explicitly dealing with the lowest level property,
and that property would necessarily be deterministic.