bhobba said:
I would point out the same could be said about flipping a coin and assigning probabilities to it. In modern times probabilities is defined by the Kolmogorov axioms which is an abstract property assigned to an event (in your terminology iteration).
There are two aspects which require some attention.
First, one must clarify the rationale for assigning a probability (which is a form of property) to a discrete occurrence of an event-type, better than assigning this probability to the event-type representing one category of events that may be observed when running the experiment. In the first case the probability is a property of the unique iteration of the experiment which produced the discrete information, but in the second case it is a property of the iterative implementation of the experiment. What I said in my previous input is that the second case formalises what is experimentally true, whereas the first one stems from a dogma which can be accepted or rejected. I do think that the second approach, which is minimal because it endeavours relying exclusively on experimental truth and what can logically be derived from it, should be used as a reference whenever other approaches based on non-verifiable hypotheses lead to paradoxes.
Second, assuming the minimal approach is followed, there might be no compelling need for referring to “probabilities”. The “state vector”, more exactly
the orientation of a unit vector, represents an objective property of a quantum experiment run in an iterative way (i.e. the distribution of discrete events over a set of event-types). The quantum formalism transforms the orientation of a unit vector into another orientation of the same unit vector. The new orientation computed by the quantum formalism relates to the objective property of a modified experiment (the distribution pattern remaining over the same set of event-types) or a combination of such experiments, still assuming an iterative run of that set-up. It should be noted that in a manifold the
orientation of a unit vector (i.e. a list of cosines)
is the canonical representation for a distribution. Hence the choice of a vectorial representation for the quantum theory implies that the formalism will manipulate/transform a set of cosines (the so-called "amplitudes of probability") instead of their squared values which account for relative frequencies.
(I'm not aware of any alternative / simple explanation for this peculiar feature of the quantum formalism often presented as a mystery, but I'd be keen to learn about them). Eventually references to the “probability” concept, and more significantly to the "amplitude of probability" mis-concept can be dropped since the former only stands for "relative frequency observed in the iterative mode" whereas the latter has lost any physical significance according to the proposed approach.
bhobba said:
This is the view taken by the Ensemble interpretation and what the state applies to - a conceptualization of a large number of iterations, events etc such that the proportion is the probability predicted by the Borne rule. When one makes an observation, in that interpretation, its selecting an element from that ensemble and wave-function collapse, in applying only to this conceptual ensemble, and nothing in any sense real, is of no concern at all.
I'm sorry I don't understand this last sentence, in particular what you say about the link between the occurrence of an event and the collapse of the wave function. What I said is that a non-continuous modification of the experimental device is likely to translate into a non-continuous evolution of the observed distribution for the new device as compared to the initial distribution. There is no such thing as a collapse of the wave-function triggered or induced by the occurrence of a discrete event. The so-called wave function is a property of an experiment, not a property of a “system” and neither a state of our knowledge or belief.
bhobba said:
There is another view of probability that associates this abstract thing, probability, as defined in the Kolmogorov axioms, with a subjective confidence in something. This is the Bayesian view and is usually expressed via the so called Cox axioms - which are equivalent to the Kolmogorov axioms. This view leads to an interpretation along the lines of Copenhagen which takes the state as a fundamental property of an individual system, but gives a subjective confidence instead
I don't think the formalism (Kolmogorov, Bayes, ...) determines whether the probability should be interpreted as a belief, as some knowledge about what may happen or as an objective state. Only the correspondence you explicitly establish between what you are dealing with and the mathematical objects involved in the probability formalism defines what the probability you compute deals with.
In the minimal approach I recommend to follow, the “probability” refers to an objective property of a quantum experiment, and it actually means “relative frequency observed in the iterative mode”.
Thanks.