The Bayesian interpretation is straightforward. It just means that I am not certain that it is going to rain on Thursday, but I think it is likely. More operationally, if I had to bet a dollar either that it would rain on Thursday or that I would get heads on a single flip of a fair coin, then I would rather take the bet on the rain.f
Not necessarily. We are certainly uncertain about random things, but we are also uncertain about some non-random things. Both can be represented as a distribution from which we can draw samples. So the mere act of drawing from a distribution does not imply randomness.

A good example is a pseudorandom number generator. There is nothing actually random about it. But we are uncertain of its next value, so we can describe it using a distribution and draw samples from it.

Isn't that the same in frequentist thinking?

Isn't that the same in frequentist thinking?
Isn’t what the same?

Isn’t what the same?

Isn't it the same in frequentist thinking that randomness can arise from determinism, ie. from our ignorance of the details of a deterministic process?

"Frequentist thinking" is as vague a category of thinking as "liberal thinking" or "conservative thinking". R.A. Fisher is regarded as one of the famous frequentists. In the article https://www.cmu.edu/dietrich/philos...shers Fiducial Argument and Bayes Theorem.pdf we find the quotation from Fisher:

This fundamental requirement for the applicability to individual cases of the concept of classical probability shows clearly the role both of well- specified ignorance and of specific knowledge in a typical probability statement. . . . The knowledge required for such a statement refers to a well-defined aggregate, or population of possibilities within which the limiting frequency ratio must be exactly known. The necessary ignorance is specified by our inability to discriminate any of the different sub-aggregates having different frequency ratios, such as must always exist.

So we see a Frequentist discussing ignorance and knowledge in connection with the concept of probability. That view may not be statistically typical of the population of Frequentists, but it is a view that would allow probabilities to be assigned to the population of numbers generated by a deterministic random number generator - provided that when we take samples, we don't know how to distinguish sub-populations that have statistical characteristics different than the parent population.

• atyy and Dale
Thanks @Stephen Tashi that is a good quote.
Isn't it the same in frequentist thinking that randomness can arise from determinism, ie. from our ignorance of the details of a deterministic process?
So Fisher clearly thinks that it is not necessary to establish “randomness” but merely to have a sample population with a well defined frequency. That fits in well with the frequentist definition of probability as a population frequency. One thing that Fisher doesn’t address there is sampling individual values from the population. Can you still use frequentist probability if the sampling is non-random (e.g. a random number generator with a specified seed)? I suspect that Fisher would say yes, but I am not sure that all prominent frequentists would agree.

So potentially, depending on the individual, there is not much difference between the frequentist and Bayesian interpretation in a deterministic population where we have ignorance.

Where you get a difference is in situations where there is simply no sample population. For example, ##G## or ##\alpha##. Those quantities are not a population, there is only one value but we are uncertain about it. With a frequentist approach ##P(\alpha=1/137)## is somewhere between weird and impossible, whereas a Bayesian would have no qualms about such an expression.

Last edited:
• atyy
Would one accept another piece of evidence that many frequentists consider randomness to arise from ignorance the terminology in quantum mechanics that the density operator is sometimes "ignorance interpretable" and at other times "not ignorance interpretable"? In other words, it shows that standard quantum mechanics does use the idea that probability arises from ignorance, ie. some cases in classical and quantum mechanics are "ignorance interpretable". Here I'm assuming that most physics has used the frequentist interpretation of probability.

Here are two examples from Schlosshauer's review https://arxiv.org/abs/quant-ph/0312059.

"It is a well-known and important property of quantum mechanics that a superposition of states is fundamentally different from a classical ensemble of states, where the system actually is in only one of the states but we simply do not know in which (this is often referred to as an “ignorance-interpretable,” or “proper”ensemble). "

"Most prominently, the orthodox interpretation postulates a collapse mechanism that transforms a pure-state density matrix into an ignorance-interpretable ensemble of individual states (a “proper mixture”)."

How does a frequentist rationalise irrational probabilities? ;)

How does a frequentist rationalise irrational probabilities? ;)

http://www.cs.ru.nl/P.Lucas/teaching/CI/efron.pdf
Why Isn't Everyone a Bayesian? Author(s): B. Efron
Source: The American Statistician, Vol. 40, No. 1 (Feb., 1986), pp. 1-5

Just a note that "incoherent" is nowadays the more usual technical term in English.

How does a frequentist rationalise irrational probabilities? ;)
Maybe I'm dense, but that seems easy. :)

• Wizard and Dale
Touché