fbs7 said:
I was just rolling in my bed, unable to sleep, and now I know the reason for that -- that's because the concept of "random" is finally getting in order in my head. What a trip this was! What makes sense to me is that there are really 3 concepts of "random", and people refer with the same word for completely different things:
(a) One is the is a well-structured, axiomatic, abstract mathematical structure that defines and studies "probability". This is not based on any actual dice rolling or some ghost taking cards out of a deck, but it is instead a logical system built around abstract concepts like a probability space and a measurement space. As such, it's rather beautiful. Here "random" doesn't really mean anything, as it's an axiom, and we could just as well call "bananility" instead of "probability" and the mathematical structure would be exactly the same.
(b) Another is the mysterious realm of quantum mechanics, where for some crazy odd reason real objects do seem to exactly follow laws derived from the abstractions above. Here "random" really means random, there's no other way to describe. Why quantum objects behave so is a mind-blowing question, and I suspect it's one of the greatest mysteries of physics, but thankfully everyday dudes like me don't have to worry about it and have no use for that, we can just have faith in physicists to get stuff to work by using those rules, and hopefully not blow the planet to pieces while doing that.
(c) Another is the macroscopic realm that we all handle everyday. Here the "random" really means unknown. There's no real random in macroscopic. We think the cards from the deck as random just because they are turned face down, and if we could calculate exactly all the forces acting in the dice we could deterministically predict which number would be rolled. One could despair with the unknown, but by making assumptions (like the deck is not missing cards and the cards are equally probable) and by applying that mathematical framework, we can make guesses and estimates of outcomes, which, if we assumed right will, in large numbers, be close to the mathematical predictions.
What's awesome is that (c) is routinely used by billions of people. It's actually very amazing if one considers of it - regular joes use it everyday, for example saying "wow, what a hail-mary pass -- he'll never be able to repeat that!" to express how unlikely p(x)2 is, without really knowing why that's correct. We joes don't care, and do not use, anything about tensors or Hilbert matrices or the 350-millionth digit on pi, but we use probability as commonly as we use algebra to verify the bills.
For that reason I now have renewed respect for probability theory, due to widespread use, and now I think that field is one of the Titans of mathematics, with the same practical utility as algebra and geometry! Once again thanks all for this very inspiring discussion!
The late physicist E.T Jaynes wrote a provocative book "Probability Theory: the Logic of Science", Cambridge University Press, 2003, in which he essentially rejects the very idea of "randomness". That's right, a large probability book by somebody who does not believe in randomness! For Jaynes (and several others---maybe mostly physicists), probability is associated with a "degree of plausibility". He shows that using some reasonable axioms about how plausibilities combine, you can end up with multiplication laws like P(A & B) = P(A) P(B|A), etc. His book essentially tries to stay away from the whole "Kolmogorov" measure-theoretic way of doing probability, and so can only treat problems that do not involve things like ##P(\lim_{n \to \infty} A_n)## (but can certainly deal with things like ##\lim_{n \to \infty} P(A_n)##).
In his third chapter entitled "Elementary Sampling Theory", he says on pp. 73-74 (after developing the basic probability distributions):
"In the case of sampling with replacement, we apply this strategy as follows.
(1) Suppose that, after tossing the ball in, we shake up the urn. However complicated the problem was initially, it now becomes many orders of magnitude more complicated, because the solution now depends on every detail of the precise way we shake it, in addition to all the factors mentioned above.
(2) We now assert that the shaking has somehow made all these details irrelevant, so that the problem reverts back to the simple one where the Bernoulli urn rule applies.
(3) We invent the dignified-sounding word
randomization to describe what we have done. This term is, evidently, a euphemism whose real meaning is:
deliberately throwing away relevant information when it becomes too complicated for us to handle."
"We have described this procedure in laconic terms, because an antidote is needed for the impression created by some writers on probability theory, who attach a kind of mystical significance to it. For some, declaring a problem to be "randomized" is an incantation with the same purpose and effect as those uttered by an exorcist to drive out evil spirits; i.e., it cleanses the subsequent calculations and renders them immune to criticism. We Agnostics often envy the True Believer, who thus acquires so easily that sense of security which is forever denied to us."
Jaynes goes on some more about this issue, often revisiting it in subsequent chapters. Lest you think that his book is just "hand-waving", be assured that it is satisfying technical, presenting most of the usual equations that you will find in other books at the senior undergraduate and perhaps beginning graduate level (at least in "applied" courses). The man is highly opinionated and I do not subscribe to all he posits, but I find the approach interesting and refreshing, even though it is one I, personally, would not embrace. He does end the book with a long appendix outlining other approaches to probability, including the usual measure-theoretic edifice.