Here is a thought/question.
There are simple algorithms that generate infinite series of rational numbers that converge to π. By going out far enough on one of these series, one obtains a rational number whose first n decimal places are the same as those for π. So it would seem then that the decimal expansion of π is not random in some sense because it is computable.
It would also seem that for most numbers there is no such algorithm - proof? - so most numbers are truly random but π is not.
If statistically, the decimal expansion of π were indistinguishable from a random sequence, then I am reminded of mechanical systems (chaotic systems) which statistically appear random but in fact are completely determined.
- I imagine that the concept of an algorithm needs defining; perhaps an inductive rule that requires only a finite amount of initial data. But I am out of my element here.
- I just found this Wikipedia link on what are called "computable numbers". The definition involves Turing machines. Perhaps someone can explain what it means but it seems to mean something like there exists a computer program that generates a series that converges to the number.
http://en.wikipedia.org/wiki/Computable_number
In any case, by this definition,π is "computable", as also is e but in total there are only countably many computable numbers. Interestingly, the computable numbers from a subfield of the real numbers.