Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Are the digits of pi truly random?

  1. Aug 31, 2005 #1
    My son's computer science teacher claims that there is no way to devise a computer algorithm that can generate a truly random sequence of numbers (only a pseudo-random sequence that ultimately repeats). Yet there are algorithms with a finite number of steps that generate the decimal digits of irrational numbers, such as pi. Statistical tests would seem to show that these digits are truly random (e.g., any digit is statistically independent of all the previous digits), but can this be rigorously proved? If so, this would seem to contradict the teacher's statement. Is he wrong?
  2. jcsd
  3. Aug 31, 2005 #2


    User Avatar
    Science Advisor
    Gold Member

    Part of the problem may be practicallity. Random number generators, as used in practice, all have algorithms that will gnerate very long sequences, which will eventually repeat. Although you are probably correct about pi (I am not sure its randomness for statistical purposes has ever been proven), it is not practical to use in simulations. After a while, the calculation of further digits gets very time consuming.
  4. Aug 31, 2005 #3

    matt grime

    User Avatar
    Science Advisor
    Homework Helper

    the digits of pi are not "random" (random being a slightly ambiguous term by the way), they are *probably* normal (which is to say approcimately, that if we were to to take a string of n digits then they would occur with the correct frequency in the long run, ie about 10^-n), though we haven't proven this yet. But in general, no, there is no way to generate truly random numbers. For a start, computers tend to only operate with rational numbers (some programs symbolically deal with irrational ones) and as we know, if we restrict to the interval [0,1], then almost all numbers are irrational, ie a number picked at random from that interval is rational with probabiltiy zero, yet on a computer it will be probability 1.

    However, it is possible that your son's teacher was speaking of the practical idea here, and indeed there is no way to generate truly random numbers from even any finite set, though we are able to make thigns that are "good enough" often this is some output based upon based upon the time between key strokes in the 10^{-something bigger than we can think of} time scale on you computer.

    And, whislt we could in theory pick a random point in the decimal expansion of some normal number (i guess some are known), how would you decide which portion to take, randomly? and what length? and how long would it take even if we hadn't just killed the idea stone dead by asking how one picks the point we look at?
  5. Sep 3, 2005 #4


    User Avatar
    Staff Emeritus
    Science Advisor

    As Matt Grime pointed out- the digits of pi are NOT "random" in the strict sense- they are completely determined by the definition (and various formulas that follow from that). The concept of a "random number" is queer one anyway. No individual number is random! It would be better to say "randomly generated" number (and I confess I'm not exactly certain what that means!). But your son's teacher is right. Computers are determinant machines. Whatever "program" you put in, if run again with exactly the same input, will produce exactly the same output. They cannot produce true "randomly generated" numbers. With a little work, they can produce very long strings of numbers with such complex patterns that they look random, but they aren't in any strict sense.
  6. Sep 3, 2005 #5
    I think random these days means that the "best formula" giving the nth digit in the sequence of numbers is approximately not shorter compared to the list of the first n numbers in the sequence. For example, if polynomial interpolation is used on the following sequence {3,1,4,1,5,9,...}, in the form {(1,3),(2,1),(3,4),(4,1),(5,5),(6,9),...}, then the formula that won't give the nth digit of pi is *longer* that the sequence used to encode it. There are however formulas that can generate the nth digit of pi such that for large enough n, the formula (best or not) is much *shorter* than the list of the first n digits. Compare that to the sequence {1,1,1,1,...} which can be computer very briefly by the formula f(n)=1 which is much shorter than a list of n ones for large n.

    That's all well and good but how do you prove it when a sequence is random??

    How do you prove that the best formula that generates the sequence is shorter than the list of the first n numbers in the sequence for large n?

    AFAIK, you can prove that random sequences exist but that proving a given sequence is random is *hard*. In fact, the cardinality of the set of random sequences is the same as the cardinality of nonrandom ones.
  7. Sep 3, 2005 #6


    User Avatar
    Science Advisor
    Homework Helper

    Is the primary reason for not using quantum states as perfect random number generators theoretical or practical?
  8. Sep 4, 2005 #7


    User Avatar
    Staff Emeritus
    Science Advisor
    Gold Member

  9. Sep 4, 2005 #8
    I think there is a true random number generator, that is based on quantum mechanics:

    You let a single photon be incident on a 50-50 beamsplitter (a mirror that reflects and transmits the photon each with 50% probability). When the photon is reflected you have a 1, otherwise if the photon is transmitted you have a 0.
    Then when you repeat that many times you get a sequence, e.g. 1001010110101 etc..

    There are already devices in a box:

    And here commercially available:
    Last edited: Sep 4, 2005
  10. Sep 4, 2005 #9

    I'd like to know more about this. I'm sure it has often been raised that maybe we're just too stupid to find a pattern. I'd like to see the proof that the binary sequence generated by these quantum events (like the one mentioned above) is random. Are they random in the exact same sense that flipping a coin is random or in the sense that I listed above: that the "best formula" to fit the binary sequence is not much smaller than the sequence itself?

    I ask because if I made up a function, I bet I could make it so that the numbers were unpredicatble but that doesn't make the numbers a part of a random sequence.

    I guess what I'm trying to wrap my head around is the math version of the statement, "these processes are completely random."

    edit: Just for drill, I sought "random" in wikipedia and the first sentence is:
    Last edited: Sep 4, 2005
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Are the digits of pi truly random?
  1. Randomized Algorithms (Replies: 2)