Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A proof that a computer cannot generate a truly random number?

  1. May 11, 2013 #1
    I'm thinking about how to do a proof that a computer cannot generate a truly random number.

    Attempt. Let Ω = {ω1, ω2, ..., ωn}, a subset of ℝ, be all the numbers represented on a certain machine. A random number generator rand(), because its output is dependent on how many times it has been called, is analogous to a function f:N→Ω (where I'm letting N denote the natural numbers). We need f to have the following properties:

    (1) The long term frequency of any wi appearing is 1/n. That is, for any i ε {1, ..., n}, if we let gi(f(n)) = 1 if f(n) = ωi, and gi(f(n)) = 0 if f(n) ≠ ωi, then

    ƩnεN gi = gi(f(1)) + gi(f(2)) + ... = 1/n.​

    (2) We need independence .... I'm trying to think how to formalize this statement.
     
  2. jcsd
  3. May 12, 2013 #2

    chiro

    User Avatar
    Science Advisor

    Hey Jamin2112.

    The first thing you have to do is define random.

    The best way that I can think of is to assume that all possible distributions (i.e. all subsets of the power set that correspond to the events) has maximum entropy. This is pure randomness.

    If you can prove that there exists a situation that does not have this property, then you have proved that a computer may not generate a purely random process.

    If you are looking at a particular algorithm or class of generators, you need to take that process and prove that the entropy characteristics of the process are lower than the maximum bounds.
     
  4. May 12, 2013 #3
    It perhaps doesn't need saying, but in addition to chiro's suggestion, you would also define what you mean by "computer" -- deterministic Turing machine (without an oracle)?
     
  5. May 12, 2013 #4

    Stephen Tashi

    User Avatar
    Science Advisor

    It isn't really the machine that determines what numbers it can represent. For example, mathematical software that does symbolic manipulations can represent numbers like [itex] \sqrt{2} [/itex] and [itex] \pi [/itex] using strings for their symbols. The machine's processors have a system for reprsenting particular subsets of real numbers (e.g. floating point ), but programs aren't limited to using that system. You can refer to the set of numbers that can be generated on a particular machine by a particular program.

    Perhaps you should begin by trying to prove that a computer cannot randomly select a number from th set of two numbers {0,1} or select a random symbol from the set of two symbols {H,T}.

    That limits the scope of your proof to particular types of random number generators. It's true that recursive random number generators are the ones most commonly used.

    Should you define what you mean by "random"? It's amusing that we often don't make this demand of people who post about problems in probability theory. For example, we don't cross examine a poster who makes a statement like "Let X be a random number chosen uniformly in the interval [0,1]". For questions in probability theory, all we demand is that an event and its associated probability distribution be clearly stated. However, in probability theory, what we say about "probability" is essentially "how it behaves" and not "what it is". i.e. there are certain definitions for events (like "independence") that are stated in terms of how the probability of other events ( like their joint occurrence) behaves as a function of the individual probabilities of the events. However, there is no definition of probablity that lets us look at a situation where no probabilities are defined and conclude that certain probabilities exist and have certain values. All problems in the conventional theory of mathematical probability must begin by stating that certain probabilities are "given". So if you want to deal with a situation where no probabilities are given and either conclude some probabilities exist or prove that none do, you need to look for other definitions of "random" that would give you some hope of doing this.
     
  6. May 12, 2013 #5
    I think that's what I'll do.

    I think that one component of the definition is like I said in the original post, the limiting value of Ʃg as I defined in the original post is 1/n. But I need to make that more general because in fact all ordered arrangements of elements from Ω needs to have the same limiting value for ∑g if I use a function g that equals 1 if that arrangement appears and 0 otherwise.
     
  7. May 12, 2013 #6
    Also, I'm not convinced that it is relevant whether the random number generator uses a recursion algorithm. Either way it's analogous to a function f(n) because you have a number of trials.
     
  8. May 14, 2013 #7

    Jano L.

    User Avatar
    Gold Member

    Generating random (read unpredictable) numbers in conventional computer is not possible, because it behaves deterministically, i.e. the numbers can be predicted if you read the state of the computer (what is the program, what is in the memory).

    This reminds me that von Neumann once said "tell me what computer cannot do and I will build one which will do it". In fact you can enhance your computer with hardware device which produces numbers not based on some clever algorithm, but based on the transformation of a signal from sensitive measurement of, say, electric current across a resistor. The small fluctuations of current do not follow any obvious deterministic algorithm, so they can be used to produce random numbers.
     
  9. May 15, 2013 #8
    Random, as I understand it, means 'all possibilities with equal probabiliy'

    So, flipping a coin is considered a random event, because in the scope of our problem we recognize two possibilities, heads and tails, and the probabilities of these possible events are equal. I am essentially subjectivist/bayesian in attitude on statistics. Probabilities are not objective features of the word, they are features of our expectations. A pseudo-random algorithm that humans don't/can't predict IS random.

    Its kind of a weird idea to prove a computer can't do something. Any proof you do will be regarding mathematical notions, not physical objects. There would have to be an independent understanding that those notions have something to do with computers. You cant incorporate a physical object into your proof.
     
    Last edited: May 15, 2013
  10. May 15, 2013 #9
    Meh, first there is no reason a computer can't generate random sequences. There is no requirement that a computer be fully deterministic. If its operation contains a random component then it can extract that random component and produce a random sequence.

    If we are restricted to deterministic computers then the future states of the computer are fully determined by the current state. Therefore the future billion gigabyte of random numbers that will be generated can be specified by simply providing the current state of the computer. This amounts to a massive amount of compression of the sequence. Since a random sequence cannot be compressed we know that the sequence could not have been random.
     
  11. May 16, 2013 #10

    Jano L.

    User Avatar
    Gold Member

    That is not true. If you produce sequence of 100000 coin flips, which is always random by common definition, write it down into binary file and run compressing program, sometimes you can compress it to smaller size.

    True, conventional program will achieve compression only for certain small subset of possible sequences. But which subset it is depends on the method of compression; there is no sequence that can be said "incompressible absolutely" (unless we require that the program be shorter than some maximum number of bytes).

    The root of the problem is in the fact that "randomness of sequence" is not a property deducible from its numerical content; the numbers cannot be "random".

    Randomness means rather that we are not able to predict the next values of the sequence from the preceding ones. It thus refers to knowledge of the human about the production of the sequence, not to the sequence itself.
     
  12. May 16, 2013 #11

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Here are some things that a computer (a Turing machine) cannot be programmed to do:
    • Compute a non-computable number. Note well: Almost all of the real numbers are not computable.
    • Given a random computer program and a finite set of input data, prove whether or not that program will halt.
    • Given a random Diophantine problem, prove whether or not that problem has a solution.
     
  13. May 16, 2013 #12
    I believe you when you tell me a computer cannot be programmed to do those things, but its another thing entirely to say that a mathematical proof alone tells us what a computer can or cannot do.
     
  14. May 16, 2013 #13

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    All of those are mathematical proofs.

    The first is a consequence of Cantor's proof that the reals are uncountably large. There number of possible computer programs is only a countably large. Almost all of the reals are not computable. Google "computable number" for more.

    The second is the "halting problem." Google that term, along with "Church-Turing theorem." That the halting problem is unsolvable was an important discovery regarding the limits of computation.

    The third problem is "Hilbert's tenth problem." That this was proven to be unsolvable in general was an important discovery regarding the limits of mathematics.
     
  15. May 16, 2013 #14
    Yeah I don't disagree, but I am getting at something else entirely. There is nothing about those proof that tells you they say anything about real physical objects like computers. I dont disagree that we can meaningfully demonstrate things about computers with math, but thats because I am the one who can see the connection between a computer and a proof. There are proofs which don't have any kind of connection, and we wouldnt say it meant anything about the world.
     
  16. May 16, 2013 #15

    pwsnafu

    User Avatar
    Science Advisor

    Unless Jamin2112 comes back with a different definition, the word "computer" means "Turing machine". This is a math forum, if you want to talk about physical machines, we have a forum for that.

    Which is completely irrelevant from this thread.
     
  17. May 17, 2013 #16
    Sorry, I didn't realize there was a formal definition of a computer. I shouldnt have derailed the thread.
     
  18. May 18, 2013 #17
    No.

    Choose a compression algorithm. Then you can spend your entire life flipping coins and never find a 100000 bit sequence that will compress appreciably. The number of compressible sequences is so small that you will never see them by accident. And every bit you add to the sequence you want to compress makes it exponentially worse.

    It is true that given a any sequence, even a random one, you could design a compression algorithm that will compress it. But that is cheating. And besides the size of your program will exceed the size of the data being compressed so in reality there is no compression anyway. All you are doing is front-loading the information from the file into your algorithm. A cheat with no practical use.

    There is no free lunch. A compression algorithm that achieves substantial compression does so by restricting itself to a very small subset of possible inputs. You will never see one of those compressible inputs of any size by accident.
     
  19. May 18, 2013 #18

    Jano L.

    User Avatar
    Gold Member

    ppnl, I think you misunderstood me.

    I do not say that given program can compress significant percentage of all possible sequences. I also do not suggest that transferring the data from the sequence to the program is the way to compress it.

    What I say is this: whether the sequence can be compressed or not is not an absolute property of this sequence, but also depends on the algorithm we use to compress it. Do you agree with this?

    If we can agree on this, then we cannot base universal definition of randomness on the fact that the sequence is compressible or not. Such definition refers to particular program.

    Let me give a concrete example of what I mean.

    Let us flip the coin 100 000 times and denote 1-head, 0-tail. We may obtain any sequence of 1's and 0's, and they are random, because the process that lead to them is unpredictable.

    We may obtain a sequence

    1111...

    of 100 000 heads. It does not matter than it is highly unlikely; it is possible and as likely as any other sequence.

    This sequence is random, since it has been obtained in an unpredictable process, but it can be compressed by the simplest current programs.

    I think what you mean is that we cannot compress all or great percentage of such random sequences, and I agree, but it does not allow us to say that one random sequence cannot be compressed.
     
  20. May 18, 2013 #19

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    It's the other way around. You misunderstood ppnl. More below.

    It matters very much that this all-heads sequence is highly unlikely. Yes, there is a chance of a sequence of all 100,000 heads. The probability is smaller than small. Suppose you had a supercomputer capable of generating a billion of these 100,000 element random coin toss sequences every second. This supercomputer would have run for 1030,000 centuries before the expected value of seeing this all heads sequence rises to over 1/2.

    ppnl covered this all-heads scenario, along with every other random sequence that is compressible in the opening lines of his last post, emphasis mine.

    In particular, the number of compressible sequences is small in comparison to the 2100000 possible sequences. We're getting close to "almost surely" territory with 100,000 coin tosses. All but a tiny, tiny, tiny fraction of those 2100000 possible sequences is not compressible.
     
  21. May 18, 2013 #20

    Jano L.

    User Avatar
    Gold Member

    D H,

    I do not understand how your post shows that I misunderstood ppnl. I agree with him and you on his point - most sequences cannot be compressed by given program.

    Let's agree we agree on this.

    What I am saying is something different. I object to adoption of statement "random sequence cannot be compressed" as true, and I gave counter-example, sequence 1111... obtained by coin flipping can be compressed. Its low probability is immaterial, as any other example would have exactly the same probability and surely one example is enough to convince us that the statement is not true.


    If we change the statement into

    "most sequences long 100 000 digits cannot be compressed by a given program"

    then it is true. However, then it also has nothing to do with randomness.


    For a given compression program:

    There are random sequences that are compressible, and other random sequences that are incompressible.

    There are non-random sequences that are compressible, and other non-random sequences that are incompressible.

    EDIT:
    Clearly, compressibility of a sequence by a given program has no direct relation to its randomness.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: A proof that a computer cannot generate a truly random number?
  1. Random Number generator? (Replies: 12)

Loading...