Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Is an infinite series of random numbers possible?

  1. Mar 14, 2012 #1
    Is an infinite series of [nonrepeating] random numbers possible?

    That is, can the term "random" apply to a [nonrepeating] infinite series?

    It seems to me that Cantor's logic might not allow the operation of [nonrepeating] randomization on a number line approaching infinity.
  2. jcsd
  3. Mar 14, 2012 #2
    I'm not sure of an answer, but allow me to put two thought-experiments on the table to discuss.

    Experiment A: it will produce a rational approximation to a real in the interval (0,1]. Expressed in base 2, the number is constructed as follows: throw a fair coin, and choose the first binary digit (at position 2^-1) accordingly (maybe heads=1, tails=0). Each next coin throw will give you another digit. When you get tired, or consider that your approximation is good enough, stop throwing the coin and use the fraction so far. If you have two of these fractions and are worried that they may be equal, calculate extra digits for each number (with more coin throws) until they differ.

    Experiment B: produces a random number, not uniformly distributed and possibly repeated (the latter could be solved by throwing away a repeated number and producing a new one), in the interval [0,pi). Proceed as in experiment A, but definitely terminate the experiment at the first 0 digit (tails); then return pi times your fraction.
  4. Mar 24, 2012 #3
    I would think the proof would be very similar to "Are there infinite prime numbers?"
  5. Mar 24, 2012 #4
    Define random.
    Define infinite series. (I'm pretty sure you meant sequence here)
  6. Mar 25, 2012 #5
    Thanks, I did mean sequence, and by "infinite" I intend for the count of random numbers in that sequence to approach infinity.

    By "random" I mean eventually exhausting all relations among numbers.


    Is the cardinality of random numbers the same as that for reals?

    Might random numbers by their nature tend toward infinity?
  7. Mar 26, 2012 #6
    Not entirely ^^

    You see, Prime Numbers are NOT random. They are connected by the fact they can only be divided by themselves and one.

    It's incredibly unlikely if a proof to either of these were found, that it'd be similar to prove the other one.
  8. Mar 26, 2012 #7


    User Avatar
    Science Advisor

    If it is a sequence, by definition it will be countable. The numbers themselves can be confined to a given interval, since there are (non-countable) infinite numbers available.
  9. Mar 26, 2012 #8
    Please let me amend my original questions:

    Is an infinite set of exclusively non repeating random numbers possible?

    Is "random" analogous to "exhausting all relations among numbers"?


    Is the cardinality of random numbers the same as that for reals?

    Might random numbers by their nature tend toward infinity more rapidly than reals?
  10. Mar 26, 2012 #9
    You have to define random. By the definition of an infinite sequence, there will always be some generating function that represents your sequence (even if it can only be expressed as a mapping). I suppose you could say that it is random if the generating function can't be expressed in terms of certain types of other functions, like elementary functions.

    But until you specifically define random, by the reasoning that my choice of generating function is random, I can say that {2, 4, 6, ...} is random; because I thought of it randomly.
  11. Mar 26, 2012 #10
    What do you mean by a random number? Is 6 a random number? Is pi a random number?

    Do you perhaps mean non-computable or non-definable number?
  12. Mar 27, 2012 #11
    The set of random numbers exhausts integral order and corresponding numerical magnitude. Random numbers may be generated by exchanging orders with magnitudes.
  13. Mar 27, 2012 #12


    User Avatar
    Gold Member

    I know it's sound philosophical, but how would you generate a random sequence of numbers?

    I mean if you want to 'generate' something, you'll have to use some apparatus (either your mind, computer, cesium atom) and you don't know what makes the apparatus pick its numbers.

    I don't believe there's something like random numbers.
  14. Mar 27, 2012 #13
    Perhaps you question is, are there permutations of infinite length? (Just trying to guess your meaning, though.)
  15. Mar 27, 2012 #14
    Can random numbers be defined but not generated?

    I'm not sure. How about the permutation between order and number which I mentioned previously?
  16. Mar 27, 2012 #15
    You're using some words in ways that are unfamiliar to me.

    * You referred to "the set of random numbers," but I don't know what a random number is. Can you give some examples of random numbers? Is 6 a random number? Pi?

    * What do you mean "exhausts integral order?"

    * What do you mean by generating random numbers by "exchanging orders with magnitudes?" Can't imagine what that means. Order is the relation by which 5 < 6, for example. Magnitude is the relation that ignores the difference between -5 and 5. How do you exchange these properties? And how do use these ideas of order and magnitude to generate random numbers? Do you mean particular random numbers or the entire set of random numbers? And what set is that?

    I know you have something in mind but it's difficult for me to understand what you mean.

    Are you referring to numbers that are generated unpredictably? Flip a bit if a cosmic ray passes through a particular square millimeter in the next millisecond?

    Or by "random" do you mean algorithmic randomness? A number is random it's incompressible via a finitely describable algorithm? So any number we can name is not random, but we know that there must be uncountably algorithmically random numbers. Those are the non-constructable or non-definable numbers. [There's a subtle difference between those two concepts?

    Do any of those questions make sense in terms of what you're trying to do?
  17. Mar 28, 2012 #16
    I appreciate your patience, SteveL27. Upon consideration, your arguments make a lot of sense.

    In this regard, allow me to modify my previous speculations to concentrate on a specific random number generator:

    1. Start with an exclusive irrational number.

    2. Limit its fractional part in digits by its integer value.

    3. The resulting string is a random number.
    Last edited: Mar 28, 2012
  18. Mar 28, 2012 #17
    What is an "exclusive" irrational number. What does it mean to limit its part in digits by its integer value? Do you mean just take the part to the right of the decimal point?

    Can you give an example?

    Say I start with my favorite irrational, e, the base of the natural log. e = 2.7128...

    Are you saying that .7128... is a random number? But it isn't, in either sense of the word.

    * It's not unpredictable. In fact you can can write down the well-known algorithm e = 1 + 1 + 1/2 + 1/6 + 1/24 + 1/120 + ... + 1/n! + ... and crank out as many digits as you like. It's completely deterministic, the opposite of random.

    * It's not algorithmically random, in fact the finite string (sum as n goes from 0 to infinity of 1/n!) is a small number of symbols that precisely defines e.

    Is e an "exclusive" irrational by your definition? Or do you mean something else?

    Do you perhaps mean that the digits of e are normal, in the sense that any block of n digits occurs equally as often as any other block? It's not known if e is normal, but is normality the characteristic you're interested in?
  19. Mar 29, 2012 #18

    I am beginning to believe that "random number" is an oxymoron.

    Numbers are defined by value, order and relation overall, while randomness violates those quantities together.

    Random number sequences manifest near infinite uncertainty and zero probability.

    There exist tests for non-random numbers, but not for random numbers.

    No physical measurement has confirmed a random number.

    The set of random numbers and the set of non-random numbers seem mutually exclusive. Is the set of reals their union?


    [Strike "exclusive," please.]

    Some example outputs of my pseudorandom generator (I am assuming normality for irrationals):


    becomes .162


    becomes .71

    A true random number generator is an impossibility, since it requires both non-determinism (randomness) and determinism (predictiveness).
    Last edited: Mar 29, 2012
  20. Mar 29, 2012 #19


    User Avatar
    Science Advisor

    It might help if you think about randomness in different orders of entropy. Entropy is probably the best and most useful way to quantify randomness: maximum entropy on all orders means that there is no advantage you have for having any past information at all since a maximum value on all orders (independent, first order conditional, second order conditional, etc) would mean that the analysis of all past values with respect to each other would not give you an advantage.

    The best way to describe this is to have all of these distributions be uniform because this distribution is the one that maximizes entropy. If you do this for all possible conditional probabilities, then you will get a distribution that is purely random.

    From this distribution you will get hints about the kinds of processes that you could construct.

    If you want something random, but not purely random then you don't have to do anywhere near as much work, but if you want a process that is random the best way possible, you need to construct the above system and from there decide what kind of process would really emulate this distribution.
  21. Apr 1, 2012 #20


    κ=Boltzmann's constant

    ln=natural logarithm

    Ω=number of states


    1. Does true randomness accompany a transfinite number of states?

    2. Is information about states restricted by a finite speed of light?

    3. Can multiple states interfere, e.g. achieve minimum entropy?
  22. Apr 1, 2012 #21


    User Avatar
    Science Advisor

    Could you please explain what you mean by transfinite? I get the feeling its a set-theoretic term but I don't want to make an erroneous judgment.

    For number 2, I have to reiterate that the above talks about randomness with respect the distributions of the states that are related to each other and are independent of the actual process itself.

    If you want to answer your question you need to consider further constraints that relate to a specific process. Again the above considers a general process with a particular property.

    I am not a physicist, but what you need to do is specifically outline how a law or a relationship between variables modifies the properties of the distributional information of the various joint distributions which then modifies the entropy, and from that you can get an idea of the real measure of the random nature of the process.

    You should note that the physical systems are not purely random in the sense I have described above, because there is a lot of deterministic features known that we utilize and exploit everyday for various purposes. If the world were absolute truly random and completely unpredictable, then the order that we observe and make use of every day would not be present.

    For 3, I wish to say that minimization of entropy gives an indication of order while maximization of entropy gives us an indication of disorder. Physicists and natural scientists usually have a goal of finding order and this directly relates to entropy.

    The other thing is that you don't just want to consider the entropy characteristics of the original distributions, but also those of possible transformations of the data and hence the possible transformations of the distributions that 'make sense'. Make sense depends on both mathematical ideas and domain ideas, but for mathematical ones you want to consider at the very minimum convergence and probably topology and differentiability as well.

    If you want an example of entropy minimization, think of entanglement. Instead of having two objects that would be classified as purely random, instead what we see is a reduction of entropy from that case since one has a direct effect on the other and this result shows a form of order that would otherwise not be seen in a purely random system.

    In fact it is this property of minimum entropy in a variety of circumstances that has allowed us to obtain formulas like the ones you find in your science textbooks: it is this ability to quantify this order accurately enough that allows us to even understand this system we call reality or the 'universe', and I imagine that as time goes by we will be able to find transformations of our distributional representations that allow us to see even more order and subsequently a way to quantify it, just as Newton quantified gravity.
  23. Apr 2, 2012 #22
    For number 1, by "transfinite" I attempt to attain the absolute limit of states where the system is purely random.

    For number 2, I should have asked whether pure randomness can occur in a finite universe.

    The relations within the equation for statistical entropy I gave are standard, and only simplistic logarithmic information derives from them.

    States approaching infinity on a microscopic level are just as likely on a macroscopic level.

    For number 3, quantum entropy would either involve destructive interference with change to the negative, or would involve constructive interference with change to the positive.

    I believe statistical mechanics gives classical transformations of our distributional representations.

    For entropy to have a true "law," I guess that entanglements must transfer their statistics instantaneously (rather than at a finite speed of c) and thermally.
  24. Apr 2, 2012 #23


    User Avatar
    Science Advisor

    This is a very good question.

    I think the answer is going to be yes because the state-space does not have to necessarily define the nature of the process.

    The thing you have to remember is that a process can take in a finite-state space and map it to a finite-state space like say relating the history of die rolls to the next one in terms of probability.

    The state-space itself is fixed, but the process could go on forever and ever infinitely and although you have a process with a finite state space, it doesn't mean that the properties of the underlying process itself are not purely random.

    Think of the problem of whether you could define a coin-toss process so that every new toss has no arbitrage chance of being biased in any way given the entire history of the process. If you think you can define a process that is unpredictable in this nature, your answer is yes. If you think you can't your answer is no.

    I haven't shown a proof, but if I were to give one I would try and show that there always exists a process that given N observations, that the complete joint distribution of the N+1 observation given all the prior observations has maximal entropy (i.e. all values are the same). If this is shown, then you have proven that your answer is a resounding yes.

    The thing about entropy though is that you need to consider not just non-conditional entropy but conditional entropy as well.

    To give you an example, imagine that you have a process corresponding to an infinite periodic sequence where one period consists of {0,1,2,3,4,5} in that order and repeats forever.

    Now if you try and calculate P(X = a) for a = {0,1,2,3,4,5} you will always get 1/6 which implies maximal entropy.

    But for this process we know that P(X_n+1 = a| X_n = (a-1) MOD 6) = 1 because of the periodicity. The entropy of this joint distribution is zero which implies absolute determinism.

    From this statement although the non-conditional entropy is maximal, the joint ones are completely the opposite and through this we have found complete order which is exhausted at this level and thus the process is deterministic.

    The order of the process if it exists, will be hidden somewhere in the conditional joint distributions, not in the non-conditional probability distribution.

    You need to be cautious about this: it may turn out depending on what you define as macroscopic that there is a different kind of entropy than what you would find on a microscopic level and that you may get some kind of dimensionality reduction when you consider a particular macroscopic space vs a microscopic space. If this occurs (especially dimensionality reduction), you are going to get some kind of different in classification of states between the two spaces.

    Again, it needs to be defined what the macroscopic space is and if possible, the mapping between states in different spaces. I think you'll find that because of the way things are seen macroscopically in comparison to microscopically, there will be some kind of significant reduction in the state space for the macroscopic classifications in contrast to the microscopic ones which will look more like a projection than a bijection.

    I'm not sure what you mean by this statement.

    The thing I am more interested in is not just the non-joint distribution but the joint distribution in a general case. As I said above, order is not found in non-conditional statistical measures or distributions: it's like taking data sorting it out and putting it into separate buckets without considering how the relativity between data impacts the order found in the process.

    This is a very interesting point that I am going to have think about. Thanks.
  25. Apr 3, 2012 #24

    Please explain the general meaning of the symbols and variables used in P(X_n+1 = a| X_n = (a-1) MOD 6) = 1.

    Thanks for your dedication.
  26. Apr 3, 2012 #25


    User Avatar
    Science Advisor

    This is just a way of describing 1st order conditional properties for the staircase process {0,1,2,3,4,5} that keeps repeating in a periodic way.

    I'll expand it out for all states and then you should see how I used the mod function: Here we go:

    P(X_(n+1) = 0 | X_(n) = 5) = 1
    P(X_(n+1) = 1 | X_(n) = 0) = 1
    P(X_(n+1) = 2 | X_(n) = 1) = 1
    P(X_(n+1) = 3 | X_(n) = 2) = 1
    P(X_(n+1) = 4 | X_(n) = 3) = 1
    P(X_(n+1) = 5 | X_(n) = 4) = 1

    All other probabilities for all other first order conditional combinations are zero and you can show this by various probability identities and exhaustion of the probability space.

    The X_(n+1) refers to the "n+1"th observation for the process and the X_(n) refers to the "n"th observation for the process. You could for this example associate n as a time parameter of a one-dimensional process.

    The above process can only take on the values {0,1,2,3,4,5} which means we only have to consider going from one value in this list to another value in this list.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook