Mathematical definition of randomness

phoenixthoth

i want a mathematical definition of randomness. since i don't think a -ness word is commonly (or perhahps ever) defined in math, maybe it'd be better to define random. since random is more like an adjective, how about a random blah where blah could be at least these three things: sequence or variable or number

so let's just ask these questions:
what is a random sequence?
what is a random variable?
what is a random number?
and if you want
what is a random event?

if you don't mind, state your opinion (w or w/o proof) that:
1. random sequences exist (or not), how many there are, and how to decide if a given sequence is random.
2. random variables exist (or not), and how to decide if a given variable is random.
3. random numbers exist (or not).
4. random events exist.

what i'm ultimately wondering is whether no, any, or all events in the universe are random though if you thought i meant event in the statistical sense that's ok. i don't really expect my questions to address what i'm ultimately wondering without undertaking what may be a logical leap of extrapolating math to philosophy/science (eg that it can't be decided if a given sequence is random then it can't be decided that a given event is random).

incidentally, if you were to pick a random real number, what is the probability that it is less than x (where x is in R)? how about between -x and x? i think the answer to this question ought to be 0 even if x=googleplexgoogleplexgoogleplex

Last edited:

Organic

A = random

B = a definition of A.

What association(s) can be found between A and B?

C=Complement

P=Prevent

For example: Do they P each other or C each other or both ?

Last edited:

MathematicalPhysicist

Gold Member
Re: randomness

Originally posted by phoenixthoth
i want a mathematical definition of randomness. since i don't think a -ness word is commonly (or perhahps ever) defined in math, maybe it'd be better to define random. since random is more like an adjective, how about a random blah where blah could be at least these three things: sequence or variable or number

so let's just ask these questions:
what is a random sequence?
what is a random variable?
what is a random number?
and if you want
what is a random event?

if you don't mind, state your opinion (w or w/o proof) that:
1. random sequences exist (or not), how many there are, and how to decide if a given sequence is random.
2. random variables exist (or not), and how to decide if a given variable is random.
3. random numbers exist (or not).
4. random events exist.

what i'm ultimately wondering is whether no, any, or all events in the universe are random though if you thought i meant event in the statistical sense that's ok. i don't really expect my questions to address what i'm ultimately wondering without undertaking what may be a logical leap of extrapolating math to philosophy/science (eg that it can't be decided if a given sequence is random then it can't be decided that a given event is random).

incidentally, if you were to pick a random real number, what is the probability that it is less than x (where x is in R)? how about between -x and x? i think the answer to this question ought to be 0 even if x=googleplexgoogleplexgoogleplex
i think that a random sequence by definition doesnt exist or at least that's what i think, a sequence represents a pattern and randomness doesnt show a pattern therefore a random sequence doesnt exist.

jcsd

Gold Member
According to my maths dictionary a sequence must be formed by a rule or law, but my discrete maths book suggests that isn't the case, but perhaps 'list' would be a better word? In QM the collapse of the wavefunction is an event which produces truly random results, so you could well use this to generate a truly random list of numbers or a truly random single number.

Another interseting question is: if I picked natural numbers at random, what would the be the expected value for the mean?

NateTG

Homework Helper
Like many other words, random means different things to different people and at different times.

When people talk about random events, they are referring to things that occur sponateously and unpredictably. For example earthquakes.

For mathematicians, a random sequence of bits is a sequence that has one bit of information per bit in the sequence. (A bit is a number that is either 1 or zero.)

If you accept, for the moment, that the condition above is well-defined, then I can generate conditions for a random sequence of real numbers from a bounded interval, or a weighted sequence of random real numbers from an interval based on it.
(Since &int;-infinityinfinityp(x)dx=1, p(x) cannot be constant over unbounded intervals.)

When you start talking about random variables, typically, people are referring to a variable that is assigned consecutive values from a random sequence.

An individual random number is simply the only number in a single-number random sequence. Alternatively, it may be usefull to think of it as a single-number subsequence of an infinite sequence of random numbers.

Now, to the other questions:

As long as you are limited to real probability distributions (i.e. p(x) is always a real number), the chance that a given real number x occurs in a finite random sequence of real numbers is 0, and chance that an individual random real number is on the interval (a,b) is
&int;abp(x)dx
where p(x) is the probability function.

HallsofIvy

Homework Helper
A random sequence is a sequence in which there is no rule by which one can deduce the next number in the sequence.

A random variable is a variable whose value is not determined by any rule or formula (although there may be a formula for determining the probability that the value is a specific number or within a specific range).

A "random number" is a possible value of a random variable (though, of course, after you have that number there is no difference between it and any other number).

A random event is an event (you can define that as you please) whose occurance is random (i.e. not determined by any rule or formula).

"incidentally, if you were to pick a random real number, what is the probability that it is less than x (where x is in R)? how about between -x and x? "

What probability density are you assuming? If you were to use a finite universal set or a bounded interval, then you might assume the uniform distribution where every number is "equally likely". For
R, all real numbers, that's not possible since it is impossible to make the integral a constant density function equal to 1. I think the most reasonable distribution to use for all real numbers is the normal distribution- to answer your question, look it up in table of the normal distribution!

phoenixthoth

i really hope that textbooks aren't defining a sequence to mean something given by a formula...

my general info on random sequences is this. let {a_n} be a sequence (ie a function from N to some set). suppose you write a formula that gives a_n perfectly for n=1, 2, ..., M. let's call this formula f_M and say that f_M(n)=a_n for n=1, 2, ..., M. {a_n} is random iff every such f_M is not of significantly less complexity than the list {a_1,...,a_M} for arbitrarily large M. when i read this, i didn't catch what "of significantly less complexity" meant.

example: {0,0,0,...} is not random cuz f_M(n)=0 for all n is an f_M that works for all M to give a formula of significantly less complexity than the list {0,...,0} (where there are M 0's) in that f_M(n)=0 has a few symbols for any M but {0,...,0} has roughly M symbols, and M gets arbitrarily large.

another way to say it is that {a_n} is not random iff the amount of data in {a_n} is perfectly compressible in a way that the compressed version is somehow "much smaller" than almost all finite subsets of {a_n}. this is kinda like saying the n-th term is "easily" predictable.

with this definition, i think that the following statements are correct:
1. you can prove that random sequences exist.
2. there are as many random sequences as non-random ones.
3. given a specific sequence, either you stumble on a way to prove it's not random or you can't tell, though you can't provably not decide i don't think. i don't think you can ever prove that it is random.

if i define a sequence recursively by a_(n+1)=c a_n (1-a_n), and we say c is a nonzero real number and a_0 is a real number, is {a_n} random (for most a_0) by either your definition of random or by mine or is it not random? in my definion, one would have to find a formula for a_n that is basically not as complex as a list of n numbers, which is something i can only do for c=-2, 2, or 4. in fact, the other formulas for a_n are much more complex than a list of n numbers.

do i have it right if i say random variable is a function whose domain is a random sequence or is it the range that is a random sequence?

is it the case that one can't have a uniform way of picking a random real number or just that such a distribution doesn't have a density function that fits the definition of density function? i would guess that for a uniform method of picking, the odds that the random real number is less than x is 1/2 for all x. if the method of picking more resembles the normal distribution, then that implies it's much more likely to pick a real number near the mean... i know that P(X=x)=0 but i was wondering what P(X<x) was...

Last edited:

NateTG

Homework Helper
Originally posted by phoenixthoth

with this definition, i think that the following statements are correct:
1. you can prove that random sequences exist.
2. there are as many random sequences as non-random ones.
3. given a specific sequence, either you stumble on a way to prove it's not random or you can't tell, though you can't provably not decide i don't think. i don't think you can ever prove that it is random.
I'm not so sure about 1. I expect that if you can prove 1, 2 will be a corollary with many more random than non-random sequences.

3 is pretty obvious.

Hurkyl

Staff Emeritus
Gold Member
When in doubt, abstract!

Probabilities, random measures, etc, are rigorously defined through the concept of a probability measure.

Let X be a set. A measure &mu; on X is a function that maps subsets of X into R, satisfying the properties that

If A and B are disjoint, &mu;(A U B) = &mu;(A) + &mu;(B)
If &Phi; is the empty set, &mu;(&Phi;) = 0
For any A, &mu;(A) >= 0

(technical note: This doesn't really need to be defined for all subsets of X... also the additive property may be required to be true for infinite unions/sums)

When we are interested in randomness, probabilities, or whatever, we call the set X the probability space, and we call &mu; the probability function. We also impose the restriction on &mu; that &mu;(X)=1.

Some examples of probability spaces would be:

{1, 2, 3, ..., n} with &mu;(A) = |A| / n

R with &mu;([a, b]) = (1/&radic;(2&pi;)) &int;ab e^(-t2/2) dt

R with &mu;(A) = {1 if 0 is in A, 0 otherwise}

"Random variables" are fictitious things. When we say something like:

P(X = a) is the probability that the random variable X is equal to a

we really mean that P(X = a) is the probability measure of the set {a} in the probability space associated with this problem.

For example:

Let S = {1, 2, 3, 4, 5, 6}, and let &mu;(A) = |A| / 6

Now, when I say "If X is the random number generated by the roll of a fair die, then P(X is even) = 1/2", I really mean "&mu;({2, 4, 6}) = 1/2".

Suppose I want to two dice and ask the odds that their sum is 7. Well, I use the probability space S2, and &sigma;(A) = |A| / 36. (You can check that &sigma; is the product of &mu; with itself) So, the question I'm really asking is "What is &sigma;( { (x, y) | x + y = 7 } )?"

You want random sequences? Then pick your probability space to be a set of sequences. No problem!

It's a little strange to deal with random things in this highly nonrandom fashion; I know I'm certainly not completely used to it yet!

phoenixthoth

are nonmeasurable sets considered events or not?

NateTG

Homework Helper
I understand what you're saying, but what I'm referring to is somewhat different. We can talk about random sequences as sequences that are non-compressible, and then ask whether there are any non-compressible sequences.

It's sufficient to deal with streams of bits.

For example:
10101010...
Is obviously a compressible sequence.

Similarly, any sequence of bits where the probability of getting a 1 is not 0.5 is obviously compressible since there is less than one bit of information per bit of sequence.

It's easy to show that there are non-compressible finite sequences, the question is whether there are non-compressible infinite sequences.

phoenixthoth

this seems to be a survey of some notions of the phrase "random sequence."

http://www.maa.org/news/monthly046-063.pdf

under one definition of an incompressible sequence, no incompresible sequences exist. darn! or maybe not darn. my arbitrary goal is to arbitrarily define randomness and decide, arbitrarily, if anything in the real world is truly random. seems like the most interesting point of contention has to do with quantum mechanics. what definition do they arbitrarily decide randomness has to decide that the activity of certain quanta is random?

jeffceth

The general consensus here on random appears to be that something is random if it has no rule or method for determining it. Perhaps this belongs in a new thread, but is it really possible to describe a number/set/whatever with such a property? It seems to me that if you are capable of describing it to specificity/uniqueness you have destroyed its random property. Does this mean it is impossible to prove the mathematical existence of such a number/set/whatever, and if not is the nonexistence also provable?

Now obviously this all comes down to one's definition of random. The information-based definition of a non-compressable data set for a random list, while it holds up against these issues, does not seem to me to genuinely reflect common usage of the word random. It seems rather to me that, randomness must be expressed as a property a number can only have prior to being referenced, in which case such an existence is only theoretical. However, in the usage "Given a random number n," that is, where specificity is required in the usage, I would have to say that all numbers/sets/events are random because they are all possible outcomes of random origins. To conclude: this discussion has really drawn attention to how the word 'random' has come into common usage without a careful exact definition. Maybe this is because the concept of 'random' is itself a random concept, and thus would no longer exist if we were to nail it down to specificity. Now that belongs in the philosophy section!

sincerely,
jeffceth

HallsofIvy

Homework Helper
The problem is that we commonly use the term "random number" or "random set of numbers" when we mean "randomly chosen number" or "randomly chosen set of numbers".

It is the process of producing the numbers that is random, not the numbers themselves.

Once you have a number or set of numbers, it is no longer "random".

kai0ty

well as far of your last question u have a 50% chance of it being a real number. any real number you pick there will always be one higher, however if u picked an imaginary number, such as sqrt(-4) then in will no longer be in the reals. thats all i know, im not all that great in math like some of these guys are.

B_Domstedt

Re: randomness

The general consensus here on random appears to be that something is random if it has no rule or method for determining it.

This is due to the wrong scope. You wish to assign the "Random" to numbers or a process. It won't work. Try this: Assign "Random" as a property of the Observer. It is a property of the observer, and it is not in the generating process or in the numbers themselves. This also explain why the decimals of Pi may appear "random", where they are in fact completely deterministic. True Randomness then follows when every possible (computable) observer cannot see any pattern (we can show this is the case).

Last edited:

chiro

Re: randomness

One way of defining randomness is to use various measures of entropy.

If you measure in addition to normal entropy, all the various measures of conditional entropy, then that will give you an actual quantifiable measure of how random something is.

If you don't include the various conditional entropies then you will miss any information that is related to patterns that exist between different elements of your data set.

As an example consider the data set with 6N elements that has the progression 1,2,3,4,5,6,1,2,3,4,5,6 and so on. The standard entropy calculation is at a maximum but the conditional entropies are going to be 0, and when you have an entropy figure of 0, it means that some property of the data is deterministic in a probabilistic sense.

Physics Forums Values

We Value Quality
• Topics based on mainstream science
• Proper English grammar and spelling
We Value Civility
• Positive and compassionate attitudes
• Patience while debating
We Value Productivity
• Disciplined to remain on-topic
• Recognition of own weaknesses
• Solo and co-op problem solving