# Is this system random or not?

1. May 3, 2012

My friend and I were having a discussion, and we both can't seem to see the other side's point of view. The question was whether a certain "operation" is random or not. This is what it is:

Suppose you have an input, it doesn't matter what it is. The first "operation" is just a completely random number generator - and yes, assume for the sake of the discussion it is completely random. Even if you plug in the same number twice, you could get different numbers out. Random. Next, you add 1 to that number.

I said, since the final output depended on a random input partly through the process, it was random. He replied that since you added one to it, the output was relying on a non-random operation, therefore it is not random.

I'm confused about how that would make it un-random. Since the odds for every number coming out are still the same, shouldn't it be random? Or am I missing something, like the definition of random? Thanks for your help!

2. May 3, 2012

### Office_Shredder

Staff Emeritus
Random generally means non-deterministic, i.e. if I tell you what the input is you cannot tell me what the output is before I calculate it. Thus function clearly doesn't satisfy that so it is random.

In this case specifically the adding one operation is completely superficial. What is the difference between picking a number between zero and one at random and adding one, versus picking a number between one and two at random?

3. May 3, 2012

That's what I was telling him, but he said since it then "depended on it" it made it not random. Thanks for your support, I'll show this to him :).

4. May 3, 2012

### dcpo

This is the kind of situation where mathematics really shines, as it allows for rather vague concepts to be defined and discussed objectively. Clearly the issue here is that neither you nor your friend has a clear idea of what randomness should mean, so both of you are making arguments that appeal to your own intuition, which will be difficult to convey to the other person. If we define randomness along the lines of determinism, as suggested by office_shredder, then I'd agree with you, as the resulting process is clearly non-deterministic, though I think your argument is flawed. For example, you could define a process that first picked a random number, then mapped every number to zero. That process would involve a 'random' stage, but the output would clearly be deterministic.

As an aside, the odds of each number coming out may not be well defined, depending on what the range of possible outputs is. For example, if you range over the whole set of natural numbers, and you want each number to be 'equally likely', how do you define the probability of a given number being produced?

5. May 3, 2012

### chiro

Hey Freespader and welcome to the forums.

The easiest way to think about this is to define the random variable Y where Y = X + 1, where X is the random variable.

Given that X is random, Y is also random even though you add a 1. In the end we are measuring Y which is random based on the reasoning above.

The confusion that many people have is that once a value is 'realized' it is not random but known. The realization is just one of often many and if we repeated this process over and over, what would happen is that the results would converge to the distribution for our X.

6. May 4, 2012

I take AP stats and my father did university work on functions, so I guess I can give you a decent explanation: in short, I think your friend has a more correct idea of whats happening.
In order for something to be random, it cannot be defined by a function. I will take the function you used: f(x) = x +1. The domain of x could be ANYTHING, but the fact that f(x) is definable thus makes it not random. So even if the domain of x is a set of random numbers, the range is defined by the +1 operation. For something to be random, it cannot be determined by ANYTHING. f(x) can be determined from adding one to any of the original random set elements, so it not random.
That is why, if you go on random.org (a common random # generator), they make such a big deal about how the #'s taken are from a situation of complete entropy. There is NO way one can ever trace the output to its input.
In chaos theory, 'random' phenomena really are not random if one can describe its behavior by writing a function. If you can define something, it is not random. See Chaos (Gleik?) for more info.
This is also why 'random' # generators on calculators are really pseudorandom. The fact the calculator is using an algorithm makes in not random, because the calculator is taking something, who knows what, but they are taking something, which gives their operation a domain. This is why many stats teachers will smile when talking about using technology as a substitute of a random # table. It is possible for the whole class to get the same 'random' # set because the seed on all the calculators may be set to the same value. This would never occur in an instance of true randomness.
I hope this helps address the issue.

7. May 4, 2012

### Diffy

I am not sure about this. I appreciate how far along you are in math, with AP Stats under your belt and everything but let me try and convince you otherwise.

According to the OP.

I read this as there is some process that takes an input and produces a random number.

We can call the input a, and the process will be R(a).

X = R(a)

So if you give me an input a, I can give you a random number X.

Now Y = X + 1. Or Y = R(a) + 1

Since we all agree that X is random, you give me a, you won't be able to guess what X will be will you?

If you give me a, will you be able to guess what Y will be?

Here is another way to view it.

Say we produce a whole bunch of Y values. Y1, Y2, Y3...

And put them in a sequence in the order we produced them

{Y1, Y2, Y3, Y4, ...}

Is this not a random sequence? If you saw the first 4 Y values would you be able to predict what the next Y is going to be?

8. May 4, 2012

Hey there Diffy:
The thing is that R(a) does not produce a random number. The function R(a) assigns a some value to any a. Since there is an assignment, whatever 'a' is, something certain will come out of it-that's because we have a function. Something random can never be considered a something certain because then it is arising from a function, and that function has some domain. It may take me 100,000 years with the most powerful supercomputer, but I would be able to produce an inverse function which could find the source of R(a): 'a'. And if we know what 'a' is, there is no way R(a) is random. And since R(a) is not random, neither is X then.
Substituting y = x + 1 with y = R(a) + 1 is basically making y a composition of functions. Really, y is then f(R(a)). And from the above argument, y is also non random.

As for the sequences thing, I am sure one could find a function which describes the first four terms. It is possible that there can be more than one function which may describe those four terms. But once I have a function, I can proceed with algebra to figure out what x could be. I could find multiple x values, and even more so as there could have been more than one applicable function, but since I have some clue as to what x could be, the sequence is not random. This is because I have found a link to Y's domain-if Y was random, then I could never be able to do so.

But here is something which may truly be random, again concerning your sequences thing. Suppose that you did not 'produce' the Ys, as you said. I would never know if they were random or not then, because if they were random, there is NO test which exists to prove that a set cannot come from any domain. There would be no way to find out whether the set was random, if one cannot figure out where it came from.

9. May 4, 2012

### dcpo

@MadViolinist: I'm not sure you're looking at this in a particularly helpful way. We are not constrained to functions, and a truly random process is explicitly assumed. The only solid definition of randomness so far offered is the notion of non-determinism, and the composition of a non-deterministic process with the function described in the OP will still result in a non-deterministic process. You can introduce a competing definition of randomness if you want, but you'll have to define it clearly and motivate it for the discussion in question.

10. May 4, 2012

I think I can clarify the discussion a little: Madviolinist is my friend.

Nice try, by the way.

To be honest, I think you're missing the point of my argument entirely, which would explain why we're still arguing. The point of R(a) is that it IS random. The only real "function" for it then would be to map electron orbits or something that are totally random, and thus no supercomputer with 100 million years could turn into a function.

Also, I think dcpo has the right idea. You seem to be using a different definition. Mine is simply for a certain given domain (be it restricted or all reals or what have you) the odds of having one output is the same as the other. When you add one, then, you might need to change the domain, but shouldn't the probably of each one stay the same? Since each one would still be equal to the next, according to my definition it would be random, correct?

@Diffy, that was exactly what I was thinking :)

11. May 4, 2012

### dcpo

Just a point about notation, the 'domain' of a function is the set of values you can feed into it, the set of possible results of applying a function is its 'range'. Formally we would write something like f:X->Y to say that f is a function mapping elements of X to elements of Y. In this case we don't really have a function in the first stage, unless we take an unusual definition of a function to include non-determinism, we just have a random process selecting a member of some set X, which we compose with the function f:X->X defined by f(x)=x+1. The random process doesn't need a domain, because it's not a function.

12. May 4, 2012

Back to the original post, the question was "Since the odds for every number coming out are still the same, shouldn't it be random?".

I am thinking along the lines of the Principle of Insufficient Reason-
"that if we are ignorant of the ways an event can occur (and therefore have no reason to believe that one way will occur preferentially compared to another), the event will occur equally likely in any way."
Weisstein, Eric W. "Principle of Insufficient Reason." From MathWorld--A Wolfram Web Resource. mathworld.wolfram.com/PrincipleofInsufficientReason.html

My rational is that since we are adding +1 to a random number, we know the way the event will occur. Thus, the event is not equally likely to occur in any way.

13. May 4, 2012

So if a random process is not a function, then is the f(x)= x+1 a random process? Please tell me more :)

14. May 4, 2012

What I mean to say, in response to dcpo, is that if f(x) is not a random process, then isn't f(any random x) not random? Because aren't we choosing what we are feeding into the function, albeit via a random process, which makes the range determined? I do not understand how randomness and non-determinism aren't one and the same.

15. May 4, 2012

### dcpo

As I said earlier, close to the root of the problem here is an unclear definition of what it means to be random. The function defined by f(x)=x+1 is certainly not random by any reasonable definition; it is a simple and well defined function (subject to suitable choice of domain). But the whole process of non-deterministically producing a number, then deterministically adding one to it is still non-deterministic.

16. May 4, 2012

Hey dcpo:

So, to see if I understand, if you have a random number and add one to it, even though we are applying an operation to that random number, the resulting number is not determined by the first one?

Because then could we not subtract one from the resulting number and arrive at the random number, thus defining our original random number and contradicting ourselves?

17. May 4, 2012

### dcpo

Having produced a number, the result of adding 1 to it is deterministic, as once you have a number, it is clearly defined what that number plus 1 will be. But the process of generating a random number, then adding 1 to it is non-deterministic, because you cannot predict what you will end up with.

18. May 4, 2012

Hey dcpo:

Ohh. So if the process of generating a random number includes adding 1 to it, it is non deterministic? So there really is no function involved, like the Y = X+1 like my friend was saying?
Also, just curious, what is your definition of randomness?

Thanks again.

19. May 4, 2012

### dcpo

I am using non-determinism as my definition of randomness. A function is involved in the construction defined in the OP. We are generating a random number, then we are adding one to it. Remember that this is not a real situation that we are describing, it's a vague idea that we are trying to make precise. I am thinking in terms of a process generating a random number, composed with the function defined by f(x)=x+1, because to me this concept is clear, and it appears to capture the essence of the situation in the OP.

20. May 4, 2012

### Office_Shredder

Staff Emeritus