Could someone point me to a book that has a proof of the above statement?
Thanks in advance!
A random variable is just a measurable function, and the sum of measurable function is always measurable.
That statement seems obvious to me. I don't need a proof.
If you have two variables for which the outcome is uncertain i.e random. The outcome of their sum will also be uncertain i.e random. So the sum of two random variables is also a random variable. Don't you think? I don't even know if there is a written proof for that.
Yes, but there is a formal definition for what it means to be random.
Most texts seem to state it without proof. Shiryaev sets an exercise to show that if f:R^n -> R is Borel and X1(w),...,Xn(w) are random variables then f(X1(w),...,Xn(w)) is a random variable. This would be useful for showing that X1-X2, X1*X2, X1/X2 are also random variables.
It seems to me you've got to define what 'adding' means when you're talking about random variables. In fact, i'm pretty sure the proof is in this definition because when you say X + Y you really mean Z such that P(Z=z) = P(X=x, Y=y|x+y =z)
EDIT: This does seem to be a rather sneaky method
Actually, that statement is false unless you define the nature of the randomness and the nature of the addition. For example, suppose I have a source of random values ranging over the interval (0 - 1) OK?
Now, for large n, I select n pairs of values and add them algebraically. my output now is no longer over that interval, and its pattern of randomness over the new interval is no longer the same. It now has a bias away from the extremes of 0 and 1.
However, in contrast, suppose we add the numbers mod 1; now things improve don't they?
OK, now the next step is informal (fancy for "handwaving"). Before the selection and addition, the probability of every number in the interval being selected was equal (definition of random, yes?) Now, after the addition mod 1, the probability still is equal.
You might find it useful instead of speaking as though the value were real, to think of the numbers in the interval as being integers or simple fractions 0.0, 0.1, 0.2 ... 0.9. Try it and see what happens.
@ Jon Richfield
Respectfully, I disagree.
The probability of each outcome does not need to be the same, it only needs (to quote wikipedia) to be a function of of a statistical experiment with equiprobable outcomes. In this case your function is a sum.
Furthermore, if this weren't the case you couldn't have normal, exponential, etc distributions cause everything would be uniform.
Thanks for the reply and no special respect needed!
Firstly, I suspect that I have caused confusion by saying "Actually, that statement is false..." without specifying which statement! Did you think I was denying what *you* had said? I was denying that "adding" random values would give a random result without :
a) defining the nature of the "addition" operation in context
b) defining the nature of the "randomness" in question.
From what you have written, I suspect that you have no quarrel with this point of view. I for my part have no quarrel with the wiki statement that you quoted, which I am sure that you will agree is consistent with the view that say, the exponential and normal (and many, many others) distributions are grossly *non*-random in a context where one expects and requires a rectangular distribution, such as one should get from the standard, vanilla, random number utilities. (In principle one can of course convert any of these distributions into any other by suitable manipulation, but that is another matter).
But I am sure that you agree (feel welcome to puncture my assurance if I presume too far) that, suppose, without appropriately defining "addition" as well as the "fair random" distribution, that you "add" numbers in most of these (generally bounded) distributions, you get, to the extent that you get anything meaningful at all, a shift in distribution that in most contexts amounts to gross bias instead of equiprobability. In general it is possible to overcome this by redefining "addition", as I did in my example by specifying mod 1 addition.
Incidentally, if one's random selection is from an unbounded set, it is very hard to speak meaningfully of a random number at all. What would you get if you selected a million random integers? In particular, would you expect any of your million to be finite?
Nasty one that! Not only food for thought, but bubble gum! :-)
An important class of such "additions" (more properly speaking in general, "operations") would be suitable translation tables such as XOR or XNOR, which permit one to apply random numbers (to a suitable radix) in encryption. They have the important property that even a single application of appropriately random numbers will produce equally random numbers as output, *no matter how **non**-random the other numbers might be*.
So strictly speaking, the original question was too modest. It could have been to prove that r+s -> r' where "+" is a suitable operation, r and r' are random in a suitable idiom, and s is *any* value in that idiom.
Are we now on the same wavelength?
If so I apologise for the confusion; if not, please elaborate if you have the time,
Separate names with a comma.