## Proof of the statement: sum of two random variables is also a random variable

Could someone point me to a book that has a proof of the above statement?

 PhysOrg.com science news on PhysOrg.com >> King Richard III found in 'untidy lozenge-shaped grave'>> Google Drive sports new view and scan enhancements>> Researcher admits mistakes in stem cell study
 A random variable is just a measurable function, and the sum of measurable function is always measurable.
 Hi. That statement seems obvious to me. I don't need a proof. If you have two variables for which the outcome is uncertain i.e random. The outcome of their sum will also be uncertain i.e random. So the sum of two random variables is also a random variable. Don't you think? I don't even know if there is a written proof for that.

## Proof of the statement: sum of two random variables is also a random variable

 Quote by artbio Hi. That statement seems obvious to me. I don't need a proof. If you have two variables for which the outcome is uncertain i.e random. The outcome of their sum will also be uncertain i.e random. So the sum of two random variables is also a random variable. Don't you think? I don't even know if there is a written proof for that.
Yes, but there is a formal definition for what it means to be random.

http://en.wikipedia.org/wiki/Random_...mal_definition
 Most texts seem to state it without proof. Shiryaev sets an exercise to show that if f:R^n -> R is Borel and X1(w),...,Xn(w) are random variables then f(X1(w),...,Xn(w)) is a random variable. This would be useful for showing that X1-X2, X1*X2, X1/X2 are also random variables.
 It seems to me you've got to define what 'adding' means when you're talking about random variables. In fact, i'm pretty sure the proof is in this definition because when you say X + Y you really mean Z such that P(Z=z) = P(X=x, Y=y|x+y =z) EDIT: This does seem to be a rather sneaky method
 Blog Entries: 12 Actually, that statement is false unless you define the nature of the randomness and the nature of the addition. For example, suppose I have a source of random values ranging over the interval (0 - 1) OK? Now, for large n, I select n pairs of values and add them algebraically. my output now is no longer over that interval, and its pattern of randomness over the new interval is no longer the same. It now has a bias away from the extremes of 0 and 1. However, in contrast, suppose we add the numbers mod 1; now things improve don't they? OK, now the next step is informal (fancy for "handwaving"). Before the selection and addition, the probability of every number in the interval being selected was equal (definition of random, yes?) Now, after the addition mod 1, the probability still is equal. You might find it useful instead of speaking as though the value were real, to think of the numbers in the interval as being integers or simple fractions 0.0, 0.1, 0.2 ... 0.9. Try it and see what happens. Cheers, Jon
 @ Jon Richfield Respectfully, I disagree. The probability of each outcome does not need to be the same, it only needs (to quote wikipedia) to be a function of of a statistical experiment with equiprobable outcomes. In this case your function is a sum. Furthermore, if this weren't the case you couldn't have normal, exponential, etc distributions cause everything would be uniform.