Proof of the statement: sum of two random variables is also a random variable

In summary, the conversation discusses the definition and proof of adding random variables and the concept of randomness. Some argue that the proof is in the definition, while others suggest that the nature of addition and randomness must be clearly defined in order to prove the statement. The conversation also touches on the use of translation tables in encryption and the idea that random selection from an unbounded set can be problematic.
  • #1
seeker101
28
0
Could someone point me to a book that has a proof of the above statement?

Thanks in advance!
 
Physics news on Phys.org
  • #2
A random variable is just a measurable function, and the sum of measurable function is always measurable.
 
  • #3
Hi.
That statement seems obvious to me. I don't need a proof.

If you have two variables for which the outcome is uncertain i.e random. The outcome of their sum will also be uncertain i.e random. So the sum of two random variables is also a random variable. Don't you think? I don't even know if there is a written proof for that.
 
  • #4
artbio said:
Hi.
That statement seems obvious to me. I don't need a proof.

If you have two variables for which the outcome is uncertain i.e random. The outcome of their sum will also be uncertain i.e random. So the sum of two random variables is also a random variable. Don't you think? I don't even know if there is a written proof for that.

Yes, but there is a formal definition for what it means to be random.

http://en.wikipedia.org/wiki/Random_variable#Formal_definition
 
  • #5
Most texts seem to state it without proof. Shiryaev sets an exercise to show that if f:R^n -> R is Borel and X1(w),...,Xn(w) are random variables then f(X1(w),...,Xn(w)) is a random variable. This would be useful for showing that X1-X2, X1*X2, X1/X2 are also random variables.
 
  • #6
It seems to me you've got to define what 'adding' means when you're talking about random variables. In fact, I'm pretty sure the proof is in this definition because when you say X + Y you really mean Z such that P(Z=z) = P(X=x, Y=y|x+y =z)

EDIT: This does seem to be a rather sneaky method
 
Last edited:
  • #7
Actually, that statement is false unless you define the nature of the randomness and the nature of the addition. For example, suppose I have a source of random values ranging over the interval (0 - 1) OK?
Now, for large n, I select n pairs of values and add them algebraically. my output now is no longer over that interval, and its pattern of randomness over the new interval is no longer the same. It now has a bias away from the extremes of 0 and 1.

However, in contrast, suppose we add the numbers mod 1; now things improve don't they?

OK, now the next step is informal (fancy for "handwaving"). Before the selection and addition, the probability of every number in the interval being selected was equal (definition of random, yes?) Now, after the addition mod 1, the probability still is equal.

You might find it useful instead of speaking as though the value were real, to think of the numbers in the interval as being integers or simple fractions 0.0, 0.1, 0.2 ... 0.9. Try it and see what happens.

Cheers,

Jon
 
  • #8
@ Jon Richfield
Respectfully, I disagree.
The probability of each outcome does not need to be the same, it only needs (to quote wikipedia) to be a function of of a statistical experiment with equiprobable outcomes. In this case your function is a sum.

Furthermore, if this weren't the case you couldn't have normal, exponential, etc distributions cause everything would be uniform.
 
  • #9
Hi Pseudophonist,
Thanks for the reply and no special respect needed!
Firstly, I suspect that I have caused confusion by saying "Actually, that statement is false..." without specifying which statement! Did you think I was denying what *you* had said? I was denying that "adding" random values would give a random result without :
a) defining the nature of the "addition" operation in context
and also
b) defining the nature of the "randomness" in question.

From what you have written, I suspect that you have no quarrel with this point of view. I for my part have no quarrel with the wiki statement that you quoted, which I am sure that you will agree is consistent with the view that say, the exponential and normal (and many, many others) distributions are grossly *non*-random in a context where one expects and requires a rectangular distribution, such as one should get from the standard, vanilla, random number utilities. (In principle one can of course convert any of these distributions into any other by suitable manipulation, but that is another matter).

But I am sure that you agree (feel welcome to puncture my assurance if I presume too far) that, suppose, without appropriately defining "addition" as well as the "fair random" distribution, that you "add" numbers in most of these (generally bounded) distributions, you get, to the extent that you get anything meaningful at all, a shift in distribution that in most contexts amounts to gross bias instead of equiprobability. In general it is possible to overcome this by redefining "addition", as I did in my example by specifying mod 1 addition.

Incidentally, if one's random selection is from an unbounded set, it is very hard to speak meaningfully of a random number at all. What would you get if you selected a million random integers? In particular, would you expect any of your million to be finite?

Nasty one that! Not only food for thought, but bubble gum! :-)

An important class of such "additions" (more properly speaking in general, "operations") would be suitable translation tables such as XOR or XNOR, which permit one to apply random numbers (to a suitable radix) in encryption. They have the important property that even a single application of appropriately random numbers will produce equally random numbers as output, *no matter how **non**-random the other numbers might be*.

So strictly speaking, the original question was too modest. It could have been to prove that r+s -> r' where "+" is a suitable operation, r and r' are random in a suitable idiom, and s is *any* value in that idiom.

Are we now on the same wavelength?

If so I apologise for the confusion; if not, please elaborate if you have the time,

Go well,

Jon
 
Last edited:

1. What is a random variable?

A random variable is a numerical quantity that takes on different values based on the outcome of a random event. It is a mathematical representation of the uncertain nature of events in a probabilistic system.

2. How is a random variable different from a regular variable?

A regular variable takes on a specific value, while a random variable can take on a range of values based on the probability of certain outcomes. Random variables are also used to model uncertain or random phenomena, while regular variables are used in mathematical equations and formulas.

3. What is the sum of two random variables?

The sum of two random variables is a new random variable that represents the combined outcome of the two original variables. It takes on values based on the sum of the values of the two individual variables.

4. How do you prove that the sum of two random variables is also a random variable?

In order to prove that the sum of two random variables is also a random variable, we must show that it follows the properties of a random variable. This includes having a defined probability distribution, taking on numerical values, and being able to be manipulated in mathematical operations.

5. What are some real-world examples of the sum of two random variables?

Real-world examples of the sum of two random variables include adding the outcomes of two dice rolls, combining the results of two medical tests, or calculating the total cost of two items with different prices. Essentially, anytime two uncertain quantities are combined, the result can be represented as the sum of two random variables.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
504
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
410
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
466
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Back
Top