The mean of a sum of variables.

  • Context: Undergrad 
  • Thread starter Thread starter Mppl
  • Start date Start date
  • Tags Tags
    Mean Sum Variables
Click For Summary

Discussion Overview

The discussion centers on proving that the mean of a random variable Z, defined as the sum of two other random variables X and Y, equals the sum of the means of X and Y. The scope includes theoretical aspects of probability and integration related to random variables.

Discussion Character

  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant asks how to prove that the mean of Z (X + Y) equals the sum of the means of X and Y.
  • Another participant suggests that this theorem is analogous to the property that the integral of a sum is the sum of the integrals.
  • A participant expresses difficulty in relating the integral property to the mean of the sum of random variables and mentions encountering a convoluted integral.
  • It is proposed that a proof can be established for independent random variables using a change of variables and reversing the order of integration.
  • For dependent variables, a suggestion is made to consider the joint and marginal distributions of X and Y.
  • One participant provides a mathematical expression for the expected value of the sum of two random variables, indicating how to separate the integrals for X and Y.

Areas of Agreement / Disagreement

Participants express varying degrees of understanding and ability to prove the theorem, with some uncertainty regarding the application to dependent versus independent variables. No consensus is reached on a definitive proof.

Contextual Notes

Some participants mention challenges with convoluted integrals and the need for specific proofs, indicating potential limitations in their current understanding or approach.

Mppl
Messages
12
Reaction score
0
How do I prove that the mean of a random variable Z which is the sum of to other random variables X and Y is the sum of the mean of X with the mean of Y?
 
Physics news on Phys.org
Essentially the theorem is equivalent to the theorem that the integral of a sum is the sum of the integrals.
 
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.
 
Mppl said:
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.

Yes that'll eventually give you a proof for the special case where the rv's are independent and have densities (involves reversing the order of integration and a change of variables).

Another approach that would work for the non-independent case is to consider separately the joint distribution and marginal distributions of X and Y.
 
well but I'm not being able to prove it either for dependent or independent variables, can you please show me the proof or tell me where I can find it? thank you
 
Two random variables.
E(X+Y)=∫∫(x+y)dF(x,y)=∫∫xdF(x,y) + ∫∫ydF(x,y).
Integrate with respect to y in the first integral and integrate with respect to x in the second integral. You will be left with E(X) + E(Y).

In the above F(x,y) is the joint distribution function.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
483
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 39 ·
2
Replies
39
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K