The mean of a sum of variables.

  • Context: Undergrad 
  • Thread starter Thread starter Mppl
  • Start date Start date
  • Tags Tags
    Mean Sum Variables
Click For Summary
SUMMARY

The mean of a sum of random variables, specifically Z = X + Y, is proven to be the sum of their means, E(Z) = E(X) + E(Y). This theorem holds true for both independent and dependent random variables. For independent variables, the proof involves reversing the order of integration and applying a change of variables. In the case of dependent variables, one must analyze the joint distribution and marginal distributions of X and Y using the joint distribution function F(x,y).

PREREQUISITES
  • Understanding of random variables and their means
  • Familiarity with joint distribution functions
  • Knowledge of integration techniques in probability theory
  • Concept of marginal distributions
NEXT STEPS
  • Study the properties of joint distribution functions in probability theory
  • Learn about marginal distributions and their applications
  • Explore integration techniques relevant to probability, such as Fubini's theorem
  • Investigate proofs of the linearity of expectation for random variables
USEFUL FOR

Students and professionals in statistics, data science, and mathematics who are looking to deepen their understanding of the properties of random variables and their means.

Mppl
Messages
12
Reaction score
0
How do I prove that the mean of a random variable Z which is the sum of to other random variables X and Y is the sum of the mean of X with the mean of Y?
 
Physics news on Phys.org
Essentially the theorem is equivalent to the theorem that the integral of a sum is the sum of the integrals.
 
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.
 
Mppl said:
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.

Yes that'll eventually give you a proof for the special case where the rv's are independent and have densities (involves reversing the order of integration and a change of variables).

Another approach that would work for the non-independent case is to consider separately the joint distribution and marginal distributions of X and Y.
 
well but I'm not being able to prove it either for dependent or independent variables, can you please show me the proof or tell me where I can find it? thank you
 
Two random variables.
E(X+Y)=∫∫(x+y)dF(x,y)=∫∫xdF(x,y) + ∫∫ydF(x,y).
Integrate with respect to y in the first integral and integrate with respect to x in the second integral. You will be left with E(X) + E(Y).

In the above F(x,y) is the joint distribution function.
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 39 ·
2
Replies
39
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K