The mean of a sum of variables.

AI Thread Summary
To prove that the mean of a sum of random variables Z, defined as Z = X + Y, equals the sum of their means, E(Z) = E(X) + E(Y), one can utilize the properties of joint and marginal distributions. The proof involves integrating the joint distribution function F(x,y) over the respective variables, leading to the conclusion that E(X + Y) can be expressed as the sum of the integrals of E(X) and E(Y). For independent variables, this can be shown by reversing the order of integration and applying a change of variables. In cases where the variables are dependent, analyzing the joint distribution remains crucial. This approach confirms the theorem that the mean of a sum is the sum of the means.
Mppl
Messages
12
Reaction score
0
How do I prove that the mean of a random variable Z which is the sum of to other random variables X and Y is the sum of the mean of X with the mean of Y?
 
Physics news on Phys.org
Essentially the theorem is equivalent to the theorem that the integral of a sum is the sum of the integrals.
 
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.
 
Mppl said:
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.

Yes that'll eventually give you a proof for the special case where the rv's are independent and have densities (involves reversing the order of integration and a change of variables).

Another approach that would work for the non-independent case is to consider separately the joint distribution and marginal distributions of X and Y.
 
well but I'm not being able to prove it either for dependent or independent variables, can you please show me the proof or tell me where I can find it? thank you
 
Two random variables.
E(X+Y)=∫∫(x+y)dF(x,y)=∫∫xdF(x,y) + ∫∫ydF(x,y).
Integrate with respect to y in the first integral and integrate with respect to x in the second integral. You will be left with E(X) + E(Y).

In the above F(x,y) is the joint distribution function.
 
I was reading documentation about the soundness and completeness of logic formal systems. Consider the following $$\vdash_S \phi$$ where ##S## is the proof-system making part the formal system and ##\phi## is a wff (well formed formula) of the formal language. Note the blank on left of the turnstile symbol ##\vdash_S##, as far as I can tell it actually represents the empty set. So what does it mean ? I guess it actually means ##\phi## is a theorem of the formal system, i.e. there is a...

Similar threads

Back
Top