The mean of a sum of variables.

In summary, to prove that the mean of a random variable Z which is the sum of two other random variables X and Y is the sum of the mean of X with the mean of Y, you can use the theorem that the integral of a sum is the sum of the integrals. This can be done by considering separately the joint distribution and marginal distributions of X and Y. If the variables are independent and have densities, you can also use a change of variables and reversing the order of integration.
  • #1
13
0
How do I prove that the mean of a random variable Z which is the sum of to other random variables X and Y is the sum of the mean of X with the mean of Y?
 
Physics news on Phys.org
  • #2
Essentially the theorem is equivalent to the theorem that the integral of a sum is the sum of the integrals.
 
  • #3
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.
 
  • #4
Mppl said:
well I obviously know that the integral of the sum is the sum of the integral but I don't know how I can relate that to the situation a mentioned, can you please be more specific?
I'm trying to prove it and I'm getting a convultion integral so far...
thank you.

Yes that'll eventually give you a proof for the special case where the rv's are independent and have densities (involves reversing the order of integration and a change of variables).

Another approach that would work for the non-independent case is to consider separately the joint distribution and marginal distributions of X and Y.
 
  • #5
well but I'm not being able to prove it either for dependent or independent variables, can you please show me the proof or tell me where I can find it? thank you
 
  • #6
Two random variables.
E(X+Y)=∫∫(x+y)dF(x,y)=∫∫xdF(x,y) + ∫∫ydF(x,y).
Integrate with respect to y in the first integral and integrate with respect to x in the second integral. You will be left with E(X) + E(Y).

In the above F(x,y) is the joint distribution function.
 

What is the mean of a sum of variables?

The mean of a sum of variables is a statistical measure that represents the average value of a set of variables added together. It is calculated by dividing the sum of the variables by the total number of variables in the set.

How is the mean of a sum of variables calculated?

To calculate the mean of a sum of variables, you first add all of the variables together. Then, you divide the sum by the total number of variables in the set. This will give you the mean of the sum of variables.

Why is the mean of a sum of variables important?

The mean of a sum of variables is important because it provides a single value that represents the central tendency of a set of data. It can help to summarize and understand the overall pattern or trend in the data.

Can the mean of a sum of variables be influenced by outliers?

Yes, the mean of a sum of variables can be influenced by outliers. Outliers are extreme values that are significantly different from the rest of the data and can skew the results of the mean. It is important to identify and handle outliers appropriately in order to accurately interpret the mean of a sum of variables.

Is the mean of a sum of variables affected by the order of the variables?

No, the mean of a sum of variables is not affected by the order of the variables. This is because the mean is calculated by adding all of the variables together and then dividing by the total number of variables, regardless of their order. Therefore, the result will be the same regardless of the order in which the variables are added.

Suggested for: The mean of a sum of variables.

Back
Top