Mean of a sum of random variables

albega
Messages
74
Reaction score
0

Homework Statement


If Y=X1+X2+...+XN prove that <Y>=<X1>+<X2>+...+<XN>

Homework Equations


<Y>=∫YP(Y)dY over all Y.

The Attempt at a Solution


I only seem to be able to show this if the Xi are independent, and I also think my proof may be very wrong. I basically have said that we can write the probability in the interval X1+dX1, X2+dX2,..., XN+dXN, as
j=1nPXj(Xj)dXj (I really doubt this is right).
Then
<Y>=∫(∑i=1nXi)∏j=1nPXj(Xj)dXj
=∑i=1n∫Xi∏j=1nPXj(Xj)dXj
then all the integrals apart from the ith one go to one because the various probability functions are normalised so
=∑i=1n∫XiPXi(Xi)dXi
=∑i=1n<Xi>
however in saying all the integrals go to one, I have assumed I could separate all the integrals, i.e that the variables were independent.

Also, is there not a really easy way to prove this - I can't seem to find any books/websites proving it making me think it's just trivial...
 
Last edited:
Physics news on Phys.org
Your proof seems sound at first glance. It should be easily fixed for all distributions using P(A|B) P(B) = P(AB).
 
For example, if the joint density of x_1, x_2 is f(x_1,x_2)

\int \int { (x_1 + x_2) f(x_1,x_2) } dx_1 dx_2 = \int \int {x_1 f(x_1,x_2)} dx_1 dx_2 + \int \int {x_2 f(x_1,x_2) } dx_1 dx_2

Then an individual integals like \int \int x_2 f(x_1,x_2) dx_1 dx_2 have the general pattern (expressed in different variables) of

\int \int h(r) f(r,s) ds\ dr = \int\ h(r)\ ( \int f(r,s) ds )\ dr

The integration \int {f(r,s)} ds produces the density function for r. (It's integration of a joint density to produce a marginal density.)
 
albega said:

Homework Statement


If Y=X1+X2+...+XN prove that <Y>=<X1>+<X2>+...+<XN>

Homework Equations


<Y>=∫YP(Y)dY over all Y.

The Attempt at a Solution


I only seem to be able to show this if the Xi are independent, and I also think my proof may be very wrong. I basically have said that we can write the probability in the interval X1+dX1, X2+dX2,..., XN+dXN, as
j=1nPXj(Xj)dXj (I really doubt this is right).
Then
<Y>=∫(∑i=1nXi)∏j=1nPXj(Xj)dXj
=∑i=1n∫Xi∏j=1nPXj(Xj)dXj
then all the integrals apart from the ith one go to one because the various probability functions are normalised so
=∑i=1n∫XiPXi(Xi)dXi
=∑i=1n<Xi>
however in saying all the integrals go to one, I have assumed I could separate all the integrals, i.e that the variables were independent.

Also, is there not a really easy way to prove this - I can't seem to find any books/websites proving it making me think it's just trivial...

The result is true in general, even if the variables ##X_1, X_2, \ldots, X_n## dependent. Look at the case ##n=2##, and take sums instead of integrals (because the result is also true for discrete random variables). Using the standard notation ##E## for expectation (instead of your Physics-oriented notation ##\langle \cdot \rangle##) we have, for a joint probability mass function ##P\{X_1 = k_1, X_2 =k_2 \} = p_{12}(k_1,k_2)##:
E(X_1 + X_2) = \sum_{k_1,k_2} p_{12}(k_1,k_2) (k_1 + k_2)<br /> = \sum_{k_1,k_2} p_{12}(k_1,k_2) k_1 + \sum_{k_1,k_2} p_{12}(k_1,k_2) k_2 \\<br /> = \sum_{k_1} k_1\underbrace{ \left( \sum_{k_2} p_{12} (k_1,k_2) \right) }_{=p_1(k_1)}<br /> + \sum_{k_2} k_2 \underbrace{\left( \sum_{k_1} p_{12} (k_1,k_2) \right)}_{=p_2(k_2)} \\<br /> = E X_1 + E X_2
Here, ##p_1(k_1) = P\{ X_1 = k_1 \}## and ##p_2(k_2) = P \{ X_2 = k_2 \}## are the marginal probability mass functions of ##X_1## and ##X_2## separately.
 
Last edited:
Orodruin said:
x

Stephen Tashi said:
x

Ray Vickson said:
x

Thanks for the replies. I have another issue related to the same setup... If we have n independent random variables given by Xi from i=1 to n, each with the same mean <X> and the same variance, how do we know <Xi2>=<X2>. I can't see this although I'm guessing it's obvious...
 
Start from the definition of variance V(X) = <X^2> - <X>^2 (or, equivalently, V(X) = <(X - <X>)^2>).
 
Orodruin said:
Start from the definition of variance V(X) = <X^2> - <X>^2 (or, equivalently, V(X) = <(X - <X>)^2>).
<Xi2>=<Xi>2+V(X)=<X>2+V(X)=<X2>?
 
What is your definition of X?
 
Orodruin said:
What is your definition of X?

Some random variable that has a probability distribution with mean <X> and variance V(X)=<X2>-<X>2?
 
  • #10
I am just saying, because it is not clear if it is one of the Xi or not.
 
  • #11
Orodruin said:
I am just saying, because it is not clear if it is one of the Xi or not.
No it isn't...
 
  • #12
Does it have the same variance as the Xi? If it does, you can just as well include it among them and otherwise the statement is not really correct.
 
  • #13
Orodruin said:
Does it have the same variance as the Xi? If it does, you can just as well include it among them and otherwise the statement is not really correct.

Well no I guess I have just assumed that and I shouldn't have, which means that proof doesn't work...
 
  • #14
What I am curious about is if this is the actual problem statement:
albega said:
each with the same mean <X> and the same variance
Which would mean that <Xi> = <X>, but not necessarily V(Xi) = V(X), unless you also add a "V(X)" after "same variance".

If V(Xi) = V(X), then the problem is trivial as you noticed. If it is not, then the statement is false.
 
  • #15
Orodruin said:
What I am curious about is if this is the actual problem statement:

Which would mean that <Xi> = <X>, but not necessarily V(Xi) = V(X), unless you also add a "V(X)" after "same variance".

If V(Xi) = V(X), then the problem is trivial as you noticed. If it is not, then the statement is false.

Oh I see... The actual statement is
'each with the same mean <X> and the same variance σx2'
but I didn't think defining what it actually was mattered but it clearly does. So I guess that makes it fine then.
Thankyou.
 
  • #16
[

albega said:
If we have n independent random variables given by Xi from i=1 to n, each with the same mean <X> and the same variance, how do we know <Xi2>=<X2>.

It isn't clear what the notation &lt;X^2&gt; signifies. If each of X_i has mean \mu then it is not true that the mean value of X_i^2 must equal \mu^2. If R and W are independent random variables that have the same mean value then it is not true that R^2 and W^2 must have the same mean value.
If R and W are independent, identically distributed random variables, you could get that result.
 
  • #17
Stephen Tashi said:
[
It isn't clear what the notation &lt;X^2&gt; signifies. If each of X_i has mean \mu then it is not true that the mean value of X_i^2 must equal \mu^2. If R and W are independent random variables that have the same mean value then it is not true that R^2 and W^2 must have the same mean value.
If R and W are independent, identically distributed random variables, you could get that result.

I disagree, <X^2> is standard notation for the expectation value of X^2, not for the square of the expectation value, which is normally written <X>^2. If the mean and variance of all of the stochastic variables are the same, then so is the expectation values of their squares.
 
Back
Top