Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Variance of a sum

  1. Jan 8, 2008 #1
    If I have a set of indepenent and identically distributed random variables X1,...Xn, then [tex]Var(\sum_{i=1}^{n}X_i) = \sum_{i=1}^{n}Var(X_i)[/tex].

    Now I want to know what the sum of variances of Xi would be when n is a random variable?
    I'm guessing the above statement still holds when n is a random variable, but when I work out both sides of the above statement, I get two different answers.

    For example, [tex]Var(\sum_{i=1}^{n}X_i)[/tex] will now be the variance of a random sum of random variables which can be worked out using the total law of variance, and comes out as E(n)*Var(X1) + Var(n)*E(X1)^2.
    But evaluating the other side of the expression [tex]\sum_{i=1}^{n}Var(X_i)[/tex] when n is a random variable comes out as E(n)*Var(X1).

    So, I don't understand why I'm getting two different answers here?? Which one is correct?? I think they should be the same.
  2. jcsd
  3. Jan 8, 2008 #2


    User Avatar
    Science Advisor
    Gold Member

    The first equation (var(sum)=sum(var)) does not hold if n is a random variable.
  4. Jan 8, 2008 #3


    User Avatar
    Science Advisor
    Homework Helper

    By Law of Total Variance, and letting E[X] = [itex]\mu[/itex] and Var[X] = [itex]\sigma^2[/itex],

    Var[[itex]\sum_1^n[/itex] X] = E[Var[[itex]\sum_1^n[/itex] X|n]] + Var[E[[itex]\sum_1^n[/itex] X|n]] = E[Var[[itex]\sum_1^n[/itex] X]] + Var[n[itex]\mu[/itex]|n] = E[n[itex]\sigma^2[/itex]] + 0 = [itex]\sigma^2[/itex]E[n].
    Last edited: Jan 8, 2008
  5. Jan 8, 2008 #4
    Are you sure the terms highlighted are correct?

    Shouldn't it read:

    Var[E[[itex]\sum_1^n[/itex] X|n]] = Var[n[itex]\mu[/itex]] = Var[n][itex]\mu^2[/itex] ??

    And therefore, as mathman has stated (var(sum)=sum(var)) does not hold when n is a random variable?
  6. Jan 8, 2008 #5
    I agree with Jimmy.
  7. Jan 9, 2008 #6


    User Avatar
    Science Advisor
    Homework Helper

    I agree to the general proposition; in my previous post I was making an error: I wrote Var[E[sum|n]] = Var[n[itex]\mu[/itex]|n], which should have been Var[n[itex]\mu[/itex]].

    As I thought about the problem, I came to realize the following two special cases.

    First, if [itex]\mu[/itex] = 0 then the "complicated" formula (with the Var[n[itex]\mu[/itex]] term) reduces to the simple formula. For example, if the X's are distributed normally with mean 0, then there is no difference between the two formulas.

    Second, the linear relationship E[[itex]\sum_1^n[/itex] X|n] = a + b n, where a = 0 and b = [itex]\mu[/itex] implies:

    Explained variance/Total variance = Var[E[[itex]\sum_1^n[/itex] X|n]]/Var[[itex]\sum_1^n[/itex] X] = Corr[[itex]\sum_1^n[/itex] X, n]2 or Var[E[[itex]\sum_1^n[/itex] X|n]] = Corr[[itex]\sum_1^n[/itex] X, n]2 Var[[itex]\sum_1^n[/itex] X] ... ... ... ... ... ... ... ... ... [Eq. 1],

    which implies that the degree to which that the simple formula differs from the complicated formula is an empirical question. If it so happens that the correlation between the sum of X's and n is not significantly different from zero, then the two formulas will produce practically an identical result.

    Here is a neat point, though: one can look at the equation [itex]\sum_1^n[/itex] X = [itex]\alpha[/itex] + [itex]\beta[/itex]n + [itex]\epsilon[/itex] as a least squares regression, where E[itex]\alpha[/itex] = 0, E[itex]\beta[/itex] = [itex]\mu[/itex] and E[itex]\epsilon[/itex] = 0. Remember that the least squares estimator [itex]\beta[/itex] of b in Y = a + bZ is [itex]\beta[/itex] = Cov(Y,Z)/Var[Z]. By letting Y = [itex]\sum_1^n[/itex] X and Z = n, one has [itex]\beta[/itex] = Cov([itex]\sum_1^n[/itex] X, n)/Var[n]. But E[itex]\beta[/itex] = [itex]\mu[/itex], so when Var[n] is given, there is a direct relationship between the Cov term and [itex]\mu[/itex], the mean of each X. And since Corr(Y,Z) = Cov(Y,Z)/([itex]\sigma_Y\sigma_Z[/itex]), there is a direct relationship between Corr[[itex]\sum_1^n[/itex] X, n] and [itex]\mu[/itex].

    With a little computer programming, one can verify that when the X's are i.i.d. Normal[0,1], Corr[[itex]\sum_1^n[/itex] X, n] ---> 0 as expected (because [itex]\mu[/itex] = 0, the linear relationship implies zero correlation: b = [itex]\mu[/itex] = 0 and the [unbiased] least squares estimator of b = [itex]\beta[/itex] = 0; therefore the correlation has to be zero).

    The intuition is that if the X's are sampled equally on both sides of the origin, then the number of X's being summed up does not change the expected value of the sum (= zero). Therefore the correlation between the sum and n is zero. Even neater, if [itex]\mu\approx[/itex] 0, then Cov [itex]\approx[/itex] 0, i.e. the simple relationship can be a good approximation even when [itex]\mu[/itex] isn't identically zero. The approximation worsens as [itex]\mu[/itex] gets farther away from zero. Which is a roundabout way of saying that Var[n[itex]\mu[/itex]] [itex]\approx[/itex] 0 if [itex]\mu\approx[/itex] 0.
    Last edited: Jan 9, 2008
  8. Apr 9, 2008 #7
    hello people,

    this is sofie.
    i was wondering how jimmy1 got the result:
    var(\sum_{i=1}^{N})=E(N)var(X) + var(N)(E(X))^2 using the law of total variance.

    hope someone can tell me,
  9. Apr 9, 2008 #8
    I'm confused because you all use the notation n for both the random variable N and for the value of this N.
    I get E(nVar(X1))+(E(X1))^2Var(n) and I don't know how that is the same as when n would be replaced by N.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?