Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Sum of random number of random variables

  1. Jul 31, 2008 #1
    Hi, Guys,
    I'm new to this forum, and don't have strong background in probability theory, so please bare with me if the question is too naive.

    Here's the question,

    In a problem I'm trying to model, I have a random variable (say, R), which is a sum of random number (say, N) of random variables (say, Hi), in which all Hi are i.i.d..

    I have distribution of both N and Hi, and I am interested in the expected value and variance of R.

    Any suggestions how I can get it? My initial thought is E(R) = E(N)*E(Hi), but i feel it not quite right.. and it's even harder to have variance of R.

    I did some googling, and found out ways to sum rvs, but not so much of how to find random sums..

    Any suggestions? or hint about where I can find related information?

    Last edited: Jul 31, 2008
  2. jcsd
  3. Aug 1, 2008 #2
    Use the tower property which says that [tex] E(E(X|Y))=E(X) [/tex]. In your case the solution is [tex] E\left(\sum_{i=1}^N H_i \right)=E\left(E\left(\sum_{i=1}^N H_i | N\right)\right)=E\left(\sum_{i=1}^N E(H_i)\right)=E(N E(H_1)) [/tex]. Furthermore is N and His are independent then you can say that [tex] E(R)=E(N)E(H_1) [/tex]. Hope this helps.
  4. Aug 1, 2008 #3


    User Avatar
    Science Advisor
    Gold Member

    To get the variance, you can apply the same approach (as Focus) to get the second moment and then use the usual relationship between second moment and variance.
  5. Aug 1, 2008 #4
    Thanks very much for all your replies, guys~~

    I'll look into the suggested approach, thanks a bunch~~
  6. Aug 3, 2008 #5
    This question is related to another, so if I may, I'd like to add it to this thread.

    In my Sheldon Ross, First Course in Probability, there is a derivation that has stumped me. The author wants to show how to use the conditional variance formula

    [tex]Var(X) = E[Var(X|Y)] + Var(E[X|Y])[/tex]

    to derive the following identity:

    [tex]Var(\sum_{i = 1}^{N}X_i) = E[N]Var(X) + (E[X])^2Var(N)[/tex]

    but he skips some steps and succeeds in losing me. :-) All he says, by way of derivation, is that the following two statements hold:

    [tex]E[\sum_{i = 1}^{N}X_i|N] = NE[X][/tex]
    [tex]Var(\sum_{i = 1}^{N}X_i|N) = NVar(X)[/tex]

    But if I substitute these into the conditional variance formula I get:

    [tex]Var(X) = E[Var(X|N)] + Var(E[X|N])[/tex]
    [tex] = E[NVar(X)] + Var(NE[X])[/tex]
    [tex] = E[N]E[Var(X)] + Var(NE[X])[/tex]

    In the last step, I can separate E[N] because N and X are independent, but I can think of no further simplifications. I've been looking around for a handy identity for a variance of a product, but cannot find anything.

  7. Aug 3, 2008 #6


    User Avatar

    how about using [itex]E[Var(X)]=Var(X)[/itex] and [itex]Var(NE[X])=Var(N)E[X]^2[/itex]?
  8. Aug 3, 2008 #7
    E[X] is just a number, so you have to work out the variance of a constant times N. That is standard: Var(kY)=k^2.Var(Y) for k constant and Y an r.v.

    Don't forget that Var (Y)= E(Y^2) - E(Y)^2 as well, when you're doing things like this. So if U and V are independent

    Var(UV)= E(U^2V^2) - E(UV)^2 = E(U^2)E(V^2) - E(U)^2E(V^2)

    which can be related, albeit messily, to Var(U) and Var(V).
  9. Aug 3, 2008 #8
    "E[X] is just a number" - Yes! Seems obvious now, but that is what I was overlooking.

    I suppose we could also say that Var(X) is just a number, which explains the other identity that I overlooked: E[Var(X)] = Var(X).

    Thanks very much for these helpful replies!
  10. Feb 25, 2010 #9
    Please consider this one as well.

    I have a set of (say N) random variables (X_i), of which I know the pmf. I want to find the probability of (sum(i=1 to N) X_i)=K where K is a consnant.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Sum of random number of random variables