Sum of random number of random variables

Click For Summary

Discussion Overview

The discussion revolves around the expected value and variance of a random variable that is defined as the sum of a random number of independent and identically distributed (i.i.d.) random variables. Participants explore various approaches to derive these statistical properties, including the use of conditional expectations and variance formulas.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant expresses uncertainty about calculating the expected value and variance of a random variable defined as a sum of a random number of i.i.d. random variables, suggesting an initial thought that E(R) = E(N)*E(Hi) might be incorrect.
  • Another participant proposes using the tower property of expectation to derive E(R) = E(N)E(H1), assuming independence between N and Hi.
  • A different participant suggests applying a similar approach to derive the variance, indicating the need to calculate the second moment first.
  • One participant raises a related question about deriving a specific variance identity from a textbook, expressing confusion over the steps involved in using the conditional variance formula.
  • Another participant discusses the variance of a product of a constant and a random variable, noting the standard relationship Var(kY) = k^2Var(Y) for a constant k.
  • One participant acknowledges a realization about treating E[X] and Var(X) as constants in the context of variance calculations.
  • A later post introduces a new question regarding finding the probability of the sum of a set of random variables equaling a constant, indicating a shift in focus.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the best approach to derive the variance identity or on the specific steps involved in the calculations. Multiple competing views and methods are presented, and some participants express confusion or seek clarification on certain points.

Contextual Notes

Limitations include potential missing assumptions regarding the independence of random variables and the specific distributions involved. The discussion also reflects varying levels of understanding of probability theory among participants.

Who May Find This Useful

This discussion may be useful for individuals interested in probability theory, particularly those exploring the properties of sums of random variables and their statistical characteristics.

fredliu
Messages
2
Reaction score
0
Hi, Guys,
I'm new to this forum, and don't have strong background in probability theory, so please bare with me if the question is too naive.

Here's the question,

In a problem I'm trying to model, I have a random variable (say, R), which is a sum of random number (say, N) of random variables (say, Hi), in which all Hi are i.i.d..

I have distribution of both N and Hi, and I am interested in the expected value and variance of R.

Any suggestions how I can get it? My initial thought is E(R) = E(N)*E(Hi), but i feel it not quite right.. and it's even harder to have variance of R.

I did some googling, and found out ways to sum rvs, but not so much of how to find random sums..

Any suggestions? or hint about where I can find related information?

Thanks
 
Last edited:
Physics news on Phys.org
Use the tower property which says that E(E(X|Y))=E(X). In your case the solution is E\left(\sum_{i=1}^N H_i \right)=E\left(E\left(\sum_{i=1}^N H_i | N\right)\right)=E\left(\sum_{i=1}^N E(H_i)\right)=E(N E(H_1)). Furthermore is N and His are independent then you can say that E(R)=E(N)E(H_1). Hope this helps.
 
To get the variance, you can apply the same approach (as Focus) to get the second moment and then use the usual relationship between second moment and variance.
 
Thanks very much for all your replies, guys~~

I'll look into the suggested approach, thanks a bunch~~
 
This question is related to another, so if I may, I'd like to add it to this thread.

In my Sheldon Ross, First Course in Probability, there is a derivation that has stumped me. The author wants to show how to use the conditional variance formula

Var(X) = E[Var(X|Y)] + Var(E[X|Y])

to derive the following identity:

Var(\sum_{i = 1}^{N}X_i) = E[N]Var(X) + (E[X])^2Var(N)

but he skips some steps and succeeds in losing me. :-) All he says, by way of derivation, is that the following two statements hold:

E[\sum_{i = 1}^{N}X_i|N] = NE[X]
Var(\sum_{i = 1}^{N}X_i|N) = NVar(X)

But if I substitute these into the conditional variance formula I get:

Var(X) = E[Var(X|N)] + Var(E[X|N])
= E[NVar(X)] + Var(NE[X])
= E[N]E[Var(X)] + Var(NE[X])

In the last step, I can separate E[N] because N and X are independent, but I can think of no further simplifications. I've been looking around for a handy identity for a variance of a product, but cannot find anything.

Suggestions?
 
pluviosilla said:
Var(X) = E[Var(X|N)] + Var(E[X|N])
= E[NVar(X)] + Var(NE[X])
= E[N]E[Var(X)] + Var(NE[X])

...

Suggestions?

how about using E[Var(X)]=Var(X) and Var(NE[X])=Var(N)E[X]^2?
 
E[X] is just a number, so you have to work out the variance of a constant times N. That is standard: Var(kY)=k^2.Var(Y) for k constant and Y an r.v.

Don't forget that Var (Y)= E(Y^2) - E(Y)^2 as well, when you're doing things like this. So if U and V are independent

Var(UV)= E(U^2V^2) - E(UV)^2 = E(U^2)E(V^2) - E(U)^2E(V^2)

which can be related, albeit messily, to Var(U) and Var(V).
 
"E[X] is just a number" - Yes! Seems obvious now, but that is what I was overlooking.

I suppose we could also say that Var(X) is just a number, which explains the other identity that I overlooked: E[Var(X)] = Var(X).

Thanks very much for these helpful replies!
 
Please consider this one as well.

I have a set of (say N) random variables (X_i), of which I know the pmf. I want to find the probability of (sum(i=1 to N) X_i)=K where K is a consnant.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
3K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 30 ·
2
Replies
30
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 10 ·
Replies
10
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K