Sum of random number of random variables

In summary, The conversation is about a question regarding the expected value and variance of a random variable that is a sum of other random variables. Suggestions are given to use the tower property and the conditional variance formula to derive the desired identities. The idea of treating E[X] as just a constant is also mentioned. Another question is brought up regarding finding the probability of a specific sum of random variables.
  • #1
fredliu
2
0
Hi, Guys,
I'm new to this forum, and don't have strong background in probability theory, so please bare with me if the question is too naive.

Here's the question,

In a problem I'm trying to model, I have a random variable (say, R), which is a sum of random number (say, N) of random variables (say, Hi), in which all Hi are i.i.d..

I have distribution of both N and Hi, and I am interested in the expected value and variance of R.

Any suggestions how I can get it? My initial thought is E(R) = E(N)*E(Hi), but i feel it not quite right.. and it's even harder to have variance of R.

I did some googling, and found out ways to sum rvs, but not so much of how to find random sums..

Any suggestions? or hint about where I can find related information?

Thanks
 
Last edited:
Physics news on Phys.org
  • #2
Use the tower property which says that [tex] E(E(X|Y))=E(X) [/tex]. In your case the solution is [tex] E\left(\sum_{i=1}^N H_i \right)=E\left(E\left(\sum_{i=1}^N H_i | N\right)\right)=E\left(\sum_{i=1}^N E(H_i)\right)=E(N E(H_1)) [/tex]. Furthermore is N and His are independent then you can say that [tex] E(R)=E(N)E(H_1) [/tex]. Hope this helps.
 
  • #3
To get the variance, you can apply the same approach (as Focus) to get the second moment and then use the usual relationship between second moment and variance.
 
  • #4
Thanks very much for all your replies, guys~~

I'll look into the suggested approach, thanks a bunch~~
 
  • #5
This question is related to another, so if I may, I'd like to add it to this thread.

In my Sheldon Ross, First Course in Probability, there is a derivation that has stumped me. The author wants to show how to use the conditional variance formula

[tex]Var(X) = E[Var(X|Y)] + Var(E[X|Y])[/tex]

to derive the following identity:

[tex]Var(\sum_{i = 1}^{N}X_i) = E[N]Var(X) + (E[X])^2Var(N)[/tex]

but he skips some steps and succeeds in losing me. :-) All he says, by way of derivation, is that the following two statements hold:

[tex]E[\sum_{i = 1}^{N}X_i|N] = NE[X][/tex]
[tex]Var(\sum_{i = 1}^{N}X_i|N) = NVar(X)[/tex]

But if I substitute these into the conditional variance formula I get:

[tex]Var(X) = E[Var(X|N)] + Var(E[X|N])[/tex]
[tex] = E[NVar(X)] + Var(NE[X])[/tex]
[tex] = E[N]E[Var(X)] + Var(NE[X])[/tex]

In the last step, I can separate E[N] because N and X are independent, but I can think of no further simplifications. I've been looking around for a handy identity for a variance of a product, but cannot find anything.

Suggestions?
 
  • #6
pluviosilla said:
[tex]Var(X) = E[Var(X|N)] + Var(E[X|N])[/tex]
[tex] = E[NVar(X)] + Var(NE[X])[/tex]
[tex] = E[N]E[Var(X)] + Var(NE[X])[/tex]

...

Suggestions?

how about using [itex]E[Var(X)]=Var(X)[/itex] and [itex]Var(NE[X])=Var(N)E[X]^2[/itex]?
 
  • #7
E[X] is just a number, so you have to work out the variance of a constant times N. That is standard: Var(kY)=k^2.Var(Y) for k constant and Y an r.v.

Don't forget that Var (Y)= E(Y^2) - E(Y)^2 as well, when you're doing things like this. So if U and V are independent

Var(UV)= E(U^2V^2) - E(UV)^2 = E(U^2)E(V^2) - E(U)^2E(V^2)

which can be related, albeit messily, to Var(U) and Var(V).
 
  • #8
"E[X] is just a number" - Yes! Seems obvious now, but that is what I was overlooking.

I suppose we could also say that Var(X) is just a number, which explains the other identity that I overlooked: E[Var(X)] = Var(X).

Thanks very much for these helpful replies!
 
  • #9
Please consider this one as well.

I have a set of (say N) random variables (X_i), of which I know the pmf. I want to find the probability of (sum(i=1 to N) X_i)=K where K is a consnant.
 

1. What is the definition of "Sum of random number of random variables"?

The sum of random number of random variables refers to the total value obtained by adding together a specific number of random variables. This number can vary and is also a random variable, leading to a sum of random variables.

2. How is the sum of random number of random variables calculated?

The sum of random number of random variables is calculated by adding the values of each individual random variable. This can be done by hand or using mathematical formulas, depending on the specific variables and their distribution.

3. What is the significance of studying the sum of random number of random variables?

Studying the sum of random number of random variables is important in understanding probability and statistics. It allows us to analyze and predict outcomes of events that involve multiple random variables, which is common in many real-world scenarios.

4. Can the sum of random number of random variables be negative?

Yes, the sum of random number of random variables can be negative depending on the values and distribution of the individual random variables. It is possible for some variables to have negative values, resulting in a negative sum.

5. How does the Central Limit Theorem apply to the sum of random number of random variables?

The Central Limit Theorem states that the sum of a large number of independent random variables tends to follow a normal distribution. Therefore, as the number of random variables increases in the sum of random number of random variables, the overall distribution will approach a normal distribution.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
366
  • Set Theory, Logic, Probability, Statistics
Replies
6
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
30
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
611
  • Set Theory, Logic, Probability, Statistics
Replies
11
Views
390
  • Set Theory, Logic, Probability, Statistics
Replies
15
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
909
  • Set Theory, Logic, Probability, Statistics
Replies
3
Views
1K
Back
Top