I On the expected value of a sum of a random number of r.v.s.

  • I
  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Probability theory
AI Thread Summary
The discussion centers on a theorem regarding the expected value of the sum of a random number of independent and identically distributed (i.i.d.) random variables. It establishes that if the expected number of terms, EN, and the expected value of the individual terms, E|X|, are both finite, then the expected value of the sum, ES_N, equals EN multiplied by EX. The author seeks clarification on why both conditions are necessary for the existence of ES_N through generating functions, suggesting that the finiteness of E|S_N| implies the need for EN and E|X_1| to be finite. The conversation also touches on the relationship between S_N and the random variables involved, confirming that S_N is a function of both N and the X variables. Overall, the discussion emphasizes the importance of these conditions in deriving the expected value of S_N.
psie
Messages
315
Reaction score
40
TL;DR Summary
I am confused about a proof concerning the expectation of a sum of a random number of random variables
There's a theorem in An Intermediate Course in Probability by Gut that says if ##E|X|<\infty\implies EX=g_X'(1)##, where ##g_X## is the probability generating function. Now, consider the r.v. ##S_N##, which is the sum of a random number ##N## of terms of i.i.d. r.v.s. ##X_1,X_2,\ldots## (everything's nonnegative integer-valued, and ##N## is independent of ##X_1,X_2,\ldots##). One can derive the probability generating function for ##S_N##, namely ##g_{S_N}(t)=g_N(g_X(t))##. I am now reading a theorem that states;

Theorem If ##EN<\infty## and ##E|X|<\infty##, then ##ES_N=EN\cdot EX##.

The author proves this using the theorem I stated in the beginning, namely that ##E|X|<\infty\implies EX=g_X'(1)##. What I don't understand is why we require ##EN<\infty## and ##E|X|<\infty##. For ##ES_N## to exist via generating functions, we require ##E|S_N|<\infty##, but I don't see how this means that we should require ##EN<\infty## and ##E|X|<\infty##.

One idea that comes to mind is the following, but I'm not sure if this is correct: $$E|S_N|=E(|X_1+\ldots +X_N|)\leq E(|X_1|+\ldots +|X_N|)=E (N|X_1|)=EN E|X_1|,$$and so we see that ##E|S_N|## is finite if ##EN## and ##E|X_1|## are finite, as required by theorem. But I'm doubting if ##E(|X_1|+\ldots +|X_N|)=E (N|X_1|)## is correct. Grateful for any confirmation or help.
 
Physics news on Phys.org
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
 
Office_Shredder said:
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
Silly question maybe, but which variables is ##S_N## and consequently ##|S_N|## a function of? Certainly ##N##, but is it correct to say it is also a function of ##X_1,\ldots,X_N##?
 
I think we should be able to write $$S_N = \sum_{j = 1}^{\infty}X_j \mathbf1_{j \leq N},$$ so ##S_N## is ##\sigma((Y_n)_{n\in\mathbb N})##-measurable, where ##Y_1=N, Y_2=X_1, Y_3=X_2, \ldots##.
 

Similar threads

Back
Top