I On the expected value of a sum of a random number of r.v.s.

  • I
  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Probability theory
psie
Messages
315
Reaction score
40
TL;DR Summary
I am confused about a proof concerning the expectation of a sum of a random number of random variables
There's a theorem in An Intermediate Course in Probability by Gut that says if ##E|X|<\infty\implies EX=g_X'(1)##, where ##g_X## is the probability generating function. Now, consider the r.v. ##S_N##, which is the sum of a random number ##N## of terms of i.i.d. r.v.s. ##X_1,X_2,\ldots## (everything's nonnegative integer-valued, and ##N## is independent of ##X_1,X_2,\ldots##). One can derive the probability generating function for ##S_N##, namely ##g_{S_N}(t)=g_N(g_X(t))##. I am now reading a theorem that states;

Theorem If ##EN<\infty## and ##E|X|<\infty##, then ##ES_N=EN\cdot EX##.

The author proves this using the theorem I stated in the beginning, namely that ##E|X|<\infty\implies EX=g_X'(1)##. What I don't understand is why we require ##EN<\infty## and ##E|X|<\infty##. For ##ES_N## to exist via generating functions, we require ##E|S_N|<\infty##, but I don't see how this means that we should require ##EN<\infty## and ##E|X|<\infty##.

One idea that comes to mind is the following, but I'm not sure if this is correct: $$E|S_N|=E(|X_1+\ldots +X_N|)\leq E(|X_1|+\ldots +|X_N|)=E (N|X_1|)=EN E|X_1|,$$and so we see that ##E|S_N|## is finite if ##EN## and ##E|X_1|## are finite, as required by theorem. But I'm doubting if ##E(|X_1|+\ldots +|X_N|)=E (N|X_1|)## is correct. Grateful for any confirmation or help.
 
Physics news on Phys.org
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
 
Office_Shredder said:
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
Silly question maybe, but which variables is ##S_N## and consequently ##|S_N|## a function of? Certainly ##N##, but is it correct to say it is also a function of ##X_1,\ldots,X_N##?
 
I think we should be able to write $$S_N = \sum_{j = 1}^{\infty}X_j \mathbf1_{j \leq N},$$ so ##S_N## is ##\sigma((Y_n)_{n\in\mathbb N})##-measurable, where ##Y_1=N, Y_2=X_1, Y_3=X_2, \ldots##.
 
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Thread 'Detail of Diagonalization Lemma'
The following is more or less taken from page 6 of C. Smorynski's "Self-Reference and Modal Logic". (Springer, 1985) (I couldn't get raised brackets to indicate codification (Gödel numbering), so I use a box. The overline is assigning a name. The detail I would like clarification on is in the second step in the last line, where we have an m-overlined, and we substitute the expression for m. Are we saying that the name of a coded term is the same as the coded term? Thanks in advance.

Similar threads

Back
Top