On the expected value of a sum of a random number of r.v.s.

  • Context: Undergrad 
  • Thread starter Thread starter psie
  • Start date Start date
  • Tags Tags
    Probability theory
Click For Summary
SUMMARY

The discussion centers on the theorem from "An Intermediate Course in Probability" by Gut, which states that if the expected value of a random variable (r.v.) is finite, then the expected value can be derived from its probability generating function. Specifically, for the sum of a random number of independent and identically distributed (i.i.d.) nonnegative integer-valued r.v.s, denoted as ##S_N##, the theorem asserts that if both the expected number of terms ##EN## and the expected value of the r.v. ##E|X|## are finite, then the expected value of the sum ##ES_N## equals the product of these two expectations. The requirement for both ##EN<\infty## and ##E|X|<\infty## is essential to ensure the finiteness of ##E|S_N|##, as demonstrated through the application of triangle inequalities.

PREREQUISITES
  • Understanding of probability generating functions (PGFs)
  • Familiarity with the concept of expected value in probability theory
  • Knowledge of independent and identically distributed (i.i.d.) random variables
  • Basic proficiency in applying the triangle inequality in mathematical proofs
NEXT STEPS
  • Study the properties and applications of probability generating functions (PGFs)
  • Explore the implications of the law of total expectation in probability theory
  • Learn about the conditions for the existence of expected values for sums of random variables
  • Investigate the concept of measurability in the context of random variables and their sums
USEFUL FOR

Mathematicians, statisticians, and students of probability theory who are looking to deepen their understanding of expected values and probability generating functions, particularly in the context of sums of random variables.

psie
Messages
315
Reaction score
40
TL;DR
I am confused about a proof concerning the expectation of a sum of a random number of random variables
There's a theorem in An Intermediate Course in Probability by Gut that says if ##E|X|<\infty\implies EX=g_X'(1)##, where ##g_X## is the probability generating function. Now, consider the r.v. ##S_N##, which is the sum of a random number ##N## of terms of i.i.d. r.v.s. ##X_1,X_2,\ldots## (everything's nonnegative integer-valued, and ##N## is independent of ##X_1,X_2,\ldots##). One can derive the probability generating function for ##S_N##, namely ##g_{S_N}(t)=g_N(g_X(t))##. I am now reading a theorem that states;

Theorem If ##EN<\infty## and ##E|X|<\infty##, then ##ES_N=EN\cdot EX##.

The author proves this using the theorem I stated in the beginning, namely that ##E|X|<\infty\implies EX=g_X'(1)##. What I don't understand is why we require ##EN<\infty## and ##E|X|<\infty##. For ##ES_N## to exist via generating functions, we require ##E|S_N|<\infty##, but I don't see how this means that we should require ##EN<\infty## and ##E|X|<\infty##.

One idea that comes to mind is the following, but I'm not sure if this is correct: $$E|S_N|=E(|X_1+\ldots +X_N|)\leq E(|X_1|+\ldots +|X_N|)=E (N|X_1|)=EN E|X_1|,$$and so we see that ##E|S_N|## is finite if ##EN## and ##E|X_1|## are finite, as required by theorem. But I'm doubting if ##E(|X_1|+\ldots +|X_N|)=E (N|X_1|)## is correct. Grateful for any confirmation or help.
 
Physics news on Phys.org
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
 
  • Like
Likes   Reactions: psie
Office_Shredder said:
You can start with
##E|S_N|=\sum_{k=1^\infty} P(N=k) E|X_1+..+X_k|##

And now you are doing triangle inequalities on fixed number of terms
Silly question maybe, but which variables is ##S_N## and consequently ##|S_N|## a function of? Certainly ##N##, but is it correct to say it is also a function of ##X_1,\ldots,X_N##?
 
I think we should be able to write $$S_N = \sum_{j = 1}^{\infty}X_j \mathbf1_{j \leq N},$$ so ##S_N## is ##\sigma((Y_n)_{n\in\mathbb N})##-measurable, where ##Y_1=N, Y_2=X_1, Y_3=X_2, \ldots##.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K