MHB Moment generating function question

oyth94
Messages
32
Reaction score
0
Let X1,X2,…,Xn be independent random variables that all have the same distribution, let N be an independent non-negative integer valued random variable, and let SN:=X1+X2+⋯+XN. Find an expression for the moment generating function of SN

so all i know is that it is i.i.d but i am not sure what distribution it is in order to find the moment generating function. how do i solve this question?
 
Physics news on Phys.org
Re: moment generating function question

oyth94 said:
Let X1,X2,…,Xn be independent random variables that all have the same distribution, let N be an independent non-negative integer valued random variable, and let SN:=X1+X2+⋯+XN. Find an expression for the moment generating function of SN

so all i know is that it is i.i.d but i am not sure what distribution it is in order to find the moment generating function. how do i solve this question?

I'm not sure to have correctly understood... is N, the number of random variables, a random variable itself?...

Kind regards

$\chi$ $\sigma$
 
Re: moment generating function question

oyth94 said:
Let X1,X2,…,Xn be independent random variables that all have the same distribution, let N be an independent non-negative integer valued random variable, and let SN:=X1+X2+⋯+XN. Find an expression for the moment generating function of SN

so all i know is that it is i.i.d but i am not sure what distribution it is in order to find the moment generating function. how do i solve this question?

Let suppose that we know the quantity...

$\displaystyle p_{n}= P \{N=n\}\ (1)$

... and each continuous r.v. is $\displaystyle \mathcal {N} (0,\sigma)$, then is...

$\displaystyle m_{S_{N}} (t) = \sum_{n=1}^{\infty} p_{n}\ e^{\frac{n}{2}\ \sigma^{2}\ t^{2}}\ (2)$

Kind regards

$\chi$ $\sigma$
 
Re: moment generating function question

chisigma said:
Let suppose that we know the quantity...

$\displaystyle p_{n}= P \{N=n\}\ (1)$

... and each continuous r.v. is $\displaystyle \mathcal {N} (0,\sigma)$, then is...

$\displaystyle m_{S_{N}} (t) = \sum_{n=1}^{\infty} p_{n}\ e^{\frac{n}{2}\ \sigma^{2}\ t^{2}}\ (2)$

Kind regards

$\chi$ $\sigma$

I'm not sure how you arrived at this answer..can you please explain?
 
If I understood correctly, the $X_{i}, i=1,2,...,N$ are continuous r.v. with the same p.d.f. f(x) [which is not specified...] and N is a discrete r.v. with discrete p.d.f. $p_{n} = P \{N=n\}, n=1,2,...\ $. Setting $S = X_{1} + X_{2} + ... + X_{N}$, the r.v. S has p.d.f. ...

$\displaystyle f_{N} (x) = f(x) * f(x) * ... * f(x),\text{N times}\ (1)$

... and the moment generating function is...$\displaystyle m_{S} (t) = E \{e^{S\ t}\} = \sum_{n=1}^{\infty} p_{n}\ \int_{- \infty}^{+ \infty} e^{x\ t} f_{n} (x)\ dx\ (2)$Kind regards $\chi$ $\sigma$
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top