MHB Calculating MGF: Solutions for Undefined Limit Issue

  • Thread starter Thread starter Usagi
  • Start date Start date
Usagi
Messages
38
Reaction score
0
http://img253.imageshack.us/img253/7306/moments.jpg

This a pretty weird question... because:

E(e^{tX}) = M(t) = \int_0^{\infty} e^{xt} e^{-x} dx = \int_0^{\infty} e^{-x(1-t)}dx = \lim_{k \to \infty} \left[\frac{e^{x(t-1)}}{t-1}\right]_0^k

But the limit: \lim_{k \to \infty} \left[\frac{e^{k(t-1)}}{t-1}\right] is undefined?

How am I meant to compute the MGF then?

Thanks
 
Physics news on Phys.org
Usagi said:
http://img253.imageshack.us/img253/7306/moments.jpg

This a pretty weird question... because:

E(e^{tX}) = M(t) = \int_0^{\infty} e^{xt} e^{-x} dx = \int_0^{\infty} e^{-x(1-t)}dx = \lim_{k \to \infty} \left[\frac{e^{x(t-1)}}{t-1}\right]_0^k

But the limit: \lim_{k \to \infty} \left[\frac{e^{k(t-1)}}{t-1}\right] is undefined?

How am I meant to compute the MGF then?

Thanks

If $\displaystyle 1-t>0 \implies t<1$ is...

$\displaystyle E\{e^ {t\ X}\}= \int_{0}^{\infty} e^{-x\ (1-t)}\ dx = - |\frac{e^{-x\ (1-t)}}{1-t}|_{0}^{\infty} = \frac{1}{1-t}$ (1)

The condition $t<1$ is no limitation because pratically we are interested to the function $M(t)$ and its derivatives in $t=0$...

Kind regards

$\chi$ $\sigma$
 
Last edited:
Thanks chisigma,

However how did you know to set t-1>0? I thought the restriction on t was that there exists a positive b, such that t \in (-b,b)

How does that relate with setting t-1>0 though?

Thanks again
 
Usagi said:
Thanks chisigma,

However how did you know to set t-1>0? I thought the restriction on t was that there exists a positive b, such that t \in (-b,b)

How does that relate with setting t-1>0 though?

Thanks again

The integral defining the moment generating function...

$\displaystyle M(t)= E\{e^{t\ X}\}= \int_{0}^{\infty} e^{-x\ (1-t)}\ dx$ (1)

... converges for $\displaystyle t<1$ to $\displaystyle M(t)= \frac{1}{1-t}$. The series expansion...

$\displaystyle M(t)= \frac{1}{1-t}= \sum_{n=0}^{\infty} t^{n}$ (2)

... converges for $\displaystyle -1<t<1$ and (2) allows You an easily computation of the moments...

$\displaystyle E\{X^{n}\}= M^{(n)}(0)= n!$ (3)

Kind regards

$\chi$ $\sigma$
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...
Back
Top