MHB What Common Mistakes Occur When Calculating Moment Generating Functions (MGFs)?

  • Thread starter Thread starter nacho-man
  • Start date Start date
  • Tags Tags
    Moments Mystery
nacho-man
Messages
166
Reaction score
0
Please refer to the attached image.The concept of MGF still plagues me.

I got an invalid answer when i tried this.

What i did was:

$ \int e^{tx}f_{X}(x)dx $
= $ \int_{-\infty}^{+\infty} e^{tx}(p \lambda e^{-\lambda x} + (1-p)\mu e^{-x\mu})dx$

I was a bit wary at this point, because it reminded me of the bernoulli with the p and (1-p) but i could not find any relation for this.

i separated the two integrals, and ended up with
$ p \lambda \int_{-\infty}^{+\infty}e^{tx-x\lambda}dx + ... $ which i knew was immediately wrong because that integral does not converge.
What did i do wrong.

What does the MGF even tell us. First, second, nth moment, what does this mean to me?
 

Attachments

  • Untitled.jpg
    Untitled.jpg
    8.7 KB · Views: 90
Physics news on Phys.org
nacho said:
Please refer to the attached image.The concept of MGF still plagues me.

I got an invalid answer when i tried this.

What i did was:

$ \int e^{tx}f_{X}(x)dx $
= $ \int_{-\infty}^{+\infty} e^{tx}(p \lambda e^{-\lambda x} + (1-p)\mu e^{-x\mu})dx$

I was a bit wary at this point, because it reminded me of the bernoulli with the p and (1-p) but i could not find any relation for this.

i separated the two integrals, and ended up with
$ p \lambda \int_{-\infty}^{+\infty}e^{tx-x\lambda}dx + ... $ which i knew was immediately wrong because that integral does not converge.
What did i do wrong.

What does the MGF even tell us. First, second, nth moment, what does this mean to me?

By definition is...

$\displaystyle M(t) = E \{ e^{t\ X} \} = \int_{- \infty}^{+ \infty} f(x)\ e^{t\ x}\ dx = \int_{0}^{\infty} \{p\ \lambda\ e^{- \lambda\ x} + (1-p)\ \mu\ e^{- \mu\ x}\ \}\ e^{t\ x}\ d x = \frac{p}{1 - \frac{t}{\lambda}} + \frac{1-p}{1-\frac{t}{\mu}}\ (1)$

The knowledge of M(t) permit us to find mean and variance of X with the formula...

$\displaystyle E \{X^{n}\} = M^{(n)} (0)\ (2)$

... so that is...

$\displaystyle E \{X\} = \frac{p}{\lambda} + \frac{1-p}{\mu}\ (2)$

$\displaystyle E \{X^{2}\} = \frac{2\ p}{\lambda^{2}} + \frac{2\ (1-p)}{\mu^{2}}\ (3)$

$\displaystyle \sigma^{2} = E \{X^{2} \} - E^{2} \{ X \} = \frac{2\ p - p^{2}}{\lambda^{2}} + \frac{2\ (1-p) - (1-p)^{2}}{\mu^{2}} - 2\ \frac{p\ (1-p)}{\lambda\ \mu}\ (4)$

Kind regards

$\chi$ $\sigma$
 
chisigma said:
By definition is...

$\displaystyle M(t) = E \{ e^{t\ X} \} = \int_{- \infty}^{+ \infty} f(x)\ e^{t\ x}\ dx = \int_{0}^{\infty} \{p\ \lambda\ e^{- \lambda\ x} + (1-p)\ \mu\ e^{- \mu\ x}\ \}\ e^{t\ x}\ d x = \frac{p}{1 - \frac{t}{\lambda}} + \frac{1-p}{1-\frac{t}{\mu}}\ (1)$

$\chi$ $\sigma$
I don't see how this integral converges, how did you get that answer
 
nacho said:
I don't see how this integral converges, how did you get that answer

Is...

$\displaystyle \lambda\ \int_{0}^{\infty} e^{- (\lambda-t)\ x}\ d x = \frac{\lambda}{t - \lambda} |e^{- (\lambda-t)\ x}|_{0}^{\infty} = \frac{1}{1-\frac{t}{\lambda}}\ (1)$

... and [of course...] the integral in (1) converges if $\displaystyle t< \lambda$. That is not a disavantage because from the pratical point of view what matters in the behaviour of M(t) in t=0...

Kind regards

$\chi$ $\sigma$
 
Namaste & G'day Postulate: A strongly-knit team wins on average over a less knit one Fundamentals: - Two teams face off with 4 players each - A polo team consists of players that each have assigned to them a measure of their ability (called a "Handicap" - 10 is highest, -2 lowest) I attempted to measure close-knitness of a team in terms of standard deviation (SD) of handicaps of the players. Failure: It turns out that, more often than, a team with a higher SD wins. In my language, that...
Hi all, I've been a roulette player for more than 10 years (although I took time off here and there) and it's only now that I'm trying to understand the physics of the game. Basically my strategy in roulette is to divide the wheel roughly into two halves (let's call them A and B). My theory is that in roulette there will invariably be variance. In other words, if A comes up 5 times in a row, B will be due to come up soon. However I have been proven wrong many times, and I have seen some...

Similar threads

Back
Top