# Finding probability using moment-generating functions

## Main Question or Discussion Point

I'm working through Schaum's Outline of Probability, Random Variables, and Random Processes, and am stuck on a question about moment-generating functions. If anyone has the 2nd edition, it is question 4.60, part (b).

The question gives the following initial information: $E[X^k]=0.8$ for k = 1, 2, ... and the moment generating function is: $0.2+0.8\sum_{k=0}^{\infty}\frac{t^k}{k!}=0.2+0.8e^t$.

The question is asking to find $P(X=0)$ and $P(X=1)$. I'm trying to do the first part and solve $P(X=0)$. By the definition of a moment-generating function for discrete random variables, I know I can use the following equation:

$\sum_{i}e^{tx_i}p_X(x_i)=0.2+0.8e^t$

For $P(X=0)$, the above equation becomes: $e^{t(0)}p_X(0)=0.2+0.8e^t$. The LHS simplifies to $p_X(0)$ which means $P(X=0)=0.2+0.8e^t$. But I know that is not the right answer. The right answer is $P(X=0)=0.2$.

Related Set Theory, Logic, Probability, Statistics News on Phys.org
chiro
Hey brogrammer and welcome to the forums.

If you expand the sum and plug in the values for i you will get e^(0t)P(X=0) + e^(1t)P(X=1) = 0.2 + 0.8e^t = P(X=0) + e^(t)*P(X=1)

Now can you equate like terms with co-effecients?

Chiro -

Thanks for the reply. That makes sense. My thick brain didn't realize that the question wants me to see that the sample space for the r.v. X is just 0 and 1, i.e. a Bernoulli trial. Now it makes sense.

Thanks man.