Finding probability using moment-generating functions

  • #1
I'm working through Schaum's Outline of Probability, Random Variables, and Random Processes, and am stuck on a question about moment-generating functions. If anyone has the 2nd edition, it is question 4.60, part (b).

The question gives the following initial information: [itex]E[X^k]=0.8[/itex] for k = 1, 2, ... and the moment generating function is: [itex]0.2+0.8\sum_{k=0}^{\infty}\frac{t^k}{k!}=0.2+0.8e^t[/itex].

The question is asking to find [itex]P(X=0)[/itex] and [itex]P(X=1)[/itex]. I'm trying to do the first part and solve [itex]P(X=0)[/itex]. By the definition of a moment-generating function for discrete random variables, I know I can use the following equation:

[itex]\sum_{i}e^{tx_i}p_X(x_i)=0.2+0.8e^t[/itex]

For [itex]P(X=0)[/itex], the above equation becomes: [itex]e^{t(0)}p_X(0)=0.2+0.8e^t[/itex]. The LHS simplifies to [itex]p_X(0)[/itex] which means [itex]P(X=0)=0.2+0.8e^t[/itex]. But I know that is not the right answer. The right answer is [itex]P(X=0)=0.2[/itex].

Can someone please show me where I'm going wrong? Thanks in advance for your help.
 
  • #2
Hey brogrammer and welcome to the forums.

If you expand the sum and plug in the values for i you will get e^(0t)P(X=0) + e^(1t)P(X=1) = 0.2 + 0.8e^t = P(X=0) + e^(t)*P(X=1)

Now can you equate like terms with co-effecients?
 
  • #3
Chiro -

Thanks for the reply. That makes sense. My thick brain didn't realize that the question wants me to see that the sample space for the r.v. X is just 0 and 1, i.e. a Bernoulli trial. Now it makes sense.

Thanks man.
 

Suggested for: Finding probability using moment-generating functions

Replies
4
Views
840
Replies
3
Views
790
Replies
4
Views
897
Replies
31
Views
731
Replies
4
Views
810
Replies
4
Views
810
Replies
3
Views
505
Replies
3
Views
556
Replies
14
Views
384
Back
Top