Finding probability using moment-generating functions

  • Context: Graduate 
  • Thread starter Thread starter brogrammer
  • Start date Start date
  • Tags Tags
    Functions Probability
Click For Summary
SUMMARY

The discussion focuses on solving a problem from Schaum's Outline of Probability, Random Variables, and Random Processes, specifically question 4.60, part (b). The moment-generating function provided is 0.2 + 0.8e^t, leading to the need to find P(X=0) and P(X=1). The correct solution reveals that P(X=0) equals 0.2, clarifying that the random variable X represents a Bernoulli trial with outcomes 0 and 1. Participants emphasize the importance of equating coefficients in the moment-generating function to derive the probabilities accurately.

PREREQUISITES
  • Understanding of moment-generating functions for discrete random variables
  • Familiarity with Bernoulli trials and their probability distributions
  • Basic knowledge of series expansions and their applications in probability
  • Experience with Schaum's Outline of Probability, Random Variables, and Random Processes
NEXT STEPS
  • Study the properties and applications of moment-generating functions in probability theory
  • Learn how to derive probabilities from moment-generating functions using coefficient comparison
  • Explore Bernoulli distributions and their characteristics in depth
  • Review additional problems from Schaum's Outline to reinforce understanding of random variables
USEFUL FOR

Students of probability theory, educators teaching random variables, and anyone seeking to deepen their understanding of moment-generating functions and their applications in statistical analysis.

brogrammer
Messages
2
Reaction score
0
I'm working through Schaum's Outline of Probability, Random Variables, and Random Processes, and am stuck on a question about moment-generating functions. If anyone has the 2nd edition, it is question 4.60, part (b).

The question gives the following initial information: E[X^k]=0.8 for k = 1, 2, ... and the moment generating function is: 0.2+0.8\sum_{k=0}^{\infty}\frac{t^k}{k!}=0.2+0.8e^t.

The question is asking to find P(X=0) and P(X=1). I'm trying to do the first part and solve P(X=0). By the definition of a moment-generating function for discrete random variables, I know I can use the following equation:

\sum_{i}e^{tx_i}p_X(x_i)=0.2+0.8e^t

For P(X=0), the above equation becomes: e^{t(0)}p_X(0)=0.2+0.8e^t. The LHS simplifies to p_X(0) which means P(X=0)=0.2+0.8e^t. But I know that is not the right answer. The right answer is P(X=0)=0.2.

Can someone please show me where I'm going wrong? Thanks in advance for your help.
 
Physics news on Phys.org
Hey brogrammer and welcome to the forums.

If you expand the sum and plug in the values for i you will get e^(0t)P(X=0) + e^(1t)P(X=1) = 0.2 + 0.8e^t = P(X=0) + e^(t)*P(X=1)

Now can you equate like terms with co-effecients?
 
Chiro -

Thanks for the reply. That makes sense. My thick brain didn't realize that the question wants me to see that the sample space for the r.v. X is just 0 and 1, i.e. a Bernoulli trial. Now it makes sense.

Thanks man.
 

Similar threads

  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 0 ·
Replies
0
Views
2K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 17 ·
Replies
17
Views
1K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K