Probability Mass Function/Moment Generating Function

The q is the same both times, and the p is the same both times, and you just need to evaluate S = p*(0 + q + 2*q^2 + 3*q^3+...).)
  • #1
tangodirt
54
1

Homework Statement



The pmf of a random variable X is given by f(x) = π(1 − π)x for x = 0, 1, ..., ∞, and 0 ≤ π ≤ 1.
a) Show that this function actually is a pmf.
b) Find E(X).
c) Find the moment generating function of X, MX(t) = E(etX).

2. The attempt at a solution
My solution was done numerically in MATLAB, but I suppose that there is probably an analytical solution as well. My biggest issue is the interpretation of π.

[PLAIN]http://img401.imageshack.us/img401/2015/proofox.png

For the second part, I also did this numerically, by solving the series:

[tex]\sum_{x}x \cdot p(x) = \sum_{x}x \cdot f(X=x)[/tex]
Which evaluates, also through MATLAB, to be: E(X) = 1, but I suspect there is probably an analytical method for this as well?

The part I am struggling the most with is the last bit. I can get my MGF down to:

[tex]\sum_{\forall x} e^{tx} \pi (1-\pi)^{x}[/tex]
But I am not sure how to get rid of the infinite summation. I tried an infinite geometric series, but it only holds true for:

[tex]|e^{t}(1-\pi)| < 1[/tex]
Which means that E(X) cannot be found with the MGF.

Any ideas?
 
Last edited by a moderator:
Physics news on Phys.org
  • #2
tangodirt said:

Homework Statement



The pmf of a random variable X is given by f(x) = π(1 − π)x for x = 0, 1, ..., ∞, and 0 ≤ π ≤ 1.
a) Show that this function actually is a pmf.
b) Find E(X).
c) Find the moment generating function of X, MX(t) = E(etX).

2. The attempt at a solution
My solution was done numerically in MATLAB, but I suppose that there is probably an analytical solution as well. My biggest issue is the interpretation of π.

[PLAIN]http://img401.imageshack.us/img401/2015/proofox.png

For the second part, I also did this numerically, by solving the series:

[tex]\sum_{x}x \cdot p(x) = \sum_{x}x \cdot f(X=x)[/tex]
Which evaluates, also through MATLAB, to be: E(X) = 1, but I suspect there is probably an analytical method for this as well?

The part I am struggling the most with is the last bit. I can get my MGF down to:

[tex]\sum_{\forall x} e^{tx} \pi (1-\pi)^{x}[/tex]
But I am not sure how to get rid of the infinite summation. I tried an infinite geometric series, but it only holds true for:

[tex]|e^{t}(1-\pi)| < 1[/tex]
Which means that E(X) cannot be found with the MGF.

Any ideas?

You should not need Matlab to do part (a); in fact, relying on software instead of thinking means you will _never_ learn the material. You need to show that sum{(1-p)^k, k=0..infinity} is 1/p. Are you really sure you have never, ever, seen this type of material before? It is just elementary algebra.

Part (b) is just a little bit trickier, but again, involves only standard results that are widely available If q = 1-p, you want to evaluate S = 0 + 1*q + 2*q^2 + 3*q^3 + ... . You can write this as q + q^2 + q^3 + ... + [q^2 + 2*q^3 + 3*q^4 + ... ], and [...] = q*(q + 2*q^2 + 3*q^3 +...) = q*S. The first summation is q*(1 + q + q^2 + ...) = q/(1-q), so we have S = q/(1-q) + q*S. This is an equation for S that you can solve. (There are many other ways of getting the result, but the above is the most "elementary".)

You can get part (c) similarly: just write out the first few terms of what you are supposed to evaluate, then look for a pattern.

RGV
 
Last edited by a moderator:
  • #3
Ray Vickson said:
You should not need Matlab to do part (a); in fact, relying on software instead of thinking means you will _never_ learn the material. You need to show that sum{(1-p)^k, k=0..infinity} is 1/p. Are you really sure you have never, ever, seen this type of material before? It is just elementary algebra.

To be honest, I cannot recall if I've seen the material or not. It seems like I have, but it's been so long that I cannot remember.

Ray Vickson said:
Part (b) is just a little bit trickier, but again, involves only standard results that are widely available If q = 1-p, you want to evaluate S = 0 + 1*q + 2*q^2 + 3*q^3 + ... . You can write this as q + q^2 + q^3 + ... + [q^2 + 2*q^3 + 3*q^4 + ... ], and [...] = q*(q + 2*q^2 + 3*q^3 +...) = q*S. The first summation is q*(1 + q + q^2 + ...) = q/(1-q), so we have S = q/(1-q) + q*S. This is an equation for S that you can solve. (There are many other ways of getting the result, but the above is the most "elementary".)

To be honest, this really confuses me. Also, when solving your final equation, I get:

[tex]S = \frac{q}{1-q} + qS[/tex]
[tex]S = \frac{q}{(q-1)^2}[/tex]

Which should return 1 for any value of q such that 0 <= q <= 1, no? For q = 0.5, it evaluates to 2, for q = 0.75, it evaluates to 12, etc.

Ray Vickson said:
You can get part (c) similarly: just write out the first few terms of what you are supposed to evaluate, then look for a pattern.

RGV

Regardless, even if I have learned about series (however long ago), I've never done series manipulation like this, as far as I am aware.
 
  • #4
tangodirt said:
To be honest, I cannot recall if I've seen the material or not. It seems like I have, but it's been so long that I cannot remember.



To be honest, this really confuses me. Also, when solving your final equation, I get:

[tex]S = \frac{q}{1-q} + qS[/tex]
[tex]S = \frac{q}{(q-1)^2}[/tex]

Which should return 1 for any value of q such that 0 <= q <= 1, no? For q = 0.5, it evaluates to 2, for q = 0.75, it evaluates to 12, etc.



Regardless, even if I have learned about series (however long ago), I've never done series manipulation like this, as far as I am aware.

Sorry: I should have said S = p*(0 + q + 2*q^2 + 3*q^3+...), which is p times what I wrote before, so should have gotten S = p*q/(1-q)^2 = p*q/p^2 = q/p. Here, p = 1-q. There is no reason to expect S to be 1, because S is NOT the sum of the probabilities (which would be sum p(x)); it is the expected value of X, which is sum x*p(x).

RGV
 

1. What is a Probability Mass Function (PMF)?

A Probability Mass Function (PMF) is a statistical function that describes the probability of a discrete random variable taking on a specific value. It maps each possible outcome of a random variable to its associated probability.

2. How is a PMF different from a Probability Density Function (PDF)?

A PMF is used for discrete random variables, while a PDF is used for continuous random variables. A PMF assigns probabilities to specific values, while a PDF assigns probabilities to ranges of values. Additionally, the sum of all probabilities in a PMF is equal to 1, while the integral of a PDF over its entire range is equal to 1.

3. What is a Moment Generating Function (MGF)?

A Moment Generating Function (MGF) is a mathematical function that uniquely determines the probability distribution of a random variable. It is defined as the expected value of e^(tx), where t is a real number and x is a random variable.

4. How is an MGF used in probability and statistics?

An MGF is used to find the moments (mean, variance, etc.) of a probability distribution. It can also be used to prove properties of a probability distribution, such as the Central Limit Theorem.

5. What is the relationship between PMFs and MGFs?

The MGF of a random variable can be used to find its PMF. Specifically, the PMF can be obtained by taking the derivative of the MGF and evaluating it at t=0. Additionally, the moments of a random variable can be found by taking derivatives of the MGF at t=0.

Similar threads

  • Precalculus Mathematics Homework Help
Replies
1
Views
1K
Replies
3
Views
188
  • Precalculus Mathematics Homework Help
Replies
12
Views
2K
Replies
4
Views
380
  • Precalculus Mathematics Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
19
Views
895
  • Set Theory, Logic, Probability, Statistics
Replies
5
Views
2K
Replies
1
Views
793
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Precalculus Mathematics Homework Help
Replies
24
Views
2K
Back
Top