relinquished™
- 79
- 0
I'm supposed to prove that in a geometric distribution, the expected value,
<br /> \mu = \frac{1}{p}<br />
without the use of moment generating functions (whatever that is)
I start off with the very definition of the expected value.
<br /> \mu_x = E(x) = \sum x \cdot p \cdot (1-p)^{x-1}<br />
<br /> \mu_x = p \sum x \cdot (1-p)^{x-1}<br />
<br /> \mu_x = p \sum x \cdot (1-p)^x \cdot (1-p)^{-1}<br />
<br /> \mu_x = \frac{p}{1-p} \sum x \cdot (1-p)^x<br />
Now I get stuck because I don't know how to evaluate the summation. Can anyone help me out?
btw, x starts from 1 to n
<br /> \mu = \frac{1}{p}<br />
without the use of moment generating functions (whatever that is)
I start off with the very definition of the expected value.
<br /> \mu_x = E(x) = \sum x \cdot p \cdot (1-p)^{x-1}<br />
<br /> \mu_x = p \sum x \cdot (1-p)^{x-1}<br />
<br /> \mu_x = p \sum x \cdot (1-p)^x \cdot (1-p)^{-1}<br />
<br /> \mu_x = \frac{p}{1-p} \sum x \cdot (1-p)^x<br />
Now I get stuck because I don't know how to evaluate the summation. Can anyone help me out?
btw, x starts from 1 to n