Maximum Likelihood Estimator of Bin(m,p)

  • #1
Ted123
446
0

Homework Statement



Let [itex]\displaystyle X_1 ,..., X_n \stackrel {\text{i.i.d.}}{\sim} \text{Bin} (m,p)[/itex] where [itex]m[/itex] is known. Find the MLE [itex]\hat{p}_n[/itex] of [itex]p[/itex] and hence show that [itex]\hat{p}_n[/itex] is unbiased.

The Attempt at a Solution



Can anyone check my attempt please?

[itex]\displaystyle L(m,p) = \prod_{i=1}^n {m \choose x_i} p^{x_i} (1-p)^{m-x_i}[/itex]

[itex]\displaystyle \log (L) = \log \left[ \sum_{i=1}^n {m \choose x_i} \right] + \log \left( p^{\sum_{i=1}^n x_i} \right) + \log \left[ (1-p)^{\sum_{i=1}^n (m-x_i)} \right][/itex]

[itex]\displaystyle = \prod_{i=1}^n \log \left[ {m \choose x_i} \right] + \log (p)\sum_{i=1}^n x_i + \log(1-p) \sum_{i=1}^n (m-x_i})[/itex]

[itex]\displaystyle \frac{\partial}{\partial p} \log(L) = \frac{1}{p} \sum_{i=1}^n x_i - \frac{1}{1-p} \sum_{i=1}^n (m-x_i)[/itex]

Setting [itex]\displaystyle \frac{\partial}{\partial p} \log(L) = 0[/itex]

[itex]\displaystyle \frac{1}{p} \sum_{i=1}^n - \frac{1}{1-p} \sum_{i=1}^n (m-x_i) =0[/itex]

[itex]\displaystyle (1-p) \sum_{i=1}^n x_i - p \sum_{i=1}^n (m-x_i) =0[/itex]

[itex]\displaystyle p = \frac{\sum_{i=1}^n x_i}{\sum_{i=1}^n x_i - \sum_{i=1}^n (m-x_i)}[/itex]

The denominator reduces to [itex]nm[/itex] and we have

[itex]\displaystyle p = \frac{\sum_{i=1}^n x_i}{nm} = \frac{\overline{x}}{m}[/itex]

[itex]\displaystyle \hat{p}_n = \frac{\overline{X}}{m}[/itex]

[itex]\hat{p}_n[/itex] is unbiased since [itex]\displaystyle \mathbb{E}[\hat{p}_n] = \mathbb{E}\left[ \frac{\overline{X}}{m}\right] = \frac{1}{m} \mathbb{E}\left[\:\overline{X}\: \right] = \frac{1}{m} \mathbb{E} \left[ \frac{1}{n} \sum_{i=1}^n x_i \right] = \frac{1}{nm} \sum_{i=1}^n \mathbb{E}[X_1] = \frac{n}{nm} (mp) = p[/itex]

[itex]\therefore \hat{p}_n[/itex] is an unbiased estimator of [itex]p[/itex] .
 
Last edited:
Back
Top