Maximum Likelihood Estimator of Bin(m,p)

In summary, you have correctly derived the MLE for p and showed that it is unbiased. Your explanation is clear and concise. Good job!
  • #1
Ted123
446
0

Homework Statement



Let [itex]\displaystyle X_1 ,..., X_n \stackrel {\text{i.i.d.}}{\sim} \text{Bin} (m,p)[/itex] where [itex]m[/itex] is known. Find the MLE [itex]\hat{p}_n[/itex] of [itex]p[/itex] and hence show that [itex]\hat{p}_n[/itex] is unbiased.

The Attempt at a Solution



Can anyone check my attempt please?

[itex]\displaystyle L(m,p) = \prod_{i=1}^n {m \choose x_i} p^{x_i} (1-p)^{m-x_i}[/itex]

[itex]\displaystyle \log (L) = \log \left[ \sum_{i=1}^n {m \choose x_i} \right] + \log \left( p^{\sum_{i=1}^n x_i} \right) + \log \left[ (1-p)^{\sum_{i=1}^n (m-x_i)} \right][/itex]

[itex]\displaystyle = \prod_{i=1}^n \log \left[ {m \choose x_i} \right] + \log (p)\sum_{i=1}^n x_i + \log(1-p) \sum_{i=1}^n (m-x_i})[/itex]

[itex]\displaystyle \frac{\partial}{\partial p} \log(L) = \frac{1}{p} \sum_{i=1}^n x_i - \frac{1}{1-p} \sum_{i=1}^n (m-x_i)[/itex]

Setting [itex]\displaystyle \frac{\partial}{\partial p} \log(L) = 0[/itex]

[itex]\displaystyle \frac{1}{p} \sum_{i=1}^n - \frac{1}{1-p} \sum_{i=1}^n (m-x_i) =0[/itex]

[itex]\displaystyle (1-p) \sum_{i=1}^n x_i - p \sum_{i=1}^n (m-x_i) =0[/itex]

[itex]\displaystyle p = \frac{\sum_{i=1}^n x_i}{\sum_{i=1}^n x_i - \sum_{i=1}^n (m-x_i)}[/itex]

The denominator reduces to [itex]nm[/itex] and we have

[itex]\displaystyle p = \frac{\sum_{i=1}^n x_i}{nm} = \frac{\overline{x}}{m}[/itex]

[itex]\displaystyle \hat{p}_n = \frac{\overline{X}}{m}[/itex]

[itex]\hat{p}_n[/itex] is unbiased since [itex]\displaystyle \mathbb{E}[\hat{p}_n] = \mathbb{E}\left[ \frac{\overline{X}}{m}\right] = \frac{1}{m} \mathbb{E}\left[\:\overline{X}\: \right] = \frac{1}{m} \mathbb{E} \left[ \frac{1}{n} \sum_{i=1}^n x_i \right] = \frac{1}{nm} \sum_{i=1}^n \mathbb{E}[X_1] = \frac{n}{nm} (mp) = p[/itex]

[itex]\therefore \hat{p}_n[/itex] is an unbiased estimator of [itex]p[/itex] .
 
Last edited:
Physics news on Phys.org
  • #2




Your attempt seems to be correct. You have correctly derived the MLE for p and showed that it is unbiased. Your explanation is clear and concise. Good job! If you have any further questions, feel free to ask.
 

What is the Maximum Likelihood Estimator (MLE) of Bin(m,p)?

The Maximum Likelihood Estimator (MLE) of Bin(m,p) is a statistical method used to estimate the parameters m and p of a binomial distribution. It involves finding the values of m and p that maximize the likelihood of the observed data.

How is the MLE of Bin(m,p) calculated?

The MLE of Bin(m,p) is calculated by taking the derivative of the binomial likelihood function with respect to m and p, setting them equal to zero, and solving for the parameters. This can often be done using calculus or numerical optimization methods.

What are the assumptions of the MLE of Bin(m,p)?

The MLE of Bin(m,p) assumes that the data follows a binomial distribution, meaning that each observation can only have two possible outcomes (success or failure) and that the probability of success (p) is constant for all observations.

What are the advantages of using the MLE of Bin(m,p)?

The MLE of Bin(m,p) is a widely used and accepted method for estimating the parameters of a binomial distribution. It is also an efficient estimator, meaning that as the sample size increases, the estimated values of m and p will converge to the true values.

Are there any limitations to using the MLE of Bin(m,p)?

One limitation of the MLE of Bin(m,p) is that it assumes that the data follows a binomial distribution, which may not always be the case in real-world scenarios. Additionally, the method can be sensitive to outliers and may not perform well with small sample sizes.

Similar threads

  • Calculus and Beyond Homework Help
Replies
5
Views
283
  • Calculus and Beyond Homework Help
Replies
2
Views
270
  • Calculus and Beyond Homework Help
Replies
3
Views
411
Replies
5
Views
379
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Replies
16
Views
2K
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
0
Views
149
  • Calculus and Beyond Homework Help
Replies
9
Views
794
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
918
Back
Top