Maximum Likelihood Estimator + Prior

Click For Summary
SUMMARY

The discussion centers on finding the Maximum Likelihood (ML) estimator and Bayesian estimator for the parameter ∏ in a Bernoulli distribution, specifically X~B(1,∏). The ML estimator is derived as πML = n1/n, where n1 represents the number of ones in the sample. When constrained to the interval 1/2 ≤ ∏ ≤ 1, if πML < 1/2, the estimator should be adjusted to π = 1/2. The Bayesian estimator under quadratic loss is also discussed, emphasizing the importance of understanding the behavior of the likelihood function.

PREREQUISITES
  • Understanding of Bernoulli distributions and their parameters
  • Familiarity with Maximum Likelihood Estimation (MLE)
  • Knowledge of Bayesian estimation techniques
  • Basic calculus for optimization (derivatives)
NEXT STEPS
  • Study the properties of Bernoulli distributions and their likelihood functions
  • Learn about constraints in MLE and how they affect estimators
  • Explore Bayesian estimation under different loss functions
  • Investigate graphical methods for visualizing likelihood functions
USEFUL FOR

Statisticians, data scientists, and researchers involved in statistical modeling and estimation techniques, particularly those working with Bernoulli processes and Bayesian methods.

Scootertaj
Messages
97
Reaction score
0
1.Suppose that X~B(1,∏). We sample n times and find n1 ones and n2=n-n1zeros
a) What is ML estimator of ∏?
b) What is the ML estimator of ∏ given 1/2≤∏≤1?
c) What is the probability ∏ is greater than 1/2?
d) Find the Bayesian estimator of ∏ under quadratic loss with this prior

2. The attempt at a solution
a) L=\pi^n_1 *(1-\pi)^{n_2}
Do \frac{d(logL)}{d\pi} = \frac{n_1}{\pi} - \frac{n_2}{1-\pi} → \pi_{ML} = \frac{n_1}{n}
b) not sure how to go about
c) not sure
d) I think I know how.
 
Last edited:
Physics news on Phys.org
Scootertaj said:
1.Suppose that X~B(1,∏). We sample n times and find n1 ones and n2=n-n1zeros
a) What is ML estimator of ∏?
b) What is the ML estimator of ∏ given 1/2≤∏≤1?
c) What is the probability ∏ is greater than 1/2?
d) Find the Bayesian estimator of ∏ under quadratic loss with this prior




2. The attempt at a solution
a) L=\pi^n_1 *(1-\pi)^{n_2}
Do \frac{d(logL)}{d\pi} = \frac{n_1}{\pi} - \frac{n_2}{1-\pi} → \pi_{ML} = \frac{n_1}{n}
b) not sure how to go about
c) not sure
d) I think I know how.

In (b) you are asked to maximize L (or log L) subject to 1/2 ≤ π ≤ 1. Your solution to )a)_ may, or may not work in this case. When does it work? When does it fail? If it fails, what then must be the solution?

RGV
 
What do you mean fail?
Intuitively, \pi_{ML}=\frac{n_1}{n} would "fail" in the case that it is \frac{n_1}{n} &lt; 1/2
But, I'm not sure what our solution must be then if it fails.
 
Scootertaj said:
What do you mean fail?
Intuitively, \pi_{ML}=\frac{n_1}{n} would "fail" in the case that it is \frac{n_1}{n} &lt; 1/2
But, I'm not sure what our solution must be then if it fails.

"Fail" = does not succeed = is wrong = does not work. When that is the case, something must have happened; what was that? What does that tell you about the behaviour of L(π)? (Hint: draw a hypothetical graph.)

RGV
 
Well, based off the graph of \pi^{n_1}(1-\pi)^{n_2} with several different n1 and n2 values plugged in that the best choice would be \pi=n1/n when 1/2≤n1/n≤1, else we choose \pi=1/2 since we usually look at the corner points (1/2 and 1)
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
698
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
2
Views
2K
Replies
3
Views
2K