Maximum Likelihood Estimator + Prior

Click For Summary

Homework Help Overview

The discussion revolves around finding the maximum likelihood estimator (MLE) and Bayesian estimator for the parameter ∏ in a binomial distribution context, specifically when sampling from a Bernoulli process. Participants explore various aspects of the estimators under different constraints and loss functions.

Discussion Character

  • Mixed

Approaches and Questions Raised

  • Participants discuss the MLE for ∏ based on the likelihood function and question how the constraints affect the solution. There is uncertainty regarding the implications of the MLE when it falls outside the specified range of 1/2 to 1. Some participants express confusion about the necessary adjustments if the MLE does not meet the constraints.

Discussion Status

The discussion is active, with participants attempting to clarify the conditions under which the MLE is valid and exploring the implications of those conditions. There is a focus on understanding the behavior of the likelihood function and how it relates to the estimators.

Contextual Notes

Participants are working under the assumption that the estimators must adhere to the constraints of the problem, specifically regarding the range of ∏. There is an ongoing exploration of the consequences of the MLE being less than 1/2.

Scootertaj
Messages
97
Reaction score
0
1.Suppose that X~B(1,∏). We sample n times and find n1 ones and n2=n-n1zeros
a) What is ML estimator of ∏?
b) What is the ML estimator of ∏ given 1/2≤∏≤1?
c) What is the probability ∏ is greater than 1/2?
d) Find the Bayesian estimator of ∏ under quadratic loss with this prior

2. The attempt at a solution
a) L=\pi^n_1 *(1-\pi)^{n_2}
Do \frac{d(logL)}{d\pi} = \frac{n_1}{\pi} - \frac{n_2}{1-\pi} → \pi_{ML} = \frac{n_1}{n}
b) not sure how to go about
c) not sure
d) I think I know how.
 
Last edited:
Physics news on Phys.org
Scootertaj said:
1.Suppose that X~B(1,∏). We sample n times and find n1 ones and n2=n-n1zeros
a) What is ML estimator of ∏?
b) What is the ML estimator of ∏ given 1/2≤∏≤1?
c) What is the probability ∏ is greater than 1/2?
d) Find the Bayesian estimator of ∏ under quadratic loss with this prior




2. The attempt at a solution
a) L=\pi^n_1 *(1-\pi)^{n_2}
Do \frac{d(logL)}{d\pi} = \frac{n_1}{\pi} - \frac{n_2}{1-\pi} → \pi_{ML} = \frac{n_1}{n}
b) not sure how to go about
c) not sure
d) I think I know how.

In (b) you are asked to maximize L (or log L) subject to 1/2 ≤ π ≤ 1. Your solution to )a)_ may, or may not work in this case. When does it work? When does it fail? If it fails, what then must be the solution?

RGV
 
What do you mean fail?
Intuitively, \pi_{ML}=\frac{n_1}{n} would "fail" in the case that it is \frac{n_1}{n} < 1/2
But, I'm not sure what our solution must be then if it fails.
 
Scootertaj said:
What do you mean fail?
Intuitively, \pi_{ML}=\frac{n_1}{n} would "fail" in the case that it is \frac{n_1}{n} < 1/2
But, I'm not sure what our solution must be then if it fails.

"Fail" = does not succeed = is wrong = does not work. When that is the case, something must have happened; what was that? What does that tell you about the behaviour of L(π)? (Hint: draw a hypothetical graph.)

RGV
 
Well, based off the graph of \pi^{n_1}(1-\pi)^{n_2} with several different n1 and n2 values plugged in that the best choice would be \pi=n1/n when 1/2≤n1/n≤1, else we choose \pi=1/2 since we usually look at the corner points (1/2 and 1)
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 3 ·
Replies
3
Views
791
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
2
Views
2K
Replies
3
Views
2K