Find MLE of θ: Maximizing Likelihood fxn

Click For Summary
To find the maximum likelihood estimator (MLE) of θ from the given probability density function, the likelihood function is expressed as L(θ|x) = θ^n * ∏x_i^(-2). The MLE is determined by maximizing this likelihood function under the constraint that 0 < θ ≤ min(x_1, x_2, ..., x_n). Setting the derivative of the log-likelihood to zero does not yield valid results due to these constraints. The optimal solution for maximizing θ occurs at θ = min(x_1, x_2, ..., x_n), ensuring compliance with the defined limits. Understanding Fisher Information may provide additional insights into the estimation process, but the primary focus remains on the constraints for accurate MLE determination.
dspampi
Messages
16
Reaction score
0

Homework Statement


Let X1, X2,...Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θMy attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]
 
Last edited:
Physics news on Phys.org
It looks like you are on the right track.
You are looking to maximize
##L(\theta | x ) = \prod_{i=1}^n \theta X_i^{-2} = \theta^n \prod_{i=1}^n x_i^{-2}##
for a given vector x.
If you note that ##\prod_{i=1}^n x_i^{-2}## is a constant for any fixed vector X, how do you maximize the likelihood?
##\max L(\theta|x) = C\theta^n ##
You should maximize theta, which is the same result you found with the answer ## \frac{n}{\theta} = 0##.
Now you apply the constraints. For a vector ##x = [X_1, X_2, ..., X_n]##, what is the maximum allowable ##\theta##?
 
dspampi said:

Homework Statement


Let X1, X2,...Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θMy attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]

In constrained optimization it is often wrong to set derivatives to zero, because the results violate the constraints. In your problem, you want to solve
\max_{\theta} L(\theta|x) = \theta^n \prod_{i=1}^n x_i^{-2}, \; \text{subject to} \; 0 &lt; \theta \leq \min\, (x_1,x_2, \ldots, x_n)
The derivative will definitely not = 0 at the optimal solution.

If ##\underline{x} = \min\, (x_1,x_2, \ldots, x_n)##, what is the solution of the problem of maximizing ##\theta^n## over the region ##0 < \theta \leq \underline{x}##?
 
Last edited:
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
3K
  • · Replies 0 ·
Replies
0
Views
953
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K