Find MLE of θ: Maximizing Likelihood fxn

Click For Summary
SUMMARY

The discussion focuses on finding the Maximum Likelihood Estimator (MLE) of θ from the probability density function f(x|θ) = θx-2 for 0 < θ ≤ x < ∞. The likelihood function is expressed as L(θ|x) = θn ∏xi-2, and the MLE is derived by maximizing this function under the constraint that θ must be less than or equal to the minimum of the sample values. The optimal solution occurs at θ = min(x1, x2, ..., xn), as setting the derivative to zero violates the constraints of the problem.

PREREQUISITES
  • Understanding of Maximum Likelihood Estimation (MLE)
  • Familiarity with probability density functions (PDFs)
  • Knowledge of constrained optimization techniques
  • Basic concepts of Fisher Information
NEXT STEPS
  • Study the properties of Maximum Likelihood Estimators in statistical inference
  • Learn about constrained optimization methods in calculus
  • Explore Fisher Information and its applications in parameter estimation
  • Investigate the implications of boundary conditions in optimization problems
USEFUL FOR

Statisticians, data scientists, and students studying statistical inference or optimization techniques who are interested in understanding MLE and its constraints.

dspampi
Messages
16
Reaction score
0

Homework Statement


Let X1, X2,...Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θMy attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]
 
Last edited:
Physics news on Phys.org
It looks like you are on the right track.
You are looking to maximize
##L(\theta | x ) = \prod_{i=1}^n \theta X_i^{-2} = \theta^n \prod_{i=1}^n x_i^{-2}##
for a given vector x.
If you note that ##\prod_{i=1}^n x_i^{-2}## is a constant for any fixed vector X, how do you maximize the likelihood?
##\max L(\theta|x) = C\theta^n ##
You should maximize theta, which is the same result you found with the answer ## \frac{n}{\theta} = 0##.
Now you apply the constraints. For a vector ##x = [X_1, X_2, ..., X_n]##, what is the maximum allowable ##\theta##?
 
dspampi said:

Homework Statement


Let X1, X2,...Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θMy attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]

In constrained optimization it is often wrong to set derivatives to zero, because the results violate the constraints. In your problem, you want to solve
\max_{\theta} L(\theta|x) = \theta^n \prod_{i=1}^n x_i^{-2}, \; \text{subject to} \; 0 &lt; \theta \leq \min\, (x_1,x_2, \ldots, x_n)
The derivative will definitely not = 0 at the optimal solution.

If ##\underline{x} = \min\, (x_1,x_2, \ldots, x_n)##, what is the solution of the problem of maximizing ##\theta^n## over the region ##0 < \theta \leq \underline{x}##?
 
Last edited:

Similar threads

  • · Replies 1 ·
Replies
1
Views
2K
Replies
3
Views
3K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 2 ·
Replies
2
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K