Maximum Likelihood and Fisher Information

Click For Summary
SUMMARY

The discussion focuses on finding the Maximum Likelihood Estimator (MLE) for the parameter θ in the probability density function f(x|θ) = θx-2, where 0 < θ ≤ x < ∞. The likelihood function is expressed as L(θ|x) = ∏θx-2, and the participant attempts to derive the MLE by taking the logarithm and differentiating. However, they encounter constraints on θ, specifically that it must be greater than 0 and less than or equal to the smallest observed value of x. The role of Fisher Information is also mentioned, indicating its relevance in determining the bounds for the MLE.

PREREQUISITES
  • Understanding of Maximum Likelihood Estimation (MLE)
  • Familiarity with Fisher Information concepts
  • Knowledge of probability density functions (pdf)
  • Basic calculus for differentiation and logarithmic functions
NEXT STEPS
  • Study the derivation of MLE for constrained parameters
  • Learn how to apply Fisher Information in parameter estimation
  • Explore examples of likelihood functions in statistical modeling
  • Investigate the implications of boundary conditions on MLE results
USEFUL FOR

Statisticians, data scientists, and students studying statistical inference who are interested in parameter estimation techniques and the application of Fisher Information in MLE problems.

dspampi
Messages
16
Reaction score
0

Homework Statement


Let X1, X2,...Xn be a random sample from pdf,
f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

Find the MLE of θMy attempt:

Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]
 
Physics news on Phys.org
Those two bounds are absolute limits - θ cannot be zero or negative, and it cannot be larger than the smallest observed x. If your likelihood estimate gives such an unreasonable value, the maximal likelihood has to be at one of the two bounds.
 

Similar threads

  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K