Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Maximum Likelihood and Fisher Information

  1. Feb 5, 2016 #1
    1. The problem statement, all variables and given/known data
    Let X1, X2,....Xn be a random sample from pdf,
    f(x|θ) = θx-2 where 0 < θ ≤ x < ∞

    Find the MLE of θ

    My attempt:

    Likelihood fxn: L(θ|x) = ∏θx-2 = θn∏ θx-2

    And to find MLE, I take Log of that function and partial derivative (w.r.t θ, of log L(θ|x) and set that = 0, and get: n/θ = 0

    However, I realize that θ ≤ x and θ > 0...what do I need to do to incorporate this to my likelihood function?
    In class we discuss about Fisher Information and I have a guess that it has some involvement with this problem, but I'm not sure why and what we can use Fisher Information for this problem?[/SUP][/SUP][/SUP][/SUP][/SUB][/SUB][/SUB]
  2. jcsd
  3. Feb 7, 2016 #2


    User Avatar
    2017 Award

    Staff: Mentor

    Those two bounds are absolute limits - θ cannot be zero or negative, and it cannot be larger than the smallest observed x. If your likelihood estimate gives such an unreasonable value, the maximal likelihood has to be at one of the two bounds.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook