1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Statistics: Method of Moments/Maximum Likelihood Estimation

  1. Jul 18, 2009 #1
    1. The problem statement, all variables and given/known data
    f(x;theta)=Exp(-x+theta)
    Find parameter estimates for variable 'theta' using maximum likelihood Estimator and Method of Moments.

    2. Relevant equations
    Log(x; theta) = Log(Exp(-x + theta)) -- For MLE
    Integral from theta to infinity of (x*Exp(-x + theta)) = xbar -- For Method of Moments

    3. The attempt at a solution
    I evaluate the log-likelihood function to get Log(L)= -x + theta and then take the derivative of the log-likelihood function with respect to theta. The problem here arises when I take this derivative and set it equal to zero since it gives me 0 = 1 with none of my parameters left in the equation.

    In performing the method of moments analysis i get that the estimation for theta is equal to the xbar + 1. Don't know if this is correct, but if someone could help me see what I'm doing wrong with either of these parts it would be most appreciated.
     
  2. jcsd
  3. Jul 20, 2009 #2
    Welcome to PF.

    Is there a restriction on x and theta, such as 0<theta<x?

    If the observations are x_1, ... , x_n, then the likelihood function is L(x_1,...,x_n;theta)=product of f(x_1;theta), ... f(x_n;theta).

    Now to maximize L(theta), the usual way is to consider K(theta)=log L(theta) and take a derivative, etc. However, in this problem, you have to maximize L a different way. Think about the graph of L(theta) or K(theta). Use the conditions 0<theta<x_i and think about where the function would be maximized.

    Do you want to show your work on this?
     
  4. Jul 20, 2009 #3
    So for the MLE, The conditions are that x >= theta. Visualizing what this graph would look like, if x = theta, then you have the function equaling a value of 1, if If theta = x-1 you would get e^-1, which would be less that if x = theta, and this continues. So would it be correct to say that theta is maximized when it equals x?

    For the Method of Moments, I use the integral from theta to infinity of x*Exp(-x+theta). Upon evaluating the integral i get theta + 1. I set this equal to the expected value for the first moment, which i am asumming would just be the average (or xbar). From this i just solved for theta to get theta = xbar + 1. I am not sure if this is correct though since different distributions have different first moments.
     
  5. Jul 20, 2009 #4
    MOM is almost correct. Starting with [tex]E[X]=\theta + 1[/tex], you then solved for theta incorrectly (minus sign error). Then, as you said, you replace [tex]E[X][/tex] with [tex]\bar x[/tex] to obtain the estimator of [tex]\theta[/tex].

    For MLE, you are beginning to think in the right spirit. However, is there a reason you are only using one x? You should imagine n observations being given, [tex] x_1, x_2, \dots, x_n[/tex] and you will estimate [tex]\theta[/tex] in terms of those.
     
  6. Jul 20, 2009 #5
    Fixing the MOM i now have theta = xbar - 1, thank you for catching my error.

    I am a little confused on the MLE still. Do you mean that x should be evaluated as (x1, ..., xn)/n (xbar)? This gives theta = xbar. Is it a problem that the MLE and MOM give different estimates?
     
  7. Jul 20, 2009 #6
    Sorry, I don't understand those comments at all.

    The likelihood function is [tex]L(\theta)=f(x_1;\theta)f(x_2;\theta)\dots f(x_n;\theta)=\Pi_{i=1}^n f(x_i;\theta) [/tex]

    You have [tex]f(x;\theta)=e^{-x+\theta}[/tex] for [tex]x>\theta[/tex] so substitute [tex]f(x_i;\theta)=e^{-x_i+\theta}[/tex] (for [tex]x_i>\theta[/tex]) into the formula for L.

    Simplify L. By "studying" L (no derivatives), you have to decide what theta would maximize L.

    Not a problem.
     
  8. Jul 20, 2009 #7
    In performing the operations suggested in the last post and simplifying to find L, I got that L=Exp(theta-(xbar*n)) by using properties of products and exponents. (i.e. the product of Exp(x_i) from i to n is the same as saying Exp(sum of x_i from i to n) In order to maximize this function theta = n*xbar or am i missing something with the constraints? n= number of observations.
     
  9. Jul 20, 2009 #8
    First note [tex]e^{-x_1+\theta}e^{-x_2+\theta}=e^{-x_1-x_2}e^{2\theta}[/tex] which suggests you partially simplified correctly (n*xbar is good), but partially not (theta looks wrong).

    Second, you are right that you must look at the constraints. You can't take theta=n*xbar, because that might violate theta<=x_i.

    You are getting close. Remember, theta <= every x_i.
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Statistics: Method of Moments/Maximum Likelihood Estimation
Loading...