1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Stat Theory: Need to Prove Consistent Estimator

  1. Feb 28, 2013 #1
    So I am struggling with this homework problem because I got burned out of another problem earlier today, and I just cannot get beyond what I have.

    The problem is:

    Let X be a continuous random variable with the pdf: f(x)=e^(-(x-θ)) , x > θ ,
    and suppose we have a sample of size n , { X1, X2 , … , Xn }.
    Is T = Min ( X1, X2 , … , Xn ) a consistent estimator for θ ?

    2. Relevant equations

    From my class, I know that my cdf for this is F_T(t)= 1-e^(-n(t-θ)) and my pdf is f_t(t)=ne^(-n(t-θ)) where t>θ.



    3. The attempt at a solution

    Now, to show that an estimator is consistent, I need to show that my E(T) is unbiased and my Var(T) as n->infinity goes to 0.

    What I am currently stuck on is finding my E(T), as silly as that sounds.

    I know my integral needs to be:

    ∫(from 0 to theta) t*ne^(-n(t-θ)) dt

    So dusting off my integration by parts, I get my integral to be:
    n∫(from 0 to theta) t*e^(-n(t-θ)) dt

    [-t*e^(-n(t-θ))|(from 0 to theta)-(1/n)e^(-n(t-θ))|(from 0 to theta)]

    [-θ*e^(-n(θ-θ))+0-e^(-n(θ-θ))+e^(nθ)]
    [-θ-1/n+(1/n)e^(nθ)]

    Which I am pretty much stuck on how to get an unbiased estimator out of that.
    Thus I can pretty much assume I did something wrong somewhere and I need help.

    Could someone please take a look at this and let me know where I am going wrong with this?

    Thanks!
     
    Last edited: Feb 28, 2013
  2. jcsd
  3. Mar 1, 2013 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Easiest way:
    [tex] \int_{\theta}^{\infty} t f(t-\theta) \; dt = \int_{\theta}^{\infty} (t - \theta + \theta) f(t-\theta) \; dt\\
    = \theta \int_{\theta}^{\infty} f(t-\theta)\; dt + \int_{\theta}^{\infty} (t-\theta)f(t-\theta) \; dt\\
    = \theta + \int_0^{\infty} s f(s) \; ds.[/tex]
     
  4. Mar 1, 2013 #3
    Hi Ray,
    I am not understanding how you went from ∫tf(t-θ) to what you have presented below.
    Could you provide further details in the steps you have listed below?
    Also, should have I made my integral from theta to infinity instead of 0 to theta?

    Thanks!
     
  5. Mar 1, 2013 #4

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    Sorry, I cannot do more. I gave the steps in detail, one-by-one. And no: the final integral goes from 0 to ∞ because we have changed variables from t to s = t-θ. That was the whole point: we reduce the problem to a standard form that is already familiar (or should be).
     
  6. Mar 1, 2013 #5
    Hi Ray,

    What I am not understanding is how you have t-theta+theta and then in the next step just eliminated that down to theta outside of the integral.
    I just need to understand your thinking behind it because it doesn't make sense to me.

    Thanks!
     
  7. Mar 1, 2013 #6

    statdad

    User Avatar
    Homework Helper

    "Now, to show that an estimator is consistent, I need to show that my E(T) is unbiased and my Var(T) as n->infi"

    A consistent estimator is merely one that converges in probability - it doesn't have to be unbiased. Thus you need to show that |T - theta| converges to zero in probability (T is your estimator).
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted



Similar Discussions: Stat Theory: Need to Prove Consistent Estimator
  1. Consistent Estimator (Replies: 0)

Loading...