1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: How to show that the Method of Moment Estimators for the Normal

  1. Jul 22, 2013 #1
    So I'm trying to show that the estimators for the normal distribution by the method of moments are consistent. So to show consistency, I have to :

    1) Show E(θ(estimator) = θ (parameter)

    2) lim n-->∞ Var(θ(estimator) = 0

    So Since there are two estimators in the normal distribution ( [itex]\mu[/itex],[itex]\sigma[/itex]2) I have to prove that they are each consistent:

    To prove [itex]\mu[/itex](estimator):

    E([itex]\mu[/itex](estimate) = E(([itex]\sum[/itex]Xi)/n)

    Working along I get that [itex]\mu[/itex] (estimate) is unbias as well as [itex]\sigma[/itex]2, but now to show the second condition I get tripped up for both estimators.

    for [itex]\mu[/itex](estimate):

    V([itex]\mu[/itex](estimate) = V(([itex]\sum[/itex]Xi)/n

    = 1/n2[itex]\sum[/itex]V(Xi)

    Similarly for V( [itex]\sigma[/itex]2(estimate):


    How do I proceed for these two estimators from here?
  2. jcsd
  3. Jul 22, 2013 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    If you are estimating ##\sigma^2## using
    [tex] \text{est}(\sigma^2) = \frac{1}{n} \sum_{i=1}^n (X_i - \bar{X})^2[/tex]
    then your estimate is biased.
  4. Jul 22, 2013 #3

    Then I established that result wrong. I did this:

    Assuming [itex]\sigma[/itex]2 estimate:

    = 1/n ([itex]\sum[/itex]E(Xi-X(bar))2

    = 1/n ([itex]\sum[/itex][itex]\sigma[/itex]2

    = 1/n (n[itex]\sigma[/itex]2)

    = [itex]\sigma[/itex]2
  5. Jul 23, 2013 #4


    User Avatar
    Homework Helper

    Your error is stating that
    E\left(X_i - \overline{X}\right)^2 = \sigma^2
  6. Jul 23, 2013 #5

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    As 'statdad' has explained to you, ##E(X_i - \bar{X})^2 \neq \sigma^2##.

    Take ##i = 1##, for example. We have [tex]X_1 - \bar{X} = \left(1-\frac{1}{n}\right) X_1 - \frac{1}{n} X_2 - \cdots - \frac{1}{n} X_n[/tex]
    You need to square this, expand it out, then take the expectation----and yes, I am perfectly serious! Try it yourself; it is not as bad as you might think at first. The basic properties you need are ##E X_j^2 = \sigma^2 + \mu^2## and the fact that the different ##X_j## are independent, so that ##E X_i X_j## is easy to get for ##i \neq j##.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted