# How to show that the Method of Moment Estimators for the Normal

1. Jul 22, 2013

### trap101

So I'm trying to show that the estimators for the normal distribution by the method of moments are consistent. So to show consistency, I have to :

1) Show E(θ(estimator) = θ (parameter)

2) lim n-->∞ Var(θ(estimator) = 0

So Since there are two estimators in the normal distribution ( $\mu$,$\sigma$2) I have to prove that they are each consistent:

To prove $\mu$(estimator):

E($\mu$(estimate) = E(($\sum$Xi)/n)

Working along I get that $\mu$ (estimate) is unbias as well as $\sigma$2, but now to show the second condition I get tripped up for both estimators.

for $\mu$(estimate):

V($\mu$(estimate) = V(($\sum$Xi)/n

= 1/n2$\sum$V(Xi)

Similarly for V( $\sigma$2(estimate):

1/n2$\sum$V(Xi-X(bar))2

How do I proceed for these two estimators from here?

2. Jul 22, 2013

### Ray Vickson

If you are estimating $\sigma^2$ using
$$\text{est}(\sigma^2) = \frac{1}{n} \sum_{i=1}^n (X_i - \bar{X})^2$$

3. Jul 22, 2013

### trap101

Then I established that result wrong. I did this:

Assuming $\sigma$2 estimate:

= 1/n ($\sum$E(Xi-X(bar))2

= 1/n ($\sum$$\sigma$2

= 1/n (n$\sigma$2)

= $\sigma$2

4. Jul 23, 2013

$$E\left(X_i - \overline{X}\right)^2 = \sigma^2$$

5. Jul 23, 2013

### Ray Vickson

As 'statdad' has explained to you, $E(X_i - \bar{X})^2 \neq \sigma^2$.

Take $i = 1$, for example. We have $$X_1 - \bar{X} = \left(1-\frac{1}{n}\right) X_1 - \frac{1}{n} X_2 - \cdots - \frac{1}{n} X_n$$
You need to square this, expand it out, then take the expectation----and yes, I am perfectly serious! Try it yourself; it is not as bad as you might think at first. The basic properties you need are $E X_j^2 = \sigma^2 + \mu^2$ and the fact that the different $X_j$ are independent, so that $E X_i X_j$ is easy to get for $i \neq j$.