# Asymptotically unbiased & consistent estimators

1. Jul 8, 2011

### kingwinner

Theorem: If "θ hat" is an unbiased estimator for θ AND Var(θ hat)->0 as n->∞, then it is a consistent estimator of θ.

The textbook proved this theorem using Chebyshev's Inequality and Squeeze Theorem and I understand the proof.
BUT then there is a remark that we can replace "unbiased" by "asymptotically unbiased" in the above theorem, and the result will still hold, but the textbook provided no proof. This is where I'm having a lot of trouble. I really don't see how we can prove this (i.e. asymptotically unbiased and variance->0 implies consistent). I tried to modify the original proof, but no way I can get it to work under the assumption of asymptotically unbiased.

I'm frustrated and I hope someone can explain how to prove it. Thank you!

Last edited: Jul 8, 2011
2. Jul 8, 2011

### micromass

Hi kingwinner!

$$P(|\hat{\theta}_n-\theta_0|\geq \varepsilon)\leq P(|\hat{\theta}_n-E(\hat{\theta}_n)|+|E(\hat{\theta}_n)-\theta_0|\geq \varepsilon)\leq \frac{Var(\hat{\theta}_n)}{(\varepsilon-|E(\hat{\theta}_n)-\theta_0|)^2}\rightarrow 0$$

3. Jul 8, 2011

### kingwinner

Thanks for the help, but one of the assumptions of Chebyshev's inequality requires $$\varepsilon-|E(\hat{\theta}_n)-\theta_0|$$>0 which is not necessarily true here?

4. Jul 8, 2011

### micromass

It's not necessarily true, but it is true for large n. We know that

$$E(\hat{\theta}_n)\rightarrow \theta_0$$

So from a certain n0, we know that

$$|E(\hat{\theta}_n)-\theta_0|<\varepsilon$$

So from that certain n0, we know that

$$\varepsilon-|E(\hat{\theta}_n)-\theta_0|>0$$

5. Jul 8, 2011

### kingwinner

Thanks for the help! :) You're a legend...