Theorem: If "θ hat" is an unbiased estimator for θ AND Var(θ hat)->0 as n->∞, then it is a consistent estimator of θ.(adsbygoogle = window.adsbygoogle || []).push({});

The textbook proved this theorem using Chebyshev's Inequality and Squeeze Theorem and I understand the proof.

BUT then there is a remark that we can replace "unbiased" by "asymptotically unbiased" in the above theorem, and the result will still hold, but the textbook provided no proof. This is where I'm having a lot of trouble. I really don't see how we can prove this (i.e. asymptotically unbiased and variance->0 implies consistent). I tried to modify the original proof, but no way I can get it to work under the assumption of asymptotically unbiased.

I'm frustrated and I hope someone can explain how to prove it. Thank you!

**Physics Forums | Science Articles, Homework Help, Discussion**

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Asymptotically unbiased & consistent estimators

**Physics Forums | Science Articles, Homework Help, Discussion**