An unbiased estimator is a sample function:
<br />
Z_n = f(X_1, \ldots, X_n)<br />
such that, for an
i.i.d. sample with a parameter of the distribution
θ that we are trying to estimate, has the property:
<br />
\mathrm{E}\left[Z_n \right] = \theta<br />
If this does not hold for a finite
n, but is true as n \rightarrow \infty, then we say that the estimator is
asymptotically unbiased.
In general, if the function
f is some non-polynomial function, it is very hard to check the bias of the estimator. If, on the other hand, the estimator is a (symmetric) polynomial of degree
p (
pth moment), we may use some rules for the expectation values. For example, the mean:
<br />
\bar{X}_n \equiv \frac{1}{n} \, \sum_{k = 1}^{n}{X_k}<br />
has the property:
<br />
\mathrm{E} \left[\bar{X}_n \right] = \frac{1}{n} \, \sum_{k = 1}^{n}{\mathrm{E} \left[ X_k \right]} = E \left[ X \right]<br />
is the unbiased estimator of the mathematical expectation of the random variable
X.