# Weights of a linear estimator

## Main Question or Discussion Point

Given
Y(i) = u + e(i) i = 1,2,...N
such that e(i)s are statistically independent and u is a parameter
mean of e(i) = 0
and variance = $$\sigma(i)$$^2

Find W(i) such that the linear estimator

$$\mu$$ = $$\sum$$W(i)X(i) for i = 1 to N

has

mean value of $$\mu$$ = u

and E[$$(u-\mu)^2$$ is a minimum

## Answers and Replies

Related Set Theory, Logic, Probability, Statistics News on Phys.org
Homework Helper
There may be other ways to do this, but one of the most direct is outlined below.
$$\hat\mu = \sum w(i) x_i$$
then in order for $$E(\hat \mu) = \mu$$ to be true you must have

$$\sum w(i) = 1$$

Next, note that $$E(\hat \mu - \mu)^2$$ is simply the $$\textbf{variance}$$ of your estimate (since your estimate has expectation $$\mu$$).

Since the $$x_i$$ are independent, the variance of $$\hat \mu$$ is

$$\mathbf{Var}{\hat \mu} = \sum w(i)^2 \mathbf{Var}(x_i) = \sigma^2 \sum w(i)^2$$

You want to choose the $$w(i)$$ so that the most recent expression is minimized, subject to the constraint that $$\sum w(i) = 1$$

From here on use the method of undetermined multipliers.