Weights of a linear estimator

  • Thread starter purplebird
  • Start date
18
0

Main Question or Discussion Point

Given
Y(i) = u + e(i) i = 1,2,...N
such that e(i)s are statistically independent and u is a parameter
mean of e(i) = 0
and variance = [tex]\sigma(i)[/tex]^2

Find W(i) such that the linear estimator

[tex]\mu[/tex] = [tex]\sum[/tex]W(i)X(i) for i = 1 to N

has

mean value of [tex]\mu[/tex] = u

and E[[tex](u-\mu)^2[/tex] is a minimum
 

Answers and Replies

statdad
Homework Helper
1,494
35
There may be other ways to do this, but one of the most direct is outlined below.
[tex]
\hat\mu = \sum w(i) x_i
[/tex]
then in order for [tex] E(\hat \mu) = \mu [/tex] to be true you must have

[tex]
\sum w(i) = 1
[/tex]

Next, note that [tex] E(\hat \mu - \mu)^2 [/tex] is simply the [tex] \textbf{variance} [/tex] of your estimate (since your estimate has expectation [tex] \mu [/tex]).

Since the [tex] x_i [/tex] are independent, the variance of [tex] \hat \mu [/tex] is

[tex]
\mathbf{Var}{\hat \mu} = \sum w(i)^2 \mathbf{Var}(x_i) = \sigma^2 \sum w(i)^2
[/tex]

You want to choose the [tex] w(i) [/tex] so that the most recent expression is minimized, subject to the constraint that [tex] \sum w(i) = 1 [/tex]

From here on use the method of undetermined multipliers.
 

Related Threads for: Weights of a linear estimator

  • Last Post
Replies
0
Views
1K
  • Last Post
Replies
2
Views
1K
  • Last Post
Replies
2
Views
3K
  • Last Post
Replies
1
Views
631
Replies
2
Views
2K
  • Last Post
Replies
2
Views
6K
Replies
1
Views
550
Replies
12
Views
9K
Top