Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Weights of a linear estimator

  1. Apr 15, 2008 #1
    Y(i) = u + e(i) i = 1,2,...N
    such that e(i)s are statistically independent and u is a parameter
    mean of e(i) = 0
    and variance = [tex]\sigma(i)[/tex]^2

    Find W(i) such that the linear estimator

    [tex]\mu[/tex] = [tex]\sum[/tex]W(i)X(i) for i = 1 to N


    mean value of [tex]\mu[/tex] = u

    and E[[tex](u-\mu)^2[/tex] is a minimum
  2. jcsd
  3. Jul 25, 2008 #2


    User Avatar
    Homework Helper

    There may be other ways to do this, but one of the most direct is outlined below.
    \hat\mu = \sum w(i) x_i
    then in order for [tex] E(\hat \mu) = \mu [/tex] to be true you must have

    \sum w(i) = 1

    Next, note that [tex] E(\hat \mu - \mu)^2 [/tex] is simply the [tex] \textbf{variance} [/tex] of your estimate (since your estimate has expectation [tex] \mu [/tex]).

    Since the [tex] x_i [/tex] are independent, the variance of [tex] \hat \mu [/tex] is

    \mathbf{Var}{\hat \mu} = \sum w(i)^2 \mathbf{Var}(x_i) = \sigma^2 \sum w(i)^2

    You want to choose the [tex] w(i) [/tex] so that the most recent expression is minimized, subject to the constraint that [tex] \sum w(i) = 1 [/tex]

    From here on use the method of undetermined multipliers.
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Weights of a linear estimator
  1. Linear estimation (Replies: 2)