1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Statistics Unbiased Estimators

  1. Jan 16, 2008 #1
    1. Let X1,X2, ... ,Xn be independent identically distributed random variables with ex-
    pected value [tex]\mu[/tex] and variance [tex]\sigma^2[/tex]: Consider the class of linear estimators of the form
    [tex]\mu\widehat{}[/tex] = a1X1 + a2X2 + ... + anXn (1)
    for the parameter [tex]\mu[/tex], where a1, a2, ... an are arbitrary constants.
    a) Find the expected value of the estimator [tex]\mu[/tex].
    b) Find the variance of this estimator.
    c) When is [tex]\mu\widehat{}[/tex] an unbiased estimator of [tex]\mu[/tex]?
    d) Among all linear unbiased estimators of the above form (1), find the minimum
    variance estimator.
    Hint: Use the Cauchy-Schwarz inequality
    e) What is the variance of the best unbiased estimator of the form (1)?

    I am really lost and confused.
    for (a) I got expected value is X1+X2+...Xn / n, is this correct or do i need to include the arbitrary constants? I thought about it again and got something totally different: i summed up a1EX1 + a2EX2 + anEXn, and wrote a general formula for it as aiEXi
    for (b) is the variance just Var(miu hat) = sigma^2 (n summation (i=1) ai^2)
    for (c) When Cramer-Rao inequality is satisifed?
    I have not attempted (d) or (e) yet as I want to confirm that I am on the right track for (a) (b) and (c) for which I think I'm not.
  2. jcsd
  3. Jan 17, 2008 #2


    User Avatar
    Science Advisor
    Homework Helper

    (a) Is [itex]\mu[/itex] defined for arbitrary constants, or 1/n? And that's your answer.

    aiEXi is right, now substitute in EXi = ___ (fill in the blank).

    (b) The formula is [itex]Var[\sum a_i X_i] = \sum a_i^2 Var[X_i][/itex]; check your answer against this.

    (c) No; what is the definition of an unbiased estimator?
    Last edited: Jan 17, 2008
  4. Jan 17, 2008 #3
    So these are my revised answers for a-c:
    a: [itex]\mu[/itex] (a1 + a2 + .. + an)
    b: [itex]\sigma^2[/itex] (a1 + a2 + ... + an)
    c: expectation of estimator has to equal [itex]\mu[/itex], so in a: the (a1+a2..) part has to equal 1.
    as for d, i am unsure how to start this. but i have noticed that the LHS of the cauchy-schwarz can be replaced by 1^2 due to the fact that it is identical to (a1+a2+....) but in summation form. i don't know how to go from there though.
    as for e, would it just equal the answer in d, as they are suppose to equal, i found in my notes that best unbiased estimator = minimum variance estimator
  5. Jan 18, 2008 #4


    User Avatar
    Science Advisor
    Homework Helper

    b isn't right.

    LHS of C-S is a.X, where a and X are vectors and . is the dot product.

    For this problem C-S is:

    (a.X)^2 < ||a|| ||X||

    where ||a|| = sqrt[a.a] and ||X|| = sqrt[X.X]. How did you get to 1^2?

    I don't understand your answer to e. It is asking for a variance.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook