Statistics Unbiased Estimators

Click For Summary

Homework Help Overview

The discussion revolves around the topic of unbiased estimators in statistics, specifically focusing on linear estimators of the expected value of a random variable. Participants are exploring the expected value, variance, and conditions for unbiasedness of these estimators.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • Participants are attempting to derive the expected value and variance of the linear estimator, questioning whether arbitrary constants should be included in their calculations. There is also discussion about the conditions under which the estimator is unbiased and the implications of the Cramer-Rao inequality.

Discussion Status

Some participants have revised their initial answers and are seeking confirmation on their reasoning. There is ongoing exploration of the Cauchy-Schwarz inequality and its application to the problem, with some uncertainty about the definitions and relationships involved. Multiple interpretations and approaches are being discussed without a clear consensus.

Contextual Notes

Participants express confusion regarding the definitions and calculations related to unbiased estimators, particularly in relation to the arbitrary constants and the application of inequalities. There is a hint of imposed homework rules as participants are encouraged to confirm their understanding before proceeding with further parts of the problem.

braindead101
Messages
158
Reaction score
0
1. Let X1,X2, ... ,Xn be independent identically distributed random variables with ex-
pected value \mu and variance \sigma^2: Consider the class of linear estimators of the form
\mu\widehat{} = a1X1 + a2X2 + ... + anXn (1)
for the parameter \mu, where a1, a2, ... an are arbitrary constants.
a) Find the expected value of the estimator \mu.
b) Find the variance of this estimator.
c) When is \mu\widehat{} an unbiased estimator of \mu?
d) Among all linear unbiased estimators of the above form (1), find the minimum
variance estimator.
Hint: Use the Cauchy-Schwarz inequality
e) What is the variance of the best unbiased estimator of the form (1)?

I am really lost and confused.
for (a) I got expected value is X1+X2+...Xn / n, is this correct or do i need to include the arbitrary constants? I thought about it again and got something totally different: i summed up a1EX1 + a2EX2 + anEXn, and wrote a general formula for it as aiEXi
for (b) is the variance just Var(miu hat) = sigma^2 (n summation (i=1) ai^2)
for (c) When Cramer-Rao inequality is satisifed?
I have not attempted (d) or (e) yet as I want to confirm that I am on the right track for (a) (b) and (c) for which I think I'm not.
 
Physics news on Phys.org
(a) Is \mu defined for arbitrary constants, or 1/n? And that's your answer.

aiEXi is right, now substitute in EXi = ___ (fill in the blank).

(b) The formula is Var[\sum a_i X_i] = \sum a_i^2 Var[X_i]; check your answer against this.

(c) No; what is the definition of an unbiased estimator?
 
Last edited:
So these are my revised answers for a-c:
a: \mu (a1 + a2 + .. + an)
b: \sigma^2 (a1 + a2 + ... + an)
c: expectation of estimator has to equal \mu, so in a: the (a1+a2..) part has to equal 1.
as for d, i am unsure how to start this. but i have noticed that the LHS of the cauchy-schwarz can be replaced by 1^2 due to the fact that it is identical to (a1+a2+...) but in summation form. i don't know how to go from there though.
as for e, would it just equal the answer in d, as they are suppose to equal, i found in my notes that best unbiased estimator = minimum variance estimator
 
b isn't right.

LHS of C-S is a.X, where a and X are vectors and . is the dot product.

For this problem C-S is:

(a.X)^2 < ||a|| ||X||

where ||a|| = sqrt[a.a] and ||X|| = sqrt[X.X]. How did you get to 1^2?

I don't understand your answer to e. It is asking for a variance.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
Replies
2
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
3K