Statistics Unbiased Estimators

In summary, we are considering a class of linear estimators for the parameter \mu, with the form \mu\widehat{} = a1X1 + a2X2 + ... + anXn. We found that the expected value of this estimator is \mu(a1 + a2 + ... + an), and the variance is \sigma^2(a1 + a2 + ... + an). In order for this estimator to be unbiased, the sum of the arbitrary constants must equal 1. To find the minimum variance estimator, we can use the Cauchy-Schwarz inequality and the fact that the LHS is equivalent to 1^2. The variance of the best unbiased estimator of this form is the
  • #1
braindead101
162
0
1. Let X1,X2, ... ,Xn be independent identically distributed random variables with ex-
pected value [tex]\mu[/tex] and variance [tex]\sigma^2[/tex]: Consider the class of linear estimators of the form
[tex]\mu\widehat{}[/tex] = a1X1 + a2X2 + ... + anXn (1)
for the parameter [tex]\mu[/tex], where a1, a2, ... an are arbitrary constants.
a) Find the expected value of the estimator [tex]\mu[/tex].
b) Find the variance of this estimator.
c) When is [tex]\mu\widehat{}[/tex] an unbiased estimator of [tex]\mu[/tex]?
d) Among all linear unbiased estimators of the above form (1), find the minimum
variance estimator.
Hint: Use the Cauchy-Schwarz inequality
e) What is the variance of the best unbiased estimator of the form (1)?

I am really lost and confused.
for (a) I got expected value is X1+X2+...Xn / n, is this correct or do i need to include the arbitrary constants? I thought about it again and got something totally different: i summed up a1EX1 + a2EX2 + anEXn, and wrote a general formula for it as aiEXi
for (b) is the variance just Var(miu hat) = sigma^2 (n summation (i=1) ai^2)
for (c) When Cramer-Rao inequality is satisifed?
I have not attempted (d) or (e) yet as I want to confirm that I am on the right track for (a) (b) and (c) for which I think I'm not.
 
Physics news on Phys.org
  • #2
(a) Is [itex]\mu[/itex] defined for arbitrary constants, or 1/n? And that's your answer.

aiEXi is right, now substitute in EXi = ___ (fill in the blank).

(b) The formula is [itex]Var[\sum a_i X_i] = \sum a_i^2 Var[X_i][/itex]; check your answer against this.

(c) No; what is the definition of an unbiased estimator?
 
Last edited:
  • #3
So these are my revised answers for a-c:
a: [itex]\mu[/itex] (a1 + a2 + .. + an)
b: [itex]\sigma^2[/itex] (a1 + a2 + ... + an)
c: expectation of estimator has to equal [itex]\mu[/itex], so in a: the (a1+a2..) part has to equal 1.
as for d, i am unsure how to start this. but i have noticed that the LHS of the cauchy-schwarz can be replaced by 1^2 due to the fact that it is identical to (a1+a2+...) but in summation form. i don't know how to go from there though.
as for e, would it just equal the answer in d, as they are suppose to equal, i found in my notes that best unbiased estimator = minimum variance estimator
 
  • #4
b isn't right.

LHS of C-S is a.X, where a and X are vectors and . is the dot product.

For this problem C-S is:

(a.X)^2 < ||a|| ||X||

where ||a|| = sqrt[a.a] and ||X|| = sqrt[X.X]. How did you get to 1^2?

I don't understand your answer to e. It is asking for a variance.
 

What is the definition of an unbiased estimator in statistics?

An unbiased estimator in statistics is a statistic that accurately estimates a population parameter without any systematic error. This means that, on average, the estimator will produce estimates that are equal to the true value of the parameter.

How do you determine if an estimator is unbiased?

To determine if an estimator is unbiased, you need to calculate its expected value (also known as the mean or average) and compare it to the true value of the parameter. If the expected value is equal to the true value, then the estimator is unbiased. Additionally, you can also use simulation or mathematical proofs to show that an estimator is unbiased.

What are the consequences of using a biased estimator?

If a biased estimator is used, the resulting estimates will be systematically higher or lower than the true value of the parameter. This can lead to incorrect conclusions and decisions based on the data. It is important to use unbiased estimators in order to obtain accurate and reliable results in statistical analysis.

Can an estimator be both unbiased and consistent?

Yes, an estimator can be both unbiased and consistent. A consistent estimator is one that produces estimates that get closer and closer to the true value of the parameter as the sample size increases. It is possible for an estimator to be unbiased but not consistent, or consistent but not unbiased. However, the ideal scenario is to have an estimator that is both unbiased and consistent.

Are there any commonly used biased estimators in statistics?

Yes, there are some commonly used biased estimators in statistics. For example, the sample variance is a biased estimator of the population variance, but it is often used because it is easy to calculate. However, there are also unbiased estimators for the population variance, such as the sample variance with a correction factor. It is important to be aware of the potential bias in estimators and choose the most appropriate one for the specific situation.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
435
  • Calculus and Beyond Homework Help
Replies
5
Views
3K
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
1
Views
3K
  • Calculus and Beyond Homework Help
Replies
1
Views
819
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Back
Top