Why are my T1 and T2 values so different if they are both unbiased estimators?

  • Thread starter BlueScreenOD
  • Start date
  • Tags
    Estimators
In summary, an unbiased estimator is a statistical measure that accurately estimates the true value of a population parameter. It is important to have unbiased estimators because they allow for accurate and reliable statistical inference. An estimator is considered unbiased if it produces accurate estimates of the population parameter on average. Examples of unbiased estimators include the sample mean, sample variance, and sample proportion. Even though an estimator is unbiased, it can still have high variability in individual estimates.
  • #1
BlueScreenOD
14
0

Homework Statement



See attached.

Homework Equations



The Attempt at a Solution



I have no issues with part A. I simply took the expected value of T1 and T2 and everything turned out fine. What I'm having issues with is part B.

I have:
T1 = (4 / n) * 1997 - 2 = 0.08075
T2 = (4 / n) * 32 = 0.03334

Why are my answers so different if they are both unbiased estimators? Shouldn't they at least be somewhat close to each other?
 

Attachments

  • q19.7.GIF
    q19.7.GIF
    17.2 KB · Views: 448
Physics news on Phys.org
  • #2
Your answers are correct. The expected value of T1 is θ. However, any observed value of T1 may or may not be close to θ.
 

What is an unbiased estimator?

An unbiased estimator is a statistical measure that accurately estimates the true value of a population parameter. It is considered unbiased if its expected value is equal to the true value of the parameter being estimated.

Why is it important to have unbiased estimators?

Having unbiased estimators allows for accurate and reliable statistical inference. It ensures that the sample data is representative of the population and that the estimates are not systematically skewed in one direction.

How do you determine if an estimator is unbiased?

An estimator is considered unbiased if, on average, it produces accurate estimates of the population parameter. This can be determined by calculating the expected value of the estimator and comparing it to the true value of the parameter.

What are some examples of unbiased estimators?

The sample mean, sample variance, and sample proportion are all examples of unbiased estimators. In regression analysis, the ordinary least squares (OLS) estimator is also considered unbiased.

Can an unbiased estimator still have high variability?

Yes, an unbiased estimator can still have high variability. An unbiased estimator only ensures that, on average, it produces accurate estimates of the population parameter. However, individual estimates can still vary widely from the true value.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
3K
  • Calculus and Beyond Homework Help
Replies
4
Views
996
Replies
2
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
19
Views
8K
  • Calculus and Beyond Homework Help
Replies
11
Views
20K
Replies
1
Views
952
Replies
1
Views
789
  • Calculus and Beyond Homework Help
Replies
3
Views
9K
Back
Top