Finding the MVUE of a two-sided interval of a normal

  • Context: Graduate 
  • Thread starter Thread starter rayge
  • Start date Start date
  • Tags Tags
    Interval Normal
Click For Summary
SUMMARY

The discussion focuses on determining the minimum variance unbiased estimator (MVUE) for the probability P(-c ≤ X ≤ c) from a normal distribution N(θ, 1). The challenge arises due to the symmetry of the distribution, where P(-c ≤ X ≤ c) is equivalent for θ and -θ, leading to non-uniqueness in the MVUE. Participants suggest constructing two MVUEs for P(X ≤ c) and P(X ≤ -c) and exploring the implications of estimating θ from the sample mean. Ultimately, the consensus is that a unique MVUE may not be achievable in this scenario.

PREREQUISITES
  • Understanding of minimum variance unbiased estimators (MVUE)
  • Familiarity with Rao-Blackwell and Lehmann-Scheffé theorems
  • Knowledge of normal distribution properties, specifically N(θ, 1)
  • Proficiency in statistical estimation techniques, including sample mean estimation
NEXT STEPS
  • Research the application of Rao-Blackwell theorem in constructing MVUEs
  • Study the implications of symmetry in probability distributions
  • Explore advanced statistical techniques for unbiased estimation
  • Investigate the properties of estimators in the context of normal distributions
USEFUL FOR

Statisticians, data analysts, and researchers involved in statistical estimation and hypothesis testing, particularly those working with normal distributions and unbiased estimators.

rayge
Messages
25
Reaction score
0
Our task is to determine if P(-c \le X \le c) has a minimum variance unbiased estimator for a sample from a distribution that is N(\theta,1). The one-sided interval P(X \le c) = \Phi(x - \theta) is unique, so constructing an MVUE is just a matter of applying Rao-Blackwell and Lehmann-Scheffe.

However for our case, P(-c \le X \le c) is the same for \theta and -\theta. So it seems like the MVUE isn't unique. I'm wondering if you can make a decision rule like choosing one unbiased estimator when \theta \ge 0 and the other when \theta < 0, but now instead of two non-unique unbiased estimators, we have three. Any thoughts? Is a MVUE just not possible?
 
Physics news on Phys.org
rayge said:
Our task is to determine if P(-c \le X \le c) has a minimum variance unbiased estimator for a sample from a distribution that is N(\theta,1).

I assume this means that c is given and \theta is unknown. So you can't employ a rule that depends on knowing the sign of \theta.
 
What if we construct two MVUE's, one for P(X \le c), and one for P(X \le -c), and then subtract one from the other? It still seems like we have the same problem, where the MVUE is not one-to-one...
 
rayge said:
However for our case, P(-c \le X \le c) is the same for \theta and -\theta.

There is ambiguity if you estimate P(-c \le X \e c) first and try to estimate \theta from that estimate. However the problem you stated doesn't insist we estimate \theta in that manner. Wouldn't the simplest try be to estimate \theta from the sample mean and then estimate P(-c \le X \le c) from that estimate?
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 20 ·
Replies
20
Views
2K
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
Replies
24
Views
5K