Finding the MVUE of a two-sided interval of a normal

  • Context: Graduate 
  • Thread starter Thread starter rayge
  • Start date Start date
  • Tags Tags
    Interval Normal
Click For Summary

Discussion Overview

The discussion centers on whether there exists a minimum variance unbiased estimator (MVUE) for the probability P(-c ≤ X ≤ c) given a sample from a normal distribution N(θ, 1). The scope includes theoretical considerations of estimation and properties of unbiased estimators.

Discussion Character

  • Exploratory, Technical explanation, Debate/contested

Main Points Raised

  • One participant suggests that the MVUE for P(-c ≤ X ≤ c) may not be unique due to the symmetry of the normal distribution around θ and -θ.
  • Another participant points out that since θ is unknown, a decision rule based on the sign of θ cannot be employed.
  • A different approach is proposed involving the construction of two MVUEs for P(X ≤ c) and P(X ≤ -c), but concerns are raised about the uniqueness of the MVUE in this context.
  • One participant notes the ambiguity in estimating P(-c ≤ X ≤ c) if one first estimates θ from the probability, suggesting instead to estimate θ from the sample mean before estimating the probability.

Areas of Agreement / Disagreement

Participants express differing views on the uniqueness of the MVUE and the methods for estimating it, indicating that multiple competing perspectives remain without a consensus on the existence of a unique MVUE.

Contextual Notes

There are limitations regarding the assumptions about the estimator's dependence on the sign of θ and the implications of estimating θ from the probability itself, which remain unresolved.

rayge
Messages
25
Reaction score
0
Our task is to determine if [itex]P(-c \le X \le c)[/itex] has a minimum variance unbiased estimator for a sample from a distribution that is [itex]N(\theta,1)[/itex]. The one-sided interval [itex]P(X \le c) = \Phi(x - \theta)[/itex] is unique, so constructing an MVUE is just a matter of applying Rao-Blackwell and Lehmann-Scheffe.

However for our case, [itex]P(-c \le X \le c)[/itex] is the same for [itex]\theta[/itex] and [itex]-\theta[/itex]. So it seems like the MVUE isn't unique. I'm wondering if you can make a decision rule like choosing one unbiased estimator when [itex]\theta \ge 0[/itex] and the other when [itex]\theta < 0[/itex], but now instead of two non-unique unbiased estimators, we have three. Any thoughts? Is a MVUE just not possible?
 
Physics news on Phys.org
rayge said:
Our task is to determine if [itex]P(-c \le X \le c)[/itex] has a minimum variance unbiased estimator for a sample from a distribution that is [itex]N(\theta,1)[/itex].

I assume this means that [itex]c[/itex] is given and [itex]\theta[/itex] is unknown. So you can't employ a rule that depends on knowing the sign of [itex]\theta[/itex].
 
What if we construct two MVUE's, one for [itex]P(X \le c)[/itex], and one for [itex]P(X \le -c)[/itex], and then subtract one from the other? It still seems like we have the same problem, where the MVUE is not one-to-one...
 
rayge said:
However for our case, [itex]P(-c \le X \le c)[/itex] is the same for [itex]\theta[/itex] and [itex]-\theta[/itex].

There is ambiguity if you estimate [itex]P(-c \le X \e c)[/itex] first and try to estimate [itex]\theta[/itex] from that estimate. However the problem you stated doesn't insist we estimate [itex]\theta[/itex] in that manner. Wouldn't the simplest try be to estimate [itex]\theta[/itex] from the sample mean and then estimate [itex]P(-c \le X \le c)[/itex] from that estimate?
 

Similar threads

  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 0 ·
Replies
0
Views
4K
  • · Replies 20 ·
Replies
20
Views
2K
Replies
1
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K