- #1
rayge
- 25
- 0
Our task is to determine if [itex]P(-c \le X \le c)[/itex] has a minimum variance unbiased estimator for a sample from a distribution that is [itex]N(\theta,1)[/itex]. The one-sided interval [itex]P(X \le c) = \Phi(x - \theta)[/itex] is unique, so constructing an MVUE is just a matter of applying Rao-Blackwell and Lehmann-Scheffe.
However for our case, [itex]P(-c \le X \le c)[/itex] is the same for [itex]\theta[/itex] and [itex]-\theta[/itex]. So it seems like the MVUE isn't unique. I'm wondering if you can make a decision rule like choosing one unbiased estimator when [itex]\theta \ge 0[/itex] and the other when [itex]\theta < 0[/itex], but now instead of two non-unique unbiased estimators, we have three. Any thoughts? Is a MVUE just not possible?
However for our case, [itex]P(-c \le X \le c)[/itex] is the same for [itex]\theta[/itex] and [itex]-\theta[/itex]. So it seems like the MVUE isn't unique. I'm wondering if you can make a decision rule like choosing one unbiased estimator when [itex]\theta \ge 0[/itex] and the other when [itex]\theta < 0[/itex], but now instead of two non-unique unbiased estimators, we have three. Any thoughts? Is a MVUE just not possible?