rayge
- 25
- 0
Our task is to determine if P(-c \le X \le c) has a minimum variance unbiased estimator for a sample from a distribution that is N(\theta,1). The one-sided interval P(X \le c) = \Phi(x - \theta) is unique, so constructing an MVUE is just a matter of applying Rao-Blackwell and Lehmann-Scheffe.
However for our case, P(-c \le X \le c) is the same for \theta and -\theta. So it seems like the MVUE isn't unique. I'm wondering if you can make a decision rule like choosing one unbiased estimator when \theta \ge 0 and the other when \theta < 0, but now instead of two non-unique unbiased estimators, we have three. Any thoughts? Is a MVUE just not possible?
However for our case, P(-c \le X \le c) is the same for \theta and -\theta. So it seems like the MVUE isn't unique. I'm wondering if you can make a decision rule like choosing one unbiased estimator when \theta \ge 0 and the other when \theta < 0, but now instead of two non-unique unbiased estimators, we have three. Any thoughts? Is a MVUE just not possible?