Bayesian Estimation: Posterior Mean Estimator

In summary, we can derive the Bayesian posterior mean estimator by first finding the marginal for X, then using it to calculate the posterior distribution for \theta, and finally taking the integral to find the posterior mean estimator. The correct solution is given by E(\theta | \vec{X}) = \frac{n-1}{n-2} \frac{1}{1-x_M^{-(n-1)}}.
  • #1
economist13
15
0

Homework Statement



[tex] X_1 , \dots ,X_n \sim U[0, \theta][/tex] iid. [tex]\theta \sim U[0,1][/tex]

derive the Bayesian posterior mean estimator

Homework Equations



[tex] f(\theta |\vec{X}) = \frac{f( \vec{X}|\theta)f(\theta )}{f( \vec{X})}[/tex]

The Attempt at a Solution



My line of thinking...

First, the marginal for X, [tex]f( \vec{X}) = \int f( \vec{X}|\theta)f(\theta )d\theta [/tex]

Let [tex] x_M \equiv max_i \{X_i\} [/tex]. Since it must be that [tex] \theta \geq x_M [/tex]

[tex]f( \vec{X}) = \int_{x_M}^1 \frac{1}{\theta^n} d\theta = \frac{x_M^{-(n-1)}-1}{n-1}[/tex]

Then [tex] f( \theta | \vec{X}) = \frac{1/\theta^n}{\frac{x_M^{-(n-1)}-1}{n-1}} = \frac{n-1}{\theta^n(x_M^{-(n-1)} -1)}[/tex]

Then [tex] E( \theta | \vec{X}) = \int_0^1 \theta f(\theta |X) d \theta = \frac{n-1}{n-2} \frac{1}{1-x_M^{-(n-1)}[/tex]I think my bounds of integration are wrong, or something to that effect...where did I go wrong?
 
Last edited:
Physics news on Phys.org
  • #2


Your approach is correct, but there are a few small mistakes in your calculations. Here is the correct solution:

First, the marginal for X is given by:

f(\vec{X}) = \int_0^1 \frac{1}{\theta^n} d\theta = \frac{1}{(n-1)x_M^{n-1}}

Next, the posterior distribution for \theta is given by:

f(\theta | \vec{X}) = \frac{f(\vec{X} | \theta) f(\theta)}{f(\vec{X})} = \frac{1}{\theta^n} \frac{1}{(n-1)x_M^{n-1}} = \frac{1}{(n-1)\theta^n x_M^{n-1}}

Finally, the posterior mean estimator is given by:

E(\theta | \vec{X}) = \int_0^1 \theta f(\theta | \vec{X}) d\theta = \frac{n-1}{n-2} \frac{1}{1-x_M^{-(n-1)}}

Note that the bounds of integration are from 0 to 1, since \theta is uniformly distributed on [0,1]. Also, the denominator in the final expression should be (n-2), not (n-1).

Hope this helps!
 

1. What is Bayesian estimation?

Bayesian estimation is a statistical method used to infer the parameters of a probability distribution, given some data. It is based on Bayes' theorem, which states that the posterior probability of a parameter is proportional to the likelihood of the data given that parameter multiplied by the prior probability of the parameter.

2. What is the posterior mean estimator?

The posterior mean estimator is a method used to estimate the mean of a probability distribution based on Bayesian estimation. It takes into account both the prior knowledge about the mean and the likelihood of the data to produce a more accurate estimate compared to traditional methods.

3. How is the posterior mean estimator calculated?

The posterior mean estimator is calculated by taking the weighted average of the prior mean and the sample mean, where the weights are given by the prior and likelihood respectively. This can be represented mathematically as: posterior mean = (prior mean * prior weight) + (sample mean * likelihood weight).

4. What are the advantages of using the posterior mean estimator?

One advantage of using the posterior mean estimator is that it takes into account both prior knowledge and data, resulting in a more accurate estimate compared to traditional methods. It also provides a measure of uncertainty in the form of a confidence interval, which can be useful in decision making.

5. Are there any limitations to using the posterior mean estimator?

One limitation of the posterior mean estimator is that it relies heavily on the choice of prior distribution. If the prior is not well-informed or is too subjective, it can lead to biased estimates. Additionally, it may not be suitable for complex models with high-dimensional data, as the computation can become computationally intensive.

Similar threads

  • Calculus and Beyond Homework Help
Replies
3
Views
547
  • Calculus and Beyond Homework Help
Replies
8
Views
865
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
901
  • Calculus and Beyond Homework Help
Replies
5
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
1K
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Quantum Interpretations and Foundations
Replies
1
Views
490
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
738
  • Set Theory, Logic, Probability, Statistics
Replies
1
Views
953
Back
Top