How Does Bayesian Estimation Determine Point Estimates with Prior Distributions?

Click For Summary

Homework Help Overview

The discussion revolves around Bayesian estimation and point estimates using prior distributions. The original poster presents a problem involving the nth order statistic from a specific probability distribution and seeks to derive the Bayes solution for estimating a parameter based on given data and a prior distribution.

Discussion Character

  • Exploratory, Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to derive the posterior distribution and point estimate but expresses uncertainty about the integration process and the role of the observed data in the posterior. Some participants question the correctness of the calculations, particularly regarding the integration limits and the dependence of the posterior on the observed data.

Discussion Status

Participants are actively engaging with the original poster's calculations, providing feedback and questioning specific steps in the reasoning. There is an acknowledgment of potential errors in the approach, and some participants are exploring the implications of the loss function in the context of the problem.

Contextual Notes

There are indications of confusion regarding the integration limits and the treatment of the prior distribution in relation to the observed data. The original poster also notes a discrepancy in the expected value derived from the posterior distribution.

rayge
Messages
25
Reaction score
0
Homework Statement
Let Y_n be the nth order statistic of a random sample of size n from a distribution with pdf f(x|\theta)=1/\theta from 0 to \theta, zero elsewhere. Take the loss function to be L(\theta, \delta(y))=[\theta-\delta(y_n)]^2. Let \theta be an observed value of the random variable \Theta, which has the prior pdf h(\theta)=\frac{\beta \alpha^\beta} {\theta^{\beta + 1}}, \alpha < \theta < \infty, zero elsewhere, with \alpha > 0, \beta > 0. Find the Bayes solution \delta(y_n) for a point estimate of \theta.
The attempt at a solution
I've found that the conditional pdf of Y_n given \theta is:
\frac{n y_n^{n-1}}{\theta^n}
which allows us to find the posterior k(\theta|y_n) by finding what it's proportional to:
k(\theta|y_n) \propto \frac{n y_n^{n-1}}{\theta^n}\frac{\beta \alpha^\beta}{\theta^{\beta + 1}}
Where I'm sketchy is that apparently we can just remove all terms not having to do with theta, come up with a fudge factor to make the distribution integrate to 1 over its support, and call it good. I end up with:
\frac{1}{\theta^{n+\beta}}
When I integrate from \alpha to \infty, and solve for the fudge factor, I get (n+\beta)\alpha^{n+\beta} as the scaling factor, so for my posterior I get:
(n+\beta)\alpha^{n+\beta}\frac{1}{\theta^{n+\beta}}
Which doesn't even have a y_n term in it. Weird.

When I find the expected value of \theta with this distribution, I get 1. Which isn't a very compelling point estimate. So I think I missed a y_n somewhere but I don't know where. Any thoughts? Thanks in advance.
 
Physics news on Phys.org
Not an area I'm familiar with, so can't help with your specific question, but one thing does look wrong to me: if you substitute n=0 in your answer, shouldn't you get h(θ)? The power of theta seems to be one off.
 
I wrote a whole reply here while totally missing what you were saying. Thanks for the response! I'll check it out.
 
rayge said:
Homework Statement
Let Y_n be the nth order statistic of a random sample of size n from a distribution with pdf f(x|\theta)=1/\theta from 0 to \theta, zero elsewhere. Take the loss function to be L(\theta, \delta(y))=[\theta-\delta(y_n)]^2. Let \theta be an observed value of the random variable \Theta, which has the prior pdf h(\theta)=\frac{\beta \alpha^\beta} {\theta^{\beta + 1}}, \alpha < \theta < \infty, zero elsewhere, with \alpha > 0, \beta > 0. Find the Bayes solution \delta(y_n) for a point estimate of \theta.
The attempt at a solution
I've found that the conditional pdf of Y_n given \theta is:
\frac{n y_n^{n-1}}{\theta^n}
which allows us to find the posterior k(\theta|y_n) by finding what it's proportional to:
k(\theta|y_n) \propto \frac{n y_n^{n-1}}{\theta^n}\frac{\beta \alpha^\beta}{\theta^{\beta + 1}}
Where I'm sketchy is that apparently we can just remove all terms not having to do with theta, come up with a fudge factor to make the distribution integrate to 1 over its support, and call it good. I end up with:
\frac{1}{\theta^{n+\beta}}
When I integrate from \alpha to \infty, and solve for the fudge factor, I get (n+\beta)\alpha^{n+\beta} as the scaling factor, so for my posterior I get:
(n+\beta)\alpha^{n+\beta}\frac{1}{\theta^{n+\beta}}
Which doesn't even have a y_n term in it. Weird.

When I find the expected value of \theta with this distribution, I get 1. Which isn't a very compelling point estimate. So I think I missed a y_n somewhere but I don't know where. Any thoughts? Thanks in advance.

I'm not sure how the loss-function business enters into the calculation, but you seem to be trying to compute the Bayesian posterior density of ##\theta##, given ##Y_n = y_n##. You have made an error in that. Below, I will use ##Y,y## instead of ##Y_n,y_n##, ##C,c## instead of ##\Theta, \theta## and ##a,b## instead of ##\alpha, \beta##---just to make typing easier.

Using the given prior density, the joint density of ##(y,c)## is
f_{Y,C}(y,c) = \frac{b a^b}{c^{b+1}} \frac{n y^{n-1}}{c^n}, 0 < y < c, a < c < \infty
The (prior) density of ##Y## is ##f_Y(y) = \int f_{Y,C}(y,c) \, dc##, but you need to be careful about integration limits. For ##0 < y < a## we have
f_Y(y) = \int_{c=a}^{\infty} f_{Y,C}(y,c) \, dc <br /> = \frac{n b y^{n-1}}{a^n (b+n)}, \; 0 &lt; y &lt; a For ##y > a## we have
f_Y(y) = \int_{c=y}^{\infty} f_{Y,C}(y,c) \, dc <br /> = \frac{n b a^b}{y^{b+1}(b+n)}, \: y &gt; a Thus, the posterior density of ##C## will depend on ##y##, since the denominator in ##f(c|y) = f_{Y,C}(y,c)/f_Y(y)## has two different forms for ##y < a## and ##y > a##.
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
2
Views
3K
  • · Replies 22 ·
Replies
22
Views
2K
Replies
46
Views
7K
Replies
33
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K