How Does Bayesian Estimation Determine Point Estimates with Prior Distributions?

Click For Summary
SUMMARY

This discussion focuses on Bayesian estimation for point estimates using prior distributions. The problem involves the nth order statistic Y_n from a distribution with a probability density function (pdf) f(x|θ) = 1/θ for 0 < x < θ. The prior pdf is given as h(θ) = (βα^β) / (θ^(β + 1)), where α < θ < ∞, α > 0, and β > 0. The participants analyze the posterior distribution k(θ|y_n) and identify issues with the integration limits and the dependence of the posterior on the observed value y_n.

PREREQUISITES
  • Understanding of Bayesian statistics and point estimation
  • Familiarity with probability density functions (pdfs)
  • Knowledge of order statistics in statistical analysis
  • Experience with integration techniques in probability theory
NEXT STEPS
  • Study Bayesian inference and its applications in statistical modeling
  • Learn about order statistics and their significance in statistical estimation
  • Explore the concept of loss functions in Bayesian decision theory
  • Investigate the derivation of posterior distributions in Bayesian statistics
USEFUL FOR

Statisticians, data scientists, and researchers involved in Bayesian analysis and point estimation methodologies will benefit from this discussion.

rayge
Messages
25
Reaction score
0
Homework Statement
Let Y_n be the nth order statistic of a random sample of size n from a distribution with pdf f(x|\theta)=1/\theta from 0 to \theta, zero elsewhere. Take the loss function to be L(\theta, \delta(y))=[\theta-\delta(y_n)]^2. Let \theta be an observed value of the random variable \Theta, which has the prior pdf h(\theta)=\frac{\beta \alpha^\beta} {\theta^{\beta + 1}}, \alpha &lt; \theta &lt; \infty, zero elsewhere, with \alpha &gt; 0, \beta &gt; 0. Find the Bayes solution \delta(y_n) for a point estimate of \theta.
The attempt at a solution
I've found that the conditional pdf of Y_n given \theta is:
\frac{n y_n^{n-1}}{\theta^n}
which allows us to find the posterior k(\theta|y_n) by finding what it's proportional to:
k(\theta|y_n) \propto \frac{n y_n^{n-1}}{\theta^n}\frac{\beta \alpha^\beta}{\theta^{\beta + 1}}
Where I'm sketchy is that apparently we can just remove all terms not having to do with theta, come up with a fudge factor to make the distribution integrate to 1 over its support, and call it good. I end up with:
\frac{1}{\theta^{n+\beta}}
When I integrate from \alpha to \infty, and solve for the fudge factor, I get (n+\beta)\alpha^{n+\beta} as the scaling factor, so for my posterior I get:
(n+\beta)\alpha^{n+\beta}\frac{1}{\theta^{n+\beta}}
Which doesn't even have a y_n term in it. Weird.

When I find the expected value of \theta with this distribution, I get 1. Which isn't a very compelling point estimate. So I think I missed a y_n somewhere but I don't know where. Any thoughts? Thanks in advance.
 
Physics news on Phys.org
Not an area I'm familiar with, so can't help with your specific question, but one thing does look wrong to me: if you substitute n=0 in your answer, shouldn't you get h(θ)? The power of theta seems to be one off.
 
I wrote a whole reply here while totally missing what you were saying. Thanks for the response! I'll check it out.
 
rayge said:
Homework Statement
Let Y_n be the nth order statistic of a random sample of size n from a distribution with pdf f(x|\theta)=1/\theta from 0 to \theta, zero elsewhere. Take the loss function to be L(\theta, \delta(y))=[\theta-\delta(y_n)]^2. Let \theta be an observed value of the random variable \Theta, which has the prior pdf h(\theta)=\frac{\beta \alpha^\beta} {\theta^{\beta + 1}}, \alpha &lt; \theta &lt; \infty, zero elsewhere, with \alpha &gt; 0, \beta &gt; 0. Find the Bayes solution \delta(y_n) for a point estimate of \theta.
The attempt at a solution
I've found that the conditional pdf of Y_n given \theta is:
\frac{n y_n^{n-1}}{\theta^n}
which allows us to find the posterior k(\theta|y_n) by finding what it's proportional to:
k(\theta|y_n) \propto \frac{n y_n^{n-1}}{\theta^n}\frac{\beta \alpha^\beta}{\theta^{\beta + 1}}
Where I'm sketchy is that apparently we can just remove all terms not having to do with theta, come up with a fudge factor to make the distribution integrate to 1 over its support, and call it good. I end up with:
\frac{1}{\theta^{n+\beta}}
When I integrate from \alpha to \infty, and solve for the fudge factor, I get (n+\beta)\alpha^{n+\beta} as the scaling factor, so for my posterior I get:
(n+\beta)\alpha^{n+\beta}\frac{1}{\theta^{n+\beta}}
Which doesn't even have a y_n term in it. Weird.

When I find the expected value of \theta with this distribution, I get 1. Which isn't a very compelling point estimate. So I think I missed a y_n somewhere but I don't know where. Any thoughts? Thanks in advance.

I'm not sure how the loss-function business enters into the calculation, but you seem to be trying to compute the Bayesian posterior density of ##\theta##, given ##Y_n = y_n##. You have made an error in that. Below, I will use ##Y,y## instead of ##Y_n,y_n##, ##C,c## instead of ##\Theta, \theta## and ##a,b## instead of ##\alpha, \beta##---just to make typing easier.

Using the given prior density, the joint density of ##(y,c)## is
f_{Y,C}(y,c) = \frac{b a^b}{c^{b+1}} \frac{n y^{n-1}}{c^n}, 0 &lt; y &lt; c, a &lt; c &lt; \infty
The (prior) density of ##Y## is ##f_Y(y) = \int f_{Y,C}(y,c) \, dc##, but you need to be careful about integration limits. For ##0 < y < a## we have
f_Y(y) = \int_{c=a}^{\infty} f_{Y,C}(y,c) \, dc <br /> = \frac{n b y^{n-1}}{a^n (b+n)}, \; 0 &lt; y &lt; a For ##y > a## we have
f_Y(y) = \int_{c=y}^{\infty} f_{Y,C}(y,c) \, dc <br /> = \frac{n b a^b}{y^{b+1}(b+n)}, \: y &gt; a Thus, the posterior density of ##C## will depend on ##y##, since the denominator in ##f(c|y) = f_{Y,C}(y,c)/f_Y(y)## has two different forms for ##y < a## and ##y > a##.
 
Last edited:

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
2
Views
3K
  • · Replies 22 ·
Replies
22
Views
1K
Replies
46
Views
5K
Replies
33
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K