Bayesian Inference: Finding Distribution of W Before Experiment

  • Thread starter Thread starter libelec
  • Start date Start date
  • Tags Tags
    Bayesian
Click For Summary
SUMMARY

The discussion focuses on Bayesian inference to determine the distribution of white balls (W) in an urn containing 6 balls, after drawing one white and one black ball. The initial prior distribution of the proportion of white balls (p) is uniformly distributed across {0,1,...,6}. The likelihood function is derived from the draws, leading to a posterior distribution calculated as π(p|BW) = (1/7*(p/6)*(1-(p-1)/5)1{p=1,...,5})/(\sum1/7*(p/6)*(1-(p-1)/5)). The final posterior distribution is confirmed as 6/7*(p/6)*(1-(p-1)/5)1{p=1,...,5}, correcting earlier misconceptions about uniformity and ordering in the sampling process.

PREREQUISITES
  • Understanding of Bayesian inference principles
  • Familiarity with likelihood functions in statistics
  • Knowledge of Bernoulli and hypergeometric distributions
  • Basic concepts of prior and posterior distributions
NEXT STEPS
  • Study the derivation of likelihood functions in Bayesian statistics
  • Learn about hypergeometric distributions and their applications
  • Explore advanced Bayesian inference techniques using software like PyMC3 or Stan
  • Investigate the implications of sampling without replacement in probability theory
USEFUL FOR

Statisticians, data scientists, and students studying Bayesian statistics who are interested in understanding the application of Bayesian inference to real-world problems involving sampling and distributions.

libelec
Messages
173
Reaction score
0

Homework Statement


An urn has 6 balls: black and white. Initially, the number of white balls W is distributed uniformly in {0,1,...,6}. Two balls are taken from the urn: one white and one black.

Find the distribution of W before the experiment.

The Attempt at a Solution



So, initially, if p is the proportion of white balls, then fp(p) = 1/7 1{p=0,1/6,...,1}. This is the prior distribution of the parameter p.

Given the sample x=(B,W), if I consider each element of the sample a Bernoulli variable of parameter p, then the likelihood function would be L(p|B,W) = p*(1-p).

(Now, this would rule out the possibility of p being 0 or 1. I don't know how that translates into the posterior distribution of p...)

Then, the posterior distribution of p would be:

\pi (p|B,W) = \frac{{\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}}}{{\sum\limits_p {\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}} }}

My problem is that the marginal distribution of the sample (which I think is what the problem asks for) yields 5/36 1{p=0,1/6,...,1}.

This is impossible! since the sample shows at least one white (p can't be 0) and one black (p can't be 1). Besides, that should be a uniform distribution over {p=0,1/6,...,1}, and with the 5/36 that is impossible (5/36*7 = 35/36 != 1).

What am I doing wrong?
 
Physics news on Phys.org
You are doing at least a couple of things wrong here.
  1. You are implicitly assuming the posterior probability will be uniform. It won't.
  2. The observed event involves drawing two balls from the urn without replacement. You are not calculating that correctly.
  3. You are assuming the balls were drawn white and then black, in that order. While two balls were drawn, I don't think you should assume a specific ordering. (It doesn't matter in this particular problem, but it certainly does in other cases.)
 
Oh, you're right: there's no replacement.

So, should I treat the X's as Bernoulli variables or as hypergeometric variables?

I mean, if now p is the number or white balls in the urn, the sample yields (p/6)*(1-p/5)?
 
Almost. Suppose that a white ball then a black ball are drawn out of the urn. The probability of having drawn the white ball is indeed p/6. However, the probability of drawing a black ball on the second draw is not 1-p/5.
 
I mean 1 - (p-1)/5. Is that it?
 
Correct.
 
So the likelihood function is (p/6)*(1-(p-1)/5) 1{p=1,...,5}. The prior function of p is then 1/7 1{p=0,1,...,6}

Then \Pi(p|BW) = (1/7*(p/6)*(1-(p-1)/5)1{p=1,...,5})/(\sum1/7*(p/6)*(1-(p-1)/5))

This I solved as: \Pi(p|BW) = 6/7*(p/6)*(1-(p-1)/5)1{p=1,...,5}

So, that's the posterior distribution of the number of white balls at the beginning of the experiment, right? Is that correct?
 

Similar threads

Replies
6
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 9 ·
Replies
9
Views
6K
  • · Replies 36 ·
2
Replies
36
Views
5K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 1 ·
Replies
1
Views
5K
Replies
9
Views
2K