Bayesian Inference: Finding Distribution of W Before Experiment

  • Thread starter Thread starter libelec
  • Start date Start date
  • Tags Tags
    Bayesian
libelec
Messages
173
Reaction score
0

Homework Statement


An urn has 6 balls: black and white. Initially, the number of white balls W is distributed uniformly in {0,1,...,6}. Two balls are taken from the urn: one white and one black.

Find the distribution of W before the experiment.

The Attempt at a Solution



So, initially, if p is the proportion of white balls, then fp(p) = 1/7 1{p=0,1/6,...,1}. This is the prior distribution of the parameter p.

Given the sample x=(B,W), if I consider each element of the sample a Bernoulli variable of parameter p, then the likelihood function would be L(p|B,W) = p*(1-p).

(Now, this would rule out the possibility of p being 0 or 1. I don't know how that translates into the posterior distribution of p...)

Then, the posterior distribution of p would be:

\pi (p|B,W) = \frac{{\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}}}{{\sum\limits_p {\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}} }}

My problem is that the marginal distribution of the sample (which I think is what the problem asks for) yields 5/36 1{p=0,1/6,...,1}.

This is impossible! since the sample shows at least one white (p can't be 0) and one black (p can't be 1). Besides, that should be a uniform distribution over {p=0,1/6,...,1}, and with the 5/36 that is impossible (5/36*7 = 35/36 != 1).

What am I doing wrong?
 
Physics news on Phys.org
You are doing at least a couple of things wrong here.
  1. You are implicitly assuming the posterior probability will be uniform. It won't.
  2. The observed event involves drawing two balls from the urn without replacement. You are not calculating that correctly.
  3. You are assuming the balls were drawn white and then black, in that order. While two balls were drawn, I don't think you should assume a specific ordering. (It doesn't matter in this particular problem, but it certainly does in other cases.)
 
Oh, you're right: there's no replacement.

So, should I treat the X's as Bernoulli variables or as hypergeometric variables?

I mean, if now p is the number or white balls in the urn, the sample yields (p/6)*(1-p/5)?
 
Almost. Suppose that a white ball then a black ball are drawn out of the urn. The probability of having drawn the white ball is indeed p/6. However, the probability of drawing a black ball on the second draw is not 1-p/5.
 
I mean 1 - (p-1)/5. Is that it?
 
Correct.
 
So the likelihood function is (p/6)*(1-(p-1)/5) 1{p=1,...,5}. The prior function of p is then 1/7 1{p=0,1,...,6}

Then \Pi(p|BW) = (1/7*(p/6)*(1-(p-1)/5)1{p=1,...,5})/(\sum1/7*(p/6)*(1-(p-1)/5))

This I solved as: \Pi(p|BW) = 6/7*(p/6)*(1-(p-1)/5)1{p=1,...,5}

So, that's the posterior distribution of the number of white balls at the beginning of the experiment, right? Is that correct?
 
Back
Top