1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Bayesian inference

  1. May 27, 2010 #1
    1. The problem statement, all variables and given/known data
    An urn has 6 balls: black and white. Initially, the number of white balls W is distributed uniformly in {0,1,...,6}. Two balls are taken from the urn: one white and one black.

    Find the distribution of W before the experiment.

    3. The attempt at a solution

    So, initially, if p is the proportion of white balls, then fp(p) = 1/7 1{p=0,1/6,...,1}. This is the prior distribution of the parameter p.

    Given the sample x=(B,W), if I consider each element of the sample a Bernoulli variable of parameter p, then the likelihood function would be L(p|B,W) = p*(1-p).

    (Now, this would rule out the possibility of p being 0 or 1. I don't know how that translates into the posterior distribution of p...)

    Then, the posterior distribution of p would be:

    [tex]\pi (p|B,W) = \frac{{\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}}}{{\sum\limits_p {\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}} }}[/tex]

    My problem is that the marginal distribution of the sample (which I think is what the problem asks for) yields 5/36 1{p=0,1/6,...,1}.

    This is impossible! since the sample shows at least one white (p can't be 0) and one black (p can't be 1). Besides, that should be a uniform distribution over {p=0,1/6,...,1}, and with the 5/36 that is impossible (5/36*7 = 35/36 != 1).

    What am I doing wrong?
     
  2. jcsd
  3. May 27, 2010 #2

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    You are doing at least a couple of things wrong here.
    1. You are implicitly assuming the posterior probability will be uniform. It won't.
    2. The observed event involves drawing two balls from the urn without replacement. You are not calculating that correctly.
    3. You are assuming the balls were drawn white and then black, in that order. While two balls were drawn, I don't think you should assume a specific ordering. (It doesn't matter in this particular problem, but it certainly does in other cases.)
     
  4. May 28, 2010 #3
    Oh, you're right: there's no replacement.

    So, should I treat the X's as Bernoulli variables or as hypergeometric variables?

    I mean, if now p is the number or white balls in the urn, the sample yields (p/6)*(1-p/5)?
     
  5. May 28, 2010 #4

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Almost. Suppose that a white ball then a black ball are drawn out of the urn. The probability of having drawn the white ball is indeed p/6. However, the probability of drawing a black ball on the second draw is not 1-p/5.
     
  6. May 28, 2010 #5
    I mean 1 - (p-1)/5. Is that it?
     
  7. May 29, 2010 #6

    D H

    User Avatar
    Staff Emeritus
    Science Advisor

    Correct.
     
  8. May 29, 2010 #7
    So the likelihood function is (p/6)*(1-(p-1)/5) 1{p=1,...,5}. The prior function of p is then 1/7 1{p=0,1,...,6}

    Then [tex]\Pi[/tex](p|BW) = (1/7*(p/6)*(1-(p-1)/5)1{p=1,...,5})/([tex]\sum[/tex]1/7*(p/6)*(1-(p-1)/5))

    This I solved as: [tex]\Pi[/tex](p|BW) = 6/7*(p/6)*(1-(p-1)/5)1{p=1,...,5}

    So, that's the posterior distribution of the number of white balls at the beginning of the experiment, right? Is that correct?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook




Similar Discussions: Bayesian inference
  1. Bayesian estimation (Replies: 0)

  2. Bayesian Statistics (Replies: 0)

Loading...