Bayesian Inference: Finding Distribution of W Before Experiment

  • Thread starter libelec
  • Start date
  • Tags
    Bayesian
In summary, when initially distributing the number of white balls in an urn with a total of 6 balls, the proportion of white balls, p, follows a prior distribution of 1/7 1{p=0,1,...,6}. After drawing a white and black ball without replacement, the posterior distribution of p becomes 6/7*(p/6)*(1-(p-1)/5)1{p=1,...,5}. This represents the distribution of the number of white balls at the beginning of the experiment.
  • #1
libelec
176
0

Homework Statement


An urn has 6 balls: black and white. Initially, the number of white balls W is distributed uniformly in {0,1,...,6}. Two balls are taken from the urn: one white and one black.

Find the distribution of W before the experiment.

The Attempt at a Solution



So, initially, if p is the proportion of white balls, then fp(p) = 1/7 1{p=0,1/6,...,1}. This is the prior distribution of the parameter p.

Given the sample x=(B,W), if I consider each element of the sample a Bernoulli variable of parameter p, then the likelihood function would be L(p|B,W) = p*(1-p).

(Now, this would rule out the possibility of p being 0 or 1. I don't know how that translates into the posterior distribution of p...)

Then, the posterior distribution of p would be:

[tex]\pi (p|B,W) = \frac{{\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}}}{{\sum\limits_p {\frac{1}{7}p(1 - p)1\left\{ {p = 0,1/6,...,1} \right\}} }}[/tex]

My problem is that the marginal distribution of the sample (which I think is what the problem asks for) yields 5/36 1{p=0,1/6,...,1}.

This is impossible! since the sample shows at least one white (p can't be 0) and one black (p can't be 1). Besides, that should be a uniform distribution over {p=0,1/6,...,1}, and with the 5/36 that is impossible (5/36*7 = 35/36 != 1).

What am I doing wrong?
 
Physics news on Phys.org
  • #2
You are doing at least a couple of things wrong here.
  1. You are implicitly assuming the posterior probability will be uniform. It won't.
  2. The observed event involves drawing two balls from the urn without replacement. You are not calculating that correctly.
  3. You are assuming the balls were drawn white and then black, in that order. While two balls were drawn, I don't think you should assume a specific ordering. (It doesn't matter in this particular problem, but it certainly does in other cases.)
 
  • #3
Oh, you're right: there's no replacement.

So, should I treat the X's as Bernoulli variables or as hypergeometric variables?

I mean, if now p is the number or white balls in the urn, the sample yields (p/6)*(1-p/5)?
 
  • #4
Almost. Suppose that a white ball then a black ball are drawn out of the urn. The probability of having drawn the white ball is indeed p/6. However, the probability of drawing a black ball on the second draw is not 1-p/5.
 
  • #5
I mean 1 - (p-1)/5. Is that it?
 
  • #7
So the likelihood function is (p/6)*(1-(p-1)/5) 1{p=1,...,5}. The prior function of p is then 1/7 1{p=0,1,...,6}

Then [tex]\Pi[/tex](p|BW) = (1/7*(p/6)*(1-(p-1)/5)1{p=1,...,5})/([tex]\sum[/tex]1/7*(p/6)*(1-(p-1)/5))

This I solved as: [tex]\Pi[/tex](p|BW) = 6/7*(p/6)*(1-(p-1)/5)1{p=1,...,5}

So, that's the posterior distribution of the number of white balls at the beginning of the experiment, right? Is that correct?
 

What is Bayesian Inference?

Bayesian Inference is a statistical method used to update our beliefs about a certain event or phenomenon based on new evidence or information. It involves using prior beliefs and data to calculate the probability of a hypothesis being true.

How does Bayesian Inference work?

Bayesian Inference uses Bayes' theorem, which states that the posterior probability of an event is equal to the prior probability multiplied by the likelihood of the data given the hypothesis, divided by the probability of the data. This allows us to update our prior beliefs based on new evidence or data.

What is the role of prior beliefs in Bayesian Inference?

Prior beliefs play a crucial role in Bayesian Inference as they serve as the starting point for our calculations. These beliefs can come from previous experiments, expert opinions, or even subjective beliefs. As we gather new evidence, our prior beliefs are updated to become our posterior beliefs.

How is the distribution of W determined in Bayesian Inference?

In Bayesian Inference, the distribution of W is determined by using Bayes' theorem to calculate the posterior probability of W given the data. This posterior distribution takes into account our prior beliefs and the likelihood of the data, and provides a more accurate estimation of W.

What are the advantages of using Bayesian Inference?

One of the main advantages of Bayesian Inference is its ability to incorporate prior beliefs into the analysis, making it a more flexible and intuitive approach compared to other statistical methods. It also allows for continuous updating of beliefs as new evidence is gathered, making it useful in scenarios where data is constantly evolving.

Similar threads

  • Calculus and Beyond Homework Help
Replies
6
Views
601
  • Calculus and Beyond Homework Help
Replies
4
Views
1K
  • Calculus and Beyond Homework Help
Replies
5
Views
286
  • Calculus and Beyond Homework Help
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
2
Replies
36
Views
3K
  • Calculus and Beyond Homework Help
Replies
9
Views
4K
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
Replies
0
Views
354
  • Calculus and Beyond Homework Help
Replies
1
Views
1K
  • Precalculus Mathematics Homework Help
Replies
2
Views
1K
Back
Top