Conditional Probability Distribution

Click For Summary
The discussion focuses on deriving the conditional probability distribution from a joint probability mass function (PMF) expressed as a product of individual PMFs. Participants are trying to understand how the condition that the sum of the random variables equals a specific value constrains the joint distribution, especially when the parameters are distinct. There is confusion about how to compute the joint distribution under this constraint and whether the condition affects the PMF. The importance of recognizing that the parameters do not directly influence the values of the random variables is emphasized. Exploring simpler cases with fewer variables is suggested as a way to clarify the concepts involved.
SP90
Messages
19
Reaction score
0

Homework Statement



attachment.php?attachmentid=47884&stc=1&d=1338504146.png


The Attempt at a Solution



I have that the joint probability mass function would be

\Pi_{i=1}^{k} \frac{\lambda_{i}^{n_{i}}}{n_{i}!} e^{-\lambda_{i}}

How would I go about applying the conditional to get the conditional distribution?
 

Attachments

  • Screen Shot 2012-05-31 at 23.41.23.png
    Screen Shot 2012-05-31 at 23.41.23.png
    17 KB · Views: 738
Physics news on Phys.org
SP90 said:

Homework Statement



attachment.php?attachmentid=47884&stc=1&d=1338504146.png


The Attempt at a Solution



I have that the joint probability mass function would be

\Pi_{i=1}^{k} \frac{\lambda_{i}^{n_{i}}}{n_{i}!} e^{-\lambda_{i}}

How would I go about applying the conditional to get the conditional distribution?

You need to compute
P(N_1 = n_1, N_2 = n_2, \ldots, N_k = n_k | \sum_{i=1}^k N_i = n)<br /> = \frac{ P(N_1 = n_1, N_2 = n_2, \ldots, N_k = n_k \: \&amp; \sum_{i=1}^k N_i = n)}{P(\sum_{i=1}^k N_i = n)} for any k-tuple (n_1, n_2, \ldots, n_k) that sums to n. Can you simplify the "event" in the numerator probability? Can you compute the denominator probability?

RGV
 
I don't understand how the \sum_{i=1}^k N_i = n term constrains the joint distribution, given that each \lambda_{i} can be distinct.

Also, to evaluate P(\sum_{i=1}^k N_i = n), wouldn't this be the joint distribution, evaluated at N_{1}=n-\sum_{i=2}^{k}N_{i} and then N_{2}=n-\sum_{i=3}^{k}N_{i} and so on? I feel that's wrong because it doesn't seem like it'd collapse down nicely.
 
SP90 said:
I don't understand how the \sum_{i=1}^k N_i = n term constrains the joint distribution, given that each \lambda_{i} can be distinct.

Also, to evaluate P(\sum_{i=1}^k N_i = n), wouldn't this be the joint distribution, evaluated at N_{1}=n-\sum_{i=2}^{k}N_{i} and then N_{2}=n-\sum_{i=3}^{k}N_{i} and so on? I feel that's wrong because it doesn't seem like it'd collapse down nicely.

The \lambda_i have absolutely nothing to do with the n_i; if you think they do, then you seriously misunderstand the material. The formula you wrote for the joint pmf is P(N_1 = n_1, \ldots, N_k = n_k). The only role of the \lambda_i is to control the size of this probability, but that has nothing to do with the n_i values themselves.

RGV
 
But because the joint PMF is a multiplication of the individual PMFs, the term becomes \lambda_{1}^{n_{1}}\lambda_{2}^{n_{2}}\lambda_{3}^{n_{3}}... which means the term \sum_{i=1}^k N_i = n doesn't appear in the joint PMF (although it would if the \lambda_{i} were equal), so I don't understand how the condition \sum_{i=1}^k N_i = n is applied to give P(N_1 = n_1, N_2 = n_2, \ldots, N_k = n_k \: \&amp; \sum_{i=1}^k N_i = n)
 
SP90 said:
But because the joint PMF is a multiplication of the individual PMFs, the term becomes \lambda_{1}^{n_{1}}\lambda_{2}^{n_{2}}\lambda_{3}^{n_{3}}... which means the term \sum_{i=1}^k N_i = n doesn't appear in the joint PMF (although it would if the \lambda_{i} were equal), so I don't understand how the condition \sum_{i=1}^k N_i = n is applied to give P(N_1 = n_1, N_2 = n_2, \ldots, N_k = n_k \: \&amp; \sum_{i=1}^k N_i = n)

Try first to figure out what happens when k = 2, then maybe k = 3. You should see a pattern emerging.

RGV
 
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

Replies
6
Views
1K
Replies
2
Views
1K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 2 ·
Replies
2
Views
1K
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
815
  • · Replies 3 ·
Replies
3
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
Replies
0
Views
855
  • · Replies 11 ·
Replies
11
Views
2K