Probability: What is the conditional distribution of X?

Click For Summary

Homework Help Overview

The discussion revolves around the conditional distribution of a random variable \( X \) given that the sum of two independent Poisson-distributed random variables \( X \) and \( Y \) equals \( m \). The parameters of the distributions are \( \lambda_1 \) and \( \lambda_2 \) respectively.

Discussion Character

  • Conceptual clarification, Mathematical reasoning, Assumption checking

Approaches and Questions Raised

  • Participants explore the relationship between the joint distribution of \( X \) and \( Y \) and the conditional distribution \( P(X=x|X+Y=m) \). There are attempts to derive expressions for the marginal distribution and to simplify sums involving Poisson parameters. Questions arise regarding the correctness of initial formulations and the use of Jacobians in a discrete context.

Discussion Status

Multiple interpretations of the problem are being explored, with some participants providing guidance on correcting initial mistakes. There is an ongoing examination of the relationships between the probabilities involved, and suggestions for expressing events in terms of intersections and unions are noted. Participants are actively engaging with the mathematical details and seeking clarification on specific points.

Contextual Notes

There are discussions about the appropriateness of using continuous methods for discrete random variables, and some participants question the assumptions made in the original problem setup. The conversation reflects a mix of correct reasoning and misunderstandings that are being addressed collaboratively.

sanctifier
Messages
58
Reaction score
0

Homework Statement



If random variables X and Y are independent and both belong to Possion distribution of parameters \lambda_1 and \lambda_2, then what is the conditional distribution of X when the condition X + Y = m is given?

Homework Equations



Possion distribution of parameter \lambda: P(x)= \frac{\lambda^x}{x!} e^{-\lambda}

The Attempt at a Solution



Answer:

Because P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)}

Let \begin{cases}u=x \\v=x+y \end{cases} then \begin{cases}x=u \\y=v-u \end{cases}

Jacobian= \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\ \frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\-1 & 1 \end{bmatrix} = 1

The joint distribution of X and Y is g(x,y)= \frac{\lambda_1^x}{x!} \frac{\lambda_2^y}{y!}e^{-\lambda_1 - \lambda_2}

Then f(u,v)=g(u,v-u)|Jacobian|=\frac{\lambda_1^u}{u!} \frac{\lambda_2^{v-u}}{(v-u)!}e^{-\lambda_1 - \lambda_2}

Hence the marginal distribution of v is:

f_v(v)=\sum_{u=0}^{m}f(u,v)=e^{-\lambda_1-\lambda_2}\lambda_2^v \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}

P(X=x|X+Y=m)=P(U=u|V=m)= \frac{f(u,m)}{f_v(m)}

Does the sum \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!} have a concise form?

Is the answer correct? Thank you in advance!
 
Physics news on Phys.org
sanctifier said:
Does the sum \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!} have a concise form?

Is the answer correct? Thank you in advance!

Notice that:## \frac{1}{u!(v-u)!} = \frac{1}{v!} {v \choose u} ##.

When finding the marginal, you should be summing to infinity, which actually just means summing through v (since v choose u will be zero thereafter).

Therefore your sum is: ## \frac{1}{v!} \sum_{u=0}^{v} {v \choose u} {(\frac{\lambda_1}{\lambda_2})}^u 1^{v-u} = \frac{1}{v!}(\frac{\lambda_1}{\lambda_2} + 1)^v ##.

Here I used the binomial formula. We can further simplify this though.

## \frac{1}{v!}(\frac{\lambda_1}{\lambda_2} + 1)^v = \frac{1}{v!} (\frac{\lambda_1 + \lambda_2}{\lambda_2})^v ##.

Putting this into your formula for the marginal of ##X+Y## we see that ##X+Y## is Poisson with parameter ##\lambda_1 + \lambda_2##. In fact this holds for any finite sum of Poisson random variables, and you just proved the base case for induction.
 
Last edited:
This is correct.

kduna, thank you a lot!
 
sanctifier said:

Homework Statement



If random variables X and Y are independent and both belong to Possion distribution of parameters \lambda_1 and \lambda_2, then what is the conditional distribution of X when the condition X + Y = m is given?

Homework Equations



Possion distribution of parameter \lambda: P(x)= \frac{\lambda^x}{x!} e^{-\lambda}

The Attempt at a Solution



Answer:

Because P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)}

Let \begin{cases}u=x \\v=x+y \end{cases} then \begin{cases}x=u \\y=v-u \end{cases}

Jacobian= \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\ \frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\-1 & 1 \end{bmatrix} = 1

The joint distribution of X and Y is g(x,y)= \frac{\lambda_1^x}{x!} \frac{\lambda_2^y}{y!}e^{-\lambda_1 - \lambda_2}

Then f(u,v)=g(u,v-u)|Jacobian|=\frac{\lambda_1^u}{u!} \frac{\lambda_2^{v-u}}{(v-u)!}e^{-\lambda_1 - \lambda_2}

Hence the marginal distribution of v is:

f_v(v)=\sum_{u=0}^{m}f(u,v)=e^{-\lambda_1-\lambda_2}\lambda_2^v \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}

P(X=x|X+Y=m)=P(U=u|V=m)= \frac{f(u,m)}{f_v(m)}

Does the sum \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!} have a concise form?

Is the answer correct? Thank you in advance!

You made some errors.
(1)
P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)} \longleftarrow \text{ wrong}
Can you spot the mistake?

(2) Jacobians and all those things are wrong for this problem; they apply to continuous random variables having probability density functions, but in your case all the random variables are discrete and do not have densities at all---instead, they have probability mass functions.
 
Ray Vickson said:
You made some errors.
(1)
P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)} \longleftarrow \text{ wrong}
Can you spot the mistake?

(2) Jacobians and all those things are wrong for this problem; they apply to continuous random variables having probability density functions, but in your case all the random variables are discrete and do not have densities at all---instead, they have probability mass functions.

To (1)

Did you mean P(X=x|X+Y=m) = \frac{P(X=x \bigcap X+Y=m)}{P(X+Y=m)}

To (2)

Then what shall I do if the standard means doesn't work?
 
sanctifier said:
To (1)

Did you mean P(X=x|X+Y=m) = \frac{P(X=x \bigcap X+Y=m)}{P(X+Y=m)}

To (2)

Then what shall I do if the standard means doesn't work?

Standard means DO work; just use them correctly. For integer m, can you compute ##P(X+Y=m)?## For integer ##x, m## can you compute ##P(X = x \; \& \; X+Y=m)?##

Hint: how would you express the event ##\{ X+Y = m\}## in terms of events like ##\{ X = j \}## and ##\{ Y = k \}##? You came very close to having it in your original post, but you did not finish the job.

It was your use of Jacobians, etc., that was wrong, not some of the final formulas you got. In other words, you almost had the right answers for the wrong reasons. Doing that on an exam would still cost you points.
 
Thank you for your insistent reply.

As you suggested,

P(X=x \cap X+Y=m)=P(X=x \cap Y=m-x)= \frac{\lambda_1^x}{x!}e^{-\lambda_1} *\frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2}=\lambda_2^m e^{-\lambda_1-\lambda_2} (\frac{\lambda_1}{\lambda_2})^x \frac{1}{m!} {m \choose x}

P(X+Y=m) = \sum_{x=0}^{m} \frac{\lambda_1^x}{x!} e^{-\lambda_1} \frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2} =\lambda_2^m e^{-\lambda_1-\lambda_2}\sum_{x=0}^m (\frac{\lambda_1}{\lambda_2} )^x \frac{1}{m!} {m \choose x} = \lambda_2^m e^{-\lambda_1-\lambda_2} \frac{1}{m!} ( \frac{\lambda_1}{\lambda_2}+1 )^m

P(X=x|X+Y=m) = \frac{P(X=x \cap X+Y=m)}{P(X+Y=m)} = \frac{P(X=x \cap Y=m-x)}{P(X+Y=m)} = \frac{(\frac{\lambda_1}{\lambda_2})^x {m \choose x} }{ ( \frac{\lambda_1}{\lambda_2}+1 )^m}

Is this correct?
 
Last edited:
sanctifier said:
Thank you for your insistent reply.

As you suggested,

P(X=x \cap X+Y=m)=P(X=x \cap Y=m-x)= \frac{\lambda_1^x}{x!}e^{-\lambda_1} *\frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2}=\lambda_2^m e^{-\lambda_1-\lambda_2} (\frac{\lambda_1}{\lambda_2})^x \frac{1}{m!} {m \choose x}

P(X+Y=m) = \sum_{x=0}^{m} \frac{\lambda_1^x}{x!} e^{-\lambda_1} \frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2} =\lambda_2^m e^{-\lambda_1-\lambda_2}\sum_{x=0}^m (\frac{\lambda_1}{\lambda_2} )^x \frac{1}{m!} {m \choose x} = \lambda_2^m e^{-\lambda_1-\lambda_2} \frac{1}{m!} ( \frac{\lambda_1}{\lambda_2}+1 )^m

P(X=x|X+Y=m) = \frac{P(X=x \cap X+Y=m)}{P(X+Y=m)} = \frac{P(X=x \cap Y=m-x)}{P(X+Y=m)} = \frac{(\frac{\lambda_1}{\lambda_2})^x {m \choose x} }{ ( \frac{\lambda_1}{\lambda_2}+1 )^m}

Is this correct?

Yes, it is correct. Good work. However, it can all be made neater and more revealing.

First, you should note that your formula for ##P(X+Y=m)## simplifies to
P(X+Y=m) = \frac{\mu^m e^{-\mu}}{m!}, \;\; \mu = \lambda_1 + \lambda_2,
so ##X+Y = \text{Poisson}(\lambda_1 + \lambda_2)##. This is a standard result.

Next: notice that your final formula for ##P(X=k|X+Y=m)## can be written as
P(X=k | X+Y=m) = {m \choose k} p^k (1-p)^{m-k}, \;\; p = \frac{\lambda_1}{\lambda_1+\lambda_2},
so ##X | X+Y = m## is binomial, with parameters ##m## and ##p = \lambda_1/(\lambda_1+\lambda_2)##.
 
Last edited:
Thank you very much, Ray.
 

Similar threads

  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 11 ·
Replies
11
Views
3K
Replies
19
Views
3K
Replies
5
Views
2K
Replies
7
Views
1K
  • · Replies 33 ·
2
Replies
33
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 8 ·
Replies
8
Views
1K
  • · Replies 4 ·
Replies
4
Views
2K