Probability: What is the conditional distribution of X?

• sanctifier
In summary: As you suggested, P(X=x \cap X+Y=m)=P(X=x \cap Y=m-x)= \frac{\lambda_1^x}{x!}e^{-\lambda_1} *\frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2}=\lambda_2^m e^{-\lambda_1-\lambda_2} (\frac{\lambda_1}{\lambda_2})^x \frac{1}{m!} {m \choose x} P(X+Y=m) = \sum_{x=0}^{m} \frac{\lambda_1^x}{x!} e^{-\
sanctifier

Homework Statement

If random variables $X$ and $Y$ are independent and both belong to Possion distribution of parameters $\lambda_1$ and $\lambda_2$, then what is the conditional distribution of $X$ when the condition $X + Y = m$ is given?

Homework Equations

Possion distribution of parameter $\lambda$: $P(x)= \frac{\lambda^x}{x!} e^{-\lambda}$

The Attempt at a Solution

Because $P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)}$

Let $\begin{cases}u=x \\v=x+y \end{cases}$ then $\begin{cases}x=u \\y=v-u \end{cases}$

$Jacobian= \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\ \frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\-1 & 1 \end{bmatrix} = 1$

The joint distribution of $X$ and $Y$ is $g(x,y)= \frac{\lambda_1^x}{x!} \frac{\lambda_2^y}{y!}e^{-\lambda_1 - \lambda_2}$

Then $f(u,v)=g(u,v-u)|Jacobian|=\frac{\lambda_1^u}{u!} \frac{\lambda_2^{v-u}}{(v-u)!}e^{-\lambda_1 - \lambda_2}$

Hence the marginal distribution of $v$ is:

$f_v(v)=\sum_{u=0}^{m}f(u,v)=e^{-\lambda_1-\lambda_2}\lambda_2^v \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$

$P(X=x|X+Y=m)=P(U=u|V=m)= \frac{f(u,m)}{f_v(m)}$

Does the sum $\sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$ have a concise form?

sanctifier said:
Does the sum $\sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$ have a concise form?

Notice that:## \frac{1}{u!(v-u)!} = \frac{1}{v!} {v \choose u} ##.

When finding the marginal, you should be summing to infinity, which actually just means summing through v (since v choose u will be zero thereafter).

Therefore your sum is: ## \frac{1}{v!} \sum_{u=0}^{v} {v \choose u} {(\frac{\lambda_1}{\lambda_2})}^u 1^{v-u} = \frac{1}{v!}(\frac{\lambda_1}{\lambda_2} + 1)^v ##.

Here I used the binomial formula. We can further simplify this though.

## \frac{1}{v!}(\frac{\lambda_1}{\lambda_2} + 1)^v = \frac{1}{v!} (\frac{\lambda_1 + \lambda_2}{\lambda_2})^v ##.

Putting this into your formula for the marginal of ##X+Y## we see that ##X+Y## is Poisson with paramater ##\lambda_1 + \lambda_2##. In fact this holds for any finite sum of Poisson random variables, and you just proved the base case for induction.

Last edited:
This is correct.

kduna, thank you a lot!

sanctifier said:

Homework Statement

If random variables $X$ and $Y$ are independent and both belong to Possion distribution of parameters $\lambda_1$ and $\lambda_2$, then what is the conditional distribution of $X$ when the condition $X + Y = m$ is given?

Homework Equations

Possion distribution of parameter $\lambda$: $P(x)= \frac{\lambda^x}{x!} e^{-\lambda}$

The Attempt at a Solution

Because $P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)}$

Let $\begin{cases}u=x \\v=x+y \end{cases}$ then $\begin{cases}x=u \\y=v-u \end{cases}$

$Jacobian= \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\ \frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\-1 & 1 \end{bmatrix} = 1$

The joint distribution of $X$ and $Y$ is $g(x,y)= \frac{\lambda_1^x}{x!} \frac{\lambda_2^y}{y!}e^{-\lambda_1 - \lambda_2}$

Then $f(u,v)=g(u,v-u)|Jacobian|=\frac{\lambda_1^u}{u!} \frac{\lambda_2^{v-u}}{(v-u)!}e^{-\lambda_1 - \lambda_2}$

Hence the marginal distribution of $v$ is:

$f_v(v)=\sum_{u=0}^{m}f(u,v)=e^{-\lambda_1-\lambda_2}\lambda_2^v \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$

$P(X=x|X+Y=m)=P(U=u|V=m)= \frac{f(u,m)}{f_v(m)}$

Does the sum $\sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$ have a concise form?

(1)
$$P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)} \longleftarrow \text{ wrong}$$
Can you spot the mistake?

(2) Jacobians and all those things are wrong for this problem; they apply to continuous random variables having probability density functions, but in your case all the random variables are discrete and do not have densities at all---instead, they have probability mass functions.

Ray Vickson said:
(1)
$$P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)} \longleftarrow \text{ wrong}$$
Can you spot the mistake?

(2) Jacobians and all those things are wrong for this problem; they apply to continuous random variables having probability density functions, but in your case all the random variables are discrete and do not have densities at all---instead, they have probability mass functions.

To (1)

Did you mean $$P(X=x|X+Y=m) = \frac{P(X=x \bigcap X+Y=m)}{P(X+Y=m)}$$

To (2)

Then what shall I do if the standard means doesn't work?

sanctifier said:
To (1)

Did you mean $$P(X=x|X+Y=m) = \frac{P(X=x \bigcap X+Y=m)}{P(X+Y=m)}$$

To (2)

Then what shall I do if the standard means doesn't work?

Standard means DO work; just use them correctly. For integer m, can you compute ##P(X+Y=m)?## For integer ##x, m## can you compute ##P(X = x \; \& \; X+Y=m)?##

Hint: how would you express the event ##\{ X+Y = m\}## in terms of events like ##\{ X = j \}## and ##\{ Y = k \}##? You came very close to having it in your original post, but you did not finish the job.

It was your use of Jacobians, etc., that was wrong, not some of the final formulas you got. In other words, you almost had the right answers for the wrong reasons. Doing that on an exam would still cost you points.

As you suggested,

$P(X=x \cap X+Y=m)=P(X=x \cap Y=m-x)= \frac{\lambda_1^x}{x!}e^{-\lambda_1} *\frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2}=\lambda_2^m e^{-\lambda_1-\lambda_2} (\frac{\lambda_1}{\lambda_2})^x \frac{1}{m!} {m \choose x}$

$P(X+Y=m) = \sum_{x=0}^{m} \frac{\lambda_1^x}{x!} e^{-\lambda_1} \frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2} =\lambda_2^m e^{-\lambda_1-\lambda_2}\sum_{x=0}^m (\frac{\lambda_1}{\lambda_2} )^x \frac{1}{m!} {m \choose x} = \lambda_2^m e^{-\lambda_1-\lambda_2} \frac{1}{m!} ( \frac{\lambda_1}{\lambda_2}+1 )^m$

$P(X=x|X+Y=m) = \frac{P(X=x \cap X+Y=m)}{P(X+Y=m)} = \frac{P(X=x \cap Y=m-x)}{P(X+Y=m)} = \frac{(\frac{\lambda_1}{\lambda_2})^x {m \choose x} }{ ( \frac{\lambda_1}{\lambda_2}+1 )^m}$

Is this correct?

Last edited:
sanctifier said:

As you suggested,

$P(X=x \cap X+Y=m)=P(X=x \cap Y=m-x)= \frac{\lambda_1^x}{x!}e^{-\lambda_1} *\frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2}=\lambda_2^m e^{-\lambda_1-\lambda_2} (\frac{\lambda_1}{\lambda_2})^x \frac{1}{m!} {m \choose x}$

$P(X+Y=m) = \sum_{x=0}^{m} \frac{\lambda_1^x}{x!} e^{-\lambda_1} \frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2} =\lambda_2^m e^{-\lambda_1-\lambda_2}\sum_{x=0}^m (\frac{\lambda_1}{\lambda_2} )^x \frac{1}{m!} {m \choose x} = \lambda_2^m e^{-\lambda_1-\lambda_2} \frac{1}{m!} ( \frac{\lambda_1}{\lambda_2}+1 )^m$

$P(X=x|X+Y=m) = \frac{P(X=x \cap X+Y=m)}{P(X+Y=m)} = \frac{P(X=x \cap Y=m-x)}{P(X+Y=m)} = \frac{(\frac{\lambda_1}{\lambda_2})^x {m \choose x} }{ ( \frac{\lambda_1}{\lambda_2}+1 )^m}$

Is this correct?

Yes, it is correct. Good work. However, it can all be made neater and more revealing.

First, you should note that your formula for ##P(X+Y=m)## simplifies to
$$P(X+Y=m) = \frac{\mu^m e^{-\mu}}{m!}, \;\; \mu = \lambda_1 + \lambda_2,$$
so ##X+Y = \text{Poisson}(\lambda_1 + \lambda_2)##. This is a standard result.

Next: notice that your final formula for ##P(X=k|X+Y=m)## can be written as
$$P(X=k | X+Y=m) = {m \choose k} p^k (1-p)^{m-k}, \;\; p = \frac{\lambda_1}{\lambda_1+\lambda_2},$$
so ##X | X+Y = m## is binomial, with parameters ##m## and ##p = \lambda_1/(\lambda_1+\lambda_2)##.

Last edited:
Thank you very much, Ray.

1. What is the definition of conditional distribution?

The conditional distribution of a random variable X is the probability distribution of X given that another random variable Y takes on a specific value.

2. How is the conditional distribution calculated?

The conditional distribution can be calculated by dividing the joint probability of X and Y by the marginal probability of Y. This can also be represented as P(X|Y) = P(X,Y)/P(Y).

3. What is the significance of the conditional distribution in probability?

The conditional distribution allows us to understand the relationship between two random variables and how one affects the distribution of the other. It helps in making predictions and understanding the behavior of a system.

4. Can the conditional distribution be visualized?

Yes, the conditional distribution can be visualized through a conditional probability plot or a conditional probability histogram. These graphs show the probability of X given a specific value of Y.

5. How does the conditional distribution relate to the concept of independence?

If X and Y are independent, the conditional distribution of X given Y is simply the marginal distribution of X. This means that the value of Y does not affect the distribution of X. However, if X and Y are dependent, the conditional distribution of X given Y will be different from the marginal distribution of X.

Replies
2
Views
583
Replies
19
Views
2K
Replies
5
Views
1K
Replies
7
Views
533
Replies
6
Views
1K
Replies
4
Views
511
Replies
8
Views
534
Replies
4
Views
1K
Replies
6
Views
2K
Replies
5
Views
4K