# Probability: What is the conditional distribution of X?

1. Mar 5, 2014

### sanctifier

1. The problem statement, all variables and given/known data

If random variables $X$ and $Y$ are independent and both belong to Possion distribution of parameters $\lambda_1$ and $\lambda_2$, then what is the conditional distribution of $X$ when the condition $X + Y = m$ is given?

2. Relevant equations

Possion distribution of parameter $\lambda$: $P(x)= \frac{\lambda^x}{x!} e^{-\lambda}$

3. The attempt at a solution

Because $P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)}$

Let $\begin{cases}u=x \\v=x+y \end{cases}$ then $\begin{cases}x=u \\y=v-u \end{cases}$

$Jacobian= \begin{bmatrix} \frac{dx}{du} & \frac{dx}{dv} \\ \frac{dy}{du} & \frac{dy}{dv} \end{bmatrix} = \begin{bmatrix}1 & 0 \\-1 & 1 \end{bmatrix} = 1$

The joint distribution of $X$ and $Y$ is $g(x,y)= \frac{\lambda_1^x}{x!} \frac{\lambda_2^y}{y!}e^{-\lambda_1 - \lambda_2}$

Then $f(u,v)=g(u,v-u)|Jacobian|=\frac{\lambda_1^u}{u!} \frac{\lambda_2^{v-u}}{(v-u)!}e^{-\lambda_1 - \lambda_2}$

Hence the marginal distribution of $v$ is:

$f_v(v)=\sum_{u=0}^{m}f(u,v)=e^{-\lambda_1-\lambda_2}\lambda_2^v \sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$

$P(X=x|X+Y=m)=P(U=u|V=m)= \frac{f(u,m)}{f_v(m)}$

Does the sum $\sum_{u=0}^{m} {( \frac{\lambda_1}{\lambda_2} )}^u \frac{1}{u!(v-u)!}$ have a concise form?

Is the answer correct? Thank you in advance!

2. Mar 5, 2014

### kduna

Notice that:$\frac{1}{u!(v-u)!} = \frac{1}{v!} {v \choose u}$.

When finding the marginal, you should be summing to infinity, which actually just means summing through v (since v choose u will be zero thereafter).

Therefore your sum is: $\frac{1}{v!} \sum_{u=0}^{v} {v \choose u} {(\frac{\lambda_1}{\lambda_2})}^u 1^{v-u} = \frac{1}{v!}(\frac{\lambda_1}{\lambda_2} + 1)^v$.

Here I used the binomial formula. We can further simplify this though.

$\frac{1}{v!}(\frac{\lambda_1}{\lambda_2} + 1)^v = \frac{1}{v!} (\frac{\lambda_1 + \lambda_2}{\lambda_2})^v$.

Putting this into your formula for the marginal of $X+Y$ we see that $X+Y$ is Poisson with paramater $\lambda_1 + \lambda_2$. In fact this holds for any finite sum of Poisson random variables, and you just proved the base case for induction.

Last edited: Mar 5, 2014
3. Mar 6, 2014

### sanctifier

This is correct.

kduna, thank you a lot!

4. Mar 6, 2014

### Ray Vickson

You made some errors.
(1)
$$P(X=x|X+Y=m) = \frac{P(X=x \cup X+Y=m)}{P(X+Y=m)} \longleftarrow \text{ wrong}$$
Can you spot the mistake?

(2) Jacobians and all those things are wrong for this problem; they apply to continuous random variables having probability density functions, but in your case all the random variables are discrete and do not have densities at all---instead, they have probability mass functions.

5. Mar 24, 2014

### sanctifier

To (1)

Did you mean $$P(X=x|X+Y=m) = \frac{P(X=x \bigcap X+Y=m)}{P(X+Y=m)}$$

To (2)

Then what shall I do if the standard means doesn't work?

6. Mar 24, 2014

### Ray Vickson

Standard means DO work; just use them correctly. For integer m, can you compute $P(X+Y=m)?$ For integer $x, m$ can you compute $P(X = x \; \& \; X+Y=m)?$

Hint: how would you express the event $\{ X+Y = m\}$ in terms of events like $\{ X = j \}$ and $\{ Y = k \}$? You came very close to having it in your original post, but you did not finish the job.

It was your use of Jacobians, etc., that was wrong, not some of the final formulas you got. In other words, you almost had the right answers for the wrong reasons. Doing that on an exam would still cost you points.

7. Mar 29, 2014

### sanctifier

As you suggested,

$P(X=x \cap X+Y=m)=P(X=x \cap Y=m-x)= \frac{\lambda_1^x}{x!}e^{-\lambda_1} *\frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2}=\lambda_2^m e^{-\lambda_1-\lambda_2} (\frac{\lambda_1}{\lambda_2})^x \frac{1}{m!} {m \choose x}$

$P(X+Y=m) = \sum_{x=0}^{m} \frac{\lambda_1^x}{x!} e^{-\lambda_1} \frac{\lambda_2^{m-x}}{(m-x)!} e^{-\lambda_2} =\lambda_2^m e^{-\lambda_1-\lambda_2}\sum_{x=0}^m (\frac{\lambda_1}{\lambda_2} )^x \frac{1}{m!} {m \choose x} = \lambda_2^m e^{-\lambda_1-\lambda_2} \frac{1}{m!} ( \frac{\lambda_1}{\lambda_2}+1 )^m$

$P(X=x|X+Y=m) = \frac{P(X=x \cap X+Y=m)}{P(X+Y=m)} = \frac{P(X=x \cap Y=m-x)}{P(X+Y=m)} = \frac{(\frac{\lambda_1}{\lambda_2})^x {m \choose x} }{ ( \frac{\lambda_1}{\lambda_2}+1 )^m}$

Is this correct?

Last edited: Mar 29, 2014
8. Mar 29, 2014

### Ray Vickson

Yes, it is correct. Good work. However, it can all be made neater and more revealing.

First, you should note that your formula for $P(X+Y=m)$ simplifies to
$$P(X+Y=m) = \frac{\mu^m e^{-\mu}}{m!}, \;\; \mu = \lambda_1 + \lambda_2,$$
so $X+Y = \text{Poisson}(\lambda_1 + \lambda_2)$. This is a standard result.

Next: notice that your final formula for $P(X=k|X+Y=m)$ can be written as
$$P(X=k | X+Y=m) = {m \choose k} p^k (1-p)^{m-k}, \;\; p = \frac{\lambda_1}{\lambda_1+\lambda_2},$$
so $X | X+Y = m$ is binomial, with parameters $m$ and $p = \lambda_1/(\lambda_1+\lambda_2)$.

Last edited: Mar 29, 2014
9. Mar 29, 2014

### sanctifier

Thank you very much, Ray.