Independent poisson random variables

Monocles
Messages
463
Reaction score
2

Homework Statement


There are two urns, A and B. Let v be a random number of balls. Each of these balls is put to urn A with probably p and to urn B with probability q = 1 - p. Let v_a and v_b denote the numbers of balls in A and B, respectively. Show that random variables v_a and v_b are independent if and only if v is a Poisson random variable.

Homework Equations


The probability mass function of a Poisson random variable is

55978f02e2b22e9a93943595030ecf64.png

The Attempt at a Solution


We showed in an earlier problem that if two variables X, Y are independent Poisson random variables then X + Y is also a Poisson random variable. Therefore, if v_A and v_B are Poisson random variables, then v must be as well. Noting that v_A + v_B = v, I think that proving that v must be a Poisson random variable might be sufficient, but I also don't know the level of rigor wanted here. It would be easy to show that v CAN be a Poisson random variable, but I don't see how to go about showing that it MUST be. I thought maybe I could show that the basis of "Poisson random variable space" is independent from the basis of "non-Poisson random variable space", but I don't know if those are even sensible ideas.

Random thing I noted that may or may not be relevant:
E(v) = E(v_A) + E(v_B). If v_A and v_B are Poisson random variables then their expectation value is their parameter \lambda.
 
Physics news on Phys.org
I think the following might be a good direction to start with, but I'm pressed for time at the moment so I can't promise that it will bear fruit.

Suppose v is a Poisson random variable. I define random variables a_1, a_2, \ldots, a_v such that

a_j = \left\{\begin{array}{ll}1, &amp; \textrm{if ball j goes into urn A} \\<br /> 0, &amp; \textrm{if ball j goes into urn B}\end{array}\right.

The a_j's are independent, and

v_A = \sum_{j=1}^{v} a_j

Then I can obtain the marginal probability mass function for v_A as follows:

P(v_A = k) = \sum_{m=0}^{\infty} P(v_A = k \quad \cap \quad v = m) = \sum_{m=0}^{\infty}P(v_A = k \quad | \quad v = m) P(v = m)

and similarly for v_B. The right-hand side should be straightforward from here.

So that gets you the marginal distributions, and now you need the joint distribution, and to show that v_A and v_B are statistically independent, i.e.,

P(v_A = k \quad \cap \quad v_B = n) = P(v_A = k) P(v_B = n)

If no one else chimes in, and you still need ideas, I'll try to write more later this evening.
 
This helped a lot - but I still don't see how to prove that ONLY Poisson random variables will work.
 
Monocles said:
This helped a lot - but I still don't see how to prove that ONLY Poisson random variables will work.

I'm not sure how to prove that. I may be way off base, but my guess is that you could make some headway using characteristic functions. The argument would start as follows:

Suppose v_A and v_B are independent. Then

\phi_v(t) = \phi_{v_A}(t) \phi_{v_B}(t)

where the notation

\phi_x(t) denotes the characteristic function of the random variable x, defined as

\phi_x(t) = E[e^{itx}] = \sum_k e^{itk} P(x = k)

You could then use (deriving if not presumed known) the characteristic functions for binomial and Poissan random variables, which are

(1 - p + pe^{it})^{n} (binomial with n trials and probability p per trial)

and

e^{\lambda(e^{it}-1)} (Poisson with parameter \lambda)

This isn't the whole story, because v_A and v_B are only conditionally binomial, given a fixed value of v, but it seems like it might be a promising avenue to try. [Of course, if this is for a class and you haven't covered characteristic functions yet, then scratch that idea...]
 
Yeah, it is for class, and we haven't gotten to characteristic functions yet. I ended up just throwing in what I had and mentioning something about the space of non-Poisson random variables hopefully not being so large that it could be dense in the space of Poisson random variables, but I was definitely just talking out of my ***.
 
Hmm, I guess without characteristic functions, you would make the analogous argument using probability mass functions, except the math might be grungier:

Suppose v_A and v_B are independent. Then the pmf of v is the convolution of the pmfs of v_A and v_B:

P(v = k) = \sum_n P(v_A = n) P(v_B = k - n)

and you would write

P(v_A = n) = \sum_j P(v_A = n \quad | v = j) P(v = j)

and similarly for v_B

Then use the fact that v_A and v_B are conditionally binomial, given v = j. Hopefully you could then do a bunch of simplification and solve for P(v = k) and show it to be Poisson.

If you get a chance, I'll be curious to know if that's the solution your instructor gives, or if there is a simpler way.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top