Independent poisson random variables

Click For Summary

Homework Help Overview

The problem involves two urns, A and B, where a random number of balls, denoted as v, is distributed between them with certain probabilities. The task is to demonstrate that the random variables representing the number of balls in each urn, v_a and v_b, are independent if and only if v follows a Poisson distribution.

Discussion Character

  • Exploratory, Assumption checking, Conceptual clarification

Approaches and Questions Raised

  • Participants discuss the relationship between independent Poisson random variables and the sum of those variables. There is an exploration of the implications of v being a Poisson random variable and how that relates to the independence of v_a and v_b.
  • Some participants suggest defining random variables based on the distribution of balls into urns and calculating marginal and joint distributions to explore independence.
  • Questions arise about proving that only Poisson random variables will satisfy the conditions of the problem, with suggestions to consider characteristic functions or probability mass functions.

Discussion Status

The discussion is ongoing, with participants sharing insights and potential approaches. Some have noted that they find certain directions promising, while others express uncertainty about proving the necessity of the Poisson condition. There is no explicit consensus yet, but several lines of reasoning are being explored.

Contextual Notes

Participants mention constraints related to their current coursework, such as not having covered characteristic functions, which may limit the methods they can employ. There is also a recognition of the complexity involved in proving the independence of the random variables under the given conditions.

Monocles
Messages
463
Reaction score
2

Homework Statement


There are two urns, A and B. Let v be a random number of balls. Each of these balls is put to urn A with probably p and to urn B with probability q = 1 - p. Let v_a and v_b denote the numbers of balls in A and B, respectively. Show that random variables v_a and v_b are independent if and only if v is a Poisson random variable.

Homework Equations


The probability mass function of a Poisson random variable is

55978f02e2b22e9a93943595030ecf64.png

The Attempt at a Solution


We showed in an earlier problem that if two variables X, Y are independent Poisson random variables then X + Y is also a Poisson random variable. Therefore, if v_A and v_B are Poisson random variables, then v must be as well. Noting that v_A + v_B = v, I think that proving that v must be a Poisson random variable might be sufficient, but I also don't know the level of rigor wanted here. It would be easy to show that v CAN be a Poisson random variable, but I don't see how to go about showing that it MUST be. I thought maybe I could show that the basis of "Poisson random variable space" is independent from the basis of "non-Poisson random variable space", but I don't know if those are even sensible ideas.

Random thing I noted that may or may not be relevant:
E(v) = E(v_A) + E(v_B). If v_A and v_B are Poisson random variables then their expectation value is their parameter \lambda.
 
Physics news on Phys.org
I think the following might be a good direction to start with, but I'm pressed for time at the moment so I can't promise that it will bear fruit.

Suppose v is a Poisson random variable. I define random variables a_1, a_2, \ldots, a_v such that

a_j = \left\{\begin{array}{ll}1, &amp; \textrm{if ball j goes into urn A} \\<br /> 0, &amp; \textrm{if ball j goes into urn B}\end{array}\right.

The a_j's are independent, and

v_A = \sum_{j=1}^{v} a_j

Then I can obtain the marginal probability mass function for v_A as follows:

P(v_A = k) = \sum_{m=0}^{\infty} P(v_A = k \quad \cap \quad v = m) = \sum_{m=0}^{\infty}P(v_A = k \quad | \quad v = m) P(v = m)

and similarly for v_B. The right-hand side should be straightforward from here.

So that gets you the marginal distributions, and now you need the joint distribution, and to show that v_A and v_B are statistically independent, i.e.,

P(v_A = k \quad \cap \quad v_B = n) = P(v_A = k) P(v_B = n)

If no one else chimes in, and you still need ideas, I'll try to write more later this evening.
 
This helped a lot - but I still don't see how to prove that ONLY Poisson random variables will work.
 
Monocles said:
This helped a lot - but I still don't see how to prove that ONLY Poisson random variables will work.

I'm not sure how to prove that. I may be way off base, but my guess is that you could make some headway using characteristic functions. The argument would start as follows:

Suppose v_A and v_B are independent. Then

\phi_v(t) = \phi_{v_A}(t) \phi_{v_B}(t)

where the notation

\phi_x(t) denotes the characteristic function of the random variable x, defined as

\phi_x(t) = E[e^{itx}] = \sum_k e^{itk} P(x = k)

You could then use (deriving if not presumed known) the characteristic functions for binomial and Poissan random variables, which are

(1 - p + pe^{it})^{n} (binomial with n trials and probability p per trial)

and

e^{\lambda(e^{it}-1)} (Poisson with parameter \lambda)

This isn't the whole story, because v_A and v_B are only conditionally binomial, given a fixed value of v, but it seems like it might be a promising avenue to try. [Of course, if this is for a class and you haven't covered characteristic functions yet, then scratch that idea...]
 
Yeah, it is for class, and we haven't gotten to characteristic functions yet. I ended up just throwing in what I had and mentioning something about the space of non-Poisson random variables hopefully not being so large that it could be dense in the space of Poisson random variables, but I was definitely just talking out of my ***.
 
Hmm, I guess without characteristic functions, you would make the analogous argument using probability mass functions, except the math might be grungier:

Suppose v_A and v_B are independent. Then the pmf of v is the convolution of the pmfs of v_A and v_B:

P(v = k) = \sum_n P(v_A = n) P(v_B = k - n)

and you would write

P(v_A = n) = \sum_j P(v_A = n \quad | v = j) P(v = j)

and similarly for v_B

Then use the fact that v_A and v_B are conditionally binomial, given v = j. Hopefully you could then do a bunch of simplification and solve for P(v = k) and show it to be Poisson.

If you get a chance, I'll be curious to know if that's the solution your instructor gives, or if there is a simpler way.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
Replies
7
Views
2K
Replies
2
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
Replies
2
Views
2K
  • · Replies 32 ·
2
Replies
32
Views
3K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
5K