Transformation of pmf; bivariate to single-variate

  • Context: Graduate 
  • Thread starter Thread starter rayge
  • Start date Start date
  • Tags Tags
    Transformation
Click For Summary

Discussion Overview

The discussion revolves around the transformation of a probability mass function (pmf) from a bivariate distribution involving two independent binomial random variables, X_1 and X_2, to a single-variate distribution for the random variable Y defined as Y = X_1 - X_2 + n_2. Participants explore various methods to demonstrate that Y has a binomial distribution with parameters n = n_1 + n_2 and p = 1/2, including joint distributions and moment generating functions (mgf).

Discussion Character

  • Exploratory
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • One participant expresses difficulty with transformations and proposes defining a joint distribution of Y and Z (where Z = X_2) to sum over values of Z, but encounters complex algebra.
  • Another participant suggests redefining Z as n_2 - X_2, which may simplify the problem, leading to the expression f(y) = ∑(z=0 to n_2) (n_1 choose y-z)(n_2 choose n_2-z)(1/2)^(n_1+n_2).
  • A participant attempts to manipulate the sum to show it equals f(y) = (n_1+n_2 choose y)(1/2)^(n_1+n_2) but struggles to find the algebraic connection between the two forms.
  • Another participant corrects the formula for f(y) to f(y) = ∑(z=0 to n_2) (n_1 choose y-z)(n_2 choose z)(1/2)^(n_1+n_2), suggesting this might lead to a clearer path to the solution.
  • A later reply mentions using the moment generating function E(e^(x_1t - tx_2 + tn_2) as a potentially easier approach than the transformation initially considered.

Areas of Agreement / Disagreement

Participants present multiple approaches and corrections, indicating that there is no consensus on the best method to demonstrate the transformation or on the correctness of the algebra involved. The discussion remains unresolved with competing viewpoints on the approach to take.

Contextual Notes

Some assumptions about the independence of the random variables and the properties of binomial distributions are implicit in the discussion. The algebraic manipulations and identities referenced may depend on specific interpretations or definitions that are not fully clarified.

rayge
Messages
25
Reaction score
0
Transformations always give me trouble, but this one does in particular.

Assume X_1, X_2 independent with binomial distributions of parameters n_1, n_2, and p=1/2 for each.

Show Y = X_1 - X_2 + n_2 has a binomial distribution with parameters n= n_1 + n_2, p = 1/2.

My first instinct was to pick a variable Z = X_2, define a joint distribution of Y and Z, and sum over all values of Z. I ran into some complex algebra when summing this joint distribution over all values of Z, 0 to n_2. If anyone knows how to sum over all values of z for (n_1 choose y+z-n_2)*(n_2 chooze z) so as to get (n_1 + n_2 choose y), I would love to hear how, but I'm pretty sure this is a no-go.

My next thought was to still choose Z = X_2, but this time get the mgf of Y and Z. This boils down to (1/2 + exp(t_1)/2)^{n_1}(1/2 + exp(t_2)/2)^{n_2}. When I set t = t_1 + t_2, I get an mgf which fits what we're looking for, i.e. the binomial distribution with parameters n_1, n_2, and p=1/2. But I don't know if that is valid algebra, as a means of obtaining an mgf for a univariate distribution from an mgf for a bivariate distribution.

Any thoughts welcome!
 
Physics news on Phys.org
Let Z = n2 - X2. Z is binomial - same parameters as X2. This should be easier. (X1 + Z).
 
Thanks for the suggestion. From this I get:
f(y)=\sum_{z=0}^{n_2} \binom{n_1}{y-z}\binom{n_2}{n_2-z}\Big(\frac{1}{2}\Big)^{n_1+n_2}
What I want eventually is this:
f(y)=\binom{n_1+n_2}{y}\Big(\frac{1}{2}\Big)^{n_1+n_2}
I want very much to snap my fingers and call these equal, but I don't see it. Expanding the sum I get (apologies if I messed this up):
\Big(1/2\Big)^{n_1+n_2}\Big(\frac{{n_1!}{n_2!}}{{y!}{(n_1-y)!}{n_2!}{0!}}+\frac{{n_1!}{n_2!}}{{(y-1)!}{(n_1-y+1)!}{(n_2-1)!}{1!}}+\frac{{n_1!}{n_2!}}{{(y-2)!}{(n_1-y+2)!}{(n_2-2)!}{2!}}+\cdots+\frac{{n_1!}{n_2!}}{{(y-n_2)!}{(n_1-y+n_2)!}{n_2!}}\Big)
The expansion of what I'm looking for looks like this:
\Big(\frac{1}{2}\Big)^{n_1+n_2}\frac{{(n_1+n_2)!}}{{y!}{(n_1+n_2-y)!}}
Is there some kind of algebra magic I'm missing to get these to equal each other?

Or maybe you were suggesting using this for the mgf approach? Still not sure about getting t_1=t_2 (sorry for the typo before).
 
Last edited:
I believe the correct formula for f(y) should be:

f(y)=\sum_{z=0}^{n_2} \binom{n_1}{y-z}\binom{n_2}{z}\Big(\frac{1}{2}\Big)^{n_1+n_2}
 
Indeed! Anyone curious about how to get the answer from this, check out the Chu-Vandermonde Identity.

(I ended up using the moment generating function E(e^x_1t-tx_2+tn_2), which was easier than the transformation I had been doing.)
 

Similar threads

  • · Replies 12 ·
Replies
12
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
Replies
0
Views
2K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 36 ·
2
Replies
36
Views
5K