# Mutual Independence of Functions of Independent Gammas

1. Feb 22, 2012

### SpringPhysics

1. The problem statement, all variables and given/known data
Let Yi = (Z1 + ... + Zi)/(Z1 + ... + Zi+1) for i = 1,...,n and Yn+1 = Z1 + ... + Zn+1
where Zi ~ independent gamma(pi) for i = 1,...,n+1.

Prove that the Yi's are mutually statistically independent.

2. Relevant equations
U ~ Dirichlet(p1,...,pn;pn+1) iff U = Z/T where Z is the vector of the n independent gammas and T is the sum of the n+1 independent gammas.
Then U is statistically independent of T.

If X is statistically independent of Y, then f(X) is statistically independent of g(Y) for all f,g.

3. The attempt at a solution
I was thinking of using induction, but I don't know what the base case should be because n=1 seems too trivial and not representative of the problem, but I have no idea how to approach n=2 because I would have to verify that the density function factors.

Would n=1 suffice? If so, then the n=2 problem would be the same for the induction step of n+1.

Is there a theorem about particular cases where pairwise independence implies mutual independence? Pairwise independence can be proven very easily for these random variables. If not, then if, for example, Yn is independent of Yn+1 (by the property of the Dirichlet distribution and that functions of independent random variables are independent), could I use a function which breaks down the two random variables into something meaningful that would show mutual independence of the Yi's? Or do I need to resort to using the definition of mutual independence and express variables in terms of one another?

EDIT: So I managed to prove mutual independence for n=2. So now I only have to prove it for n+1. I am currently attempting to change variables (using the joint density of n+1 independent gammas) to do so. Also, would anyone have any hints for the distribution of powers of beta-distributed random variables? (Specifically, Yi for i = 1,...,n for gamma(1).)

EDIT2: My professor said that we should not be using density functions to prove mutual independence......so is there a theorem that says that if a subset is random variables is mutually independent, and everything in the set is pairwise independent, then the random variables in the entire set is mutually independent?

Last edited: Feb 22, 2012
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?