(adsbygoogle = window.adsbygoogle || []).push({}); 1. The problem statement, all variables and given/known data

Let Y_{i}= (Z_{1}+ ... + Z_{i})/(Z_{1}+ ... + Z_{i+1}) for i = 1,...,n and Y_{n+1}= Z_{1}+ ... + Z_{n+1}

where Z_{i}~ independent gamma(p_{i}) for i = 1,...,n+1.

Prove that the Y_{i}'s are mutually statistically independent.

2. Relevant equations

U~ Dirichlet(p_{1},...,p_{n};p_{n+1}) iffU=Z/T whereZis the vector of the n independent gammas and T is the sum of the n+1 independent gammas.

ThenUis statistically independent of T.

If X is statistically independent of Y, then f(X) is statistically independent of g(Y) for all f,g.

3. The attempt at a solution

I was thinking of using induction, but I don't know what the base case should be because n=1 seems too trivial and not representative of the problem, but I have no idea how to approach n=2 because I would have to verify that the density function factors.

Would n=1 suffice? If so, then the n=2 problem would be the same for the induction step of n+1.

Is there a theorem about particular cases where pairwise independence implies mutual independence? Pairwise independence can be proven very easily for these random variables. If not, then if, for example, Y_{n}is independent of Y_{n+1}(by the property of the Dirichlet distribution and that functions of independent random variables are independent), could I use a function which breaks down the two random variables into something meaningful that would show mutual independence of the Y_{i}'s? Or do I need to resort to using the definition of mutual independence and express variables in terms of one another?

EDIT: So I managed to prove mutual independence for n=2. So now I only have to prove it for n+1. I am currently attempting to change variables (using the joint density of n+1 independent gammas) to do so. Also, would anyone have any hints for the distribution of powers of beta-distributed random variables? (Specifically, Y^{i}for i = 1,...,n for gamma(1).)

EDIT2: My professor said that we should not be using density functions to prove mutual independence......so is there a theorem that says that if a subset is random variables is mutually independent, and everything in the set is pairwise independent, then the random variables in the entire set is mutually independent?

**Physics Forums | Science Articles, Homework Help, Discussion**

Dismiss Notice

Join Physics Forums Today!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Mutual Independence of Functions of Independent Gammas

Can you offer guidance or do you also need help?

**Physics Forums | Science Articles, Homework Help, Discussion**