## Joint and conditional distributions

I'm having a problem evaluating a distribution-

Suppose X and Y are Chi-square random variables, and a is some
constant greater than 0. X and Y are independent, but not identically distributed (they have different DOFs).
I want to find

P(X>a,X-Y>0). So I use Bayes' theorem to write

P(X>a,X-Y>0)
=P(X>a | X-Y > 0)*P(X-Y>0)
=P(X>a| X>Y)*P(X>Y)

Now I have an expression for P(X>a) and P(X>Y), but I am at a
loss as to how to evaluate the conditional distribution P(X>a|
X>Y).

I figured out that if Y was a constant (rather than a random variable), then I could write

P(X>a| X>Y) = { 1 if Y>a
{ P(X>a)/P(X>Y) if Y<a

But this does not help evalaute the distribution because I requires knowledge of the value of random variable Y.

Any help will be much appreciated.
 PhysOrg.com science news on PhysOrg.com >> Ants and carnivorous plants conspire for mutualistic feeding>> Forecast for Titan: Wild weather could be ahead>> Researchers stitch defects into the world's thinnest semiconductor
 Recognitions: Homework Help Science Advisor Why are you going through the conditional probability formula? (X>a, X>Y) if and only if (X > max{a,Y}) or (X - max{a,Y} > 0). Just an observation.

 Why are you going through the conditional probability formula? (X>a, X>Y) if and only if (X > max{a,Y}) or (X - max{a,Y} > 0). Just an observation
Thanks, that is a good observation. So now I know that

P(X>a,X>Y) = P(X>max(a,Y)). I would like to express this as some function of P(X>a) and P(X>Y) . That is, I know that

P(X>a,X>Y) = [ P(X>a) if a>Y
[ P(X>Y) if a<Y

but I only know a, not Y (since Y is an RV). So, in other words, is there a way to determine the 'threshold' at which P(X>a,X>Y) changes from P(X>a) to P(X>Y)?

Recognitions:
Homework Help