# Conditional Probabilities relating quadratic forms of random variables

1. ### Tez

50
Well I'm getting pretty frustrated by this problem which arose in my research, so I'm hoping someone here might set me on the right track.

I start with n random variables x_i, i=1..n each independently normally distributed with mean of 0 and variance 1.

I now have two different functions of those random variables f(x1,x2...) and g(x1,x2,...). Moreover these functions can be written as quadratic forms in the random variables, i.e. there exist real symmetric matrices A, B such that
f=x'*A*x
g=x'*B*x
where x is a vector of the x_i and by ' I mean transposition.

Well, if you want to ask questions like "What is the probability that f(x)>delta for example, then there is an old paper by Imhof which provides a nice analytic answer, here it is:
http://www.physicsnerd.com/Imhof.pdf

However I have a more complicated problem. I want to know the following conditional probability:

What is the probaility that f(x)>0 given that g(x)>0

Now the thing is the matrices A,B are easily diagonalizable, though they dont commute. I dont think the specific form of the matrices matters, but in case it helps here are two matrices from one n=6 instance of the problem:

A=
[cos(theta)^2-epsilon, 0, 1/2*sin(2*theta), 0, 0, 0],
[0, cos(theta)^2-epsilon, 0, 1/2*sin(2*theta), 0, 0],
[1/2*sin(2*theta), 0, sin(theta)^2-epsilon, 0, 0, 0],
[0, 1/2*sin(2*theta), 0, sin(theta)^2-epsilon, 0, 0],
[0, 0, 0, 0, -epsilon, 0],
[0, 0, 0, 0, 0, -epsilon]

theta and epsilon are two fixed but otherwise arbitrary parameters (0<epsilon<1).

B=diag(1,1,-1,-1)

If anything comes of it, I'll certainly acknowledge anyone who helps in the paper.

Thanks
Tez