Integrating Difficult Gaussian Integrals for Multivariate Normal Distributions

Click For Summary
The discussion centers on evaluating a complex integral involving multivariate normal distributions, specifically the expectation value of a product of functions multiplied by an exponential term. The user is attempting to simplify the integral by expanding the functions into a power series and considering the moment generating function, but finds this approach unhelpful. Suggestions include transforming the random vector into independent variables and using Isserlis's theorem for simplification. Another proposed method involves rewriting the hyperbolic cosine function in terms of exponentials, allowing for the completion of the square in the integral. Overall, while the problem is intricate, several viable strategies for evaluation are discussed.
unchained1978
Messages
91
Reaction score
0
I'm dealing with multivariate normal distributions, and I've run up against an integral I really don't know how to do.

Given a random vector \vec x, and a covariance matrix \Sigma, how would you go about evaluating an expectation value of the form
G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x}
I've tried expanding the f_{i}'s into a power series, and then using the moment generating function to obtain the powers of x_{i}, but this really doesn't simplify the problem much. The function I'm currently considering is f_{i}(x_{i})=\cosh(x_{i}).
I would extremely appreciate anyone's input on this problem.
 
Physics news on Phys.org
Transform the random vector x into a random vector y consisting of independent random variables. Each xi is a linear combination of yj's. cosh(xi) will then expand into a product of cosh(yj)'s and the integrals can then be evaluated.

As you see it may be quite messy, bur doable.
 
I thought about that, but the argument of the cosh becomes a sum, which really doesn't simplify things at all. Thanks though. I've thought about using Isserlis's theorem, (also known as Wick's theorem) but I can't find a good summary/statement of the theorem that's more precise than Wikipedia's.
 
unchained1978 said:
I'm dealing with multivariate normal distributions, and I've run up against an integral I really don't know how to do.

Given a random vector \vec x, and a covariance matrix \Sigma, how would you go about evaluating an expectation value of the form
G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x}
I've tried expanding the f_{i}'s into a power series, and then using the moment generating function to obtain the powers of x_{i}, but this really doesn't simplify the problem much. The function I'm currently considering is f_{i}(x_{i})=\cosh(x_{i}).
I would extremely appreciate anyone's input on this problem.

mathman said:
Transform the random vector x into a random vector y consisting of independent random variables. Each xi is a linear combination of yj's. cosh(xi) will then expand into a product of cosh(yj)'s and the integrals can then be evaluated.

As you see it may be quite messy, bur doable.

Another option would be to write ##\cosh(x_i) = (e^{x_i}+e^{-x_i})/2##; expanding out the products, the overall integral will look something like

$$G \propto \sum_{\sigma}\int d^{n} x \exp\left[-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x + \vec J_\sigma \cdot \vec x\right],$$
where the ##J_\sigma## is a vector with entrees of ##\pm 1##, where ##\sigma## denotes all of the permutations of the ##\pm 1## signs. You can now complete the square and evaluate each of the integrals in the sum. I don't know if you will be able to write the overall result in a nice, closed form, though.
 
Mute's point is well taken. Once you write cosh(xi) = (exp(xi) + exp(-xi))/2, either approach works.
 
There are probably loads of proofs of this online, but I do not want to cheat. Here is my attempt: Convexity says that $$f(\lambda a + (1-\lambda)b) \leq \lambda f(a) + (1-\lambda) f(b)$$ $$f(b + \lambda(a-b)) \leq f(b) + \lambda (f(a) - f(b))$$ We know from the intermediate value theorem that there exists a ##c \in (b,a)## such that $$\frac{f(a) - f(b)}{a-b} = f'(c).$$ Hence $$f(b + \lambda(a-b)) \leq f(b) + \lambda (a - b) f'(c))$$ $$\frac{f(b + \lambda(a-b)) - f(b)}{\lambda(a-b)}...

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K