Integrating Difficult Gaussian Integrals for Multivariate Normal Distributions

Click For Summary
SUMMARY

This discussion focuses on evaluating the integral of a multivariate normal distribution involving the function \( f_{i}(x_{i}) = \cosh(x_{i}) \). The integral is expressed as \( G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x} \). Participants suggest transforming the random vector \( \vec x \) into independent random variables \( \vec y \) and utilizing Isserlis's theorem for simplification. An alternative method involves rewriting \( \cosh(x_i) \) in terms of exponential functions, allowing for the completion of the square to evaluate the integral.

PREREQUISITES
  • Understanding of multivariate normal distributions
  • Familiarity with Isserlis's theorem (Wick's theorem)
  • Knowledge of moment generating functions
  • Proficiency in integral calculus and transformations of random variables
NEXT STEPS
  • Study Isserlis's theorem in detail for applications in multivariate integrals
  • Learn about transformations of random variables in multivariate distributions
  • Explore the properties and applications of moment generating functions
  • Investigate techniques for completing the square in integrals
USEFUL FOR

Mathematicians, statisticians, and data scientists working with multivariate normal distributions, particularly those involved in advanced statistical modeling and integral evaluations.

unchained1978
Messages
91
Reaction score
0
I'm dealing with multivariate normal distributions, and I've run up against an integral I really don't know how to do.

Given a random vector \vec x, and a covariance matrix \Sigma, how would you go about evaluating an expectation value of the form
G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x}
I've tried expanding the f_{i}'s into a power series, and then using the moment generating function to obtain the powers of x_{i}, but this really doesn't simplify the problem much. The function I'm currently considering is f_{i}(x_{i})=\cosh(x_{i}).
I would extremely appreciate anyone's input on this problem.
 
Physics news on Phys.org
Transform the random vector x into a random vector y consisting of independent random variables. Each xi is a linear combination of yj's. cosh(xi) will then expand into a product of cosh(yj)'s and the integrals can then be evaluated.

As you see it may be quite messy, bur doable.
 
I thought about that, but the argument of the cosh becomes a sum, which really doesn't simplify things at all. Thanks though. I've thought about using Isserlis's theorem, (also known as Wick's theorem) but I can't find a good summary/statement of the theorem that's more precise than Wikipedia's.
 
unchained1978 said:
I'm dealing with multivariate normal distributions, and I've run up against an integral I really don't know how to do.

Given a random vector \vec x, and a covariance matrix \Sigma, how would you go about evaluating an expectation value of the form
G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x}
I've tried expanding the f_{i}'s into a power series, and then using the moment generating function to obtain the powers of x_{i}, but this really doesn't simplify the problem much. The function I'm currently considering is f_{i}(x_{i})=\cosh(x_{i}).
I would extremely appreciate anyone's input on this problem.

mathman said:
Transform the random vector x into a random vector y consisting of independent random variables. Each xi is a linear combination of yj's. cosh(xi) will then expand into a product of cosh(yj)'s and the integrals can then be evaluated.

As you see it may be quite messy, bur doable.

Another option would be to write ##\cosh(x_i) = (e^{x_i}+e^{-x_i})/2##; expanding out the products, the overall integral will look something like

$$G \propto \sum_{\sigma}\int d^{n} x \exp\left[-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x + \vec J_\sigma \cdot \vec x\right],$$
where the ##J_\sigma## is a vector with entrees of ##\pm 1##, where ##\sigma## denotes all of the permutations of the ##\pm 1## signs. You can now complete the square and evaluate each of the integrals in the sum. I don't know if you will be able to write the overall result in a nice, closed form, though.
 
Mute's point is well taken. Once you write cosh(xi) = (exp(xi) + exp(-xi))/2, either approach works.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
3K
  • · Replies 19 ·
Replies
19
Views
4K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 7 ·
Replies
7
Views
2K