Integrating Difficult Gaussian Integrals for Multivariate Normal Distributions

Click For Summary

Discussion Overview

The discussion revolves around evaluating a complex integral involving multivariate normal distributions, specifically focusing on the expectation value of a product of functions multiplied by an exponential term related to the covariance matrix. The scope includes theoretical exploration and mathematical reasoning related to Gaussian integrals.

Discussion Character

  • Exploratory
  • Mathematical reasoning

Main Points Raised

  • One participant presents an integral of the form G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x} and seeks methods for evaluation, specifically with f_{i}(x_{i})=\cosh(x_{i}).
  • Another participant suggests transforming the random vector x into a new vector y of independent random variables, proposing that this transformation could allow for the evaluation of the integral, albeit with potential complexity.
  • A participant expresses concern that the transformation leads to a sum in the argument of cosh, which complicates the evaluation rather than simplifying it. They mention considering Isserlis's theorem but struggle to find a precise summary of it.
  • One participant reiterates the transformation of cosh(xi) into a sum of exponentials and proposes that this could lead to a manageable form for the integral, suggesting that completing the square might help in evaluating the resulting expressions.
  • Another participant acknowledges the validity of the previous points and notes that both approaches to rewriting cosh(xi) could be effective.

Areas of Agreement / Disagreement

Participants express differing views on the effectiveness of the proposed transformations and methods for evaluating the integral. There is no consensus on the best approach, and the discussion remains unresolved regarding the most efficient method to tackle the integral.

Contextual Notes

The discussion highlights the complexity of integrating functions involving multivariate normal distributions and the challenges associated with transforming variables and applying theorems like Isserlis's theorem. Specific assumptions and the dependence on the definitions of the functions involved are not fully explored.

unchained1978
Messages
91
Reaction score
0
I'm dealing with multivariate normal distributions, and I've run up against an integral I really don't know how to do.

Given a random vector \vec x, and a covariance matrix \Sigma, how would you go about evaluating an expectation value of the form
G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x}
I've tried expanding the f_{i}'s into a power series, and then using the moment generating function to obtain the powers of x_{i}, but this really doesn't simplify the problem much. The function I'm currently considering is f_{i}(x_{i})=\cosh(x_{i}).
I would extremely appreciate anyone's input on this problem.
 
Physics news on Phys.org
Transform the random vector x into a random vector y consisting of independent random variables. Each xi is a linear combination of yj's. cosh(xi) will then expand into a product of cosh(yj)'s and the integrals can then be evaluated.

As you see it may be quite messy, bur doable.
 
I thought about that, but the argument of the cosh becomes a sum, which really doesn't simplify things at all. Thanks though. I've thought about using Isserlis's theorem, (also known as Wick's theorem) but I can't find a good summary/statement of the theorem that's more precise than Wikipedia's.
 
unchained1978 said:
I'm dealing with multivariate normal distributions, and I've run up against an integral I really don't know how to do.

Given a random vector \vec x, and a covariance matrix \Sigma, how would you go about evaluating an expectation value of the form
G=\int d^{n} x \left(\prod_{i=1}^{n} f_{i}(x_{i})\right) e^{-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x}
I've tried expanding the f_{i}'s into a power series, and then using the moment generating function to obtain the powers of x_{i}, but this really doesn't simplify the problem much. The function I'm currently considering is f_{i}(x_{i})=\cosh(x_{i}).
I would extremely appreciate anyone's input on this problem.

mathman said:
Transform the random vector x into a random vector y consisting of independent random variables. Each xi is a linear combination of yj's. cosh(xi) will then expand into a product of cosh(yj)'s and the integrals can then be evaluated.

As you see it may be quite messy, bur doable.

Another option would be to write ##\cosh(x_i) = (e^{x_i}+e^{-x_i})/2##; expanding out the products, the overall integral will look something like

$$G \propto \sum_{\sigma}\int d^{n} x \exp\left[-\frac{1}{2} \vec x \cdot \Sigma^{-1}\cdot \vec x + \vec J_\sigma \cdot \vec x\right],$$
where the ##J_\sigma## is a vector with entrees of ##\pm 1##, where ##\sigma## denotes all of the permutations of the ##\pm 1## signs. You can now complete the square and evaluate each of the integrals in the sum. I don't know if you will be able to write the overall result in a nice, closed form, though.
 
Mute's point is well taken. Once you write cosh(xi) = (exp(xi) + exp(-xi))/2, either approach works.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 18 ·
Replies
18
Views
3K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 16 ·
Replies
16
Views
4K
  • · Replies 30 ·
2
Replies
30
Views
4K
  • · Replies 19 ·
Replies
19
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 24 ·
Replies
24
Views
4K
  • · Replies 27 ·
Replies
27
Views
2K