Fisher matrix for multivariate normal distribution

Click For Summary
SUMMARY

The Fisher information matrix (FIM) for the multivariate normal distribution can be expressed as \(\mathcal{I}_{m,n} = \frac{\partial \mu^\mathrm{T}}{\partial \theta_m} \Sigma^{-1} \frac{\partial \mu}{\partial \theta_n}\). The derivation involves using matrix derivatives, specifically \(D_{\theta} \log p(z ; \mu(\theta), \Sigma) = (z-\mu(\theta))^T \Sigma^{-1} D_{\theta} \mu(\theta)\). A critical reference for the general case, where \(\Sigma\) depends on \(\theta\), is the paper by Porat & Friedlander, while an easier interpretation can be found in Klein and Neudecker's work on Gaussian vector state space models.

PREREQUISITES
  • Understanding of Fisher information matrix (FIM)
  • Knowledge of multivariate normal distribution
  • Proficiency in matrix calculus
  • Familiarity with statistical inference concepts
NEXT STEPS
  • Study the derivation of the Fisher information matrix for multivariate normal distributions
  • Review the paper by Porat & Friedlander on Gaussian time series
  • Explore Klein and Neudecker's work on Gaussian vector state space models
  • Learn about the implications of parameter dependence in covariance matrices
USEFUL FOR

Statisticians, data scientists, and researchers involved in statistical modeling and inference, particularly those working with multivariate normal distributions and Fisher information matrices.

hdb
Messages
3
Reaction score
0
The fisher information matrix for multivariate normal distribution is said at many places to be simplified as:
\mathcal{I}_{m,n} = \frac{\partial \mu^\mathrm{T}}{\partial \theta_m} \Sigma^{-1} \frac{\partial \mu}{\partial \theta_n}.\
even on
http://en.wikipedia.org/wiki/Fisher_information#Multivariate_normal_distribution"
I am trying to come up with the derivation, but no luck so far. Does anyone have any ideas / hints / references, how to do this?

Thank you
 
Last edited by a moderator:
Physics news on Phys.org
Using matrix derivatives one has D_x(x^T A x) = x^T(A+A^T) from which it follows that D_{\theta} \log p(z ; \mu(\theta) , \Sigma) = (z-\mu(\theta))^T \Sigma^{-1} D_{\theta} \mu(\theta) For simplicity let's write D_{\theta} \mu(\theta) = H The FIM is then found as J = E[ ( D_{\theta} \log p(z ; \mu(\theta) , \Sigma))^T D_{\theta} \log p(z ; \mu(\theta) , \Sigma)] = E[ H^T R^{-1} (z - \mu(\theta))^T (z - \mu(\theta)) R^{-1} H] = H^T R^{-1} R R^{-1} H = H^T R^{-1} H [\tex] which is equivalent to the given formula. Notice that this formula only is valid as long as \Sigma [\tex] does not depend on \theta [\tex]. I'm still struggling to find a derivation of the more general case where also \Sigma [\tex] depends on \theta [\tex].<br /> <br /> For some reason my tex code is not correctly parsed. I cannot understand why.
 
Actually the general proof can apparently be found in Porat & Friedlander: Computation of the Exact Information Matrix of Gaussian Time Series with Stationary Random Components, IEEE Transactions on Acoustics, Speech and Signal Processing, Vol ASSP-34, No. 1, Feb. 1986.
 
edmundfo said:
R^{-1} H] = H^T R^{-1} R R^{-1} H = H^T R^{-1} H [\tex]

For some reason my tex code is not correctly parsed. I cannot understand why.

For one thing, you're using the back slash [\tex] instead of the forward slash [/tex] at the end of your code.
 
edmundfo said:
Actually the general proof can apparently be found in Porat & Friedlander: Computation of the Exact Information Matrix of Gaussian Time Series with Stationary Random Components, IEEE Transactions on Acoustics, Speech and Signal Processing, Vol ASSP-34, No. 1, Feb. 1986.
Thank you for the answers, in between I have found an another reference, which is a direct derivation of the same result, for me this one seems to be easier to interpret:

Klein, A., and H. Neudecker. “A direct derivation of the exact Fisher information matrix of Gaussian vector state space models.” Linear Algebra and its Applications 321, no. 1-3
 

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 0 ·
Replies
0
Views
979
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 5 ·
Replies
5
Views
8K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 30 ·
2
Replies
30
Views
7K