# Fisher matrix for multivariate normal distribution

 P: 3 The fisher information matrix for multivariate normal distribution is said at many places to be simplified as: $$\mathcal{I}_{m,n} = \frac{\partial \mu^\mathrm{T}}{\partial \theta_m} \Sigma^{-1} \frac{\partial \mu}{\partial \theta_n}.\$$ even on http://en.wikipedia.org/wiki/Fisher_...l_distribution I am trying to come up with the derivation, but no luck so far. Does anyone have any ideas / hints / references, how to do this? Thank you
 P: 5 Using matrix derivatives one has $$D_x(x^T A x) = x^T(A+A^T)$$ from which it follows that $$D_{\theta} \log p(z ; \mu(\theta) , \Sigma) = (z-\mu(\theta))^T \Sigma^{-1} D_{\theta} \mu(\theta)$$ For simplicity let's write $$D_{\theta} \mu(\theta) = H$$ The FIM is then found as $$J = E[ ( D_{\theta} \log p(z ; \mu(\theta) , \Sigma))^T D_{\theta} \log p(z ; \mu(\theta) , \Sigma)] = E[ H^T R^{-1} (z - \mu(\theta))^T (z - \mu(\theta)) R^{-1} H] = H^T R^{-1} R R^{-1} H = H^T R^{-1} H [\tex] which is equivalent to the given formula. Notice that this formula only is valid as long as [tex] \Sigma [\tex] does not depend on [tex] \theta [\tex]. I'm still struggling to find a derivation of the more general case where also [tex] \Sigma [\tex] depends on [tex] \theta [\tex]. For some reason my tex code is not correctly parsed. I cannot understand why.  P: 5 Actually the general proof can apparently be found in Porat & Friedlander: Computation of the Exact Information Matrix of Gaussian Time Series with Stationary Random Components, IEEE Transactions on Acoustics, Speech and Signal Processing, Vol ASSP-34, No. 1, Feb. 1986. P: 2,504 Fisher matrix for multivariate normal distribution  Quote by edmundfo R^{-1} H] = H^T R^{-1} R R^{-1} H = H^T R^{-1} H [\tex] For some reason my tex code is not correctly parsed. I cannot understand why. For one thing, you're using the back slash [\tex] instead of the forward slash$$ at the end of your code.