# Fisher Information Matrix: Equivalent Expressions

 P: 108 I don't understand the following step regarding the $(i,j)^{th}$ element of the Fisher Information Matrix, $\textbf{J}$: $$J_{ij}\triangleq\mathcal{E}\left\{ \frac{\partial}{\partial\theta_{i}}L_{\textbf{x}}(\textbf{θ})\frac{\par tial}{\partial\theta_{j}} L_{\mathbf{x}}(\textbf{θ})\right\} \\ =-\mathcal{E}\left\{ \frac{\partial^{2}}{\partial\theta_{i} \partial \theta_{j}}L_{\textbf{x}}(\textbf{θ})\right\}$$ which is given in (Eq. 8.26, on p. 926 of) "Optimum Array Processing" by Harry van Trees. I don't know if the details matter, but $L_{\textbf{x}}$ is the log-likelihood function and he is looking at the problem of estimating the non-random real vector, $\textbf{θ}$, from discrete observations of a complex Gaussian random vector, $\textbf{x}$. Am I missing something obvious? I'm not very sharp on partial derivatives.
 Sci Advisor P: 3,555 L_X is minus the log of the pdf. Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.
P: 108
 Quote by DrDu Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.
The expressions are rather intimidating functions of complex matrices. I don't think I want to try partial integration. I tried just evaluating the derivatives to see if they would come out the same, but I ran out of paper before I had even scratched the surface.

If the result is specific to this problem, then I would be willing to take it on face value. It's just frustrating that I keep seeing the result stated without proof. It's as though it's too obvious to warrant a formal proof.

 Sci Advisor P: 3,555 Fisher Information Matrix: Equivalent Expressions The problem is that I am too lazy to tex something which can be found in any text on statistics or google, e.g.: http://mark.reid.name/iem/fisher-inf...ikelihood.html
P: 108
 Quote by DrDu The problem is that I am too l
I see.

 Quote by DrDu http://mark.reid.name/iem/fisher-inf...ikelihood.html
Thanks - this was useful; its author claims to have spent 6 months pondering this problem, so I'm glad I'm not the only one to have had difficulty finding the solution.
 Sci Advisor P: 3,555 6 month is completely inacceptable. As I said, this topic is contained in every book on introductory statistics. Elsewise it is a good idea to search for some lecture notes containing the problem: I used "fisher information lecture notes" and almost every lecture note contained a proof of the statement, e.g. the first one i got: http://ocw.mit.edu/courses/mathemati...s/lecture3.pdf
P: 108
 Quote by DrDu this topic is contained in every book on introductory statistics.
This is obviously incorrect.

 Quote by DrDu Elsewise it is a good idea to search for some lecture notes containing the problem
This is no longer necessary. The topic is resolved.

 Related Discussions Set Theory, Logic, Probability, Statistics 2 Quantum Physics 5 Classical Physics 1 General Math 0