Fisher Information Matrix: Equivalent Expressions

AI Thread Summary
The discussion centers on understanding the Fisher Information Matrix's (FIM) (i,j)^{th} element, specifically the transition from the expectation of the product of partial derivatives of the log-likelihood function to the negative expectation of the second partial derivative. Participants express frustration over the lack of formal proofs for this result, which is often stated without elaboration. Some suggest that the complexity of the expressions and the use of partial integration make it difficult to derive the results independently. Others note that similar proofs can be found in various statistics texts and lecture notes, indicating that the topic is well-documented. Ultimately, the issue appears to be resolved with the acknowledgment that the necessary information is readily available in educational resources.
weetabixharry
Messages
111
Reaction score
0
I don't understand the following step regarding the (i,j)^{th} element of the Fisher Information Matrix, \textbf{J}:
J_{ij}\triangleq\mathcal{E}\left\{ \frac{\partial}{\partial\theta_{i}}L_{\textbf{x}}(\textbf{θ})\frac{\partial}{\partial\theta_{j}}<br /> <br /> L_{\mathbf{x}}(\textbf{θ})\right\}<br /> \\<br /> =-\mathcal{E}\left\{ \frac{\partial^{2}}{\partial\theta_{i} \partial \theta_{j}}L_{\textbf{x}}(\textbf{θ})\right\}

which is given in (Eq. 8.26, on p. 926 of) "Optimum Array Processing" by Harry van Trees. I don't know if the details matter, but L_{\textbf{x}} is the log-likelihood function and he is looking at the problem of estimating the non-random real vector, \textbf{θ}, from discrete observations of a complex Gaussian random vector, \textbf{x}.

Am I missing something obvious? I'm not very sharp on partial derivatives.
 
Last edited:
Physics news on Phys.org
L_X is minus the log of the pdf. Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.
 
DrDu said:
Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.

The expressions are rather intimidating functions of complex matrices. I don't think I want to try partial integration. I tried just evaluating the derivatives to see if they would come out the same, but I ran out of paper before I had even scratched the surface.

If the result is specific to this problem, then I would be willing to take it on face value. It's just frustrating that I keep seeing the result stated without proof. It's as though it's too obvious to warrant a formal proof.
 
6 month is completely inacceptable. As I said, this topic is contained in every book on introductory statistics.
Elsewise it is a good idea to search for some lecture notes containing the problem: I used "fisher information lecture notes" and almost every lecture note contained a proof of the statement, e.g. the first one i got:
http://ocw.mit.edu/courses/mathemat...ications-fall-2006/lecture-notes/lecture3.pdf
 
DrDu said:
this topic is contained in every book on introductory statistics.
This is obviously incorrect.

DrDu said:
Elsewise it is a good idea to search for some lecture notes containing the problem
This is no longer necessary. The topic is resolved.
 

Similar threads

Back
Top