Register to reply

Fisher Information Matrix: Equivalent Expressions

Share this thread:
weetabixharry
#1
Jul6-12, 01:49 PM
P: 108
I don't understand the following step regarding the [itex](i,j)^{th}[/itex] element of the Fisher Information Matrix, [itex]\textbf{J}[/itex]:
[tex]J_{ij}\triangleq\mathcal{E}\left\{ \frac{\partial}{\partial\theta_{i}}L_{\textbf{x}}(\textbf{θ})\frac{\par tial}{\partial\theta_{j}}

L_{\mathbf{x}}(\textbf{θ})\right\}
\\
=-\mathcal{E}\left\{ \frac{\partial^{2}}{\partial\theta_{i} \partial \theta_{j}}L_{\textbf{x}}(\textbf{θ})\right\}[/tex]

which is given in (Eq. 8.26, on p. 926 of) "Optimum Array Processing" by Harry van Trees. I don't know if the details matter, but [itex]L_{\textbf{x}}[/itex] is the log-likelihood function and he is looking at the problem of estimating the non-random real vector, [itex]\textbf{θ}[/itex], from discrete observations of a complex Gaussian random vector, [itex]\textbf{x}[/itex].

Am I missing something obvious? I'm not very sharp on partial derivatives.
Phys.Org News Partner Science news on Phys.org
Climate change increases risk of crop slowdown in next 20 years
Researcher part of team studying ways to better predict intensity of hurricanes
New molecule puts scientists a step closer to understanding hydrogen storage
DrDu
#2
Jul6-12, 02:36 PM
Sci Advisor
P: 3,555
L_X is minus the log of the pdf. Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.
weetabixharry
#3
Jul7-12, 11:14 PM
P: 108
Quote Quote by DrDu View Post
Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.
The expressions are rather intimidating functions of complex matrices. I don't think I want to try partial integration. I tried just evaluating the derivatives to see if they would come out the same, but I ran out of paper before I had even scratched the surface.

If the result is specific to this problem, then I would be willing to take it on face value. It's just frustrating that I keep seeing the result stated without proof. It's as though it's too obvious to warrant a formal proof.

DrDu
#4
Jul9-12, 03:24 AM
Sci Advisor
P: 3,555
Fisher Information Matrix: Equivalent Expressions

The problem is that I am too lazy to tex something which can be found in any text on statistics or google, e.g.:

http://mark.reid.name/iem/fisher-inf...ikelihood.html
weetabixharry
#5
Jul11-12, 03:45 AM
P: 108
Quote Quote by DrDu View Post
The problem is that I am too l
I see.

Thanks - this was useful; its author claims to have spent 6 months pondering this problem, so I'm glad I'm not the only one to have had difficulty finding the solution.
DrDu
#6
Jul11-12, 06:05 AM
Sci Advisor
P: 3,555
6 month is completely inacceptable. As I said, this topic is contained in every book on introductory statistics.
Elsewise it is a good idea to search for some lecture notes containing the problem: I used "fisher information lecture notes" and almost every lecture note contained a proof of the statement, e.g. the first one i got:
http://ocw.mit.edu/courses/mathemati...s/lecture3.pdf
weetabixharry
#7
Jul14-12, 05:47 AM
P: 108
Quote Quote by DrDu View Post
this topic is contained in every book on introductory statistics.
This is obviously incorrect.

Quote Quote by DrDu View Post
Elsewise it is a good idea to search for some lecture notes containing the problem
This is no longer necessary. The topic is resolved.


Register to reply

Related Discussions
Invariance of the Fisher matrix Set Theory, Logic, Probability, Statistics 2
Equivalent expressions Quantum Physics 5
Equivalent expressions? Classical Physics 1
Fisher information General Math 0