Fisher Information Matrix: Equivalent Expressions

Click For Summary

Discussion Overview

The discussion centers around the Fisher Information Matrix and its expressions, particularly the equivalence of certain elements involving partial derivatives of the log-likelihood function. Participants explore the mathematical details and implications of these expressions in the context of estimating parameters from observations of a complex Gaussian random vector.

Discussion Character

  • Technical explanation
  • Debate/contested
  • Mathematical reasoning

Main Points Raised

  • One participant expresses confusion about the derivation of the (i,j)^{th} element of the Fisher Information Matrix and questions whether they are missing something fundamental regarding partial derivatives.
  • Another participant suggests writing down explicit expressions for the expectation values in terms of the log-likelihood function and its derivatives, proposing the use of partial integration.
  • A different participant notes the complexity of the functions involved and their reluctance to attempt partial integration, indicating frustration over the lack of formal proof for the stated results.
  • One participant criticizes the notion that the problem is difficult, asserting that it is covered in introductory statistics texts and provides a link to a resource that claims to have spent significant time on the topic.
  • Another participant agrees with the previous comment about the availability of resources but expresses disbelief at the claim of difficulty, suggesting that the topic is resolved and no further exploration is necessary.

Areas of Agreement / Disagreement

Participants express differing views on the complexity of the topic and the necessity of formal proofs. While some believe the topic is straightforward and well-documented, others feel that the derivation is not adequately addressed in available resources, leading to unresolved disagreements about the perceived difficulty and clarity of the subject matter.

Contextual Notes

There are indications of varying levels of familiarity with the mathematical concepts involved, and some participants reference external resources that may or may not align with the specific context of the discussion. The discussion reflects a range of assumptions about the accessibility of the material and the need for formal proofs.

weetabixharry
Messages
111
Reaction score
0
I don't understand the following step regarding the (i,j)^{th} element of the Fisher Information Matrix, \textbf{J}:
J_{ij}\triangleq\mathcal{E}\left\{ \frac{\partial}{\partial\theta_{i}}L_{\textbf{x}}(\textbf{θ})\frac{\partial}{\partial\theta_{j}}<br /> <br /> L_{\mathbf{x}}(\textbf{θ})\right\}<br /> \\<br /> =-\mathcal{E}\left\{ \frac{\partial^{2}}{\partial\theta_{i} \partial \theta_{j}}L_{\textbf{x}}(\textbf{θ})\right\}

which is given in (Eq. 8.26, on p. 926 of) "Optimum Array Processing" by Harry van Trees. I don't know if the details matter, but L_{\textbf{x}} is the log-likelihood function and he is looking at the problem of estimating the non-random real vector, \textbf{θ}, from discrete observations of a complex Gaussian random vector, \textbf{x}.

Am I missing something obvious? I'm not very sharp on partial derivatives.
 
Last edited:
Physics news on Phys.org
L_X is minus the log of the pdf. Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.
 
DrDu said:
Write down the explicit expression for the expectation values in terms of L_X and its derivatives and use partial integration.

The expressions are rather intimidating functions of complex matrices. I don't think I want to try partial integration. I tried just evaluating the derivatives to see if they would come out the same, but I ran out of paper before I had even scratched the surface.

If the result is specific to this problem, then I would be willing to take it on face value. It's just frustrating that I keep seeing the result stated without proof. It's as though it's too obvious to warrant a formal proof.
 
6 month is completely inacceptable. As I said, this topic is contained in every book on introductory statistics.
Elsewise it is a good idea to search for some lecture notes containing the problem: I used "fisher information lecture notes" and almost every lecture note contained a proof of the statement, e.g. the first one i got:
http://ocw.mit.edu/courses/mathemat...ications-fall-2006/lecture-notes/lecture3.pdf
 
DrDu said:
this topic is contained in every book on introductory statistics.
This is obviously incorrect.

DrDu said:
Elsewise it is a good idea to search for some lecture notes containing the problem
This is no longer necessary. The topic is resolved.
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 12 ·
Replies
12
Views
2K
Replies
9
Views
2K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 0 ·
Replies
0
Views
1K
  • · Replies 17 ·
Replies
17
Views
3K
  • · Replies 7 ·
Replies
7
Views
8K
  • · Replies 1 ·
Replies
1
Views
2K