I don't understand the following step regarding the [itex](i,j)^{th}[/itex] element of the Fisher Information Matrix, [itex]\textbf{J}[/itex]:(adsbygoogle = window.adsbygoogle || []).push({});

[tex]J_{ij}\triangleq\mathcal{E}\left\{ \frac{\partial}{\partial\theta_{i}}L_{\textbf{x}}(\textbf{θ})\frac{\partial}{\partial\theta_{j}}

L_{\mathbf{x}}(\textbf{θ})\right\}

\\

=-\mathcal{E}\left\{ \frac{\partial^{2}}{\partial\theta_{i} \partial \theta_{j}}L_{\textbf{x}}(\textbf{θ})\right\}[/tex]

which is given in (Eq. 8.26, on p. 926 of) "Optimum Array Processing" by Harry van Trees. I don't know if the details matter, but [itex]L_{\textbf{x}}[/itex] is the log-likelihood function and he is looking at the problem of estimating the non-random real vector, [itex]\textbf{θ}[/itex], from discrete observations of a complex Gaussian random vector, [itex]\textbf{x}[/itex].

Am I missing something obvious? I'm not very sharp on partial derivatives.

**Physics Forums - The Fusion of Science and Community**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Fisher Information Matrix: Equivalent Expressions

Loading...

Similar Threads - Fisher Information Matrix | Date |
---|---|

Maximum Likelihood and Fisher Information | Feb 5, 2016 |

Continous limit of a multivariate normal distribution | Nov 9, 2015 |

How to understand fisher information ? | Jan 10, 2012 |

Invariance of the Fisher matrix | Dec 18, 2011 |

Fisher matrix for multivariate normal distribution | Mar 17, 2011 |

**Physics Forums - The Fusion of Science and Community**