- #1
skwey
- 17
- 0
How to understand "fisher information"?
Hello, I am trying to understand what "fisher information is."
It is defined as V [∂/∂∅(lnf(X,∅)) ]=E[ (∂/∂∅[lnf(X,∅)])^2 ].
From Wikipedia:
Can you please help me understand why this is the case? How can this be explained by looking at the equation?
Hello, I am trying to understand what "fisher information is."
It is defined as V [∂/∂∅(lnf(X,∅)) ]=E[ (∂/∂∅[lnf(X,∅)])^2 ].
From Wikipedia:
The Fisher information is a way of measuring the amount of information that an observable random variable X carries about an unknown parameter θ upon which the probability of X depends.
Can you please help me understand why this is the case? How can this be explained by looking at the equation?