Understanding Fisher Information: Visualizations & Explanations

Click For Summary
SUMMARY

This discussion focuses on the concept of Fisher Information, defined as the amount of information that an observed random variable conveys about an unseen parameter. The conversation highlights the need for visualizations and clear explanations, particularly regarding the variance of the score, the Cramér-Rao inequality, and the Fisher information matrix as a Riemannian metric. Participants express a desire for more intuitive understanding of these mathematical concepts, emphasizing the importance of effective communication in mathematics.

PREREQUISITES
  • Understanding of random variables and their properties
  • Familiarity with the Cramér-Rao inequality
  • Basic knowledge of Riemannian geometry
  • Experience with visualizing mathematical concepts
NEXT STEPS
  • Research the implications of Fisher Information in statistical inference
  • Explore visualizations of the Fisher information matrix
  • Study the relationship between Fisher Information and differential geometry
  • Learn about applications of the Cramér-Rao inequality in estimation theory
USEFUL FOR

Mathematicians, statisticians, data scientists, and anyone interested in deepening their understanding of statistical inference and information theory.

saviourmachine
Messages
96
Reaction score
0
Maybe it is a good idea to have mathematical tutorials like in the subforum about Physics. I have no homework anymore :smile: but tend to think that doing mathematics is a skill that always pays back at the end.

First thing I want to know what https://en.wikipedia.org/wiki/Fisher_information is. It is according to wikipedia the amount of information that is carried by an observed random variable about an unseen parameter. So, let us consider coin tossing examples. The unknown parameter is the amount in which the coin is biased. De observed random variable is a binary sequence of data point labelled "head" or "tail".

I would like an explanation that uses many visualizations and is not afraid of explaining things when it heads into "https://en.wikipedia.org/wiki/Information_geometry " for example. I don't have to be able to derive everything myself, but I want to understand the whole terminology.

What is meant by the variance of the score? What is exactly the relation with the Cramér-Rao inequality? What is meant by the Fisher information matrix being a Riemannian metric? Is the distance between points on a differential manifold equal to the amount of information between those points? Can Fisher information be seen as an amount of curvature?

I'm curious to what kind of world this crystallizes in someone's mind. Mathematics does sometimes obscure things instead of just formalizing them. It's possible to do better! http://www.av8n.com/physics/clifford-intro.htm from John Denker about Clifford Algebra is for example an amazing piece of a down to Earth explanation of a "quite" complex concept. Thanks a lot in advance!
 
Last edited by a moderator:
Physics news on Phys.org
It would be easier to answer with specific questions what on the websites you quoted you don't and what do you understand. There are books written on those subjects which makes it difficult to answer in such a generality.

The explanation of mathematics can be done better is probably a discussion as old as mathematics itself. In my experience it depends very much on the individual it is explained to.
 

Similar threads

  • · Replies 7 ·
Replies
7
Views
8K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 13 ·
Replies
13
Views
10K
  • · Replies 61 ·
3
Replies
61
Views
11K
  • · Replies 11 ·
Replies
11
Views
4K
  • · Replies 11 ·
Replies
11
Views
3K
  • · Replies 2 ·
Replies
2
Views
1K