Understanding Fisher Information: Visualizations & Explanations

In summary, the conversation touched upon the idea of having mathematical tutorials, the concept of Fisher information, and the challenges and benefits of explaining complex mathematical concepts. The participants also discussed the use of visualizations and the relationship between the variance of the score and the Cramér-Rao inequality. They also mentioned the Fisher information matrix as a Riemannian metric and its connection to differential manifolds. Finally, they explored the possibility of simplifying and improving the explanation of mathematics.
  • #1
saviourmachine
96
0
Maybe it is a good idea to have mathematical tutorials like in the subforum about Physics. I have no homework anymore :smile: but tend to think that doing mathematics is a skill that always pays back at the end.

First thing I want to know what https://en.wikipedia.org/wiki/Fisher_information is. It is according to wikipedia the amount of information that is carried by an observed random variable about an unseen parameter. So, let us consider coin tossing examples. The unknown parameter is the amount in which the coin is biased. De observed random variable is a binary sequence of data point labelled "head" or "tail".

I would like an explanation that uses many visualizations and is not afraid of explaining things when it heads into "https://en.wikipedia.org/wiki/Information_geometry " for example. I don't have to be able to derive everything myself, but I want to understand the whole terminology.

What is meant by the variance of the score? What is exactly the relation with the Cramér-Rao inequality? What is meant by the Fisher information matrix being a Riemannian metric? Is the distance between points on a differential manifold equal to the amount of information between those points? Can Fisher information be seen as an amount of curvature?

I'm curious to what kind of world this crystallizes in someone's mind. Mathematics does sometimes obscure things instead of just formalizing them. It's possible to do better! http://www.av8n.com/physics/clifford-intro.htm from John Denker about Clifford Algebra is for example an amazing piece of a down to Earth explanation of a "quite" complex concept. Thanks a lot in advance!
 
Last edited by a moderator:
Mathematics news on Phys.org
  • #2
It would be easier to answer with specific questions what on the websites you quoted you don't and what do you understand. There are books written on those subjects which makes it difficult to answer in such a generality.

The explanation of mathematics can be done better is probably a discussion as old as mathematics itself. In my experience it depends very much on the individual it is explained to.
 

1. What is Fisher Information?

Fisher Information is a mathematical concept that measures how much information a set of data provides about a particular parameter or variable. It is commonly used in statistics and machine learning to understand the sensitivity of a statistical model to changes in its parameters.

2. How is Fisher Information visualized?

Fisher Information can be visualized in various ways, such as through heat maps, scatter plots, and bar charts. These visualizations allow for a better understanding of how different variables affect the information provided by a dataset.

3. Why is Fisher Information important?

Fisher Information is important because it helps us understand the quality of a statistical model and its ability to accurately estimate parameters. It also allows us to compare different models and select the one that provides the most information about the parameters of interest.

4. How is Fisher Information calculated?

Fisher Information is calculated using the second derivative of the log-likelihood function, which measures the curvature of the likelihood function. It can also be calculated using the expected value of the score function, which measures the sensitivity of the likelihood function to changes in the parameters.

5. What are the applications of Fisher Information?

Fisher Information has various applications in statistics and machine learning, such as parameter estimation, model selection, and hypothesis testing. It is also used in fields like econometrics, physics, and biology to understand the sensitivity of different models to changes in their parameters.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
7
Views
7K
  • General Math
2
Replies
61
Views
9K
Replies
11
Views
805
Replies
11
Views
2K
Replies
1
Views
804
  • Quantum Interpretations and Foundations
Replies
25
Views
1K
  • General Math
Replies
13
Views
9K
  • General Discussion
Replies
3
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
2K
Replies
1
Views
684
Back
Top