Maybe it is a good idea to have mathematical tutorials like in the subforum about Physics. I have no homework anymore but tend to think that doing mathematics is a skill that always pays back at the end. First thing I want to know what Fisher Information is. It is according to wikipedia the amount of information that is carried by an observed random variable about an unseen parameter. So, let us consider coin tossing examples. The unknown parameter is the amount in which the coin is biased. De observed random variable is a binary sequence of data point labelled "head" or "tail". I would like an explanation that uses many visualizations and is not afraid of explaining things when it heads into "Information Geometry" for example. I don't have to be able to derive everything myself, but I want to understand the whole terminology. What is meant by the variance of the score? What is exactly the relation with the Cramér-Rao inequality? What is meant by the Fisher information matrix being a Riemannian metric? Is the distance between points on a differential manifold equal to the amount of information between those points? Can Fisher information be seen as an amount of curvature? I'm curious to what kind of world this crystallizes in someones mind. Mathematics does sometimes obscure things instead of just formalizing them. It's possible to do better! This article from John Denker about Clifford Algebra is for example an amazing piece of a down to earth explanation of a "quite" complex concept. Thanks a lot in advance!