- #26

- 780

- 53

This all sounds very interesting. I hadn't heard of using FI to derive physics. KL-divergence isn't quite cross-entropy, although they are closely related (https://tdhopper.com/blog/cross-entropy-and-kl-divergence). Fisher information is the curvature of the KL-divergence (https://en.wikipedia.org/wiki/Kullback–Leibler_divergence#Fisher_information_metric). If you are interested, look up information geometry, where they use Fisher information as a metric in a differential geometry formulation of information theory.Thanks for the references. I had not heard of Friston before, but my wife (a cognitive neuroscientist doing fMRI) most certainly had.

Frieden says that the relation between FI and K-LD (a.k.a. "cross-entropy") is that FI "is proportional to the cross-entropy between the PDFp(x)and a reference PDF that is its shifted versionp(x +∆x)." That makes intuitive sense to me, because sharp transitions would make that cross-entropy large.

Since it is possible to derive relativistic QM (incl. Dirac eqn) from FI (see Frieden chapter 4), I wonder what you would get if you derived QM from K-LD?