Other What are some good resources for learning about information theory?

Click For Summary
SUMMARY

This discussion centers on resources for learning about information theory, particularly its applications in physics and statistical mechanics. Key recommendations include "An Introduction to Information Theory: Symbols, Signals and Noise" by Pierce and "Information Theory: A Tutorial Introduction" by Stone for gentle introductions. For a more physics-oriented approach, "Principles of Statistical Mechanics" by A. Katz and "Information Theory, Inference, and Learning Algorithms" by David MacKay are highlighted. Additional resources include Shannon's foundational paper and various physics texts that incorporate information theory concepts.

PREREQUISITES
  • Basic understanding of statistical mechanics
  • Familiarity with entropy concepts
  • Knowledge of Shannon's information theory
  • Exposure to mathematical approximations like Stirling's approximation
NEXT STEPS
  • Read "Information Theory, Inference, and Learning Algorithms" by David MacKay
  • Explore Shannon's original paper on entropy
  • Study "Statistical Physics of Particles" by Mehran Kardar
  • Investigate the chapter on the noisy channel coding theorem in MacKay's book
USEFUL FOR

Students and professionals in physics, statisticians, and anyone interested in the intersection of information theory and statistical mechanics will benefit from this discussion.

AndreasC
Gold Member
Messages
555
Reaction score
317
As I've been studying statistical mechanics as well as some other things, I keep hearing about "information theory". For instance, I've heard about information theory as it relates to entropy, regarding some theorems of statistical mechanics, and I even heard about it in a Carl Bender lecture, where he said that you could show using information theory techniques that you could find series that approximate functions or solutions better than Pade or Taylor series etc. I find these things very interesting but I have no idea what to even look for to learn more about the subject, it seems to be one of these categories that different people from different disciplines may use in much different ways. Does anyone have any good sources to gain some understanding?
 
Physics news on Phys.org
For me the best book for physicists is

A. Katz, Principles of Statistical Mechanics, W. H. Freeman
and Company, San Francisco and London (1967).
 
  • Like
Likes dextercioby, AndreasC and sysprog
http://www.inference.org.uk/mackay/itprnn/book.html
Information Theory, Inference, and Learning Algorithms
David MacKay

There are two major ideas in information theory, one is the quantification of information, which is closely related to the entropy of statistical physics. In physics, we usually make use of Stirling's approximation. In Mackay's book, you can find the material under the discussion of asymptotic equipartition.

The second major idea is about how well information can be transmitted in the presence of noise. In Mackay's book, it is in the chapter on the noisy channel coding theorem. As far as I understand, this idea is not much used in physics.

Two physics books that discuss basic information theory are

https://www.amazon.com/dp/0521873428/?tag=pfamazon01-20
https://ocw.mit.edu/courses/physics...-fall-2013/lecture-notes/MIT8_333F13_Lec6.pdf (free notes on which the book is based)
Statistical Physics of Particles
Mehran Kardar

https://www.amazon.com/dp/0486497550/?tag=pfamazon01-20
Science and Information Theory
Leon Brillouin

An example of information-like quantities in the research literature is:

https://arxiv.org/abs/1007.4825
Renyi entropy, mutual information, and fluctuation properties of Fermi liquids
Brian Swingle
 
Last edited:
  • Informative
Likes AndreasC
I love The Information by James Gleick. One of my favorite reads...not technical but fascinating
 
  • Love
  • Informative
Likes AndreasC and vanhees71
W. T. Grandy, ‘‘Resource letter ITP-1: Information theory in physics,’’
Am. J. Phys. 65, 466–476 􏰐1997

is a guide to the literature
 
  • Like
  • Informative
Likes vanhees71 and AndreasC
Elements of information theory by Cover and Thomas is one of the best general texts. Information, physics, and computation by Mezard and Montanari applies ideas from information theory and the physics of disordered systems to computation.

I also second David MacKay’s book.
 
  • Like
Likes atyy, vanhees71 and AndreasC
  • #10
caz said:
W. T. Grandy, ‘‘Resource letter ITP-1: Information theory in physics,’’
Am. J. Phys. 65, 466–476 􏰐1997

is a guide to the literature
Very interesting, I'll check this out first and then look at the rest.
 
  • #11
I also highly recommend E. T. Jaynes's Phys Rev article showing the link between information theory and statistical mechanics
Jaynes's papers at Wash U
 
  • Like
Likes AndreasC and vanhees71

Similar threads

  • · Replies 36 ·
2
Replies
36
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 28 ·
Replies
28
Views
7K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 38 ·
2
Replies
38
Views
11K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 21 ·
Replies
21
Views
5K
Replies
1
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K