- #1
paralleltransport
- 131
- 96
Dear all,
I'm trying to have an intuition of what Kolmogorov Entropy for a dynamical system means. In particular,
1. what is Kolmogorov Entropy trying to quantify? (what are the units of KS Entropy, is it bits or bits/seconds).
2. What is the relation of KS entropy to Shannon Entropy?
3. In what way does Kolmogorov entropy successfully quantify the question (1) tries to answer? Why is the definition intuitively the "right" one?
I've been trying to read on it, but it seems dominated by math jargon that makes a physics person a little bit intimidated. My background is that I know statistical mechanics (graduate level).
For example Shannon entropy would be:
1. Tries to quantify the minimum # of bits (or symbols of some language) to specify to uniquely specify a particular state of a system. This also tells us how "unpredictable" the system is. If more bits is needed, that means you need more information to really "pin" down the microstate of the system.
2. Well shannon-entropy is shannon.
3. Expection[\ln p] reduces to very simple cases: for example, let's say I have N balls, one of which is heavier, equally likely. What is the minimum # of bits? p = {1 \over N} and if you compute it makes sense log(N) should be the # of symbols to code for a microstate.
Thank you!
I'm trying to have an intuition of what Kolmogorov Entropy for a dynamical system means. In particular,
1. what is Kolmogorov Entropy trying to quantify? (what are the units of KS Entropy, is it bits or bits/seconds).
2. What is the relation of KS entropy to Shannon Entropy?
3. In what way does Kolmogorov entropy successfully quantify the question (1) tries to answer? Why is the definition intuitively the "right" one?
I've been trying to read on it, but it seems dominated by math jargon that makes a physics person a little bit intimidated. My background is that I know statistical mechanics (graduate level).
For example Shannon entropy would be:
1. Tries to quantify the minimum # of bits (or symbols of some language) to specify to uniquely specify a particular state of a system. This also tells us how "unpredictable" the system is. If more bits is needed, that means you need more information to really "pin" down the microstate of the system.
2. Well shannon-entropy is shannon.
3. Expection[\ln p] reduces to very simple cases: for example, let's say I have N balls, one of which is heavier, equally likely. What is the minimum # of bits? p = {1 \over N} and if you compute it makes sense log(N) should be the # of symbols to code for a microstate.
Thank you!