Kolmogorov Entropy: Intuitive Understanding & Relation to Shannon Entropy

In summary: Kolmogorov entropy is often used to quantify the complexity of systems, such as computer programs or biological cells. By measuring the amount of information needed to predict a system's future states, Kolmogorov entropy can provide a quantitative measure of how unpredictable a system is. Additionally, Kolmogorov entropy can be used to compare the complexity of systems. For example, computer programs with higher Kolmogorov entropy are usually considered more complex than those with lower entropy.
  • #1
paralleltransport
131
96
Dear all,

I'm trying to have an intuition of what Kolmogorov Entropy for a dynamical system means. In particular,

1. what is Kolmogorov Entropy trying to quantify? (what are the units of KS Entropy, is it bits or bits/seconds).
2. What is the relation of KS entropy to Shannon Entropy?
3. In what way does Kolmogorov entropy successfully quantify the question (1) tries to answer? Why is the definition intuitively the "right" one?

I've been trying to read on it, but it seems dominated by math jargon that makes a physics person a little bit intimidated. My background is that I know statistical mechanics (graduate level).

For example Shannon entropy would be:

1. Tries to quantify the minimum # of bits (or symbols of some language) to specify to uniquely specify a particular state of a system. This also tells us how "unpredictable" the system is. If more bits is needed, that means you need more information to really "pin" down the microstate of the system.

2. Well shannon-entropy is shannon.

3. Expection[\ln p] reduces to very simple cases: for example, let's say I have N balls, one of which is heavier, equally likely. What is the minimum # of bits? p = {1 \over N} and if you compute it makes sense log(N) should be the # of symbols to code for a microstate.

Thank you!
 
Science news on Phys.org
  • #2
Kolmogorov entropy (also known as algorithmic complexity or Kolmogorov-Sinai entropy) is a measure of the unpredictability of a dynamical system. It is defined as the rate of growth of the number of distinct patterns seen in the system. Unlike Shannon entropy, which quantifies the average information content in a system, Kolmogorov entropy measures the average amount of information needed to uniquely identify a given state of the system. The units of KS entropy are typically bits/second.Kolmogorov entropy is related to Shannon entropy in that they both quantify the complexity of a system, but in different ways. Shannon entropy measures the amount of information that can be extracted from a system based on its probability distribution. Kolmogorov entropy, on the other hand, measures the amount of information required to uniquely identify a given state of the system.The definition of Kolmogorov entropy is intuitively the "right" one because it captures the idea that more information is needed to specify the exact state of a system. For example, let's say you have two balls, one of which is heavier than the other. If you know that one ball is heavier, you need log(2) bits of information to uniquely identify which ball is heavier. If instead you had N balls, then you would need log(N) bits of information. This is essentially what Kolmogorov entropy tries to capture: the amount of information needed to uniquely identify a state of a system.
 

What is Kolmogorov Entropy?

Kolmogorov Entropy, also known as algorithmic entropy or Kolmogorov complexity, is a measure of the randomness or complexity of a string of data. It is based on the idea that the complexity of a string can be measured by the length of the shortest computer program that can produce that string.

How is Kolmogorov Entropy related to Shannon Entropy?

Kolmogorov Entropy and Shannon Entropy are both measures of information, but they approach it from different perspectives. Shannon Entropy is based on the probability of observing a certain string of data, while Kolmogorov Entropy is based on the complexity of that string. They are mathematically related, as Kolmogorov Entropy can be thought of as the limit of Shannon Entropy as the length of the string approaches infinity.

Can Kolmogorov Entropy be calculated for any type of data?

Technically, Kolmogorov Entropy can be calculated for any type of data, but in practice, it is only meaningful for data that can be represented by a binary string. This is because it relies on the concept of a computer program, which operates on binary data.

Why is Kolmogorov Entropy useful?

Kolmogorov Entropy has applications in various fields such as computer science, information theory, and physics. It can be used to measure the complexity of data, assess the randomness of a sequence, and identify patterns in data. It also has implications for the study of complexity and emergence in complex systems.

Are there any limitations or criticisms of Kolmogorov Entropy?

One limitation of Kolmogorov Entropy is that it is not computable in practice, as it requires knowledge of the shortest computer program for a string of data, which is impossible to determine. It is also sensitive to the choice of universal Turing machine used in its calculation. There have also been criticisms of its applicability to real-world data, as it assumes ideal conditions that do not always hold in practical settings.

Similar threads

  • Thermodynamics
Replies
2
Views
776
Replies
2
Views
844
Replies
17
Views
3K
Replies
1
Views
1K
  • Thermodynamics
Replies
7
Views
2K
  • Thermodynamics
Replies
9
Views
1K
  • Thermodynamics
Replies
4
Views
1K
  • Programming and Computer Science
Replies
9
Views
3K
  • Thermodynamics
Replies
7
Views
1K
Back
Top