1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

A Kolmogorov Entropy

  1. Apr 2, 2016 #1
    Dear all,

    I'm trying to have an intuition of what Kolmogorov Entropy for a dynamical system means. In particular,

    1. what is Kolmogorov Entropy trying to quantify? (what are the units of KS Entropy, is it bits or bits/seconds).
    2. What is the relation of KS entropy to Shannon Entropy?
    3. In what way does Kolmogorov entropy successfully quantify the question (1) tries to answer? Why is the definition intuitively the "right" one?

    I've been trying to read on it, but it seems dominated by math jargon that makes a physics person a little bit intimidated. My background is that I know statistical mechanics (graduate level).

    For example Shannon entropy would be:

    1. Tries to quantify the minimum # of bits (or symbols of some language) to specify to uniquely specify a particular state of a system. This also tells us how "unpredictable" the system is. If more bits is needed, that means you need more information to really "pin" down the microstate of the system.

    2. Well shannon-entropy is shannon.

    3. Expection[\ln p] reduces to very simple cases: for example, let's say I have N balls, one of which is heavier, equally likely. What is the minimum # of bits? p = {1 \over N} and if you compute it makes sense log(N) should be the # of symbols to code for a microstate.

    Thank you!
  2. jcsd
  3. Apr 7, 2016 #2
    Thanks for the post! This is an automated courtesy bump. Sorry you aren't generating responses at the moment. Do you have any further information, come to any new conclusions or is it possible to reword the post?
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted