1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

The entropy of a set

  1. Jan 11, 2005 #1
    What is the entropy of a set?

    One of the two should be a general guidline:

    # A measure of the disorder or randomness in a closed system.
    # A measure of the loss of information in a transmitted message.

    I've seen topological entropy (bowen) and entropy of random variables, but what about of sets?
     
  2. jcsd
  3. Jan 13, 2005 #2
    mutual information

    What I'm really getting at is the so called "mutual information" one set A has of another set B.

    This is defined in information theory if A and B are random variables.

    I want it if they are general sets.

    I had a 'thought.' Maybe I can look at the smallest sigma algebra G containing A and B (I don't mean the intersection), and invent a nontrivial probability measure on this G. This turns A and B into events. Then the formula I've seen for mutual information is this:
    I(A;B)=Log_2 (P(A&B) / (P(A)P(B))).

    But what would be a nontrivial probability measure to put so that P(A), P(B) ∈ [0,1]. Also, P(G)=1. Is there some canonical nontrival P() that I can construct? How would I do this?
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: The entropy of a set
  1. Entropy equation plot (Replies: 1)

  2. Set of all finite sets (Replies: 12)

Loading...