The entropy of a set

1. Jan 11, 2005

phoenixthoth

What is the entropy of a set?

One of the two should be a general guidline:

# A measure of the disorder or randomness in a closed system.
# A measure of the loss of information in a transmitted message.

I've seen topological entropy (bowen) and entropy of random variables, but what about of sets?

2. Jan 13, 2005

phoenixthoth

mutual information

What I'm really getting at is the so called "mutual information" one set A has of another set B.

This is defined in information theory if A and B are random variables.

I want it if they are general sets.

I had a 'thought.' Maybe I can look at the smallest sigma algebra G containing A and B (I don't mean the intersection), and invent a nontrivial probability measure on this G. This turns A and B into events. Then the formula I've seen for mutual information is this:
I(A;B)=Log_2 (P(A&B) / (P(A)P(B))).

But what would be a nontrivial probability measure to put so that P(A), P(B) &isin; [0,1]. Also, P(G)=1. Is there some canonical nontrival P() that I can construct? How would I do this?