SW VandeCarr said:
You haven't lost me because I was already lost regarding this concept. I've never heard of the idea of the entropy of a sum or product of numbers, natural or otherwise. The simplest and clearest way to think of entropy IMO (see my post 3) is as a function of the number of states in which a system can exist, the function being the logarithm of the probability of a state in the standard Boltzmann formulation and followed by Shannon and Gibbs. A number or a series of numbers (assume natural numbers) don't have entropy unless you think of them as system which can exist in several states (including trivially one state where H=0). So we can say the permutations of some series of distinct symbols (they don't need to be numbers) represent states of that series and, assuming a distribution, assign a value to the system's entropy. A uniform distribution is convenient since the equation reduces to simply H = -log p(x) when you sum over the states.
Now when you deal with sums and products you are dealing specifically with numbers. The only way I can think of the entropy of a sum or product is in terms of the number of possible combinations which yield the sum a product. For addition this could be the number of partitions of the (natural) number sum or simply the number of compositions of the sum. The latter is easier because it can take the repeated components such as for 3: 3+0; 2+1; 1+1+1; so three states.
I've already shown how I would handle the product (post 3). It's not at all clear to me how if you simply take the log of AB, that's somehow the entropy of AB. Where does this idea come from and what's the reasoning behind it?
OOPs, thought I had you going along with my proposed entropy of A + B...so
let's drop the discussion of A times B, for now and concentrate on the sum.
(I use S for entropy)
Here is the proposed method for sums:
IE, if T = A + B,
then S = -(A/T)*LOG(A/T) - (B/T)*LOG(B/T).
Note, that it may be more correct to call S the entropy of
a mixture. And that if A = B, then S = 1 bit (using base 2 log)
IE, any 50:50 mixture has entropy equal to 1 bit. Any other
mixture results in S less than one bit. If either A or B is zero,
S = 0. So, if two donuts taste exactly the same, then I can
assume that they were made from dough with the same
entropy...a correlation of their entropies via taste.
Has my intuition finally gotten the best of me?
Also note:
"The absense of proof is not proof of absense."
(all I got going for me)