Entropy of a product of positive numbers

AI Thread Summary
The discussion revolves around calculating the entropy of the product of two positive numbers, A and B, using Shannon's entropy principles. Participants explore the relationship between the entropy of sums and products, noting that while the entropy of a sum can be expressed as a combination of contributions from each term, the entropy of a product is less straightforward. The conversation highlights the need for a finite state space to define entropy meaningfully and suggests that the entropy of a product could be related to the number of combinations yielding that product. There is also a debate on whether the entropy of a product can be equated to the logarithm of the product itself, with some participants expressing skepticism about this approach. Overall, the thread reflects a deep inquiry into the nature of entropy in mathematical contexts, particularly regarding sums and products of numbers.
ClamShell
Messages
221
Reaction score
0
I accept that the entropy (Shannon's) of a sum of positive
numbers is the sum of the -P_n * LOG(P_n) for each number
in the sum of numbers where P_n is the average contribution
of a term. IE, if T = A + B,

then S = -(A/T)*LOG(A/T) - (B/T)*LOG(B/T).

But what about the entropy of A*B?

Please don't respond with "why do it?", just give me some idea
of how you might attempt to do it.
 
Physics news on Phys.org
I don't know why, but you can define as the entropy of log(AB).
 
ClamShell said:
I accept that the entropy (Shannon's) of a sum of positive
numbers is the sum of the -P_n * LOG(P_n) for each number
in the sum of numbers where P_n is the average contribution
of a term. IE, if T = A + B,

then S = -(A/T)*LOG(A/T) - (B/T)*LOG(B/T).

But what about the entropy of A*B?

Please don't respond with "why do it?", just give me some idea
of how you might attempt to do it.

By definition entropy is a function of the number of states in which a system can exist. If you apply this concept to a sum or product, it's only feasible to calculate it if you restrict the number of states to a finite value (assuming every combination has the same probability). Then the product or sum must be compositions of natural numbers. Given this, the entropy of the product AB is a function of the number of all combinations XY for which AB is the specific product. Since scalar multiplication is commutative, state XY=YX.

If you have some chosen some continuous probability distribution for the compositions XY, then you can integrate over some finite interval [a,b] for all real number combinations for the product AB. This entropy value would be expressed as a function in terms of log(p). Typically log base 2 or e is chosen.h[f]=-\int^{b}_{a} f(x) log(x)dx

where f is the pdf.

I'm pretty sure a double integral is not needed since the restricted variables X and Y are not independent.
 
Last edited:
JSuarez said:
I don't know why, but you can define as the entropy of log(AB).

Yes, of course, and log(AB) = log(A) + log(B)

And, 1 = log(A)/log(AB) + log(B)/log(AB)

defined as, 1 = P_A + P_B (proportions of log(AB) or probabilities)

gives, S(AB) = -(P_A)ln(P_A) - (P_B)ln(P_B) (napiers)

Can define, but cannot prove to my satisfaction.
 
I guess I don't understand why you say the entropy of a product of two positive whole numbers is the sum of their logarithms. The product AB=100 can be written just 5 ways: 1*100; 2*50;4*25; 5*20; 10*10. Assuming they are equally probable the Shannon information for any outcome is -log_{2}(1/5)=2.322 bits.

Are you saying that the entropy of AB=100 is just it's base 2 log=6.6445 bits. Why so?
 
Last edited:
SW VandeCarr said:
I guess I don't understand why you say the entropy of a product of two positive whole numbers is the sum of their logarithms. The product AB=100 can be written just 5 ways: 1*100; 2*50;4*25; 5*20; 10*10. Assuming they are equally probable the Shannon information for any outcome is -log_{2}(1/5)=2.322 bits.

Are you saying that the entropy of AB=100 is just it's base 2 log=6.6445 bits. Why so?

This problem might ultimately lead to a perspective change for
defining entropy...but I hope not.

On one hand, S = log(# of combinations),

on the other, S = -sum (over n) of P_n times log(P_n) (of contributions or probabilities)

And it is easy to reconcile the two if the probability distribution is uniform.

But if we think of the probabilities as contributions to the answer, we seem
to depart from the Boltzmann definition for entropy (perhaps not).

Any departure from a uniform probability distribution means that
S will be smaller and therefore contain more information.

If I haven't lost you, let's limit the problem to discrete probability
distributions over integers...not the integers themselves.
Hope to hear from you SW.
 
ClamShell said:
If I haven't lost you, let's limit the problem to discrete probability
distributions over integers...not the integers themselves.
Hope to hear from you SW.

You haven't lost me because I was already lost regarding this concept. I've never heard of the idea of the entropy of a sum or product of numbers, natural or otherwise. The simplest and clearest way to think of entropy IMO (see my post 3) is as a function of the number of states in which a system can exist, the function being the logarithm of the probability of a state in the standard Boltzmann formulation and followed by Shannon and Gibbs.

A number or a series of numbers (assume natural numbers) don't have entropy unless you think of them as system which can exist in several states (including trivially one state where H=0). So we can say the permutations of some series of distinct symbols (they don't need to be numbers) represent states of that series and, assuming a distribution, assign a value to the system's entropy. A uniform distribution is convenient since the equation reduces to simply H = -log p(x) when you sum over the states.

Now when you deal with sums and products you are dealing specifically with numbers. The only way I can think of the entropy of a sum or product is in terms of the number of possible combinations which yield the sum a product. For addition this could be the number of partitions of the (natural) number sum such as for 3: 3+0; 2+1; 1+1+1; so three states.

I've already shown how I would handle the product (post 3). It's not at all clear to me how if you simply take the log of AB, that's somehow the entropy of AB. Where does this idea come from and what's the reasoning behind it?

EDIT: I rereading your post, it seems you are talking about summing over the terms p(x_i)log(p(x_i)) of the Shannon equation, so I misunderstood you regarding addition. However, I still don't quite follow the product idea. You multiply probabilities in computing likelihoods, and you can talk about the contribution of different probabilities to the "entropy" of the likelihood function. However I don't think that's what you're asking about . I certainly do not see the connection with the idea that log(A*B) is the "entropy" of AB where A and B are natural numbers.
 
Last edited:
SW VandeCarr said:
You haven't lost me because I was already lost regarding this concept. I've never heard of the idea of the entropy of a sum or product of numbers, natural or otherwise. The simplest and clearest way to think of entropy IMO (see my post 3) is as a function of the number of states in which a system can exist, the function being the logarithm of the probability of a state in the standard Boltzmann formulation and followed by Shannon and Gibbs. A number or a series of numbers (assume natural numbers) don't have entropy unless you think of them as system which can exist in several states (including trivially one state where H=0). So we can say the permutations of some series of distinct symbols (they don't need to be numbers) represent states of that series and, assuming a distribution, assign a value to the system's entropy. A uniform distribution is convenient since the equation reduces to simply H = -log p(x) when you sum over the states.

Now when you deal with sums and products you are dealing specifically with numbers. The only way I can think of the entropy of a sum or product is in terms of the number of possible combinations which yield the sum a product. For addition this could be the number of partitions of the (natural) number sum or simply the number of compositions of the sum. The latter is easier because it can take the repeated components such as for 3: 3+0; 2+1; 1+1+1; so three states.

I've already shown how I would handle the product (post 3). It's not at all clear to me how if you simply take the log of AB, that's somehow the entropy of AB. Where does this idea come from and what's the reasoning behind it?

OOPs, thought I had you going along with my proposed entropy of A + B...so
let's drop the discussion of A times B, for now and concentrate on the sum.
(I use S for entropy)

Here is the proposed method for sums:

IE, if T = A + B,

then S = -(A/T)*LOG(A/T) - (B/T)*LOG(B/T).

Note, that it may be more correct to call S the entropy of
a mixture. And that if A = B, then S = 1 bit (using base 2 log)
IE, any 50:50 mixture has entropy equal to 1 bit. Any other
mixture results in S less than one bit. If either A or B is zero,
S = 0. So, if two donuts taste exactly the same, then I can
assume that they were made from dough with the same
entropy...a correlation of their entropies via taste.
Has my intuition finally gotten the best of me?

Also note:
"The absense of proof is not proof of absense."
(all I got going for me)
 
ClamShell said:
OOPs, thought I had you going along with my proposed entropy of A + B...so
let's drop the discussion of A times B, for now and concentrate on the sum.

I edited my previous post. I have no problems with summing the terms of the Shannon equations. I scanned over that and got stuck on the product, then forgot you what you said about addition.

With addition, it seems you are simply talking about the equilibrium of entropy in mixtures. The contribution of the entropy of each component of the mixture is dependent in physical terms on not only on the molar proportions but on the specific heat of each component. If both components differ only in temperature and molar proportions and the temperatures and are not too far apart then your simple approach is essentially correct. At high temperatures and densities, new statistical models come into play (also very low temperatures).

So tell me about why log(A*B) is the entropy of A*B.
 
Last edited:
  • #10
Sorry to have agreed with JSuarz. I only agreed with
him/her because it's the right place to start. Here is
my reply. It's the same logic as the sum, but
the sum is of logs and the log(AB) (base changer)
produces a probability-like sum equal to 1. I really
appreciate your more rigorous approach to the
subject. Note that the log's are because of the
product...the ln's are for the entropy (altough
I prefer base 2).


"Yes, of course, and log(AB) = log(A) + log(B)

And, 1 = log(A)/log(AB) + log(B)/log(AB)

defined as, 1 = P_A + P_B (proportions of log(AB) or probabilities)

gives, S(AB) = -(P_A)ln(P_A) - (P_B)ln(P_B) (napiers)

Can define, but cannot prove to my satisfaction.
 
  • #11
ClamShell said:
Sorry to have agreed with JSuarz.

Don't be. JSuarez knows what he's talking about. I was under the impression you were talking about the entropy of the product of positive numbers as pure numbers. It seems you are talking about A and B as the states of a single two state system.
And, 1 = log(A)/log(AB) + log(B)/log(AB)
.

What's this? Why are you dividing logs?

defined as, 1 = P_A + P_B (proportions of log(AB) or probabilities)

OK

gives, S(AB) = -(P_A)ln(P_A) - (P_B)ln(P_B) (napiers)

What's this? For Shannon entropy you don't subtract, you add the two states. No need to show this however when the distribution is uniform. H=-log_b(p(x))

Since A and B are mutually exclusive states, under a uniform distribution (like heads and tails), the Shannon entropy value of the system AB is one bit.

H(AB)=-log_2(p=(1/2))=1 bit.
 
Last edited:
  • #12
Hi SW,

I think A & B are pure, and not states of a 2 state system.
If A = e^a & B = e^b, then the expression for the
probabilities is:

P_A + P_B = 1 = a/(a+b) + b/(a+b), and we return to the

entropy of sums which you agree (?) with.

But with the requirement that A & B are not only positive,
but also must be greater than 1.

The problem I am fooling around with at this moment are where:

A = (1+x) & B = 1/(1-x), so AB = (1+x)/(1-x)

To what purpose?...For fooling around and looking for patterns.
May turn out to be a blind alley...I just want to borrow your
torch. My goal is to apply this method to infinite products
with some degree of confidence. Infinite product identies
are what got me started with this SUM of -(P_n)log(P_n) stuff.
I could show you some infinite product identities that would
blow your mind (mine's still in recovery)...but that's another
thread.
 
  • #13
Dear SW_vandecarr,
Did you mean to finish this thread or
will you appear later?
Anxiously waiting by the phone...
 
Back
Top