Is Shannon Entropy subjective?

1. Jan 24, 2016

T S Bailey

If you have multiple possible states of a system then the Shannon entropy depends upon whether the outcomes have equal probability. A predictable outcome isn't very informative after all. But this seems to rely on the predictive ability of the system making the observation/measurement. This suggests that the amount of Shannon entropy of a system depends on who you ask.

Last edited: Jan 24, 2016
2. Jan 24, 2016

Staff: Mentor

Especially if you ask a Bayesian or a frequentist.

3. Jan 25, 2016

chiro

The normal Shannon entropy for a discrete state system gives a person an idea of the information density or content of that system.

The more random a system is the more information is needed to describe that system. If you have a system with so much entropy and information with so much entropy then the information will decrease the uncertainty of the system itself.

Also realize that entropy - like anything else in mathematics and language in particular has a basis.

You can all sorts of entropies corresponding to the distribution you are looking at - and that includes all possible conditional distributions.

The entropy measures are distribution invariant (as long as they have a finite sample space for the proper entropies) and have a lot of algebraic constraints that are mathematically consistent and consistent probabilistically.