Is Shannon Entropy Dependent on Perspective?

  • Context: Graduate 
  • Thread starter Thread starter T S Bailey
  • Start date Start date
  • Tags Tags
    Entropy Shannon entropy
Click For Summary
SUMMARY

The discussion centers on the concept of Shannon entropy and its dependence on the observer's perspective, particularly contrasting Bayesian and frequentist viewpoints. It establishes that the Shannon entropy of a system varies based on the probability distribution of outcomes, with higher entropy indicating greater uncertainty and information density. The conversation emphasizes that entropy measures are distribution invariant and adhere to strict algebraic and probabilistic constraints, underscoring the mathematical foundation of entropy in information theory.

PREREQUISITES
  • Understanding of Shannon entropy and its mathematical formulation
  • Familiarity with Bayesian and frequentist statistical paradigms
  • Knowledge of probability distributions and their properties
  • Basic concepts of information theory and uncertainty measurement
NEXT STEPS
  • Explore the mathematical derivation of Shannon entropy in discrete state systems
  • Study the differences between Bayesian and frequentist interpretations of probability
  • Investigate conditional entropy and its applications in information theory
  • Learn about distribution invariance in entropy measures and its implications
USEFUL FOR

Statisticians, data scientists, information theorists, and anyone interested in the foundational concepts of entropy and its applications in various statistical frameworks.

T S Bailey
Messages
26
Reaction score
0
If you have multiple possible states of a system then the Shannon entropy depends upon whether the outcomes have equal probability. A predictable outcome isn't very informative after all. But this seems to rely on the predictive ability of the system making the observation/measurement. This suggests that the amount of Shannon entropy of a system depends on who you ask.
 
Last edited:
Physics news on Phys.org
Especially if you ask a Bayesian or a frequentist.
 
  • Like
Likes   Reactions: jim mcnamara and T S Bailey
The normal Shannon entropy for a discrete state system gives a person an idea of the information density or content of that system.

The more random a system is the more information is needed to describe that system. If you have a system with so much entropy and information with so much entropy then the information will decrease the uncertainty of the system itself.

Also realize that entropy - like anything else in mathematics and language in particular has a basis.

You can all sorts of entropies corresponding to the distribution you are looking at - and that includes all possible conditional distributions.

The entropy measures are distribution invariant (as long as they have a finite sample space for the proper entropies) and have a lot of algebraic constraints that are mathematically consistent and consistent probabilistically.
 
  • Like
Likes   Reactions: jim mcnamara and T S Bailey

Similar threads

  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 3 ·
Replies
3
Views
1K
Replies
3
Views
2K
  • · Replies 13 ·
Replies
13
Views
10K