Uncertainty: balance of two aspects?

Click For Summary
SUMMARY

This discussion centers on the dual aspects of uncertainty: the probability distribution of outcomes and the dispersion of values. It distinguishes between Shannon entropy, which focuses on the probabilities of outcomes, and variance, which accounts for the magnitude of payoffs. The conversation highlights that while entropy is useful in informational geometry, variance provides critical insights into convergence rates and is favored in classical probability due to its relationship with the central limit theorem. The participants emphasize the importance of understanding these concepts when evaluating uncertainty in simple random variables.

PREREQUISITES
  • Understanding of Shannon entropy and its applications
  • Familiarity with variance and standard deviation in probability theory
  • Knowledge of convergence concepts in statistics, particularly the central limit theorem
  • Basic grasp of informational geometry and its relevance to uncertainty
NEXT STEPS
  • Explore the implications of KL divergence in random walk problems
  • Research Stein's method and its application to convergence rates
  • Study the relationship between Schur convexity and concavity in probability measures
  • Examine the role of entropy in measuring unpredictability in various contexts
USEFUL FOR

Statisticians, data scientists, and researchers interested in the mathematical foundations of uncertainty and its applications in probability theory and informational geometry.

nomadreid
Gold Member
Messages
1,773
Reaction score
256
There are two aspects of uncertainty
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard distribution, the second one is emphasized. Entropy and standard distribution are not the same, however. Which one gives then a better picture of "uncertainty"?
 
Physics news on Phys.org
I presume we're dealing with simple random variables here (i.e. ones with finitely many outcomes). I'm not sure what the "standard distribution" is though. There are lots of ways to measure dispersion -- the typical way is std deviation / variance, so I'll assume that's what's being used here.

A few things to think about:
- entropy does not consider cardinality of the payoffs involved -- but variance does. For entropy, if you think about this in terms of the classic binary guessing formulation, it should be clear. If you are worried about how much your fortune will swing after repeatedly making bets, then you likely are(/should) be more interested in variance information.
- in terms of convergence information: Variance (and really the third moment if using stein's method) gives you some information on rate of convergence to a gaussian for sums of iid random variables. Entropy does not, but it does give you (in the form of KL divergence) information on the type/composition you're likely to see in certain random walk / threshold crossing problems. (Sanov's theorem)
- Variance and Entropy are in some sense 'opposites' as variance is a Schur Convex function while entropy is Schur Concave. But again Variance is a mixture of probability and payoffs while entropy ignores magnitudes of payoffs and just looks at distinct probabilities.
- classical probability emphasizes variance (for primary reasons related to (a) ease of using moments and (b) central limit theorem).
You'll see a lot more about entropy in informational geometry though this is something of an esoteric topic.

So one immediate question is whether or not your 'uncertainty' is sensitive to the magnitude of deviation from 'typical' outcomes.
 
  • Like
Likes   Reactions: nomadreid
Thanks, StoneTemplePython. Yes, I am dealing with simple random variables. Sorry, that "standard distribution" was a typo, I meant "standard deviation". Thanks for the listing of the criteria; your parting sentence is a good lens to look at the individual cases through. I am interesting in uncertainty as unpredictability; although for example the uncertainty principles are given in terms of standard deviations, the case for situations in which entropy is advantageous is made on page 28 of https://authors.library.caltech.edu/66493/2/chap10_15.pdf :
"Furthermore, the variance does not characterize very well the unpredictability of the measurement outcomes; entropy would be a more informative measure."
 

Similar threads

  • · Replies 13 ·
Replies
13
Views
2K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 19 ·
Replies
19
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 16 ·
Replies
16
Views
2K
  • · Replies 10 ·
Replies
10
Views
5K
  • · Replies 3 ·
Replies
3
Views
3K