Uncertainty: balance of two aspects?

In summary, there are two aspects of uncertainty: (a) how far different it is from a situation where all possibilities are of equal probability, and (b) how spread out the values are. In discussions about Shannon entropy and information, the first aspect is emphasized, while discussions about standard deviation focus on the second aspect. However, entropy and standard deviation are not the same and have different uses in measuring uncertainty. Variance considers the magnitude of payoffs, while entropy only looks at distinct probabilities. Classical probability emphasizes variance, while informational geometry puts more emphasis on entropy. The choice between using variance or entropy to measure uncertainty depends on whether sensitivity to deviations from typical outcomes is important or if unpredictability is the main concern.
  • #1
nomadreid
Gold Member
1,670
204
There are two aspects of uncertainty
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard distribution, the second one is emphasized. Entropy and standard distribution are not the same, however. Which one gives then a better picture of "uncertainty"?
 
Physics news on Phys.org
  • #2
I presume we're dealing with simple random variables here (i.e. ones with finitely many outcomes). I'm not sure what the "standard distribution" is though. There are lots of ways to measure dispersion -- the typical way is std deviation / variance, so I'll assume that's what's being used here.

A few things to think about:
- entropy does not consider cardinality of the payoffs involved -- but variance does. For entropy, if you think about this in terms of the classic binary guessing formulation, it should be clear. If you are worried about how much your fortune will swing after repeatedly making bets, then you likely are(/should) be more interested in variance information.
- in terms of convergence information: Variance (and really the third moment if using stein's method) gives you some information on rate of convergence to a gaussian for sums of iid random variables. Entropy does not, but it does give you (in the form of KL divergence) information on the type/composition you're likely to see in certain random walk / threshold crossing problems. (Sanov's theorem)
- Variance and Entropy are in some sense 'opposites' as variance is a Schur Convex function while entropy is Schur Concave. But again Variance is a mixture of probability and payoffs while entropy ignores magnitudes of payoffs and just looks at distinct probabilities.
- classical probability emphasizes variance (for primary reasons related to (a) ease of using moments and (b) central limit theorem).
You'll see a lot more about entropy in informational geometry though this is something of an esoteric topic.

So one immediate question is whether or not your 'uncertainty' is sensitive to the magnitude of deviation from 'typical' outcomes.
 
  • Like
Likes nomadreid
  • #3
Thanks, StoneTemplePython. Yes, I am dealing with simple random variables. Sorry, that "standard distribution" was a typo, I meant "standard deviation". Thanks for the listing of the criteria; your parting sentence is a good lens to look at the individual cases through. I am interesting in uncertainty as unpredictability; although for example the uncertainty principles are given in terms of standard deviations, the case for situations in which entropy is advantageous is made on page 28 of https://authors.library.caltech.edu/66493/2/chap10_15.pdf :
"Furthermore, the variance does not characterize very well the unpredictability of the measurement outcomes; entropy would be a more informative measure."
 

1. What is uncertainty?

Uncertainty refers to the lack of certainty or knowledge about a particular situation or event. It is the state of being unsure or having doubts about something.

2. What are the two aspects of uncertainty?

The two aspects of uncertainty are aleatory and epistemic. Aleatory uncertainty is related to inherent randomness or variability in a system, while epistemic uncertainty is caused by a lack of knowledge or understanding about a system.

3. How is uncertainty measured?

Uncertainty can be measured using various statistical and mathematical methods, such as probability distributions, confidence intervals, and sensitivity analyses. The specific method used depends on the type of uncertainty being considered.

4. How does uncertainty affect decision making?

Uncertainty can have a significant impact on decision making, as it introduces risk and potential consequences. It is important to consider and manage uncertainty in decision making processes to minimize potential negative outcomes.

5. Can uncertainty be reduced or eliminated?

While uncertainty cannot be completely eliminated, it can be reduced through various methods such as increasing knowledge and understanding of a system, using more accurate and precise measurements, and implementing risk management strategies.

Similar threads

  • Set Theory, Logic, Probability, Statistics
Replies
13
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
21
Views
2K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
4
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
19
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
9
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
795
  • Set Theory, Logic, Probability, Statistics
Replies
16
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
2
Views
1K
  • Set Theory, Logic, Probability, Statistics
Replies
10
Views
3K
Back
Top