- #1
nomadreid
Gold Member
- 1,670
- 204
There are two aspects of uncertainty
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard distribution, the second one is emphasized. Entropy and standard distribution are not the same, however. Which one gives then a better picture of "uncertainty"?
(a) how far different from the situation where all possibilities are of equal probability
(b) how spread out the values are.
In discussions about (Shannon) entropy and information, the first aspect is emphasized, whereas in discussions about the standard distribution, the second one is emphasized. Entropy and standard distribution are not the same, however. Which one gives then a better picture of "uncertainty"?