Hi to everyone i was reading today the wikipedia's article about information entropy(adsbygoogle = window.adsbygoogle || []).push({});

I need some help to understand why in the

http://upload.wikimedia.org/math/6/a/3/6a33010c16b1d526bc5daee924e3d363.png

entropy of an event we use the log2(1/p(xi))

I have read to the article that

"An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. "

Then why only the first part of the equation Sump(xi) is not enough. p(xi) denotes the amount of uncertainty of an event..

**Physics Forums | Science Articles, Homework Help, Discussion**

The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

# Shannon's entropy definition`

Can you offer guidance or do you also need help?

Draft saved
Draft deleted

**Physics Forums | Science Articles, Homework Help, Discussion**