# Shannon's entropy definition

1. Oct 23, 2006

### dervast

Hi to everyone i was reading today the wikipedia's article about information entropy
I need some help to understand why in the
http://upload.wikimedia.org/math/6/a/3/6a33010c16b1d526bc5daee924e3d363.png
entropy of an event we use the log2(1/p(xi))
I have read to the article that
"An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. "
Then why only the first part of the equation Sump(xi) is not enough. p(xi) denotes the amount of uncertainty of an event..

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted

Similar Threads for Shannon's entropy definition Date
Information Theory (very basic) Nov 5, 2015
Relation between entropys of spatial and frequency domain Jul 19, 2013
Question about Shannon's mathematics May 17, 2013
Shannon entropy problem Feb 10, 2010
Entropy (Shannon) - Channel Capacity Oct 5, 2008