Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Shannon's entropy definition`

  1. Oct 23, 2006 #1
    Hi to everyone i was reading today the wikipedia's article about information entropy
    I need some help to understand why in the
    http://upload.wikimedia.org/math/6/a/3/6a33010c16b1d526bc5daee924e3d363.png
    entropy of an event we use the log2(1/p(xi))
    I have read to the article that
    "An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. "
    Then why only the first part of the equation Sump(xi) is not enough. p(xi) denotes the amount of uncertainty of an event..
     
  2. jcsd
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Can you help with the solution or looking for help too?
Draft saved Draft deleted



Similar Discussions: Shannon's entropy definition`
  1. The entropy of a set (Replies: 1)

Loading...