Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Shannon's entropy definition`

  1. Oct 23, 2006 #1
    Hi to everyone i was reading today the wikipedia's article about information entropy
    I need some help to understand why in the
    entropy of an event we use the log2(1/p(xi))
    I have read to the article that
    "An intuitive understanding of information entropy relates to the amount of uncertainty about an event associated with a given probability distribution. "
    Then why only the first part of the equation Sump(xi) is not enough. p(xi) denotes the amount of uncertainty of an event..
  2. jcsd
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Can you offer guidance or do you also need help?
Draft saved Draft deleted