1. Not finding help here? Sign up for a free 30min tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Entropy (information entropy)

  1. Apr 19, 2007 #1
    Hey to the nice community. I am reading about information entropy a size that can be calculated by the H(x)=-Sum[p(x)log(2)p(x)]
    then i want to calculate the entropy for the following problem. Consider a random variable that has a uniform distribution over 32 outcomes and i want to find the entroyp
    H(x)=-Sum(1/32*log(1/32) and i ll sum 32 times the quantity before . /the problem is that the result is -5 and the book says that the entropy is 5 bits. Why do we remoive the minus sign fromentropy calculations?
     
  2. jcsd
  3. Apr 19, 2007 #2

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    The result is 5. log(1/32)=(-5).
     
  4. Apr 19, 2007 #3
    My excel calculates the =LOG(1/32;0.5)
    for 5 and not for -5 as u mention
     
  5. Apr 19, 2007 #4
    Thx a lot for your reply
    now i need something more
    Why in a binary symetric channel the channel is calculated for
    C=1+plogp+(1-p)log(1-p)

    I only know that the channel is denoted as C=maxI(X;Y)
    btw what ; means in X;Y?
    Unfortunately my book doesnt mention these things so if u can reply me or provide me with some good links that will be rezlly nice
     
  6. Apr 19, 2007 #5

    Dick

    User Avatar
    Science Advisor
    Homework Helper

    LOG(1/32;0.5)? Isn't that the log base 1/2? Thought you wanted log base 2? Why not LOG(1/32;2)? As for your followup question - I don't know!? You might want to start a new thread and see if you can find someone who does.
     
  7. Apr 19, 2007 #6
    Thx a lot.. u are right i have used the function log incorrectly. As for my 2nd question i ll wait to see if someone will reply.
    Thx a lot for help though
     
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?



Similar Discussions: Entropy (information entropy)
  1. Entropy calculation. (Replies: 2)

  2. Entropy properties (Replies: 0)

  3. Quantum entropy and ? (Replies: 2)

  4. Entropy expression (Replies: 1)

  5. Finding Entropy (Replies: 3)

Loading...