Trying to understand binary entropy function

Click For Summary

Homework Help Overview

The discussion revolves around the binary entropy function in statistics, specifically its calculation and interpretation. The original poster expresses confusion regarding the application of the formula for entropy when dealing with a fair coin toss.

Discussion Character

  • Conceptual clarification, Mathematical reasoning

Approaches and Questions Raised

  • The original poster attempts to calculate the entropy using the formula provided but arrives at an incorrect result due to using the wrong logarithm base. Other participants suggest the need for binary logarithms and discuss the implications of understanding logarithmic bases.

Discussion Status

Participants are exploring the correct application of logarithms in the entropy formula. Some guidance has been offered regarding the use of binary logarithms, and there is an ongoing exploration of how to compute these on calculators. The original poster is also reflecting on their understanding of the concept of entropy.

Contextual Notes

The original poster notes that their inquiry is for personal study rather than a coursework requirement, and there are constraints regarding posting in the appropriate forum for textbook-style questions.

joebloggs
Messages
9
Reaction score
0
Ok firstly I am new to statistics but have a layman's interest in entropy.

On Wikipedia i came across this article on the binary entropy function (http://en.wikipedia.org/wiki/Binary_entropy_function).

It says... If Pr(X = 1) = p, then Pr(X = 0) = 1 − p and the entropy of X is given by
H(X) = -p log p - (1-p) log (1-p)

For a a fair coin p = 1/2, so if I plug that into the equation I get

H(X) = -0.5log0.5 - 0.5log0.5

When I do this on my calculator I come up with H(X) = 0.301

The actual answer should be H(X) = 1
as this is the situation of maximum uncertainty (entropy in this case) as it is most difficult to predict the outcome of the next toss; the result of each toss of the coin delivers a full 1 bit of information.

What am I doing wrong? Am I not putting it into my calculator correctly?
 
Physics news on Phys.org
According to the article you're supposed to use binary logarithms in the formula. You've entered plain old base 10 logarithms in your calculator
 
oh ok. How do I enter binary logarithms on my calculator?

Sorry about this, I've just realized that I should have posted this thread in the coursework forum.
 
Oh - coursework: I'll get in trouble if I answer it. However, if you just think about the meaning of the binary logarithm, you should then see what the binary log of 0.5 immediately, and then see the answer to your equation without using a calculator.
 
Ok I think I see what you are saying. 2 raised to the power of -1 = 0.5. So there is the 1 bit. This is not a problem I have to do for any paper or course (I study environmental planning). This is just personal study, I'm interested in getting a more thorough understanding of information entropy and how it relates to thermodynamic entropy.

The sticky thread said that for any textbook style questions it had to go to the coursework forum. But I don't know how to shift it. I've emailed one of the moderators about it.

On my calculator I have a log (to the base 10) button and a natural log button. The only way I know to put something in log to the base 2 is to go log(x)/log2.

Is there a better a way to the get log to the base 2 of a number on the calculator?

Cheers for your help though :)
 
Last edited:

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 15 ·
Replies
15
Views
7K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K