Trying to understand binary entropy function

Click For Summary
The discussion revolves around understanding the binary entropy function, specifically the calculation of entropy for a fair coin toss. The user initially miscalculates the entropy, arriving at 0.301 instead of the correct value of 1, due to using base 10 logarithms instead of binary logarithms. Clarification is provided that the binary logarithm can be computed using the formula log(x)/log(2) on standard calculators. The user expresses a desire to deepen their understanding of information entropy and its relation to thermodynamic entropy, despite realizing the thread may belong in a coursework forum. The conversation highlights the importance of using the correct logarithmic base in entropy calculations.
joebloggs
Messages
9
Reaction score
0
Ok firstly I am new to statistics but have a layman's interest in entropy.

On Wikipedia i came across this article on the binary entropy function (http://en.wikipedia.org/wiki/Binary_entropy_function).

It says... If Pr(X = 1) = p, then Pr(X = 0) = 1 − p and the entropy of X is given by
H(X) = -p log p - (1-p) log (1-p)

For a a fair coin p = 1/2, so if I plug that into the equation I get

H(X) = -0.5log0.5 - 0.5log0.5

When I do this on my calculator I come up with H(X) = 0.301

The actual answer should be H(X) = 1
as this is the situation of maximum uncertainty (entropy in this case) as it is most difficult to predict the outcome of the next toss; the result of each toss of the coin delivers a full 1 bit of information.

What am I doing wrong? Am I not putting it into my calculator correctly?
 
Physics news on Phys.org
According to the article you're supposed to use binary logarithms in the formula. You've entered plain old base 10 logarithms in your calculator
 
oh ok. How do I enter binary logarithms on my calculator?

Sorry about this, I've just realized that I should have posted this thread in the coursework forum.
 
Oh - coursework: I'll get in trouble if I answer it. However, if you just think about the meaning of the binary logarithm, you should then see what the binary log of 0.5 immediately, and then see the answer to your equation without using a calculator.
 
Ok I think I see what you are saying. 2 raised to the power of -1 = 0.5. So there is the 1 bit. This is not a problem I have to do for any paper or course (I study environmental planning). This is just personal study, I'm interested in getting a more thorough understanding of information entropy and how it relates to thermodynamic entropy.

The sticky thread said that for any textbook style questions it had to go to the coursework forum. But I don't know how to shift it. I've emailed one of the moderators about it.

On my calculator I have a log (to the base 10) button and a natural log button. The only way I know to put something in log to the base 2 is to go log(x)/log2.

Is there a better a way to the get log to the base 2 of a number on the calculator?

Cheers for your help though :)
 
Last edited:
Question: A clock's minute hand has length 4 and its hour hand has length 3. What is the distance between the tips at the moment when it is increasing most rapidly?(Putnam Exam Question) Answer: Making assumption that both the hands moves at constant angular velocities, the answer is ## \sqrt{7} .## But don't you think this assumption is somewhat doubtful and wrong?

Similar threads

  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 15 ·
Replies
15
Views
5K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 10 ·
Replies
10
Views
2K
  • · Replies 2 ·
Replies
2
Views
1K
  • · Replies 8 ·
Replies
8
Views
2K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 4 ·
Replies
4
Views
2K