Why is the minus sign removed in information entropy calculations?

  • Thread starter dervast
  • Start date
  • Tags
    Entropy
In summary, the conversation discusses the calculation of information entropy using the formula H(x) = -Sum[p(x)log(2)p(x)]. The speaker is trying to calculate the entropy for a random variable with a uniform distribution over 32 outcomes and is confused about the result being a negative value. The other speaker suggests using LOG(1/32;2) instead of LOG(1/32;0.5) for the calculation. The conversation also touches on the calculation of a binary symmetrical channel and the use of LOG(1/32;0.5) instead of LOG(1/32;2). The speaker is unsure about the meaning of the symbol ";" in the notation X;Y and is looking for further
  • #1
dervast
133
1
Hey to the nice community. I am reading about information entropy a size that can be calculated by the H(x)=-Sum[p(x)log(2)p(x)]
then i want to calculate the entropy for the following problem. Consider a random variable that has a uniform distribution over 32 outcomes and i want to find the entroyp
H(x)=-Sum(1/32*log(1/32) and i ll sum 32 times the quantity before . /the problem is that the result is -5 and the book says that the entropy is 5 bits. Why do we remoive the minus sign fromentropy calculations?
 
Physics news on Phys.org
  • #2
The result is 5. log(1/32)=(-5).
 
  • #3
My excel calculates the =LOG(1/32;0.5)
for 5 and not for -5 as u mention
 
  • #4
Thx a lot for your reply
now i need something more
Why in a binary symetric channel the channel is calculated for
C=1+plogp+(1-p)log(1-p)

I only know that the channel is denoted as C=maxI(X;Y)
btw what ; means in X;Y?
Unfortunately my book doesn't mention these things so if u can reply me or provide me with some good links that will be rezlly nice
 
  • #5
LOG(1/32;0.5)? Isn't that the log base 1/2? Thought you wanted log base 2? Why not LOG(1/32;2)? As for your followup question - I don't know!? You might want to start a new thread and see if you can find someone who does.
 
  • #6
Thx a lot.. u are right i have used the function log incorrectly. As for my 2nd question i ll wait to see if someone will reply.
Thx a lot for help though
 

1. What is entropy?

Entropy is a measure of the disorder or randomness present in a system. In the context of information theory, entropy is a measure of the uncertainty or unpredictability of a message or data set.

2. How is entropy calculated?

In information theory, entropy is calculated using the Shannon entropy formula: H(X) = -Σp(x)log2p(x), where p(x) is the probability of a specific outcome or symbol in a data set X. This formula can also be used to calculate the entropy of a system in statistical thermodynamics.

3. How does entropy relate to information?

Entropy and information are closely related concepts. As the entropy of a system increases, the amount of information contained in that system decreases. This is because with higher entropy, there is more disorder and less predictability, making the information less valuable.

4. What is the significance of entropy in science?

Entropy has many important applications in various fields of science, including information theory, thermodynamics, and statistics. In information theory, it helps measure the efficiency of data compression and communication systems. In thermodynamics, it is a fundamental concept that describes the direction of natural processes and the efficiency of energy conversion. In statistics, it is used to measure the uncertainty or variability in a data set.

5. Can entropy be decreased or reversed?

In a closed system, entropy can only increase or stay the same, according to the second law of thermodynamics. However, in certain systems, such as living organisms, it is possible to maintain or decrease entropy locally by expending energy. This process is known as negentropy or entropy reduction.

Similar threads

  • Advanced Physics Homework Help
Replies
1
Views
936
  • Advanced Physics Homework Help
Replies
5
Views
1K
  • Advanced Physics Homework Help
Replies
1
Views
620
  • Atomic and Condensed Matter
Replies
6
Views
4K
  • Advanced Physics Homework Help
Replies
6
Views
12K
Replies
17
Views
2K
  • High Energy, Nuclear, Particle Physics
Replies
1
Views
1K
Replies
0
Views
181
Replies
9
Views
1K
  • Advanced Physics Homework Help
Replies
2
Views
13K
Back
Top