Shannon entropy - use to calculate the bit needed to encode a symbol

In summary, to encode a symbol in binary form, you need 3 bits and with 6 symbols, you would need 18 bits to encode "We are." This can be calculated using the formula 6*3=18 bits, as shown in the provided website. The question asks for 3 bits to encode one symbol, so 16 bits would be needed to encode all symbols. The amount of bits needed is related to entropy, which is calculated from the average number of bits needed to encode symbols. It is possible to encode a symbol with less than 3 bits, as not all 8 symbols are needed. The choice of encoding "W" is up to the individual, as long as they provide a table describing the
  • #1
Outrageous
374
0
To encode a symbol in binary form, I need 3 bits ,and I have 6 symbols.
So I need 6*3=18 bits to encode "We are" into binary form. As shown in http://www.shannonentropy.netmark.pl/calculate
My question: 3 bits to encode one then I have to use 16 bits, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _.
How to encode"W" into _ _ _ ? The _ _ _ is filled by 1 or 0 .
3 bits is calculated from entropy And what is that relate to the entropy? Please help. Really appreciate.
 
Physics news on Phys.org
  • #2
You need less than 3 bits on average, as you do not use all 8 symbols you could encode with 3 bits. For example, you can choose one symbol and encode it with just two bits (this "blocks" two 3-bit-strings).

How to encode"W" into _ _ _ ?
That is completely your choice. Pick anything you like, just provide a table where you describe the encoded symbols.

I moved your thread to our homework section.
 
  • #3
mfb said:
You need less than 3 bits on average, as you do not use all 8 symbols you could encode with 3 bits. For example, you can choose one symbol and encode it with just two bits (this "blocks" two 3-bit-strings).

That is completely your choice. Pick anything you like, just provide a table where you describe the encoded symbols.

I moved your thread to our homework section.
Thanks. Can you please give me some simple links to read? Please
 
  • #4
I don't have links, but every textbook and a lot of websites should cover that.
 
  • #5


Shannon entropy is a measure of the uncertainty or randomness in a system. In this case, it is being used to calculate the minimum number of bits needed to represent a symbol in binary form. The formula for Shannon entropy is H = -∑P(x)log2P(x), where P(x) is the probability of a particular symbol occurring.

In the given example, there are 6 symbols and each symbol requires 3 bits to be represented in binary form. This means that each symbol has a probability of 1/6 (since there are 6 symbols in total). Plugging this into the Shannon entropy formula, we get H = -(1/6)*log2(1/6)*6 = 3 bits.

So, for each symbol, we need 3 bits to encode it in binary form. For the word "We are", which has 6 symbols, we would need 6*3 = 18 bits to encode it completely.

Now, for your question about encoding "W" in 3 bits, it is not possible to do so using Shannon entropy. This is because the formula assumes that all symbols have equal probability, which is not the case for the word "W". In this case, the probability of "W" occurring is 1/2, while the probabilities of the other symbols are 1/6. So, we would need more than 3 bits to encode "W" in binary form.

Overall, Shannon entropy is a useful tool for determining the minimum number of bits needed to represent symbols in a system. However, it is important to note that it is based on certain assumptions and may not always accurately reflect the actual number of bits needed for specific symbols.
 

1. What is Shannon entropy and how is it used in information theory?

Shannon entropy is a measure of the uncertainty or randomness in a system. In information theory, it is used to calculate the minimum number of bits required to encode a symbol or set of symbols. This allows for efficient communication and storage of information.

2. How is Shannon entropy calculated?

Shannon entropy is calculated using the formula H = -Σp(x)log2p(x), where p(x) represents the probability of a symbol occurring. This formula takes into account both the frequency and the randomness of the symbols in a system.

3. Why is Shannon entropy important in data compression?

Shannon entropy is important in data compression because it allows us to determine the minimum number of bits needed to encode a symbol or set of symbols. This helps in reducing the size of data and making it more efficient to store and transmit.

4. How does Shannon entropy relate to the concept of information entropy?

Shannon entropy is a specific type of information entropy that is used to measure the amount of information in a system. It is based on the idea that the more uncertain or random a system is, the more information it contains.

5. Can Shannon entropy be used in other fields besides information theory?

Yes, Shannon entropy has applications in various fields such as genetics, linguistics, and finance. It can be used to analyze the complexity and randomness in different systems and make predictions about their behavior.

Similar threads

  • Introductory Physics Homework Help
Replies
5
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
8
Views
1K
  • Programming and Computer Science
Replies
9
Views
3K
  • Engineering and Comp Sci Homework Help
Replies
2
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
3
Views
2K
  • Beyond the Standard Models
Replies
9
Views
2K
  • Programming and Computer Science
Replies
6
Views
1K
  • Engineering and Comp Sci Homework Help
Replies
3
Views
3K
  • Computing and Technology
Replies
4
Views
762
Back
Top