Shannon entropy - use to calculate the bit needed to encode a symbol

Click For Summary

Discussion Overview

The discussion revolves around the calculation of bits needed to encode symbols using Shannon entropy, specifically addressing how many bits are required for encoding a set of symbols and the relationship between the number of bits and entropy. The scope includes theoretical aspects of information encoding and practical encoding choices.

Discussion Character

  • Technical explanation, Homework-related, Conceptual clarification

Main Points Raised

  • One participant claims that 3 bits are needed to encode each of the 6 symbols, leading to a total of 18 bits for the phrase "We are."
  • Another participant argues that less than 3 bits are needed on average since not all possible symbols are utilized, suggesting that some symbols could be encoded with fewer bits.
  • A participant emphasizes that the choice of how to encode a specific symbol, such as "W," is flexible and can be defined by the user, provided a table of encoded symbols is created.
  • There is a request for additional resources or links to further reading on the topic, but one participant notes they do not have specific links, suggesting that textbooks and various websites cover the subject.

Areas of Agreement / Disagreement

Participants do not reach a consensus on the exact number of bits needed for encoding, with differing views on the average bits required and the flexibility of encoding choices. The discussion remains unresolved regarding the specifics of encoding based on entropy.

Contextual Notes

The discussion does not clarify the assumptions behind the calculations of bits required or the definitions of symbols used, leaving some aspects of the encoding process and its relation to entropy open to interpretation.

Outrageous
Messages
373
Reaction score
0
To encode a symbol in binary form, I need 3 bits ,and I have 6 symbols.
So I need 6*3=18 bits to encode "We are" into binary form. As shown in http://www.shannonentropy.netmark.pl/calculate
My question: 3 bits to encode one then I have to use 16 bits, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _.
How to encode"W" into _ _ _ ? The _ _ _ is filled by 1 or 0 .
3 bits is calculated from entropy And what is that relate to the entropy? Please help. Really appreciate.
 
Physics news on Phys.org
You need less than 3 bits on average, as you do not use all 8 symbols you could encode with 3 bits. For example, you can choose one symbol and encode it with just two bits (this "blocks" two 3-bit-strings).

How to encode"W" into _ _ _ ?
That is completely your choice. Pick anything you like, just provide a table where you describe the encoded symbols.

I moved your thread to our homework section.
 
mfb said:
You need less than 3 bits on average, as you do not use all 8 symbols you could encode with 3 bits. For example, you can choose one symbol and encode it with just two bits (this "blocks" two 3-bit-strings).

That is completely your choice. Pick anything you like, just provide a table where you describe the encoded symbols.

I moved your thread to our homework section.
Thanks. Can you please give me some simple links to read? Please
 
I don't have links, but every textbook and a lot of websites should cover that.
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 9 ·
Replies
9
Views
4K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 8 ·
Replies
8
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 9 ·
Replies
9
Views
3K
Replies
1
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K