Max Entropy of 16-Symbol Source

  • Context: MHB 
  • Thread starter Thread starter jNull
  • Start date Start date
  • Tags Tags
    Entropy Max Source
Click For Summary

Discussion Overview

The discussion revolves around the concept of entropy, specifically focusing on determining the maximum entropy of a source with 16 symbols. Participants explore the theoretical underpinnings of entropy, its mathematical formulation, and intuitive explanations of the concept.

Discussion Character

  • Exploratory
  • Technical explanation
  • Conceptual clarification
  • Debate/contested

Main Points Raised

  • Some participants introduce the formula for entropy, stating that for a random variable with n possible symbols, the entropy is defined as \( H(X) = - \sum_{k=1}^{n} P(x_{k})\ log_{2} P (x_{k}) \).
  • One participant explains that the maximum entropy occurs when each symbol has an equal probability of \( \frac{1}{16} \), leading to a maximum uncertainty in the system.
  • Another participant emphasizes the minimum entropy scenario where one symbol has a probability of 1, resulting in zero uncertainty.
  • There is a discussion about the convention used in defining \( 0 \times (-\infty) = 0 \) in the context of entropy, with some participants expressing concern about the implications of such conventions on fundamental results.
  • One participant argues that the use of the term "convention" does not imply arbitrariness, but rather is well-founded within the context of entropy.

Areas of Agreement / Disagreement

Participants express differing views on the implications of conventions in defining entropy and the mathematical treatment of certain expressions. While some agree on the mathematical formulation, the discussion remains unresolved regarding the philosophical implications of these conventions.

Contextual Notes

There are references to specific mathematical proofs and conventions that may not be universally accepted or understood, indicating a potential limitation in the discussion's clarity for those unfamiliar with the underlying mathematics.

jNull
Messages
1
Reaction score
0
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.

thank you
 
Physics news on Phys.org
jNull said:
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.

thank you

Wellcome on MHB jNull!... in Theory of Information the Entropy of a random variable X that can have n possible symbols is defined as...

$\displaystyle H(X) = - \sum_{k=1}^{n} P(x_{k})\ log_{2} P (x_{k})\ (1)$

... where $P(x_{k})$ is the probability $P \{X=x_{k}\}$ ...

In your case is n=16...

Kind regards

$\chi$ $\sigma$
 
jNull said:
Hi I am studying entropy and I am new to the concept I don't know where to start in this question:
State the maximum entropy of a 16-symbol source.

thank you
Intuitively, entropy is a measurement of randomness or lack of information. In the case of the 16 symbols, you might think of them as being 16 doors. Behind one of the doors is a brand new S-Class Mercedes, yours to drive off with if you choose the right door. If you have some inside information telling you for certain that the car is behind a particular door, say door number 7, then you would assign the probability $1$ to door 7 and probability $0$ to each of the other 15 doors. There would then be no uncertainty about the situation, and the entropy of the system would be $0$. At the opposite extreme, if you had no prior information about the situation then you would have to assign the probability $1/16$ to each of the doors, and the entropy ("lack of information") of the system would be maximised.

Coming back to the mathematics of the situation, the fundamental formula for entropy is the one given by chisigma, $$H(X) = -\sum_{k=1}^nP(x_k)\log_2(P(x_k))$$ (with the convention that $0\times (-\infty) = 0$, so that if a probability $P(x_k)$ is $0$ then $P(x_k)\log_2(P(x_k))$ is taken to be $0$). For the 16-symbol source, the entropy is minimised when one probability is $1$ and the others are all $0$. That gives the minimum entropy as $0$. The entropy is maximised in the situation where there is a complete lack of information, namely when $P(x_k) = 1/16$ for $1\leqslant k\leqslant 16$.
 
Opalg said:
... coming back to the mathematics of the situation, the fundamental formula for entropy is the one given by chisigma, $$H(X) = -\sum_{k=1}^nP(x_k)\log_2(P(x_k))$$ (with the convention that $0\times (-\infty) = 0$, so that if a probability $P(x_k)$ is $0$ then $P(x_k)\log_2(P(x_k))$ is taken to be $0$)...

A rigorous proof of the fact that, given the function $\displaystyle f(x) = x\ \ln x$, is $f(0)=0$ has been given in...

http://mathhelpboards.com/analysis-50/never-ending-dispute-2060.html?highlight=ending+dispute

Having studied for decades information theory, I would be very concerned that a fundamental result was due to a 'convention' that such a day as some 'imaginative mind' can change ...

http://d16cgiik7nzsna.cloudfront.net/82/e7/i98953090._szw1280h1280_.jpghttp://d16cgiik7nzsna.cloudfront.net/82/e7/i98953090._szw1280h1280_.jpg

MerryChristmas from Serbia

$\chi$ $\sigma$
 
chisigma said:
A rigorous proof of the fact that, given the function $\displaystyle f(x) = x\ \ln x$, is $f(0)=0$ has been given in...

http://mathhelpboards.com/analysis-50/never-ending-dispute-2060.html?highlight=ending+dispute

Having studied for decades information theory, I would be very concerned that a fundamental result was due to a 'convention' that such a day as some 'imaginative mind' can change ...
The "convention" is of course completely well-founded in the context of the entropy function, and the use of the word does not in any way imply that there is something arbitrary or negotiable about it. But in the absence of some such context, the expression $0\times \infty$ is not well-defined. That is why I wanted to emphasise the need to define $f(0) = 0$ for the function $f(x) = x\log_2(x).$
 

Similar threads

  • · Replies 5 ·
Replies
5
Views
1K
  • · Replies 5 ·
Replies
5
Views
766
  • · Replies 13 ·
Replies
13
Views
4K
  • · Replies 7 ·
Replies
7
Views
1K
  • · Replies 6 ·
Replies
6
Views
2K
  • · Replies 15 ·
Replies
15
Views
3K
  • · Replies 1 ·
Replies
1
Views
2K
  • · Replies 18 ·
Replies
18
Views
6K
  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 6 ·
Replies
6
Views
2K