# Information content of Dirac delta function

friend
I understand that the Dirac delta function can be taken as a distribution. And that one can calculate the Shannon entropy or information content of any distribution. So what is the information content of the Dirac delta function? I think it is probably identically zero, but I'd like to see the proof of it. I could not find anything specific on-line about this. So any help would be appreciated. Thanks.

Preno
Those are two entirely different senses of "distribution". The Dirac delta is a distribution in this sense, Shannon entropy relates to probability distributions in this sense.

friend
I found this information on the Web from the book:
Ecosystem Ecology: A New Systhesis, by David G. Christopher, L.J. Frid, on page 46

He calculated the Shannon entropy of a dirac delta function to be zero. Actually, he seems to be calculating for the discrete Kronecker delta. I wonder how one would go to the continuous case of the Dirac delta. Thanks.

#### Attachments

• Entropy_of_delta.gif
60 KB · Views: 885
Preno
The Dirac delta is not a function, so it does not have a well-defined entropy. (If you define the probability distribution to be p(X<x) rather than p(X<=x), then it does have a corresponding probability distribution - p(X<x) = 0 for x<=0, p(X<x) = 1 for x > 0 - but this probability distribution is not generated by a probability distribution function.) However, you could define its entropy as the limit of the entropies of successive approximations to the Dirac delta. These can be taken to be increasingly narrower and higher bell curves or rectangles of constant unit area. It's easy to show that the limit is zero - the limit is in effect x*log(x) as x goes to 0.

friend
... However, you could define its entropy as the limit of the entropies of successive approximations to the Dirac delta. These can be taken to be increasingly narrower and higher bell curves or rectangles of constant unit area. It's easy to show that the limit is zero - the limit is in effect x*log(x) as x goes to 0.

So if we take the limit of the delta function after we calculate its entropy, then it goes to zero, right? I'm wondering if its too much trouble you to show me the math of this situation? Thanks.

Preno
Well, as I said, the Dirac delta distribution can be approximated for example by a sequence of rectangles Rn around zero of width 1/n and height n.

$$H_n = - \int_{-1/2n}^{1/2n} R_n(x) \cdot \log (R_n(x)) \textrm{d}x = - \frac{1}{n} \cdot n \cdot \log(n) = -\log(n)$$

So actually you get $$-\infty$$ rather than 0. (But that's okay, because differential entropy can be negative.)