Information content of Dirac delta function

Click For Summary
The Dirac delta function is considered a distribution, but it does not have a well-defined Shannon entropy due to its nature. Calculations suggest that the entropy approaches zero when approximating the Dirac delta with narrower distributions, such as bell curves or rectangles. However, when using rectangles of constant area, the resulting entropy calculation leads to negative infinity, indicating that differential entropy can indeed be negative. The discussion highlights the complexities of defining entropy for distributions that are not traditional probability functions. Overall, the information content of the Dirac delta function remains a nuanced topic with varying interpretations.
friend
Messages
1,448
Reaction score
9
I understand that the Dirac delta function can be taken as a distribution. And that one can calculate the Shannon entropy or information content of any distribution. So what is the information content of the Dirac delta function? I think it is probably identically zero, but I'd like to see the proof of it. I could not find anything specific on-line about this. So any help would be appreciated. Thanks.
 
Physics news on Phys.org
Those are two entirely different senses of "distribution". The Dirac delta is a distribution in this sense, Shannon entropy relates to probability distributions in this sense.
 
I found this information on the Web from the book:
Ecosystem Ecology: A New Systhesis, by David G. Christopher, L.J. Frid, on page 46

He calculated the Shannon entropy of a dirac delta function to be zero. Actually, he seems to be calculating for the discrete Kronecker delta. I wonder how one would go to the continuous case of the Dirac delta. Thanks.
 

Attachments

  • Entropy_of_delta.gif
    Entropy_of_delta.gif
    68.6 KB · Views: 1,094
The Dirac delta is not a function, so it does not have a well-defined entropy. (If you define the probability distribution to be p(X<x) rather than p(X<=x), then it does have a corresponding probability distribution - p(X<x) = 0 for x<=0, p(X<x) = 1 for x > 0 - but this probability distribution is not generated by a probability distribution function.) However, you could define its entropy as the limit of the entropies of successive approximations to the Dirac delta. These can be taken to be increasingly narrower and higher bell curves or rectangles of constant unit area. It's easy to show that the limit is zero - the limit is in effect x*log(x) as x goes to 0.
 
Preno said:
... However, you could define its entropy as the limit of the entropies of successive approximations to the Dirac delta. These can be taken to be increasingly narrower and higher bell curves or rectangles of constant unit area. It's easy to show that the limit is zero - the limit is in effect x*log(x) as x goes to 0.

So if we take the limit of the delta function after we calculate its entropy, then it goes to zero, right? I'm wondering if its too much trouble you to show me the math of this situation? Thanks.
 
Well, as I said, the Dirac delta distribution can be approximated for example by a sequence of rectangles Rn around zero of width 1/n and height n.

H_n = - \int_{-1/2n}^{1/2n} R_n(x) \cdot \log (R_n(x)) \textrm{d}x = - \frac{1}{n} \cdot n \cdot \log(n) = -\log(n)

So actually you get -\infty rather than 0. (But that's okay, because differential entropy can be negative.)
 
The standard _A " operator" maps a Null Hypothesis Ho into a decision set { Do not reject:=1 and reject :=0}. In this sense ( HA)_A , makes no sense. Since H0, HA aren't exhaustive, can we find an alternative operator, _A' , so that ( H_A)_A' makes sense? Isn't Pearson Neyman related to this? Hope I'm making sense. Edit: I was motivated by a superficial similarity of the idea with double transposition of matrices M, with ## (M^{T})^{T}=M##, and just wanted to see if it made sense to talk...

Similar threads

  • · Replies 12 ·
Replies
12
Views
5K
  • · Replies 33 ·
2
Replies
33
Views
6K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 25 ·
Replies
25
Views
4K
  • · Replies 4 ·
Replies
4
Views
1K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 3 ·
Replies
3
Views
3K
Replies
0
Views
2K
Replies
5
Views
4K
  • · Replies 0 ·
Replies
0
Views
2K