- #1

- 2,199

- 81

Previously I posted a question on the Dirac delta function and was informed it was not a true function, but rather a distribution. However, I have to admit I still did not understand why its integral (neg inf to pos inf) is unity. I've thought about this and came up with the following:

Consider a normal distribution with a variance of 0. Since the integral of a normal distribution is constrained to be unity, the mean must go to infinity as the variance approaches zero. This is the Dirac delta. Is this reasonable or am I wandering in the wilderness?

Consider a normal distribution with a variance of 0. Since the integral of a normal distribution is constrained to be unity, the mean must go to infinity as the variance approaches zero. This is the Dirac delta. Is this reasonable or am I wandering in the wilderness?

Last edited: