# Homework Help: Delta function

1. Mar 5, 2009

### psid

1. The problem statement, all variables and given/known data

How does one prove that $$\int^\infty_{-\infty}\lim_{\epsilon \rightarrow 0}(1/\pi)\frac{\epsilon g(x)}{(x-a)^{2}+\epsilon^{2}}dx=g(a)$$?

2. Mar 5, 2009

### George Jones

Staff Emeritus
One doesn't, since the result isn't true. The limit has to be outside the integral sign.

Mathematical Physics by Butkov has a nice proof on pages 238-239. The idea is, for positive $\epsilon$ to write

$$\lim_{\epsilon \rightarrow 0} \int_{-\infty}^\infty = \lim_{\epsilon \rightarrow 0} \int_{-\infty}^{a-\epsilon} + lim_{\epsilon \rightarrow 0} \int_{a-\epsilon}^{a+\epsilon} + lim_{\epsilon \rightarrow 0} \int_{a+\epsilon}^{\infty},$$

and then to assume $g$ is bounded to show that the first and last terms go to zero.

For $\epsilon$ small and $g$ continuous, $g(x)$ is approximately equal to the constant value $g(a)$ over the middle interval, so pull this outside of the middle integral, or, more rigorously, use the mean value theorem for integrals.

3. Mar 6, 2009

### psid

Thanks, got it right now. The limit was indeed before the integral sign.

Share this great discussion with others via Reddit, Google+, Twitter, or Facebook