# Properly show this limit to be true

1. Oct 10, 2013

### dipole

1. The problem statement, all variables and given/known data

$g(x)$ is a function with a discontinuity at $x_0$ s.t.,

$$\Delta g_0 = \lim_{ \epsilon \to 0} (g(x_0 + \epsilon) - g(x_0 - \epsilon) )$$

3. The attempt at a solution

I'd like to show that the following limit,

$$\lim_{\epsilon \to 0} \int_{x_0-\epsilon}^{x_0+\epsilon}g'(x)\varphi(x)dx = \Delta g_0 \varphi(x_0)$$

where $\varphi(x)$ is some smooth test function that vanishes at $\infty$.

Intuitively I know this makes sense, but I'm having trouble showing it formally - any ideas/tips/advice?

edit: corrected mistake in original post.

Last edited: Oct 10, 2013
2. Oct 10, 2013

### I like Serena

Hi dipole!

Let's see...

Suppose we pick for g(x) the step function H(x) which is 0 for negative x and 1 for positive x.
And let's pick $x_0=0$ and $\varphi(x)=x+1$.

Then you integral evaluates to:
$$\lim_{\varepsilon \downarrow 0} \int_{-\varepsilon}^{+\varepsilon} H(x)\cdot 1 dx = \lim_{\varepsilon \downarrow 0} \int_{0}^{+\varepsilon} 1 dx = \lim_{\varepsilon \downarrow 0} \varepsilon = 0$$

But $\Delta g_0 \phi(x_0) = 1 \cdot 1 = 1$.

I think your statement is not true.

3. Oct 10, 2013

### BruceW

I was going to say something similar. Even if we choose a function $\varphi$ that vanishes at infinity, I think the statement still will not be true... Maybe you accidentally wrote the question down wrong?

4. Oct 10, 2013

### I like Serena

Good point. I forgot that $\varphi$ should vanish at infinity.
Let me rectify that. Let's pick:
$$\varphi(x)=\begin{cases}x+1 & \text{if } -1 < x < 1 \\ 0 & \text{otherwise}\end{cases}$$

5. Oct 10, 2013

### BruceW

needs to be smooth too. I was thinking of something like the sinc function.

6. Oct 10, 2013

### Office_Shredder

Staff Emeritus
I don't see what the global properties of $\phi$ have to do with this. If it really bothers you it's possible to smooth down $\phi$ from I Like Serena's last post without changing its values anywhere except arbitrarily close to 1 and -1.

Despite the arbitrary requirement on $\phi$ at infinity in the OP the statement only depends on $\phi$ arbitrarily close to x0. Whether it's smooth away from x0, or vanishes or blows up or whatever changes neither the left hand side nor the right hand side.

I believe a correct statement would be that
$$\lim_{\epsilon \to 0} \frac{1}{2\epsilon} \int_{x_0-\epsilon}^{x_0+\epsilon} g(x) \phi(x) dx = \Delta g_0 \phi(x_0)$$

Last edited: Oct 10, 2013
7. Oct 10, 2013

### I like Serena

Heh. Yeah. You're right.
Let me pick yet another function that has the same characteristics as the one I originally picked:
$$\varphi(x)=\frac{x+1}{x^2+1}$$

8. Oct 10, 2013

### I like Serena

True enough.

Hmm, let's make the same choices I already did.
And let's pick $\varphi(x)=x+1$ in the neighbourhood of 0, and such that it satisfies the criteria further away from 0.

Then... it still doesn't work out!

9. Oct 10, 2013

### I like Serena

Okay. Let me give it a try.
$$\lim_{\varepsilon \to 0} \int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx = \Delta g_0 \varphi(x_0)$$

10. Oct 10, 2013

### BruceW

uhhh... dipole's original equation is now starting to make sense to me... I'll probably come around again though hehe

11. Oct 10, 2013

### pasmith

This cannot possibly be true. Suppose it were true for some function $\varphi$. What happens when I look at $\varphi + \theta$ for some suitable smooth function $\theta$ which is constant and non-zero on some neighbourhood of $x_0$? The limit on the left, if it exists, should be the same as before: Once $\epsilon$ is sufficiently small I have $\theta' = 0$ on $[x_0 -\epsilon, x_0 + \epsilon]$. Hence I'm integrating the same function over the same domain, so I should get the same result. But the right hand side has suddenly jumped from $\Delta g_0 \varphi(x_0)$ to $\Delta g_0 (\varphi(x_0) + \theta(x_0))$. Plainly this is impossible.

Last edited: Oct 10, 2013
12. Oct 10, 2013

### BruceW

Your one makes sense to me now. Maybe this is what dipole meant to write as the question.

13. Oct 10, 2013

### dipole

Yes, this is correct. I arrived at the original expression I wrote by integrating by parts the one you've written (over an infinite interval), and threw away the boundary term (which is why I included the fact that $\varphi(\infty) = 0$), and then isolated the remaining discontinuity in what was left. However, that is clearly wrong. :P

Sorry about the confusion. However, now I'd like to show formally what you've posted is correct. Clearly it should be, because in the region of the discontinuity the derivative $g'(x)$ is going to behave like a $\delta (x)$, which is ultimately what I'm trying to derive here.

Any tips now that this is sorted out a bit more? :)

14. Oct 10, 2013

### I like Serena

Apparently he did. :)

Well, it's not an easy one to prove mathematically.

But from my physics background it's obvious.

$$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx = \int_{x_0-\varepsilon}^{x_0+\varepsilon} \frac{dg}{dx}\varphi(x)dx = \int dg \varphi(x) \to \Delta g_0 \varphi(x_0)$$

15. Oct 10, 2013

### dipole

Right, intuitively when we get close to $x_0$, the function $\varphi(x)$ is essentially constant over the interval, and we can take it outside the integral, and the result follows from the first equation I wrote.

That's also the physicist in me speaking too - but yea I would like to show it rigorously if it can be done in half a page or so. :)

16. Oct 10, 2013

### klondike

$$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx = g(x)\varphi(x)\vert_{x_0-\varepsilon}^{x_0+\varepsilon}-\int_{x_0-\varepsilon}^{x_0+\varepsilon} g(x)\varphi'(x)dx$$ and $$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g(x)\varphi'(x)dx=0$$ if $$\varphi$$ is continuous. so $$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx =\Delta g_0 \varphi(x_0)$$

17. Oct 11, 2013

### dipole

That seems like it should be true, but it's not obvious to me why the second statement is true. Do you have any way to justify it?

18. Oct 11, 2013

### I like Serena

Yes. That looks good.

The second statement is true because $g(x)\varphi'(x)$ will have a finite value on the interval, so:
$$\left|\int_{x_0-\varepsilon}^{x_0+\varepsilon} g(x)\varphi'(x)dx \right| \le 2\varepsilon \cdot \sup |g(x)\varphi'(x)| \to 0$$

There is one problem though.
The original integral has an undefined value at $x_0$, since $g'(x_0)$ does not exist.
Luckily we can work around that by splitting up the integral in the part below $x_0$ and the part above $x_0$.
The end result is the same.