Properly show this limit to be true

  • Thread starter Thread starter dipole
  • Start date Start date
  • Tags Tags
    Limit
dipole
Messages
553
Reaction score
151

Homework Statement



g(x) is a function with a discontinuity at x_0 s.t.,

\Delta g_0 = \lim_{ \epsilon \to 0} (g(x_0 + \epsilon) - g(x_0 - \epsilon) )

The Attempt at a Solution



I'd like to show that the following limit,

\lim_{\epsilon \to 0} \int_{x_0-\epsilon}^{x_0+\epsilon}g'(x)\varphi(x)dx = \Delta g_0 \varphi(x_0)

where \varphi(x) is some smooth test function that vanishes at \infty.

Intuitively I know this makes sense, but I'm having trouble showing it formally - any ideas/tips/advice?

edit: corrected mistake in original post.
 
Last edited:
Physics news on Phys.org
dipole said:

Homework Statement



g(x) is a function with a discontinuity at x_0 s.t.,

\Delta g_0 = \lim_{ \epsilon \to 0} (g(x_0 + \epsilon) - g(x_0 - \epsilon) )


The Attempt at a Solution



I'd like to show that the following limit,

\lim_{\epsilon \to 0} \int_{x_0-\epsilon}^{x_0+\epsilon}g(x)\varphi'(x)dx = \Delta g_0 \varphi(x_0)

where \varphi(x) is some smooth test function that vanishes at \infty.

Intuitively I know this makes sense, but I'm having trouble showing it formally - any ideas/tips/advice?

Hi dipole! :smile:

Let's see...

Suppose we pick for g(x) the step function H(x) which is 0 for negative x and 1 for positive x.
And let's pick ##x_0=0## and ##\varphi(x)=x+1##.

Then you integral evaluates to:
$$\lim_{\varepsilon \downarrow 0} \int_{-\varepsilon}^{+\varepsilon} H(x)\cdot 1 dx = \lim_{\varepsilon \downarrow 0} \int_{0}^{+\varepsilon} 1 dx = \lim_{\varepsilon \downarrow 0} \varepsilon = 0$$

But ##\Delta g_0 \phi(x_0) = 1 \cdot 1 = 1##.

I think your statement is not true.
 
I was going to say something similar. Even if we choose a function ##\varphi## that vanishes at infinity, I think the statement still will not be true... Maybe you accidentally wrote the question down wrong?
 
BruceW said:
I was going to say something similar. Even if we choose a function ##\varphi## that vanishes at infinity, I think the statement still will not be true... Maybe you accidentally wrote the question down wrong?

Good point. I forgot that ##\varphi## should vanish at infinity.
Let me rectify that. Let's pick:
$$\varphi(x)=\begin{cases}x+1 & \text{if } -1 < x < 1 \\ 0 & \text{otherwise}\end{cases}$$
 
needs to be smooth too. I was thinking of something like the sinc function.
 
I don't see what the global properties of \phi have to do with this. If it really bothers you it's possible to smooth down \phi from I Like Serena's last post without changing its values anywhere except arbitrarily close to 1 and -1.

Despite the arbitrary requirement on \phi at infinity in the OP the statement only depends on \phi arbitrarily close to x0. Whether it's smooth away from x0, or vanishes or blows up or whatever changes neither the left hand side nor the right hand side.

I believe a correct statement would be that
\lim_{\epsilon \to 0} \frac{1}{2\epsilon} \int_{x_0-\epsilon}^{x_0+\epsilon} g(x) \phi(x) dx = \Delta g_0 \phi(x_0)
 
Last edited:
BruceW said:
needs to be smooth too. I was thinking of something like the sinc function.

Heh. Yeah. You're right.
Let me pick yet another function that has the same characteristics as the one I originally picked:
$$\varphi(x)=\frac{x+1}{x^2+1}$$
 
Office_Shredder said:
I don't see what the global properties of \phi have to do with this. If it really bothers you it's possible to smooth down \phi from I Like Serena's last post without changing its values anywhere except arbitrarily close to 1 and -1.

Despite the arbitrary requirement on \phi at infinity in the OP the statement only depends on \phi arbitrarily close to x0. Whether it's smooth away from x0, or vanishes or blows up or whatever changes neither the left hand side nor the right hand side.

True enough.

I believe a correct statement would be that
\lim_{\epsilon \to 0} \frac{1}{2\epsilon} \int_{x_0-\epsilon}^{x_0+\epsilon} g(x) \phi(x) dx = \Delta g_0 \phi(x_0)

Hmm, let's make the same choices I already did.
And let's pick ##\varphi(x)=x+1## in the neighbourhood of 0, and such that it satisfies the criteria further away from 0.

Then... it still doesn't work out!
 
Okay. Let me give it a try.
How about:
$$\lim_{\varepsilon \to 0} \int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx = \Delta g_0 \varphi(x_0)$$
 
  • #10
uhhh... dipole's original equation is now starting to make sense to me... I'll probably come around again though hehe
 
  • #11
dipole said:

Homework Statement



g(x) is a function with a discontinuity at x_0 s.t.,

\Delta g_0 = \lim_{ \epsilon \to 0} (g(x_0 + \epsilon) - g(x_0 - \epsilon) )


The Attempt at a Solution



I'd like to show that the following limit,

\lim_{\epsilon \to 0} \int_{x_0-\epsilon}^{x_0+\epsilon}g(x)\varphi&#039;(x)dx = \Delta g_0 \varphi(x_0)

where \varphi(x) is some smooth test function that vanishes at \infty.

Intuitively I know this makes sense, but I'm having trouble showing it formally - any ideas/tips/advice?

This cannot possibly be true. Suppose it were true for some function \varphi. What happens when I look at \varphi + \theta for some suitable smooth function \theta which is constant and non-zero on some neighbourhood of x_0? The limit on the left, if it exists, should be the same as before: Once \epsilon is sufficiently small I have \theta&#039; = 0 on [x_0 -\epsilon, x_0 + \epsilon]. Hence I'm integrating the same function over the same domain, so I should get the same result. But the right hand side has suddenly jumped from \Delta g_0 \varphi(x_0) to \Delta g_0 (\varphi(x_0) + \theta(x_0)). Plainly this is impossible.
 
Last edited:
  • #12
I like Serena said:
Okay. Let me give it a try.
How about:
$$\lim_{\varepsilon \to 0} \int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx = \Delta g_0 \varphi(x_0)$$
Your one makes sense to me now. Maybe this is what dipole meant to write as the question.
 
  • #13
I like Serena said:
Okay. Let me give it a try.
How about:
$$\lim_{\varepsilon \to 0} \int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx = \Delta g_0 \varphi(x_0)$$

Yes, this is correct. I arrived at the original expression I wrote by integrating by parts the one you've written (over an infinite interval), and threw away the boundary term (which is why I included the fact that \varphi(\infty) = 0), and then isolated the remaining discontinuity in what was left. However, that is clearly wrong. :P

Sorry about the confusion. However, now I'd like to show formally what you've posted is correct. Clearly it should be, because in the region of the discontinuity the derivative g&#039;(x) is going to behave like a \delta (x), which is ultimately what I'm trying to derive here.

Any tips now that this is sorted out a bit more? :)
 
  • #14
BruceW said:
Your one makes sense to me now. Maybe this is what dipole meant to write as the question.

Apparently he did. :)

dipole said:
Yes, this is correct. I arrived at the original expression I wrote by integrating by parts the one you've written (over an infinite interval), and threw away the boundary term (which is why I included the fact that \varphi(\infty) = 0), and then isolated the remaining discontinuity in what was left. However, that is clearly wrong. :P

Sorry about the confusion. However, now I'd like to show formally what you've posted is correct. Clearly it should be, because in the region of the discontinuity the derivative g&#039;(x) is going to behave like a \delta (x), which is ultimately what I'm trying to derive here.

Any tips now that this is sorted out a bit more? :)

Well, it's not an easy one to prove mathematically.

But from my physics background it's obvious.

$$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx
= \int_{x_0-\varepsilon}^{x_0+\varepsilon} \frac{dg}{dx}\varphi(x)dx
= \int dg \varphi(x)
\to \Delta g_0 \varphi(x_0)$$
 
  • #15
Right, intuitively when we get close to x_0, the function \varphi(x) is essentially constant over the interval, and we can take it outside the integral, and the result follows from the first equation I wrote.

That's also the physicist in me speaking too - but yea I would like to show it rigorously if it can be done in half a page or so. :)
 
  • #16
How about this:
$$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx
= g(x)\varphi(x)\vert_{x_0-\varepsilon}^{x_0+\varepsilon}-\int_{x_0-\varepsilon}^{x_0+\varepsilon} g(x)\varphi'(x)dx$$ and $$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g(x)\varphi'(x)dx=0$$ if $$\varphi$$ is continuous. so $$\int_{x_0-\varepsilon}^{x_0+\varepsilon} g'(x)\varphi(x)dx
=\Delta g_0 \varphi(x_0)$$

dipole said:
Right, intuitively when we get close to x_0, the function \varphi(x) is essentially constant over the interval, and we can take it outside the integral, and the result follows from the first equation I wrote.

That's also the physicist in me speaking too - but yea I would like to show it rigorously if it can be done in half a page or so. :)
 
  • #17
That seems like it should be true, but it's not obvious to me why the second statement is true. Do you have any way to justify it?
 
  • #18
dipole said:
That seems like it should be true, but it's not obvious to me why the second statement is true. Do you have any way to justify it?

Yes. That looks good.

The second statement is true because ##g(x)\varphi'(x)## will have a finite value on the interval, so:
$$\left|\int_{x_0-\varepsilon}^{x_0+\varepsilon} g(x)\varphi'(x)dx \right| \le 2\varepsilon \cdot \sup |g(x)\varphi'(x)| \to 0$$

There is one problem though.
The original integral has an undefined value at ##x_0##, since ##g'(x_0)## does not exist.
Luckily we can work around that by splitting up the integral in the part below ##x_0## and the part above ##x_0##.
The end result is the same.
 
  • Like
Likes 1 person
Back
Top