Is the Rate of Decay of I_λ as λ→0 equal to o(λ^2)?

  • Thread starter Thread starter Azupol123
  • Start date Start date
  • Tags Tags
    Integral
Azupol123
Messages
5
Reaction score
0

Homework Statement


Let g: R→R be an infinitely differentiable function satisfying g(x)=0\forall |x|>1

Consider the integral I\lambda = ∫g(x)sin(\lambdax2) dx taken from -∞ to +∞

Prove that I\lambda→0 and identify the rate of decay as \lambda→0


I have no idea how to start this. I thought maybe differentiation under the integral or using Bessel's inequality?
 
Physics news on Phys.org
I can get you started. First, since g(x) = 0 for |x|>1, you are integrating only from -1 to 1. We also see that as ##\lambda \rightarrow 0## sin##(\lambda x^2) \rightarrow 0##. If g(x) is infinitely differentiable, can you show it is bounded on [-1,1]? And then what happens to that integral as ##\lambda \rightarrow 0##?
 
Alright, I think I see where you're going. I can show it's bounded by setting m=min g on [-1, 1] and M=max g on [-1, 1] so that m*I\lambda<I\lambda<M*I\lambda.

It seems like I'd do a series expansion about \lambda=∞, but I have no idea what the expansion would look due to the nature of g
 
Azupol123 said:
Alright, I think I see where you're going. I can show it's bounded by setting m=min g on [-1, 1] and M=max g on [-1, 1] so that m*I\lambda<I\lambda<M*I\lambda.

It seems like I'd do a series expansion about \lambda=∞, but I have no idea what the expansion would look due to the nature of g

Your bounds are correct, but not quite to the point here. The boundedness of g tells us that as sin(##\lambda x^2##) sinks down to 0, the g won't pull it back up and ruin our limit.

Also, you can't just say g is bounded, you have to give a reason. What is it?

In terms of the decay, I wasn't sure what derivative we need to look at (not much of a physicist). I suppose it is d##\lambda##/d? but I don't know what the ? would be. Do you know?
 
Azupol123 said:
That makes sense. g is bounded because any infinitely differentiable function over an interval is analytic and then the following holds:http://en.wikipedia.org/wiki/Analytic_function#Alternative_characterizations (at least I think that's the reason).

As far as the decay goes, I don't know what the ? would. This is for a PDE course and that's all I was given as a question

Your reason for g being bounded is correct, but it's overkill plus there is a mistake. It is enough to say g is differentiable on the interval, thus continuous, and any continuous function on a closed interval is bounded.

If g is in ##C^ \infty## that does not mean it is analytic. The standard example is g(x) = 0 for x ≤0; g(x) =## e^{-x^2}## for x > 0. You can see that g is infinitely differentiable at x = 0. All the derivatives are 0 at x = 0. Thus its Taylor's series is identically zero, which converges fine to g for x ##\le## 0 but is nowhere near the mark for x > 0.

For a function to be analytic at a point, its Taylor's series must converge to it within some interval around that point. For real analytic functions that is the definition.

Re the decay, I would think that is with respect to time, but I don't see a t in the problem. ? Could you get this clarified?
 
Hey, sorry for the late response but thanks for elucidating about analytic functions. As far as the decay is concerned, I can't get it clarified till tomorrow, but I really appreciated your help!
 
Alright, so I think I got it. I can bound the integral with the following:

m*I\lambda<I\lambda<M*I\lambda.


Now, the period of sin (λx2) is 2pi/λx, which goes to zero as λ→∞. So I think the decay rate would be 1/λx?
 
back to you later today
 
  • #10
Azupol123 said:
Alright, so I think I got it. I can bound the integral with the following:

m*I\lambda<I\lambda<M*I\lambda.


Now, the period of sin (λx2) is 2pi/λx, which goes to zero as λ→∞. So I think the decay rate would be 1/λx?

The period of sin(## \lambda x^2##) is ##\sqrt \pi / \lambda##.

Continuing with the problem: We have shown that ##I_λ \rightarrow 0## as ##λ \rightarrow 0##. From this point I am guessing, so don't quote me; but you can ask if this is the right kind of thing:

By decay I think he means how fast does ##I_λ \rightarrow 0##? So we are looking at dI/dλ. That can be integrated under the integral sign to get

##\int_{-1}^1 g(x)x^2 cos(λx^2)dx##. Since cos(x) ≈ 1 - ##x^2##/2 the integral is approximately

##\int_{-1}^1 g(x)x^2 [1 -(λx^2)^2/2]dx## = ## \int_{-1}^1 g(x)x^2 dx - \frac{1}{2}λ^2\int_{-1}^1 g(x)x^6 dx ##. Both the integrals are constants with respect to λ so we can write this last expression as A -Bλ##^2##. This would be the kind of answer that is needed: we would say the decay is o(λ##^2##), where the "o" stands for order.

Order is a measure of the general size of something. In the case of convergence we consider it to be a power of whatever variable is going to 0. The particular factors the variable may be multiplied by are irrelevant because they are constants. In this case the variable is λ; if you are defining a derivative it is usually h. When we are consider comparative bigness, we usually think of it as a power of 10 as in "your financial calculation is off by an order of magnitude".
 
Back
Top