Taking the first derivative of a function with improper integral

Main Question or Discussion Point

Hello--

I have a function:

$$u(t,\tau)=\frac{1}{\pi}\int_{0}^{\infty}\! G(\omega)\, d\omega$$

$$G(\omega)=4\sqrt{\pi}\frac{\omega^{2}}{\omega_{0}^{3}}\mbox{exp}\left(-\frac{\omega^{2}}{\omega_{0}^{2}}\right)\mbox{cos\left(\omega t-\left(\frac{\omega}{\omega_{0}}\right)^{-\gamma}\omega\tau\right)\mbox{exp}\left(-\frac{1}{2Q}\left(\frac{\omega}{\omega_{0}}\right)^{-\gamma}\omega t\right)}$$

Now what I would like to do is to take the first time derivative of $$u(t,\tau)$$ to obtain the roots of $$\partial u(t,\tau) / \partial t = 0$$, where $$\partial u(t,\tau) / \partial t$$ is the derivative with respect to $$t$$.

How would I get started? I think that I need to somehow get rid of the improper integral so that I can then take the first derivative. I've noticed that for $$\omega \rightarrow \infty$$, $$G(\omega) \rightarrow 0$$.

Answers and Replies

Once you've demonstrated convergence of the improper integral, you could probably differentiate under the integral sign, i.e., switch the order of the differentiation and integration, here. I don't remember what the precise conditions are, but I do know that for distributions, they are pretty general. If I recall correctly, it's OK to differentiate under the integral sign to determine if the resulting expression is a continuous linear functional because then you can conclude that the original expression was indeed a distribution and this would justify the interchange.

Be warned that I was exposed to a little bit of distribution theory before I learned anything beyond basic analysis, so perhaps someone more knowledgeable on the topic could give better/correct suggestions.

That sounds good, snipez90 - I'll have to investigate. Thank you very much for your response.

The only thing that is slightly perplexing is being able to show convergence of $$G(\omega)$$. How should I begin?

Well, for a start I think that

$$4\sqrt{\pi}\frac{\omega^{2}}{\omega_{0}^{3}} \rightarrow \infty$$

as $$\omega \rightarrow \infty$$ and

$$\mbox{cos}\left(\omega t-\left(\frac{\omega}{\omega_{0}}\right)^{-\gamma}\omega\tau\right)$$

will oscillate as $$\omega \rightarrow \infty$$ since we are dealing with a cosine function, and

$$\mbox{exp}\left(-\frac{1}{2Q}\left(\frac{\omega}{\omega_{0}}\right)^{-\gamma}\omega t\right) \rightarrow 0$$

as $$\omega \rightarrow \infty$$, for choices of $$\gamma, Q, \omega_0, t$$ particular to my problem. This may suggest that $$G(\omega) \rightarrow 0$$ as $$\omega \rightarrow \infty$$.

Now how do I analytically perform the integration required for $$G(\omega)$$ so that I can find $$u(t,\tau)$$? I understand that the integration could be done before or after the differentiation with respect to time.

Last edited:
Do you have the specific bounds on say, gamma? For instance, as $\omega \rightarrow \infty$ the argument within the exponential goes to zero if $\gamma > 1.$

Thank you for your response, snipez90.

Yes, often $$\gamma > 1$$, but also I also think that there's a possibility of $$\gamma \approx \frac{1}{\pi Q}$$, where $$Q$$ is a real number.

Moreover, $$25 \leq Q \leq 400$$.

All right, well I guess if $\gamma \leq 1$ it doesn't really matter since you'll still get convergence in that last exponential product of the integrand. Moreover, since exponential growth dominates polynomial growth,

$$\frac{\omega^{2}}{\omega_{0}^ {3}}\cdot\mbox{exp}\left(-\frac{\omega^{2}}{\omega_{0}^{2}}\right) \rightarrow 0 \hspace{2mm} \mbox{as} \hspace{2mm} \omega \rightarrow \infty,$$

so convergence of $G(\omega)$ certainly doesn't seem like a problem.

Thank you once again for your response!

I agree that convergence of $$G(\omega)$$ doesn't seem to be a problem. Now I wonder if it would be possible to eliminate the improper integral before or after the differentiation.

Although the improper integral can be dealt with numerically, could I get rid of it analytically? Or perhaps my best bet would be to use a series expansion to approximate the improper integral?

If that integral can be dealt with analytically, I don't really see where to begin. I mean omega appears everywhere and we have a product involving a quadratic, a cosine, and two exponential functions. Ignoring what I said earlier about distributions for now, I think differentiating under the integral sign (i.e. differentiating the integrand first) with respect to the variable of your choice (I guess here it's t) can be justified if you ensure that the partial derivative of the integrand with respect to t exists and if you can find a dominating function H such that

$$\left|\frac{\partial G}{\partial t}\right| \leq H.$$

(I left out some conditions regarding measurability/lebesgue integrability of the functions appearing in the integrand because everything making up the integrand is continuous)
These certainly aren't the most general conditions, but I have a feeling it suffices for this purpose. Out of curiosity, what exactly is the context of this calculation?

Last edited:
Yeah, that kind of confirms my own suspicions that it's not possible to really deal with the improper integral using analysis. Like you say, the best way to proceed would be to find the partial derivative of the integrand.

$$u(t, \tau)$$ models an acoustic wavelet which has propagated through a lossy material. The speed and attenuation of the wavelet are dependent on the angular frequency $$\omega$$ and the $$Q$$ factor, which governs the attenuation.

By taking the first time derivative of $$u(t, \tau)$$, and finding the roots of $$\partial u(t,\tau) / \partial t = 0$$, I would like to find the width of the wavelet, which is the total time between the two side-lobes.

Thanks snipez90.