Exchanging derivative and improper integral

Galileo
Science Advisor
Homework Helper
Messages
1,980
Reaction score
7
I was wondering. When is the following legal?

\frac{\partial}{\partial y}\int_{-\infty}^{\+\infty}f(x,y)dx=\int_{-\infty}^{\+\infty}\frac{\partial f(x,y)}{\partial y}dx

I know the rule when the limits of integration are bounded, but here there are four limits involved. One for the derivative, one for the integral and two for the 2 limits of integration which go to infinity and -infinity.
 
Physics news on Phys.org
As long as the integrals on both sides EXIST the equation is true.

It is possible, for rather special f, for one of the integrals to exist while the other doesn't.
 
HallsofIvy said:
As long as the integrals on both sides EXIST the equation is true.

It is possible, for rather special f, for one of the integrals to exist while the other doesn't.

May I ask for more?

It seems to me that the partial must be exponentially decaying for the integral to exist. However, are there other functions which make the integral converge but are "less severe" then exponentially decaying? Is 1/(x^2) exp. decaying?

What constraints must be imposed on the function f(x) itself for the relation to hold? Can we just say that it'd derivative must be like I suggested above or is there some other criteria? Do I need to just review this too?
 
Well, f(x,y) = (y.exp{-x^2} + 1/x), does that satisfy your question? The integral on the LHS would not exist, but the integral on the right would. And, no 1/x^2 doesn't decay exponentially, that is (k^x)/(x^2) would not tend to a constant as x tends to infinity (k larger than 1).
 
saltydog said:
May I ask for more?

It seems to me that the partial must be exponentially decaying for the integral to exist.


Take f(x,y)=\frac{1}{x^{2}y^{2}+3} and c what happens.

saltydog said:
However, are there other functions which make the integral converge but are "less severe" then exponentially decaying?

I've just given u an example.


saltydog said:
Is 1/(x^2) exp. decaying?

What do you think...? :rolleyes:

saltydog said:
What constraints must be imposed on the function f(x) itself for the relation to hold?

That the
\int_{-\infty}^{+\infty} f(x,y) dx

doesn't "blow up"...And that the function "f" should never "blow up" and the same with its first order derivatives...

saltydog said:
Do I need to just review this too?

No,u need to review the exponential function... :wink:

Daniel.
 
matt grime said:
Well, f(x,y) = (y.exp{-x^2} + 1/x), does that satisfy your question? The integral on the LHS would not exist, but the integral on the right would. And, no 1/x^2 doesn't decay exponentially, that is (k^x)/(x^2) would not tend to a constant as x tends to infinity (k larger than 1).

But if the integral on LHS doesn't exist then you can't take the partial of it right? Thus, only if the LHS integral exists, then under what conditions imposed on f(x,y) will the relation hold? I'll review.

Alright, I read Daniel's too. Thanks guys. I'll work your example through as well as some others and also review def of "exp. decay".

Salty
 
Last edited:
Well, for:

f(x,y)=\frac{1}{x^{2}y^{2}+3}

I proved by direct substitution:

\frac{\partial}{\partial y}\int_{-\infty}^{\+\infty}f(x,y)dx=\int_{-\infty}^{\+\infty}\frac{\partial f(x,y)}{\partial y}dx

It wasn't easy for me . I ain't proud.

Salty
 
Last edited:
No,for anyone it would have been as difficult as it was for you...Such integrals are not too simple and only a computer can do them in record time.

Daniel.
 
dextercioby said:
No,for anyone it would have been as difficult as it was for you...Such integrals are not too simple and only a computer can do them in record time.

Daniel.

Well, I did eventually use Mathematica to get the antiderivative for the RHS, but I did use my own reasoning to assess the limits.

Thanks,
Salty
 
  • #10
That's not good...You could have done it without the computer.It doesn't really matter.Hopefully you'll get clarified with the exponential decay,as i think it's more important.

Daniel.
 
  • #11
Ofcourse, I meant for an ordinary derivative sign on the left side:

\frac{d}{dy}\int_{-\infty}^{\+\infty}f(x,y)dx=\int_{-\infty}^{\+\infty}\frac{\partial f(x,y)}{\partial y}dx

Anyway, any ideas on how to prove this?
 
  • #12
dextercioby said:
That's not good...You could have done it without the computer.It doesn't really matter.Hopefully you'll get clarified with the exponential decay,as i think it's more important.

Daniel.

I figured I'd loose the thumbs up.

That's a hard integral and it looked like parts and also I really want to continue working on residues for another post.

Salty
 
  • #13
I am a little surprized by the assertion of Halls of Ivy that the two expressions are equal whenever they both exist, as I cannot find any reference for such a strong statement in my tiny library of analysis books. Can you give me one?

Counterexamples seem to exist to even the finite interval case for such a strong assertion, viz. Gelbaum and Olmsted, Counterexamples in analysis, page 123, the function f(x,y) = (x^3/y^2) e^(-x^2/y), if y > 0, and 0, if y = 0.

defined in the closed upper half plane y ≥ 0. The problem is this function, although continuous in each variable separately, is not continuous at (0,0) as a function of two variables. Then the integral from y=0 to y=1, of (∂f/∂x)(0,y) dy is zero, while d/dx of the integral of f(x,y)dy evaluated at x =0, is 1.

(Note the roles of the variables x and y are interchanged from the case here.)

Is there something different about the open interval case?

actually here is a simpler example given me by a friend, that seems to work in the present setting as well:

let f(x,y) = 1, if 0≤ y ≤ 1, and 0 ≤ x ≤ y^2 (1-y)^2,
and f(x,y) = 0 elsewhere.

Then for fixed x, as a function of y, this is equal to 1 at most on a short interval and zero elsewhere. Hence, given any y, for all but at most one x, the derivative ∂/∂y(f(x,y)) exists and is zero. In particular the integral wrt x, over the whole real line, of the derivative ∂/∂y(f(x,y)) is zero as a function of y.

On the other hand, for fixed y between 0 and 1, the integral wrt x, over the whole real line, equals y^2 (1-y)^2, while for other y, it equals zero. Thus the derivative d/dy of the integral of f(x,y)dx equals 2y(1-y)(1-2y), for 0 ≤ y ≤ 1, and 0 elsewhere.

hence the integral of the derivative does not equal the derivative of the integral.

Does this seem ok? I am pretty weak at real analysis.
 
Last edited:
  • #14
If the interval is finite, equality holds if f(x,y) is continuous and if \frac{\partial}{\partial y}f(x,y) is also continuous.
I know how to prove that theorem. It just seems to me that this is a different case, since you are interchanging not 2 limits, but 4. The limit for the derivative comes last in the LHS, but is first on the RHS.
Since the order in which limits are taken may not always be interchanged, I just thought it was natural this case has to be treated differently.
 
  • #15
the usual hypothesis seems to be, by analogy with the continuity hypothesis on f(x,y) and on ∂f/∂y, that these same two functions are at least bounded above in absolute value by some integrable functions. and maybe the derivative ∂f/∂y should exist everywhere for almost all x.

this stuff is confusing to me.
 
Last edited:
  • #16
Hmm, who about this:
\lim_{y \to b}\lim_{x \to a}f(x,y)=\lim_{x \to a}\lim_{y \to b}f(x,y)=f(a,b)
if f(x,y) is continuous.

Now if
\lim_{x \to \pm\infty}f(x,y)
exists and if f(x,y) is continuous, then
\lim_{x_1 \to -\infty}\lim_{x_2 \to \infty}\int_{x_1}^{x_2}f(x,y)dx=H(y)
is still simply a continuous function of y.

So

\frac{d}{dy}\left(\lim_{x_1 \to -\infty}\lim_{x_2 \to \infty}\int_{x_1}^{x_2}f(x,y)dx\right)=\lim_{h \to 0}\frac{H(y+h)-H(y)}{h}=
\lim_{x_1 \to -\infty}\lim_{x_2 \to \infty}\left(\lim_{h \to 0}\int_{x_1}^{x_2}\frac{f(x,y+h)-f(x,y)}{h}dx\right)
where the last step uses the continuity of H(y).
So now we can use our old (Leibniz') theorem for finite intervals.

The restrictions are that f(x,y) and f_y(x,y) are uniformly continuous on a 'strip' -\infty <x <\infty, \quad -\alpha \leq y \leq \beta.
So that it holds for all y in (\alpha,\beta).

EDIT: Oh yeah, and all the limits have to exist. :wink:
 
Last edited:
  • #17
you are using the usual hypotheses that both f and ∂f/y be continuous in some form. That was my point, that the problem is not with the size of the interval of integration, but with the lack of global hypotheses on f. If you want to use lebesgue integration, you can relax these hypotheses somewhat as discussed in any standard book on analysis, like Rudin [Real and Complex Analysis], or Lang [Real Analysis], or Dieudonne [Foundations of Modern analysis, vol 2], using "dominated convergence".
 
  • #18
Just in case someone hits this from google (like I did), I have a reference.

Buck, in Advanced Calculus, says:

Theorem If \int_c^\infty f(x,u) \, du converges to F(x) for all x, a \leq x \leq b, and if f and f_1 = \partial f/\partial x are continuous for a \leq x \leq b, c \leq u < \infty, and if \int_c^\infty f_1(x,u) \, du is uniformly convergent for x in [a,b], then for any x in [a,b],

F'(x) = \frac{d}{dx} \int_c^\infty f(x,u) \, du = \int_c^\infty f_1(x,u) \, du
 

Similar threads

Back
Top