Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Infinite Differentiation and Integration?

  1. Jan 29, 2005 #1
    I was just wondering if there was any theory or theorems involving the convergence of a function's derivative/antiderivative taken an infinite number of times. It would seem to me that there are four types of ways that it can happen: it blows up to infinity, approaches zero, approaches some function, or alternates between several functions. Based on my premliminary investigations, there's even functions which do a combination of the three. Anyway, I'm basically looking for some reading material on the subject, and was hoping that one of you people have read something about it.

    Furthermore, I'm looking for some reading material about techniques of evalutating/approximating integrals without a closed form, without using a sum of values in the interval as an approximation. Any help is vastly appreciated.
  2. jcsd
  3. Jan 30, 2005 #2
    Ok, actually, my two questions are related. Basically, what I'm trying to find is a method for approximating integrals which are a product of two "simpler" ones, using integration by parts an infinite number of times. I've had success with certain ones, such as the integrals which correspond with Si(x) and Ci(x). For example...


    Obviously, this is separated from the acutal Ci(x) by a constant, but that's not really the point. The point is that for every integration, though the average value of the sin/cos remains constant, the negative power keeps getting larger, meaning that each successive integral is less than the ones before it (for x greater than 1). Therefore, it would seem that the series converges for x greater than 1, and upon graphing it, I found that it is indeed an asymptotic expansion for Ci(x) (when the appropriate constant is added, of course).

    However, for other functions, determining convergence isn't quite so simple. It occured to me that if the derivative of one of the multiplied functions approaches zero, and if the antiderivative of the other one doesn't increase, then this type of series will converge. That's why I'm trying to find the convergence radius.
  4. Jan 30, 2005 #3
    I think you may find this topic covered somewhat in the context you are looking for in any introductory analysis book where sequences and series are covered. Although my professor did not do much with it for our class, there was some material in our book about convergence and uniform convergence of functions, and I think of the derivatives of functions too. It preceded the section on power series, which seems like what you are looking for with the convergence radius.
  5. Feb 3, 2005 #4
    The solution is simple: Write out the the nth term of the expansion and look at it as a normal infinite series. Apply standard test for convergence, etc. Your case of "if the derivative gets smaller and the integral stays similar..." is just a special case of the method above. Also, if the derivative stays similar and the integral gets smaller.....or if they both get bigger apart but together get smaller... :tongue:

    P.S. don't forget to include the sum from one to N of terms and the integral of the Nth term, making sure that both converge together.
    Last edited: Feb 3, 2005
Know someone interested in this topic? Share this thread via Reddit, Google+, Twitter, or Facebook

Have something to add?

Similar Discussions: Infinite Differentiation and Integration?