Infinite Differentiation and Integration?

In summary, the conversation revolves around the convergence of a function's derivative/antiderivative when taken an infinite number of times. There are four types of convergence that can occur: blowing up to infinity, approaching zero, approaching a function, or alternating between several functions. The speaker is looking for reading material and help on techniques for evaluating/approximating integrals without a closed form and without using a sum of values in the interval as an approximation. They are also interested in finding a method for approximating integrals that are a product of two simpler ones using integration by parts an infinite number of times. The conversation also touches on the convergence radius and how it can be determined by writing out the nth term of the expansion and applying standard tests for
  • #1
Manchot
473
4
I was just wondering if there was any theory or theorems involving the convergence of a function's derivative/antiderivative taken an infinite number of times. It would seem to me that there are four types of ways that it can happen: it blows up to infinity, approaches zero, approaches some function, or alternates between several functions. Based on my premliminary investigations, there's even functions which do a combination of the three. Anyway, I'm basically looking for some reading material on the subject, and was hoping that one of you people have read something about it.

Furthermore, I'm looking for some reading material about techniques of evalutating/approximating integrals without a closed form, without using a sum of values in the interval as an approximation. Any help is vastly appreciated.
 
Mathematics news on Phys.org
  • #2
Ok, actually, my two questions are related. Basically, what I'm trying to find is a method for approximating integrals which are a product of two "simpler" ones, using integration by parts an infinite number of times. I've had success with certain ones, such as the integrals which correspond with Si(x) and Ci(x). For example...

[tex]\int{\frac{\sin{x}}{x}dx}
\\
=-\frac{\cos{x}}{x}-\int{\frac{\cos{x}}{x^2}dx}
\\
=-\frac{\cos{x}}{x}-\frac{\sin{x}}{x^2}+2\int{\frac{\sin{x}}{x^3}dx}+...[/tex]

Obviously, this is separated from the acutal Ci(x) by a constant, but that's not really the point. The point is that for every integration, though the average value of the sin/cos remains constant, the negative power keeps getting larger, meaning that each successive integral is less than the ones before it (for x greater than 1). Therefore, it would seem that the series converges for x greater than 1, and upon graphing it, I found that it is indeed an asymptotic expansion for Ci(x) (when the appropriate constant is added, of course).

However, for other functions, determining convergence isn't quite so simple. It occurred to me that if the derivative of one of the multiplied functions approaches zero, and if the antiderivative of the other one doesn't increase, then this type of series will converge. That's why I'm trying to find the convergence radius.
 
  • #3
I think you may find this topic covered somewhat in the context you are looking for in any introductory analysis book where sequences and series are covered. Although my professor did not do much with it for our class, there was some material in our book about convergence and uniform convergence of functions, and I think of the derivatives of functions too. It preceded the section on power series, which seems like what you are looking for with the convergence radius.
 
  • #4
The solution is simple: Write out the the nth term of the expansion and look at it as a normal infinite series. Apply standard test for convergence, etc. Your case of "if the derivative gets smaller and the integral stays similar..." is just a special case of the method above. Also, if the derivative stays similar and the integral gets smaller...or if they both get bigger apart but together get smaller... :tongue:

P.S. don't forget to include the sum from one to N of terms and the integral of the Nth term, making sure that both converge together.
 
Last edited:

1. What is Infinite Differentiation?

Infinite differentiation is the process of finding the derivative of a function that results in a limit as the change in the input variable approaches zero. It is used to measure the rate of change of a function at a specific point.

2. How is Infinite Differentiation used in science?

Infinite differentiation is used in science to describe the behavior of physical systems that change continuously and smoothly. It is essential in areas such as physics, engineering, and economics to understand and predict the behavior of these systems.

3. What is Infinite Integration?

Infinite integration is the process of finding the indefinite integral of a function. It is the inverse operation of differentiation and is used to determine the original function given its derivative.

4. What is the difference between Finite and Infinite Differentiation and Integration?

The main difference between finite and infinite differentiation and integration is the range of values that the input variable can take. In finite differentiation and integration, the input variable has a limited range, while in infinite differentiation and integration, the input variable can approach infinity.

5. How do I solve problems involving Infinite Differentiation and Integration?

To solve problems involving infinite differentiation and integration, you need to understand the fundamental concepts and rules of differentiation and integration. You can then use these rules to find the derivatives and integrals of various functions. It is also helpful to practice solving problems to improve your skills and understanding.

Similar threads

Replies
3
Views
414
  • General Math
Replies
23
Views
2K
Replies
4
Views
383
  • General Math
Replies
5
Views
829
Replies
21
Views
2K
  • Topology and Analysis
Replies
0
Views
185
Replies
6
Views
1K
Replies
1
Views
3K
Replies
23
Views
4K
  • General Math
Replies
28
Views
4K
Back
Top