Convergence to the derivative

Therefore, the Uniform Convergence Theorem applies, and the convergence of f_n to f' will be uniform on ]a, b[.In summary, the Uniform Convergence Theorem states that if a sequence of functions {f_n} is equicontinuous and uniformly bounded on an interval ]a, b[, and the limit of the sequence is a continuous function f, then the convergence of {f_n} to f is uniform on ]a, b[. This applies to the sequence of functions {f_n} defined in the conversation, as long as f is continuously differentiable on the interval ]a, b[.
  • #1
jostpuur
2,116
19
Let

[tex]
f:]a,b[\to\mathbb{R}
[/tex]

be a continuously differentiable function, where [itex]]a,b[\subset\mathbb{R}[/itex] is some interval, and define

[tex]
f_n:]a,b[\to\mathbb{R},\quad
f_n(x)=\left\{\begin{array}{ll}
&\frac{f(x+1/n)-f(x)}{1/n},\quad x\in ]a,b-1/n[\\
&\textrm{something continuous},\quad x\in [b-1/n, b[\\
\end{array}\right.
[/tex]

Clearly we have a point wise limit [itex]f_n(x)\to f'(x)[/itex] as [itex]n\to\infty[/itex], but how common it is that this convergence is uniform? Is there some well known theorem that says that the convergence is uniform under some assumptions?
 
Last edited:
Physics news on Phys.org
  • #2
Yes, there is a well known theorem which states that the convergence of the sequence of functions {f_n} to f' is uniform if f is continuously differentiable over the interval ]a, b[. This theorem is known as the Uniform Convergence Theorem, or the Arzelà–Ascoli theorem.

The Uniform Convergence Theorem states that if the sequence of functions {f_n} is equicontinuous and uniformly bounded on the interval ]a, b[, then it converges uniformly to f' on ]a, b[. Here, equicontinuity means that for every ε > 0 there exists some δ > 0 such that for all x, y ∈ ]a, b[ with |x−y| < δ, we have |f_n(x)−f_n(y)| < ε. Uniform boundedness means that there exists some M > 0 such that for all n and for all x ∈ ]a, b[, we have |f_n(x)| ≤ M.

In the case of the sequence of functions {f_n}, they will be equicontinuous and uniformly bounded if f is continuously differentiable on ]a, b[. This is because if f is continuously differentiable, then its derivative f' will be continuous on ]a, b[, and thus the difference quotient of f
 

1. What is convergence to the derivative?

Convergence to the derivative is a mathematical concept that describes the behavior of a sequence of functions as they approach the derivative of a given function. It involves taking the limit of the sequence as the interval between points approaches zero.

2. Why is convergence to the derivative important?

Convergence to the derivative is important because it allows us to approximate the derivative of a function with a sequence of simpler functions. This can be useful in numerical analysis and in solving differential equations.

3. How is convergence to the derivative calculated?

Convergence to the derivative is typically calculated using the definition of the derivative, which involves taking the limit of the difference quotient as the interval between points approaches zero.

4. What are some applications of convergence to the derivative?

Convergence to the derivative has many applications in mathematics and science, such as in optimization problems, numerical integration, and solving differential equations. It is also used in physics and engineering to model and analyze various systems.

5. Are there any limitations to convergence to the derivative?

Yes, there are certain limitations to convergence to the derivative, such as the need for the function to be differentiable and for the sequence of functions to converge uniformly. Additionally, the rate of convergence may vary depending on the function and the chosen sequence.

Similar threads

Replies
7
Views
1K
Replies
11
Views
1K
Replies
5
Views
272
Replies
2
Views
690
  • Calculus
Replies
12
Views
370
Replies
2
Views
1K
Replies
6
Views
531
Replies
14
Views
1K
Replies
4
Views
643
Back
Top