Convergence to the derivative

  • Thread starter jostpuur
  • Start date
2,099
16

Main Question or Discussion Point

Let

[tex]
f:]a,b[\to\mathbb{R}
[/tex]

be a continuously differentiable function, where [itex]]a,b[\subset\mathbb{R}[/itex] is some interval, and define

[tex]
f_n:]a,b[\to\mathbb{R},\quad
f_n(x)=\left\{\begin{array}{ll}
&\frac{f(x+1/n)-f(x)}{1/n},\quad x\in ]a,b-1/n[\\
&\textrm{something continuous},\quad x\in [b-1/n, b[\\
\end{array}\right.
[/tex]

Clearly we have a point wise limit [itex]f_n(x)\to f'(x)[/itex] as [itex]n\to\infty[/itex], but how common it is that this convergence is uniform? Is there some well known theorem that says that the convergence is uniform under some assumptions?
 
Last edited:

Answers and Replies

Related Threads for: Convergence to the derivative

Replies
1
Views
2K
Replies
1
Views
3K
Replies
0
Views
1K
  • Last Post
Replies
0
Views
984
Replies
1
Views
3K
  • Last Post
Replies
1
Views
1K
  • Last Post
Replies
3
Views
1K
Replies
1
Views
4K
Top