Prove the sequence converges uniformly

  • Thread starter Thread starter wackikat
  • Start date Start date
  • Tags Tags
    Sequence
wackikat
Messages
8
Reaction score
0

Homework Statement


Let f_n be a sequence of function whcih converges pointwise on [0,1] where each one is Lipschitz with the same constant C. Prove that the sequence converges uniformly.

Homework Equations



A function is called Lipschitz with Lipschitz constant C if |f(x)-f(y)| <= C|x-y| for all x,y in its domain.

Let f_n be a sequence of functions defined on a set S. f is the pointwise limit of f_n if for all t in S lim n to infinity f_n(t) = f(t)


The Attempt at a Solution


I know that somehow I must show if for all epsilon > 0 there exists N in naturals such that sup|f_n(t) -f(t)| < epsilon if n>N for all t in [0,1]
Or show the Uniform Cauchy Criterion holds : for all epsilon > 0 there exists N in naturals such that |f_n(t) -f_m(t)| < epsilon for all m,n > t for all t in [0,1].
 
Physics news on Phys.org
Go for the uniform Cauchy, use the triangle inequality and the fact that |x-y| could at most be 1.
 
Hint: Pointwise convergence implies uniform convergence on any finite set of points. Since [0,1] is compact, you can choose points x1,...,xk such that the distances between consecutive points is arbitrarily small.
 
I've tried using Cauchy, but I just seem to end back with a term I started with.
Here's what I tried.
|f_n(x) -f_m(x)| = |f_n(x) + f_n(y) + f_n(y) - f_m(x)| <= |f_n(x) + f_n(y)| + |f_n(y) - f_m(x)| <= C|x-y| + |f_n(y) - f_m(x)| = C|x-y| + |f_n(y) -f_n(x) + f_n(x) - f_m(x)| <=
2C|x-y| + |f_n(x) - f_m(x)|

As for yyat's hint, I don't believe that is true. The sequence of funtions could converge to a discontinuouse f(x) which would mean there could not be uniform convergence.
If we knew Pointwise convergence implies uniform convergence on any finite set of points then we would not need the fact that the functions are Lipschitz.
 
wackikat said:
As for yyat's hint, I don't believe that is true. The sequence of funtions could converge to a discontinuouse f(x) which would mean there could not be uniform convergence.

Any function defined on a finite set of points is continuous.

If we knew Pointwise convergence implies uniform convergence on any finite set of points then we would not need the fact that the functions are Lipschitz.

Why? You want to prove uniform convergence on [0,1], which is not a finite set. The Lipschitz continuity is crucial here.
 
There are two things I don't understand about this problem. First, when finding the nth root of a number, there should in theory be n solutions. However, the formula produces n+1 roots. Here is how. The first root is simply ##\left(r\right)^{\left(\frac{1}{n}\right)}##. Then you multiply this first root by n additional expressions given by the formula, as you go through k=0,1,...n-1. So you end up with n+1 roots, which cannot be correct. Let me illustrate what I mean. For this...
Back
Top