Pointwise vs uniform convergence

Click For Summary
SUMMARY

The discussion centers on the convergence properties of the sequence of functions defined by f_{n}(x) = x^{n} / (1 + x^{n}) over the interval [0, 1]. The pointwise limit is established as f(x) = 0 for x in [0, 1) and f(1) = 1/2. The user questions the uniform convergence of the sequence, noting that while the supremum of the absolute difference between f_{n}(x) and f(x) approaches 0, the discontinuity of f(x) suggests that uniform convergence may not hold. This contradiction highlights the complexities of convergence types in real analysis.

PREREQUISITES
  • Understanding of pointwise and uniform convergence in real analysis
  • Familiarity with limits and supremum concepts
  • Knowledge of continuity and its implications on convergence
  • Basic proficiency in analyzing sequences of functions
NEXT STEPS
  • Study the definitions and differences between pointwise and uniform convergence
  • Explore examples of sequences of functions that converge uniformly and those that do not
  • Investigate the implications of discontinuities on convergence types
  • Learn about the Arzelà-Ascoli theorem and its applications in convergence analysis
USEFUL FOR

Mathematics students, particularly those studying real analysis, educators teaching convergence concepts, and researchers exploring function sequences and their properties.

sephiseraph
Messages
5
Reaction score
0
Howdy Ho, partner.

I have a series of functions [tex]{f_{n}}[/tex] with [tex]f_{n}(x) := x^{n} / (1 + x^{n})[/tex] and I am investigating the pointwise limit of the sequence [tex]f_{n}[/tex] over [0, 1] to see if it converges uniformly.

I found the pointwise limit f(x) to be [tex]f(x) = lim_{n\rightarrow\infty} x^{n} / (1 + x^{n})[/tex], f(x) = 0 for x in [0, 1), f(x) = 1/2 for x = 1.

My problem here comes in finding out if the sequence converges uniformly or not. Intuition tells me that, since f(x) is not continuous, there cannot be uniform convergence on [0, 1]. However we also have

[tex]sup\left\{|f_{n}(x) - f(x)| : x \in \left[0, 1\right]\right\} = sup\left\{x^{n} / (1 + x^{n}) : x \in \left[0, 1\right)\right\}[/tex] which tends to 0 as n tends to infinity, which is a sufficient condition for uniform convergence. Where have I gone wrong here or how do I make sense of this?
 
Physics news on Phys.org
Are you sure [itex]\sup\{|f_n(x) - f(x)| \colon x \in [0,1]\}[/itex] tends to 0?
 

Similar threads

  • · Replies 3 ·
Replies
3
Views
3K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 3 ·
Replies
3
Views
4K
  • · Replies 1 ·
Replies
1
Views
3K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 5 ·
Replies
5
Views
3K
  • · Replies 2 ·
Replies
2
Views
3K
  • · Replies 17 ·
Replies
17
Views
6K