Convergence of series of functions

U is a nbd. of x)}In summary, the conversation was discussing the pointwise, uniform, and local uniform convergence of a series of functions. The participants were using Weierstrass' criterion and other theorems to analyze the convergence, and discussing how to find the supremum and prove convergence. The conversation also touched on local convergence and the definition of it.
  • #1
twoflower
368
0
Hi all,
I have few questions about this excercise:

Analyse pointwise, uniform and local uniform convergence of this series of functions:

[tex]
\sum_{k=2}^{\infty}\log \left(1 + \frac{x^2}{k \log^2 k} \right)
[/tex]

I'm trying to do it using Weierstrass' criterion. To recall it, it says

[tex]
\mbox{Let } f_n \mbox{ are defined on } 0 \neq M \subset \mathbb{R}\mbox{, let }
S_n := \sup_{x \in M} \left| f_{n}(x)\right|, n \in \mathbb{N}. \mbox{ If }
\sum_{n=1}^{\infty} S_n < \infty\mbox{, then } \sum_{n=1}^{\infty} f_{n}(x) \rightrightarrows \mbox{ on } M.
[/tex]

How to find

[tex]
\sup_{x \in M} \left| f_{n}(x)\right|
[/tex]
?

The derivative is
[tex]
\left(\log \left(1 + \frac{x^2}{k \log^2 k} \right)\right)^{'} = \frac{2x}{k\log^2 k + x^2}
[/tex]

It means that the function is growing for [itex]x > 0[/itex]. x going to infinity would bring us problems, so I will take [itex]x \in [-K, K][/itex], where [itex]-\infty < -K < K < \infty[/itex].

Then
[tex]
\sup_{x \in M} \left| f_{n}(x)\right| = \log \left( 1 + \frac{K^2}{n \log^{2} n}\right)
[/tex]

But I don't know how to prove that
[tex]
\sum_{n=2}^{\infty} \log \left( 1 + \frac{K^2}{n \log^{2} n}\right) \mbox{ converges}
[/tex]

Could someone point me to the right direction please?

Thank you.
 
Physics news on Phys.org
  • #2
twoflower said:
How to find
[tex]\sup_{x \in M} \left| f_{n}(x)\right|
[/tex]
?

You don't necessarily have to find the sequence of suprema. If you find a sequence of upper bounds that converges, then according to the "regular" comparison criterion, the serie of suprema does too!

In other words, if you find a sequence [itex]a_n[/itex] such that [itex]|f_n(x)| \leq a_n[/itex] [itex]\forall x \in M[/itex] and at least for n>N, and such that [itex]\sum a_n < \infty[/itex], then since [itex]S_n \leq a_n[/itex] (definition of supremum), we have [itex]\sum S_n < \infty[/itex] (comparison criterion for numerical series)
 
  • #3
quasar987 said:
You don't necessarily have to find the sequence of suprema. If you find a sequence of upper bounds that converges, then according to the "regular" comparison criterion, the serie of suprema does too!

In other words, if you find a sequence [itex]a_n[/itex] such that [itex]|f_n(x)| \leq a_n[/itex] [itex]\forall x \in M[/itex] and at least for n>N, and such that [itex]\sum a_n < \infty[/itex], then since [itex]S_n \leq a_n[/itex] (definition of supremum), we have [itex]\sum S_n < \infty[/itex] (comparison criterion for numerical series)

Thank you quasar987, I have already also thought of comparing it with some convergent series, but I haven't thought any proper up. Now I'm trying to solve the problem using the theorem about change of sum and derivation and it may help...
 
  • #4
I think I found something but you'd better double check. I proved that the serie does not converge pointwise.

1) [itex]Dom f_n = \mathbb{R}[/itex].

2) Consider an element [itex]x_0 \in \mathbb{R}[/itex].

3) Then, according to a theorem for numerical series,

[tex]\sum f_n(x_0) \ \ \mbox{converges} \Leftrightarrow \sum 2^n f_{2^n}(x_0) \ \ \mbox{converges}[/tex]

Now calulate [itex]2^n f_{2^n}(x_0)[/itex] and evaluate the limit as n goes to infinity (you'll need to use l'Hospital's rule once). I find that the result is infinity. But according to a theorem for numerical series, if [itex]\sum a_n \neq 0 [/itex], it diverges. So our serie of function does not converge pointwise for any element of R.. ==> it does not converge uniformally on any interval.

But like I said, double check this, it seems kinda dubious.

(what does local convergence mean?)
 
Last edited:
  • #5
quasar987 said:
I think I found something but you'd better double check. I proved that the serie does not converge pointwise.

1) [itex]Dom f_n = \mathbb{R}[/itex].

2) Consider an element [itex]x_0 \in \mathbb{R}[/itex].

3) Then, according to a theorem for numerical series,

[tex]\sum f_n(x_0) \ \ \mbox{converges} \Leftrightarrow \sum 2^n f_{2^n}(x_0) \ \ \mbox{converges}[/tex]

Now calulate [itex]2^n f_{2^n}(x_0)[/itex] and evaluate the limit as n goes to infinity (you'll need to use l'Hospital's rule once). I find that the result is infinity. But according to a theorem for numerical series, if [itex]\sum a_n \neq 0 [/itex], it diverges. So our serie of function does not converge pointwise for any element of R.. ==> it does not converge uniformally on any interval.

But like I said, double check this, it seems kinda dubious.

(what does local convergence mean?)

Well, first I'll show you how I tried:

We have theorem saying this:

[tex]
\mbox{Let } f_n, n \in \mathbb{N} \mbox{ are defined and have finite derivatives } f_{n}^{'} \mbox{ on } (a, b) \subset \mathbb{R}. \mbox{ Let \\}
[/tex]

[tex]
\mbox{\\ (i) } \sum_{n=1}^{\infty} f_{n}^{'} \stackrel{loc}{\rightrightarrows} \mbox{ on } (a,b)
[/tex]

[tex]
\mbox{ \\ (ii) } \exists x_0 \in (a,b) \mbox{ such, that } \sum_{n=1}^{\infty} f_{n}(x_0) < \infty \mbox{ (converges) \\}
[/tex]

[tex]
\mbox{ \\ Then } \sum f_{n} \stackrel{loc}{\rightrightarrows} \mbox{ on } (a,b).
[/tex]

So I have

[tex]
f_{n}^{'} = \frac{2x}{n\log^{2}n + x^2}
[/tex]

Now this is the function for which we want to analyse uniform convergence. Where's the supremum? Derivative of derivative is

[tex]
\frac{2n\log^{2}n - 2x^2}{\left(n\log^{2}n + x^2\right)^2} = 0 \Leftrightarrow x = \pm \sqrt{n\log^{2}n}
[/tex]

So the supremum is in [itex]x = \sqrt{n\log^{2}n}[/itex], however, because [itex]x \in [-K,K][/itex] (it won't move along with growing n), supremum will move to the rightmost point of the interval. So we have

[tex]
\sup_{x \in [-K,K]} \left| \frac{2x}{n\log^{2}n + x^2}\right| = \frac{2K}{n\log^{2}n + K^2} =: S_n
[/tex]

Then
[tex]
\sum S_n < \infty \Rightarrow \sum f_{n}(x) \mbox{ converges on} [-K, K]
[/tex]

According to results it's ok, I just want to ask you if you think it's mathematically correct (ie. ok to write it this way in test)
 
Last edited:
  • #6
What's the definition of local convergence?

According to results it's ok, I just want to ask you if you think it's mathematically correct (ie. ok to write it this way in test)
I'm not one to answer that. I'm just learning this stuff, like you.
 
  • #7
quasar987 said:
What's the definition of local convergence?

[tex]
\mbox{We say that } \left\{ f_{n}\right\}_{n=1}^{\infty} \mbox{ converges locally uniformly to f on M and write}
[/tex]

[tex]
f_{n} \overset{loc}{\rightrightarrows} f \mbox{ on M, if}
[/tex]

[tex]
\mbox{for each } x \in M \mbox{ there exists } \delta > 0 \mbox{ such, that } \left\{f_{n}\right\}_{n=1}^{\infty} \mbox{ converges uniformly to } f \mbox{ on } M\ \cap \ U(x, \delta).
[/tex]
 
  • #8
[tex]
f_{n}^{'} = \frac{2x}{n\log^{2}n + x^2} = 0 \Leftrightarrow x = \pm \sqrt{n\log^{2}n}
[/tex]


How do you get this? I would agree that (for n fixed), f ' = 0 <==> x = 0.
 
  • #9
quasar987 said:
[tex]
f_{n}^{'} = \frac{2x}{n\log^{2}n + x^2} = 0 \Leftrightarrow x = \pm \sqrt{n\log^{2}n}
[/tex]


How do you get this? I would agree that (for n fixed), f ' = 0 <==> x = 0.

A typo, I corrected it in my post.
 
Last edited:

What is meant by "convergence of series of functions"?

Convergence of series of functions refers to the behavior of a sequence of functions as the number of functions in the sequence increases. In other words, it is the study of how the functions in a series approach a certain limit or value.

What are the different types of convergence of series of functions?

The different types of convergence of series of functions are pointwise convergence, uniform convergence, and absolute convergence. Pointwise convergence means that the sequence of functions converges at each point in the domain. Uniform convergence means that the sequence of functions converges to the same limit at every point in the domain. Absolute convergence means that the series of functions converges absolutely, regardless of the order in which the functions are added.

How is convergence of series of functions determined?

The convergence of series of functions is determined by evaluating the limit of the sequence of functions as the number of functions in the series approaches infinity. This can be done using various convergence tests, such as the ratio test, root test, and comparison test.

What is the importance of convergence of series of functions in mathematics?

The study of convergence of series of functions is important in many areas of mathematics, such as analysis, calculus, and differential equations. It allows us to understand the behavior of functions and their limits, and to make predictions and proofs about the behavior of more complex mathematical concepts.

What are some real-world applications of convergence of series of functions?

Convergence of series of functions has many real-world applications, including in physics, engineering, and economics. For example, it is used to analyze the behavior of electric fields, heat transfer, and financial markets. It also plays a crucial role in numerical analysis, which is used in computer simulations and modeling.

Similar threads

  • Introductory Physics Homework Help
Replies
2
Views
231
  • Introductory Physics Homework Help
Replies
2
Views
1K
  • Calculus and Beyond Homework Help
Replies
3
Views
410
  • Math POTW for University Students
Replies
1
Views
457
Replies
2
Views
874
  • Introductory Physics Homework Help
Replies
1
Views
977
  • Introductory Physics Homework Help
Replies
2
Views
1K
  • Introductory Physics Homework Help
Replies
4
Views
1K
  • Introductory Physics Homework Help
2
Replies
64
Views
2K
  • Introductory Physics Homework Help
Replies
1
Views
901
Back
Top