Convergence of series of functions

AI Thread Summary
The discussion centers on analyzing the convergence of the series of functions defined by the sum of logarithmic terms. The user is attempting to apply Weierstrass' criterion to determine pointwise, uniform, and local uniform convergence, but struggles with proving the convergence of the series. They conclude that the series does not converge pointwise for any real number and, consequently, does not converge uniformly on any interval. A suggestion is made to use a comparison criterion with a convergent series to establish bounds, and the user reflects on the correctness of their approach and definitions related to local convergence. The conversation emphasizes the importance of rigorous analysis in determining the behavior of function series.
twoflower
Messages
363
Reaction score
0
Hi all,
I have few questions about this excercise:

Analyse pointwise, uniform and local uniform convergence of this series of functions:

<br /> \sum_{k=2}^{\infty}\log \left(1 + \frac{x^2}{k \log^2 k} \right)<br />

I'm trying to do it using Weierstrass' criterion. To recall it, it says

<br /> \mbox{Let } f_n \mbox{ are defined on } 0 \neq M \subset \mathbb{R}\mbox{, let }<br /> S_n := \sup_{x \in M} \left| f_{n}(x)\right|, n \in \mathbb{N}. \mbox{ If } <br /> \sum_{n=1}^{\infty} S_n &lt; \infty\mbox{, then } \sum_{n=1}^{\infty} f_{n}(x) \rightrightarrows \mbox{ on } M.<br />

How to find

<br /> \sup_{x \in M} \left| f_{n}(x)\right|<br />
?

The derivative is
<br /> \left(\log \left(1 + \frac{x^2}{k \log^2 k} \right)\right)^{&#039;} = \frac{2x}{k\log^2 k + x^2}<br />

It means that the function is growing for x &gt; 0. x going to infinity would bring us problems, so I will take x \in [-K, K], where -\infty &lt; -K &lt; K &lt; \infty.

Then
<br /> \sup_{x \in M} \left| f_{n}(x)\right| = \log \left( 1 + \frac{K^2}{n \log^{2} n}\right)<br />

But I don't know how to prove that
<br /> \sum_{n=2}^{\infty} \log \left( 1 + \frac{K^2}{n \log^{2} n}\right) \mbox{ converges}<br />

Could someone point me to the right direction please?

Thank you.
 
Physics news on Phys.org
twoflower said:
How to find
\sup_{x \in M} \left| f_{n}(x)\right|<br />
?

You don't necessarily have to find the sequence of suprema. If you find a sequence of upper bounds that converges, then according to the "regular" comparison criterion, the serie of suprema does too!

In other words, if you find a sequence a_n such that |f_n(x)| \leq a_n \forall x \in M and at least for n>N, and such that \sum a_n &lt; \infty, then since S_n \leq a_n (definition of supremum), we have \sum S_n &lt; \infty (comparison criterion for numerical series)
 
quasar987 said:
You don't necessarily have to find the sequence of suprema. If you find a sequence of upper bounds that converges, then according to the "regular" comparison criterion, the serie of suprema does too!

In other words, if you find a sequence a_n such that |f_n(x)| \leq a_n \forall x \in M and at least for n>N, and such that \sum a_n &lt; \infty, then since S_n \leq a_n (definition of supremum), we have \sum S_n &lt; \infty (comparison criterion for numerical series)

Thank you quasar987, I have already also thought of comparing it with some convergent series, but I haven't thought any proper up. Now I'm trying to solve the problem using the theorem about change of sum and derivation and it may help...
 
I think I found something but you'd better double check. I proved that the serie does not converge pointwise.

1) Dom f_n = \mathbb{R}.

2) Consider an element x_0 \in \mathbb{R}.

3) Then, according to a theorem for numerical series,

\sum f_n(x_0) \ \ \mbox{converges} \Leftrightarrow \sum 2^n f_{2^n}(x_0) \ \ \mbox{converges}

Now calulate 2^n f_{2^n}(x_0) and evaluate the limit as n goes to infinity (you'll need to use l'Hospital's rule once). I find that the result is infinity. But according to a theorem for numerical series, if \sum a_n \neq 0, it diverges. So our serie of function does not converge pointwise for any element of R.. ==> it does not converge uniformally on any interval.

But like I said, double check this, it seems kinda dubious.

(what does local convergence mean?)
 
Last edited:
quasar987 said:
I think I found something but you'd better double check. I proved that the serie does not converge pointwise.

1) Dom f_n = \mathbb{R}.

2) Consider an element x_0 \in \mathbb{R}.

3) Then, according to a theorem for numerical series,

\sum f_n(x_0) \ \ \mbox{converges} \Leftrightarrow \sum 2^n f_{2^n}(x_0) \ \ \mbox{converges}

Now calulate 2^n f_{2^n}(x_0) and evaluate the limit as n goes to infinity (you'll need to use l'Hospital's rule once). I find that the result is infinity. But according to a theorem for numerical series, if \sum a_n \neq 0, it diverges. So our serie of function does not converge pointwise for any element of R.. ==> it does not converge uniformally on any interval.

But like I said, double check this, it seems kinda dubious.

(what does local convergence mean?)

Well, first I'll show you how I tried:

We have theorem saying this:

<br /> \mbox{Let } f_n, n \in \mathbb{N} \mbox{ are defined and have finite derivatives } f_{n}^{&#039;} \mbox{ on } (a, b) \subset \mathbb{R}. \mbox{ Let \\}<br />

<br /> \mbox{\\ (i) } \sum_{n=1}^{\infty} f_{n}^{&#039;} \stackrel{loc}{\rightrightarrows} \mbox{ on } (a,b)<br />

<br /> \mbox{ \\ (ii) } \exists x_0 \in (a,b) \mbox{ such, that } \sum_{n=1}^{\infty} f_{n}(x_0) &lt; \infty \mbox{ (converges) \\}<br />

<br /> \mbox{ \\ Then } \sum f_{n} \stackrel{loc}{\rightrightarrows} \mbox{ on } (a,b).<br />

So I have

<br /> f_{n}^{&#039;} = \frac{2x}{n\log^{2}n + x^2}<br />

Now this is the function for which we want to analyse uniform convergence. Where's the supremum? Derivative of derivative is

<br /> \frac{2n\log^{2}n - 2x^2}{\left(n\log^{2}n + x^2\right)^2} = 0 \Leftrightarrow x = \pm \sqrt{n\log^{2}n}<br />

So the supremum is in x = \sqrt{n\log^{2}n}, however, because x \in [-K,K] (it won't move along with growing n), supremum will move to the rightmost point of the interval. So we have

<br /> \sup_{x \in [-K,K]} \left| \frac{2x}{n\log^{2}n + x^2}\right| = \frac{2K}{n\log^{2}n + K^2} =: S_n<br />

Then
<br /> \sum S_n &lt; \infty \Rightarrow \sum f_{n}(x) \mbox{ converges on} [-K, K]<br />

According to results it's ok, I just want to ask you if you think it's mathematically correct (ie. ok to write it this way in test)
 
Last edited:
What's the definition of local convergence?

According to results it's ok, I just want to ask you if you think it's mathematically correct (ie. ok to write it this way in test)
I'm not one to answer that. I'm just learning this stuff, like you.
 
quasar987 said:
What's the definition of local convergence?

<br /> \mbox{We say that } \left\{ f_{n}\right\}_{n=1}^{\infty} \mbox{ converges locally uniformly to f on M and write}<br />

<br /> f_{n} \overset{loc}{\rightrightarrows} f \mbox{ on M, if}<br />

<br /> \mbox{for each } x \in M \mbox{ there exists } \delta &gt; 0 \mbox{ such, that } \left\{f_{n}\right\}_{n=1}^{\infty} \mbox{ converges uniformly to } f \mbox{ on } M\ \cap \ U(x, \delta).<br />
 
<br /> f_{n}^{&#039;} = \frac{2x}{n\log^{2}n + x^2} = 0 \Leftrightarrow x = \pm \sqrt{n\log^{2}n}<br />


How do you get this? I would agree that (for n fixed), f ' = 0 <==> x = 0.
 
quasar987 said:
<br /> f_{n}^{&#039;} = \frac{2x}{n\log^{2}n + x^2} = 0 \Leftrightarrow x = \pm \sqrt{n\log^{2}n}<br />


How do you get this? I would agree that (for n fixed), f ' = 0 <==> x = 0.

A typo, I corrected it in my post.
 
Last edited:
Back
Top