MHB Sequence of functions : pointwise & uniform convergence

Click For Summary
The sequence of functions \( f_n(x) = n^{\alpha}xe^{-nx} \) converges pointwise to 0 on \([0, +\infty)\) as \( n \to \infty \). The maximum of \( f_n(x) \) occurs at \( x = \frac{1}{n} \), yielding \( f_n\left(\frac{1}{n}\right) = \frac{n^{\alpha-1}}{e} \). Uniform convergence on \([a, +\infty)\) is established if \( \alpha < 1 \), as this ensures that the maximum approaches 0. Conversely, if \( a \ge 1 \), \( f_n \) does not converge uniformly. The discussion emphasizes the relationship between the parameter \( \alpha \) and the convergence behavior of the function sequence.
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! 😊

Let $0<\alpha \in \mathbb{R}$ and $(f_n)_n$ be a sequence of functions defined on $[0, +\infty)$ by: \begin{equation*}f_n(x)=n^{\alpha}xe^{-nx}\end{equation*} - Show that $(f_n)$ converges pointwise on $[0,+\infty)$.

For an integer $m>a$ we have that \begin{equation*}0 \leq n^{\alpha}xe^{-nx} \leq n^{m}xe^{-nx}\end{equation*}
For $x> 0$ we have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n(x)=\lim_{n\rightarrow +\infty}n^{m}xe^{-nx}=\lim_{n\rightarrow +\infty}\frac{n^{m}x}{e^{nx}}\ \overset{m\text{-times De L'Hopital}}{=}\ \lim_{n\rightarrow +\infty}\frac{m!}{x^{m-1}e^{nx}}=0\end{equation*} For $x= 0$ we have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n(0)=\lim_{n\rightarrow +\infty}0=0\end{equation*} So we get \begin{equation*}\lim_{n\rightarrow +\infty}0 \leq \lim_{n\rightarrow +\infty}n^{\alpha}xe^{-nx} \leq \lim_{n\rightarrow +\infty}n^{m}xe^{-nx} \Rightarrow \lim_{n\rightarrow +\infty}0 \leq \lim_{n\rightarrow +\infty}n^{\alpha}xe^{-nx} \leq 0\end{equation*} It follows that $\displaystyle{\lim_{n\rightarrow +\infty}n^{\alpha}xe^{-nx}=0}$.

Therefore $f_n(x)$ converges pointwise to $0$.

Is everything correct? :unsure:
- Calculate $\max_{x\in [0, +\infty)}f_n(x)$ and conclude that $f_n$ converges uniformly on $[a, +\infty)$ for $a>0$.

We have that
\begin{align*}&f_n(x)=n^{\alpha}xe^{-nx}\\ &\rightarrow f_n'(x)=n^{\alpha}e^{-nx}-n^{\alpha+1}xe^{-nx}=\left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx} \\ &\rightarrow f_n'(x)=0 \Rightarrow \left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx}=0 \Rightarrow n^{\alpha}-n^{\alpha+1}x=0 \Rightarrow x=\frac{1}{n} \\ &f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}\end{align*}
Since we have the intervall $[\alpha, +\infty)$ we have to check also the value of the function at $x=\alpha$, right?
We get $f_n(\alpha)=n^{\alpha}\alpha e^{-n\alpha}$. But how can we compare it with $\frac{n^{\alpha-1}}{e}$ ? Or do we have to check if $f_n(x)$ is increasing or decreasing? :unsure:
- Show that $f_n$ converges uniformly on $[0, +\infty)$ iff $a<1$.

The maximum is $\frac{n^{\alpha-1}}{e}$ (since at $x=0$ we have $f_n(0)=0$ which is smaller) and for $a<1$ the limit goes to $0$, and this means that $f_n$ converges uniformly, right?
This shows that if $a<1$ then $f_n$ converges uniformly, or not? It is left to show that if $f_n$ converges uniformly then $a<1$, or not? :unsure:
 
Physics news on Phys.org
mathmari said:
- Show that $(f_n)$ converges pointwise on $[0,+\infty)$.

Is everything correct?

Hey mathmari!

It looks correct to me. (Nod)

mathmari said:
- Calculate $\max_{x\in [0, +\infty)}f_n(x)$ and conclude that $f_n$ converges uniformly on $[a, +\infty)$ for $a>0$.

We have that
\begin{align*}&f_n(x)=n^{\alpha}xe^{-nx}\\ &\rightarrow f_n'(x)=n^{\alpha}e^{-nx}-n^{\alpha+1}xe^{-nx}=\left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx} \\ &\rightarrow f_n'(x)=0 \Rightarrow \left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx}=0 \Rightarrow n^{\alpha}-n^{\alpha+1}x=0 \Rightarrow x=\frac{1}{n} \\ &f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}\end{align*}
Since we have the intervall $[\alpha, +\infty)$ we have to check also the value of the function at $x=\alpha$, right?
We get $f_n(\alpha)=n^{\alpha}\alpha e^{-n\alpha}$. But how can we compare it with $\frac{n^{\alpha-1}}{e}$ ? Or do we have to check if $f_n(x)$ is increasing or decreasing? :unsure:

I'm a bit confused about the latin $a$ versus the greek $\alpha$.
Aren't they distinct? 🤔
Note that we can prove the statement for any latin $a>0$ independent of the value of the greek $\alpha$ that is in the exponent.

You have found that $f_n'$ has a zero at $x=\frac 1n$. We also have that $f_n'$ is decreasing.
Therefore for a given $n$ we have that $f_n$ has a maximum at $x=\frac 1n$.

Let $N=\left\lceil \frac 1 {a}\right\rceil$.
Then for any $n> N$ we have that $f_n$ is decreasing on $[a,\infty)$.
So for such $n$ the function $f_n$ has its maximum at $x=a$.
If $n$ becomes bigger, does this maximum increase or not?
If it doesn't, we have uniform convergence don't we? 🤔

mathmari said:
- Show that $f_n$ converges uniformly on $[0, +\infty)$ iff $a<1$.

The maximum is $\frac{n^{\alpha-1}}{e}$ (since at $x=0$ we have $f_n(0)=0$ which is smaller) and for $a<1$ the limit goes to $0$, and this means that $f_n$ converges uniformly, right?
This shows that if $a<1$ then $f_n$ converges uniformly, or not? It is left to show that if $f_n$ converges uniformly then $a<1$, or not?

Indeed. Or rather, that $f_n$ does not converge uniformly when $a\ge 1$.
What happens to the maximum of $f_n$ when $n$ increases with $a\ge 1$? 🤔
 
Klaas van Aarsen said:
I'm a bit confused about the latin $a$ versus the greek $\alpha$.
Aren't they distinct? 🤔
Note that we can prove the statement for any latin $a>0$ independent of the value of the greek $\alpha$ that is in the exponent.

At the intervall $[a, +\infty)$ we have the latin $a$ and at the function we have the greek letter.
Klaas van Aarsen said:
You have found that $f_n'$ has a zero at $x=\frac 1n$. We also have that $f_n'$ is decreasing.
Therefore for a given $n$ we have that $f_n$ has a maximum at $x=\frac 1n$.

Let $N=\left\lceil \frac 1 {a}\right\rceil$.
Then for any $n> N$ we have that $f_n$ is decreasing on $[a,\infty)$.
So for such $n$ the function $f_n$ has its maximum at $x=a$.
If $n$ becomes bigger, does this maximum increase or not?
If it doesn't, we have uniform convergence don't we? 🤔

So that means since $f_n$ is decreasing we have that for $\frac{1}{n}>a$ we have that $f_n\left (\frac{1}{n}\right )<f_n(a)$ which means that the maximum of $f_n$ on $[a, +\infty)$ is $f_n(a)$, right? :unsure:

To check the uniform convergence we have to check if $\displaystyle{\lim_{n\rightarrow +\infty}f_n(a)=0}$, right? :unsure:
Klaas van Aarsen said:
Indeed. Or rather, that $f_n$ does not converge uniformly when $a\ge 1$.
What happens to the maximum of $f_n$ when $n$ increases with $a\ge 1$? 🤔

In this case the maximum is $f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}$ and we have that $\displaystyle{\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}=\begin{cases} +\infty& \text{ if } \alpha-1>0 \\ \frac{1}{e} & \text{ if } \alpha-1=0 \\ 0 & \text{ if } \alpha-1<0 \end{cases}}$.

Can we say then that $f_n$ converges uniformly iff $\alpha-1<0 \Rightarrow \alpha<1$ ? Or doesn't the "iff" follow in that way? :unsure:
 
mathmari said:
So that means since $f_n$ is decreasing we have that for $\frac{1}{n}>a$ we have that $f_n\left (\frac{1}{n}\right )<f_n(a)$ which means that the maximum of $f_n$ on $[a, +\infty)$ is $f_n(a)$, right?

To check the uniform convergence we have to check if $\displaystyle{\lim_{n\rightarrow +\infty}f_n(a)=0}$, right?

Yes.
That is, it will suffice if that is the case.

Now how can we do that? 🤔

mathmari said:
In this case the maximum is $f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}$ and we have that $\displaystyle{\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}=\begin{cases} +\infty& \text{ if } \alpha-1>0 \\ \frac{1}{e} & \text{ if } \alpha-1=0 \\ 0 & \text{ if } \alpha-1<0 \end{cases}}$.

Can we say then that $f_n$ converges uniformly iff $\alpha-1<0 \Rightarrow \alpha<1$ ? Or doesn't the "iff" follow in that way?

We have proven that if $a<1$ that then $f_n$ converges uniformly (to the zero function).
However, the proof that if $f_n$ converges uniformly, that then $a<1$ is still incomplete.
That is, we have the edge case $\alpha=1$ where $f_n$ might still be uniformly convergent. 🤔

So suppose $\alpha=1$ and $f_n$ is uniformly convergent.
Then we must have that there is a function $f$ such that:
$$\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0$$
We have found that in this case $f_n(x)$ is at most $\frac 1 e$.
Could there be such a function $f$ that is also at most $\frac 1 e$? 🤔
 
Last edited:
Klaas van Aarsen said:
Yes.
That is, it will suffice if that is the case.

Now how can we do that? 🤔

Just to clarify something... We have shown that $x=\frac{1}{n}$ is a critical point. We have the following:
$$f_n''(x)=-n^{\alpha+1}e^{-nx}-n\left (n^{\alpha}-n^{\alpha+1}x\right )e^{-nx}=\left (-n^{\alpha+1}-n^{\alpha+1}+n^{\alpha+2}x\right )e^{-nx}=\left (-2n^{\alpha+1}+n^{\alpha+2}x\right )e^{-nx} \\ f_n''\left(\frac{1}{n}\right )=\left (-2n^{\alpha+1}+n^{\alpha+1}\right )e^{-1}=-n^{\alpha+1}e^{-1}<0$$ That means that at $x=\frac{1}{n}$ the function $f_n(x)$ has a maximum.

So for $x<\frac{1}{n}$ the function is increasing and for $x>\frac{1}{n}$ the function is decreasing.

So if $a<\frac{1}{n}$ we have that $f_n(a)<f_n\left (\frac{1}{n}\right )$ and if $a>\frac{1}{n}$ we have that $f_n(a)<f_n\left (\frac{1}{n}\right )$ but in that case $\frac{1}{n}$ is not in the intervall $[a, +\infty)$ so the maximumis at the boundary $x=a$.

Is that correct so far? :unsure:
 
Yep. That looks correct. (Nod)
 
Klaas van Aarsen said:
Yep. That looks correct. (Nod)

If $a<\frac{1}{n}$ the maximum is $f_n\left(\frac{1}{n}\right )=n^{\alpha-1}e^{-1}$.
We have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n\left(\frac{1}{n}\right )=\lim_{n\rightarrow +\infty}n^{\alpha-1}e^{-1}=\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}\end{equation*}

If $a>\frac{1}{n}$ the maximum is $f_n(a)=n^{\alpha}a e^{-na}$.
We have that \begin{equation*}\lim_{n\rightarrow +\infty}f_n\left(a\right )=\lim_{n\rightarrow +\infty}n^{\alpha}a e^{-na}=\lim_{n\rightarrow +\infty}\frac{n^{\alpha}a}{e^{na}}\end{equation*}

To calculate in each case the liimit do we have to distinguish cases for $\alpha$ ?
 
We can ignore latin $a<\frac 1n$, since it suffices to check for n that are 'large enough'. 🧐

So we want to know what $f_n(a)$ does for increasing n that are large enough.
I don't think it is necessary to check cases for greek $\alpha$.
How about checking the behavior of $g(n)=f_n(a)$? 🤔
 
Klaas van Aarsen said:
We can ignore latin $a<\frac 1n$, since it suffices to check for n that are 'large enough'. 🧐

So we want to know what $f_n(a)$ does for increasing n that are large enough.
I don't think it is necessary to check cases for greek $\alpha$.
How about checking the behavior of $g(n)=f_n(a)$? 🤔

\begin{align*}g(n)&=f_n(a)=n^{\alpha}a e^{-na}=\frac{n^{\alpha}a}{ e^{na}} \\ g'(n)&=\frac{a \alpha n^{\alpha-1}e^{na}-n^{\alpha}a^2e^{na}}{e^{2na}}=\frac{a \alpha n^{\alpha-1}-n^{\alpha}a^2}{e^{na}} \\ g''(n)&=\frac{\left (a \alpha (\alpha-1) n^{\alpha-2}-\alpha n^{\alpha-1}a^2\right )e^{na}-a\left (a \alpha n^{\alpha-1}-n^{\alpha}a^2\right )e^{na}}{e^{2na}}\\ & =\frac{a \alpha (\alpha-1) n^{\alpha-2}-\alpha n^{\alpha-1}a^2-a^2 \alpha n^{\alpha-1}+n^{\alpha}a^3}{e^{na}}\\ & =\frac{a \alpha (\alpha-1) n^{\alpha-2}-2\alpha n^{\alpha-1}a^2+n^{\alpha}a^3}{e^{na}} \\ g'(n)&=0 \Rightarrow an^{\alpha-1} \left ( \alpha -na\right )=0 \Rightarrow \alpha -na=0 \Rightarrow n=\frac{\alpha}{a} \\ g'' \left (\frac{\alpha}{a}\right )&=\frac{a \alpha (\alpha-1) \left (\frac{\alpha}{a}\right )^{\alpha-2}-2\alpha \left (\frac{\alpha}{a}\right )^{\alpha-1}a^2+\left (\frac{\alpha}{a}\right )^{\alpha}a^3}{e^{\alpha}}\\ & =\frac{a \alpha (\alpha-1) \alpha^{\alpha-2}a^{-\alpha+2}-2\alpha \alpha^{\alpha-1}a^{-\alpha+1} a^2+\alpha^{\alpha}a^{-\alpha}a^3}{e^{\alpha}}=\frac{ (\alpha-1) \alpha^{\alpha-1}a^{-\alpha+3}-2 \alpha^{\alpha}a^{-\alpha+3} +\alpha^{\alpha}a^{-\alpha+3}}{e^{\alpha}}\\ & =\frac{ (\alpha-1) \alpha^{\alpha-1}a^{-\alpha+3}- \alpha^{\alpha}a^{-\alpha+3} }{e^{\alpha}}=\frac{ \alpha \alpha^{\alpha-1}a^{-\alpha+3}-\alpha^{\alpha-1}a^{-\alpha+3}- \alpha^{\alpha}a^{-\alpha+3} }{e^{\alpha}}\\ & =\frac{ \alpha^{\alpha}a^{-\alpha+3}-\alpha^{\alpha-1}a^{-\alpha+3}- \alpha^{\alpha}a^{-\alpha+3} }{e^{\alpha}}=\frac{ -\alpha^{\alpha-1}a^{-\alpha+3}}{e^{\alpha}}<0\end{align*} So $g_n$ has a maximum at $n=\frac{\alpha}{a}$.

Is that correct? How does this help us? I got stuck right now. :unsure:
 
  • #10
mathmari said:
So $g_n$ has a maximum at $n=\frac{\alpha}{a}$.

Is that correct? How does this help us? I got stuck right now.

Looks correct.
Doesn't that mean that $g(n)$ is decreasing for sufficiently large $n$? 🤔
Its limit is $0$ isn't it?
So $\lim\limits_{n\to\infty} \sup \{|f_n(x)| : x\in [a,\infty)\} = 0$, isn't it?
 
  • #11
Ahh I got it! As for the last question :
Klaas van Aarsen said:
We have proven that if $a<1$ that then $f_n$ converges uniformly (to the zero function).
However, the proof that if $f_n$ converges uniformly, that then $a<1$ is still incomplete.
That is, we have the edge case $\alpha=1$ where $f_n$ might still be uniformly convergent. 🤔

So suppose $\alpha=1$ and $f_n$ is uniformly convergent.
Then we must have that there is a function $f$ such that:
$$\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0$$
We have found that in this case $f_n(x)$ is at most $\frac 1 e$.
Could there be such a function $f$ that is also at most $\frac 1 e$? 🤔

In this case the maximum is $f_n\left (\frac{1}{n}\right )=\frac{n^{\alpha-1}}{e}$ and we have that $\displaystyle{\lim_{n\rightarrow +\infty}\frac{n^{\alpha-1}}{e}=\begin{cases} +\infty& \text{ if } \alpha-1>0 \\ \frac{1}{e} & \text{ if } \alpha-1=0 \\ 0 & \text{ if } \alpha-1<0 \end{cases}}$.

If $a<1$ then $f_n$ converges uniformly (to the zero function).

If $f_n$ converges uniformly, then $\displaystyle{\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0}$.
How do we continue here to take cases for $\alpha$ ? I got stuck right now. :unsure:
 
  • #12
mathmari said:
If $f_n$ converges uniformly, then $\displaystyle{\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0}$.
How do we continue here to take cases for $\alpha$ ? I got stuck right now.

We want to prove that uniform convergence of $f_n$ implies that $\alpha<1$.

Let's try a proof by contradiction. 🤔
Suppose that it doesn't. Then there must be an $\alpha \ge 1$ such that $f_n$ converges uniformly.

If $a>1$, then $\sup f_n(x)\to\infty$, so whatever we pick for $f$, we won't have that $\displaystyle{\lim_{n\rightarrow\infty}\,\sup\{\,\left|f_n(x)-f(x)\right|: x \in [0,\infty) \,\}=0}$.
Therefore $\alpha=1$.
For $\alpha=1$ we have that $f_n(x)$ has a maximum of $\frac 1e$.
So $f$ must also have a maximum of $\frac 1e$.
Suppose $f(x)=\frac 1e$ for some $x=a$.
We have seen that if $a>0$, that $f_n(a)\to 0$, so $|f_n(a)-f(a)| = |f_n(a)-\frac 1e| \to \frac 1e \ne 0$, which contradicts that $f_n$ converges uniformly.
Consequently $a=0$.
But then $|f_n(a)-f(a)| =|0-\frac 1e|=\frac 1e\ne 0$, which is again a contradiction.

Therefore $\alpha<1$, which completes the proof. :geek:
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 21 ·
Replies
21
Views
3K
  • · Replies 2 ·
Replies
2
Views
2K
  • · Replies 1 ·
Replies
1
Views
1K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 17 ·
Replies
17
Views
1K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 4 ·
Replies
4
Views
1K