Convergence of Sequences in [0,1]

In summary: What does that mean for the sequence? Is it saying that the sequence will eventually reach 1 or that all the sequences will eventually reach 1?
  • #1
STEMucator
Homework Helper
2,076
140

Homework Statement



Determine the convergence, both pointwise and uniform on [0,1] for the following sequences :

(i) ##s_n(x) = n^2x^2(1 - cos(\frac{1}{nx})), x≠0; s_n(0) = 0##
(ii) ##s_n(x) = \frac{nx}{x+n}##
(iii) ##s_n(x) = nsin(\frac{x}{n})##

Homework Equations



##s_n(x) → s(x)## as ##n→∞##

The Attempt at a Solution



(i) So for this one, I can split it into a piecewise function since ##s_n(x)## has been defined at the origin.

Taking the limit as n → ∞, I can observe two limiting functions occurring. s(x) = 0 and s(x) = 1/2.

Therefore ##s_n(x)## converges pointwise, but not uniformly on [0,1].

(ii) As n → ∞ ##s_n(x) → x = s(x)## for all x in [0,1]. Hence ##s_n(x)## converges pointwise AND uniformly on [0,1].

(iii) Exact same limit as (ii) so the same answer will follow.

Do these look okay? It seems way too straightforward so it has me a bit worried.

If anyone could confirm it would be great :). Thanks.
 
Physics news on Phys.org
  • #2
Zondrina said:

Homework Statement



Determine the convergence, both pointwise and uniform on [0,1] for the following sequences :

(i) ##s_n(x) = n^2x^2(1 - cos(\frac{1}{nx})), x≠0; s_n(0) = 0##
(ii) ##s_n(x) = \frac{nx}{x+n}##
(iii) ##s_n(x) = nsin(\frac{x}{n})##

Homework Equations



##s_n(x) → s(x)## as ##n→∞##

The Attempt at a Solution



(i) So for this one, I can split it into a piecewise function since ##s_n(x)## has been defined at the origin.

Taking the limit as n → ∞, I can observe two limiting functions occurring. s(x) = 0 and s(x) = 1/2.

Therefore ##s_n(x)## converges pointwise, but not uniformly on [0,1].

(ii) As n → ∞ ##s_n(x) → x = s(x)## for all x in [0,1]. Hence ##s_n(x)## converges pointwise AND uniformly on [0,1].

(iii) Exact same limit as (ii) so the same answer will follow.

Do these look okay? It seems way too straightforward so it has me a bit worried.

If anyone could confirm it would be great :). Thanks.

Yes, the conclusions about convergence are correct. But I don't see any reason why you think the convergence is uniform. A sequence of continuous functions can converge to a continuous function in a way that is not uniform. You need a better reason than just that the limit is continuous.
 
Last edited:
  • #3
Dick said:
Yes, the conclusions about convergence are correct. But I don't see any reason why you think the convergence is uniform. A sequence of continuous functions can converge to a continuous function in a way that is not uniform.

As in for the second and third ones? I think I see what you mean.


(ii)

##s_n(x) = 0 → 0## as ##n→∞## for ##x = 0##.
##s_n(x) = n/(n+1) → 1## as ##n→∞## for ##x = 1##.

So we get only pointwise convergence for (ii) and not uniform for all x in [0,1].

(iii)

##s_n(x) = 0 → 0## as ##n→∞## for ##x = 0##.
##s_n(x) = nsin(1/n) → 1## as ##n→∞## for ##x = 1##.

So as is in (ii) we only get pointwise and not uniform for all x in the interval [0,1].
 
  • #4
Zondrina said:
As in for the second and third ones? I think I see what you mean.


(ii)

##s_n(x) = 0 → 0## as ##n→∞## for ##x = 0##.
##s_n(x) = n/(n+1) → 1## as ##n→∞## for ##x = 1##.

So we get only pointwise convergence for (ii) and not uniform for all x in [0,1].

(iii)

##s_n(x) = 0 → 0## as ##n→∞## for ##x = 0##.
##s_n(x) = nsin(1/n) → 1## as ##n→∞## for ##x = 1##.

So as is in (ii) we only get pointwise and not uniform for all x in the interval [0,1].

No, that's not what I meant at all. You were correct that they are uniformly convergent. You just didn't give a good reason (or any) reason why.
 
  • #5
Dick said:
No, that's not what I meant at all. You were correct that they are uniformly convergent. You just didn't give a good reason (or any) reason why.

Are you implying I need to go all the way back to the epsilon definition for this?
 
  • #6
Zondrina said:
Are you implying I need to go all the way back to the epsilon definition for this?

No, I wouldn't go back that far. Don't you have any other theorems that will show that convergence is uniform? Here's a hint. In both of those cases ##s_n(x)## is differentiable and the derivative is bounded on [0,1] independent of n. Could that be useful?
 
  • #7
Dick said:
No, I wouldn't go back that far. Don't you have any other theorems that will show that convergence is uniform? Here's a hint. In both of those cases ##s_n(x)## is differentiable and the derivative is bounded on [0,1] independent of n. Could that be useful?

Uhm I think I have this theorem I just found :

Suppose ##\{s_n'(x)\}## converges uniformly on an interval I.

If each ##s_n'(x)## is continuous and ##\{s_n(x_0)\}## converges for some ##x_0 \in I##, then ##\{s_n(x)\}##
converges uniformly on I to s(x) and :

##lim_{n→∞} s_n'(x) = s'(x)##

EDIT : So ##s_n'(x) = \frac{n^2}{(n+x)^2}##

So ##s_n'(x)## is continuous for all x in the interval.

If we choose x = 0, we see the sequence will converge to 1.

My only question is the "Suppose {s_n'(x)} converges uniformly on an interval I". Would I say : As n goes to infinity, {s_n'(x)} goes to 1. Which also happens to be the derivative of x just like the theorem is describing.
 
Last edited:
  • #8
Zondrina said:
Uhm I think I have this theorem I just found :

Suppose ##\{s_n'(x)\}## converges uniformly on an interval I.

If each ##s_n'(x)## is continuous and ##\{s_n(x_0)\}## converges for some ##x_0 \in I##, then ##\{s_n(x)\}##
converges uniformly on I to s(x) and :

##lim_{n→∞} s_n'(x) = s'(x)##

Well that just displaces the problem from showing ##s(x)## converges uniformly to showing ##s_n'(x)## converges uniformly. If you can do that then go ahead. I was thinking of something more like if ##|s'_n(x)|<M## for some constant M then if ##s_n## converges pointwise then it converges uniformly. If you can't find a theorem like that then think how you might prove it yourself. [0,1] is compact.
 
  • #9
Dick said:
Well that just displaces the problem from showing ##s(x)## converges uniformly to showing ##s_n'(x)## converges uniformly. If you can do that then go ahead. I was thinking of something more like if ##|s'_n(x)|<M## for some constant M then if ##s_n## converges pointwise then it converges uniformly. If you can't find a theorem like that then think how you might prove it yourself. [0,1] is compact.

Wait. So you're saying that if I can bound all the positive terms of my derivative by some constant M_n, then if I have just pointwise convergence of s_n, then it converges uniformly.

##|s_n'(x)| = \frac{n^2}{(n+x)^2} ≤ \frac{n^2}{n^2} = 1 = M_n##

Since we already have shown s_n(x) is pointwise convergent, then because its derivative is bounded we can conclude that s_n(x) is uniformly convergent.
 
  • #10
Zondrina said:
Wait. So you're saying that if I can bound all the positive terms of my derivative by some constant M_n, then if I have just pointwise convergence of s_n, then it converges uniformly.

##|s_n'(x)| = \frac{n^2}{(n+x)^2} ≤ \frac{n^2}{n^2} = 1 = M_n##

Since we already have shown s_n(x) is pointwise convergent, then because its derivative is bounded we can conclude that s_n(x) is uniformly convergent.

No, no subscript n. You've shown ##|s_n'(x)|## is bounded by 1. Independent of n. That's important. Showing each ##|s_n'(x)|## is bounded doesn't help. They all have to be bounded by the same constant. And they are, as you've shown. Can you say how this would show pointwise convergence of ##s_n(x)## would show ##s_n(x)## converges uniformly? Think compactness of [0,1] and epsilon. Wish I could think of the name of a theorem to refer to.
 
  • #11
Dick said:
No, no subscript n. You've shown ##|s_n'(x)|## is bounded by 1. Independent of n. That's important. Showing each ##|s_n'(x)|## is bounded doesn't help. They all have to be bounded by the same constant. And they are, as you've shown. Can you say how this would show pointwise convergence of ##s_n(x)## would show ##s_n(x)## converges uniformly? Think compactness of [0,1] and epsilon. Wish I could think of the name of a theorem to refer to.

Hmm I just read about what a compact space is, but sadly I don't think I'm allowed to use any of this ( the thing about the limit points made sense ).

I still think I was correct in an earlier post in saying this ( Slightly modified now ) :

For ##x = 0, \space s_n(x) = 0 → 0## as ##n→∞##.
For ##x ≠ 0, \space s_n(x) = nx/(n+x) → x## as ##n→∞##.

So we have two values for the limiting function s(x) depending on x.

I don't think this is incorrect at all and would result in pointwise, but not uniform convergence.

Differentiating, I would get :

For x = 0. ##s_n'(x) = 0 → 0## as n→∞.
For x ≠ 0. ##s_n'(x) = \frac{n^2}{(n+x)^2} → 1## as n→∞.

So the convergence is uniform.

Since ##\{s_n(0)\}## converges, we have one point where ##\{s_n(x)\}## converges and we know that ##\{s_n'(x)\}## converges uniformly so it follows that ##\{s_n(x)\}## converges uniformly.
 
Last edited:
  • #12
Zondrina said:
Hmm I just read about what a compact space is, but sadly I don't think I'm allowed to use any of this ( the thing about the limit points made sense ).

I still think I was correct in an earlier post in saying this ( Slightly modified now ) :

For ##x = 0, \space s_n(x) = 0 → 0## as ##n→∞##.
For ##x ≠ 0, \space s_n(x) = nx/(n+x) → 1## as ##n→∞##.

I don't think this is incorrect at all and would result in pointwise, but not uniform convergence.

That. Is. Completely. Incorrect. The convergence is uniform. It's late and I'm running out of ways to convince you. But nx/(n+x) does not converge to 1. It converges to x. You had that part right to begin with. What changed your mind?
 
  • #13
Dick said:
That. Is. Completely. Incorrect. The convergence is uniform. It's late and I'm running out of ways to convince you. But nx/(n+x) does not converge to 1. It converges to x. You had that part right to begin with. What changed your mind?

I had a moment of doubting myself for some reason. I added a bunch to the prior post that makes more sense than what I had before.
 
  • #14
Zondrina said:
I had a moment of doubting myself for some reason. I added a bunch to the prior post that makes more sense than what I had before.

You just don't seem to be grappling with what uniform means. I just had a great idea. Why don't you graph s_n(x)=nx/(x+n) for some values of n, like 1, 10, 100. Does that give you some idea how you could directly prove the convergence is uniform (by epsilon ideas - not theorems)?
 
  • #15
Dick said:
You just don't seem to be grappling with what uniform means. I just had a great idea. Why don't you graph s_n(x)=nx/(x+n) for some values of n, like 1, 10, 100. Does that give you some idea how you could directly prove the convergence is uniform (by epsilon ideas - not theorems)?

Going back to epsilons would probably make me feel more comfortable anyway.

##\forall ε > 0, \exists N \space | \space n>N \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

##|s_n(x) - s(x)| = |\frac{nx}{n+x} - x| = \frac{x^2}{n+x} ≤ \frac{1}{n+1}##

So choosing ##n > \frac{1}{ε} - 1 \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

Hence (ii) converges both pointwise and uniformly.
 
  • #16
Zondrina said:
Going back to epsilons would probably make me feel more comfortable anyway.

##\forall ε > 0, \exists N \space | \space n>N \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

##|s_n(x) - s(x)| = |\frac{nx}{n+x} - x| = \frac{x^2}{n+x} ≤ \frac{1}{n+1}##

So choosing ##n > \frac{1}{ε} - 1 \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

Hence (ii) converges both pointwise and uniformly.

That's the idea.
 
  • #17
Dick said:
That's the idea.

See, I understand when I go back to the epsilon definition for some reason, but the theorems he's provided for us are not sufficient for this problem I think.

(iii)

##\forall ε>0, \exists N \space | \space n>N \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

##|s_n(x) - s(x)| = |nsin(x/n) - x| ≤ nsin(1/n) - 1##

Having a small nitch with the arithmetic here. I can't isolate for n in terms of epsilon.

EDIT : Wait, what if :

##|s_n(x) - s(x)| = |nsin(x/n) - x| ≤ |n||sin(x/n)| + |x| ≤ n + 1##

So choosing ##n > ε -1 \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##
 
  • #18
Zondrina said:
See, I understand when I go back to the epsilon definition for some reason, but the theorems he's provided for us are not sufficient for this problem I think.

(iii)

##\forall ε>0, \exists N \space | \space n>N \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

##|s_n(x) - s(x)| = |nsin(x/n) - x| ≤ nsin(1/n) - 1##

Having a small nitch with the arithmetic here. I can't isolate for n in terms of epsilon.

EDIT : Wait, what if :

##|s_n(x) - s(x)| = |nsin(x/n) - x| ≤ |n||sin(x/n)| + |x| ≤ n + 1##

So choosing ##n > ε -1 \Rightarrow |s_n(x) - s(x)| < ε, \forall x \in [0,1]##

|x| isn't very small on [0,1]. That's not going to work. Look at the function x-n*sin(x/n). Is it increasing or decreasing on [0,1]?
 
  • #19
Dick said:
|x| isn't very small on [0,1]. That's not going to work. Look at the function x-n*sin(x/n). Is it increasing or decreasing on [0,1]?

Wouldn't it be increasing? It's derivative is 1 - cos(x/n).

##2nπ## seems to be relevant here as it is the root of the derivative.
 
Last edited:
  • #20
Zondrina said:
Wouldn't it be increasing? It's derivative is 1 - cos(x/n).

##2nπ## seems to be relevant here as it is the root of the derivative.

Sure. It's increasing. It's zero at x=0 and increases to 1-n*sin(1/n) at x=1. Does that give you any ideas about how to pick an N corresponding to an epsilon?
 
Last edited:
  • #21
Dick said:
Sure. It's increasing. It's zero at x=0 and increases to 1-n*sin(1/n) at x=1. Does that give you any ideas about how to pick an N corresponding to an epsilon?

No, I've looked at this for awhile and it doesn't seem obvious.

My question is why consider x-n*sin(x/n)? All that does is consider s(x) - sn(x) instead of the other way around.

What did I do wrong in the edit portion of my post a few posts ago? I applied the triangle inequality and then saw n+1 as an upper bound since |sin(x)| ≤ 1 and |x| = x since x is positive on [0,1] as is n.
 
  • #22
Zondrina said:
No, I've looked at this for awhile and it doesn't seem obvious.

My question is why consider x-n*sin(x/n)? All that does is consider s(x) - sn(x) instead of the other way around.

What did I do wrong in the edit portion of my post a few posts ago? I applied the triangle inequality and then saw n+1 as an upper bound since |sin(x)| ≤ 1 and |x| = x since x is positive on [0,1] as is n.

The triangle inequality is true, but it's not useful. |x-n*sin(x/n)| gets small as n increases. |x|+|n*sin(x/n)| doesn't. It approaches 2*|x| which you are not going to be able to make less than epsilon. By showing it's increasing you've shown |x-n*sin(x/n)|<1-n*sin(1/n) for all x in [0,1]. What's the limit of the right side of that inequality?
 
  • #23
Dick said:
The triangle inequality is true, but it's not useful. |x-n*sin(x/n)| gets small as n increases. |x|+|n*sin(x/n)| doesn't. It approaches 2*|x| which you are not going to be able to make less than epsilon. By showing it's increasing you've shown |x-n*sin(x/n)|<1-n*sin(1/n) for all x in [0,1]. What's the limit of the right side of that inequality?

##|x-nsin(x/n)| ≤ 1-nsin(1/n), \forall x \in [0,1]##

The limit of the right side is 0 as n goes to infinity.
 
  • #24
Zondrina said:
##|x-nsin(x/n)| ≤ 1-nsin(1/n), \forall x \in [0,1]##

The limit of the right side is 0 as n goes to infinity.

So? Conclusion?
 
  • #25
Dick said:
So? Conclusion?

Er..

I've shown ##|s(x) - s_n(x)|## is increasing and that it's bounded by ##1-nsin(1/n)## for all x in [0,1].

Wait, that means it's uniformly convergent because I've shown that if I take the limiting function s(x) and take away sn(x) I get zero as n goes off to infinity so that lim n→∞ sn(x) = s(x) for all x in [0,1].
 
  • #26
Zondrina said:
Er..

I've shown ##|s(x) - s_n(x)|## is increasing and that it's bounded by ##1-nsin(1/n)## for all x in [0,1].

Wait, that means it's uniformly convergent because I've shown that if I take the limiting function s(x) and take away sn(x) I get zero as n goes off to infinity so that lim n→∞ sn(x) = s(x) for all x in [0,1].

Yes, you've shown that the difference ##|s(x) - s_n(x)|## on [0,1] is uniformly bounded by the difference at x=1. And that goes to zero. This is a good way to prove a lot of uniform convergence problems. Find the point where the difference is a maximum and then show that goes to zero.
 
  • #27
Dick said:
Yes, you've shown that the difference ##|s(x) - s_n(x)|## on [0,1] is uniformly bounded by the difference at x=1. And that goes to zero. This is a good way to prove a lot of uniform convergence problems. Find the point where the difference is a maximum and then show that goes to zero.

So if checking out ##|s_n(x) - s(x)|## doesn't work, I should switch it up to ##|s(x) - s_n(x)|## and consider things at their maximum so I can get a bound. Since the difference will be bounded by 0 in this case it results in uniform convergence.

Could I ask you then, what If I got the right side limit equal to one? Does that mean one of the terms in the sequence doesn't go to zero? So the difference is bounded by 1 and so we wouldn't have uniform convergence ( At least I think ).
 
  • #28
Zondrina said:
So if checking out ##|s_n(x) - s(x)|## doesn't work, I should switch it up to ##|s(x) - s_n(x)|## and consider things at their maximum so I can get a bound. Since the difference will be bounded by 0 in this case it results in uniform convergence.

Could I ask you then, what If I got the right side limit equal to one? Does that mean one of the terms in the sequence doesn't go to zero? So the difference is bounded by 1 and so we wouldn't have uniform convergence ( At least I think ).

You are focussing on completely the wrong things here. ##|s_n(x) - s(x)|## and ##|s(x) - s_n(x)|## are EXACTLY the same thing. I just wrote x-n*sin(x/n) because it's positive. I could have written it the other way and had you show it's decreasing. Same difference. I didn't change anything by doing that. If the right side difference were approaching 1 then you wouldn't even have pointwise convergence, so there would be no reason to even ask about uniform convergence. Right? What I wanted you to carry away is what I said in the last post. If you can find a point where the difference is a maximum (which is what we did here) then answering the uniform question is easy.
 
Last edited:
  • #29
Dick said:
You are focussing on completely the wrong things here. ##|s_n(x) - s(x)|## and ##|s(x) - s_n(x)|## are EXACTLY the same thing. I just wrote x-n*sin(x/n) because it's positive. I could have written the other way and had you show it's decreasing. Same difference. I didn't change anything by doing that. If the right side difference were approaching 1 then you wouldn't even have pointwise convergence, so there would be no reason to even ask about uniform convergence. Right? What I wanted you to carry away is what I said in the last post. If you can find a point where the difference is a maximum (which is what we did here) then answering the uniform question is easy.

Yes, thank you very much for the information. This clears everything up for me.

So just for completeness. I'll do ##|s_n(x) - s(x)|## which I now know is decreasing, so choosing x=1, I can bound it by ##nsin(1/n) - 1## which goes to zero as n goes to infinity so the same result would follow.
 
  • #30
Zondrina said:
Yes, thank you very much for the information. This clears everything up for me.

So just for completeness. I'll do ##|s_n(x) - s(x)|## which I now know is decreasing, so choosing x=1, I can bound it by ##nsin(1/n) - 1## which goes to zero as n goes to infinity so the same result would follow.

Well, yes. Just interchanging the order just interchanges increasing/decreasing and signs. It doesn't make any real difference to the problem. But ##|s_n(x) - s(x)|## is still increasing as a function of x on [0,1]. Not decreasing. It's an absolute value. I'm still not sure you are clear on what's important and what is not.
 
  • #31
Dick said:
Well, yes. Just interchanging the order just interchanges increasing/decreasing and signs. It doesn't make any real difference to the problem. But ##|s_n(x) - s(x)|## is still increasing as a function of x on [0,1]. Not decreasing. It's an absolute value. I'm still not sure you are clear on what's important and what is not.

Oh so It doesn't make any difference at all. I see now.

It's just |a-b| = |b-a| looked wrong to me for a moment ( Can't even explain why). Now that that's clear I think I'm good.
 
  • #32
Zondrina said:
Oh so It doesn't make any difference at all. I see now.

It's just |a-b| = |b-a| looked wrong to me for a moment ( Can't even explain why). Now that that's clear I think I'm good.

That is great!
 

1. What is the definition of convergence of sequences in [0,1]?

The convergence of sequences in [0,1] refers to the behavior of a sequence of numbers that approach a specific value within the interval of [0,1]. In other words, as the sequence progresses, the terms get closer and closer to a certain number within the range of [0,1].

2. How do you determine if a sequence in [0,1] converges or diverges?

To determine if a sequence in [0,1] converges or diverges, you can use the definition of convergence which states that a sequence converges if and only if for any positive number ε, there exists a positive integer N such that for all n ≥ N, the absolute value of the difference between the nth term of the sequence and the limiting value is less than ε. If this condition is not met, then the sequence diverges.

3. Can a sequence in [0,1] converge to a value outside of the interval?

No, a sequence in [0,1] can only converge to a value within the interval of [0,1]. This is because the interval [0,1] contains all the possible values that the sequence can approach. If the sequence were to converge to a value outside of the interval, it would contradict the definition of convergence in [0,1].

4. What is the significance of convergence of sequences in [0,1] in real-world applications?

The convergence of sequences in [0,1] has many real-world applications, particularly in fields such as engineering, physics, and computer science. It is used to model and analyze various phenomena, such as the behavior of electrical circuits, the motion of objects, and the performance of algorithms. It is also essential in numerical analysis and optimization, where approximations and iterative methods rely on the convergence of sequences.

5. Are there any special types of convergence in sequences in [0,1]?

Yes, there are several types of convergence in sequences in [0,1], including pointwise convergence, uniform convergence, and almost everywhere convergence. Pointwise convergence refers to the convergence of a sequence at each point in the interval [0,1]. Uniform convergence, on the other hand, requires that the sequence converges at a uniform rate across the entire interval. Almost everywhere convergence is a weaker form of convergence that allows for exceptions at a set of points with measure zero.

Similar threads

  • Calculus and Beyond Homework Help
Replies
17
Views
1K
  • Calculus and Beyond Homework Help
Replies
11
Views
1K
  • Calculus and Beyond Homework Help
Replies
2
Views
2K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
26
Views
897
  • Calculus and Beyond Homework Help
Replies
11
Views
2K
  • Calculus and Beyond Homework Help
Replies
1
Views
255
  • Calculus and Beyond Homework Help
Replies
4
Views
306
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
  • Calculus and Beyond Homework Help
Replies
7
Views
1K
Back
Top