# Homework Help: Is sequence 1/(n-1) necessarily bounded?

1. Oct 11, 2014

### Axel Harper

1. The problem statement, all variables and given/known data
Courant states that a convergent sequence is necessarily bounded; that is, for all n, the absolute value of term an is less than or equal to some number M. My question is does this apply to the sequence given by an = 1/(n-1)?

2. Relevant equations
As n approaches infinity, an approaches zero, so the sequence converges.

3. The attempt at a solution
At n = 1, 1/(n-1) is larger than any number M, which suggests the sequence is not bounded. Should we only consider terms for n > 1, in which case the sequence would be bounded by 1 and still converge to 0?

2. Oct 11, 2014

### Staff: Mentor

Since a1 is undefined, the sequence makes sense only for n >= 2.

3. Oct 11, 2014

### vela

Staff Emeritus
A sequence is a function. Rather than writing f(n), however, we use the convenient notation $a_n$. In the example you've given, n=1 isn't in the domain of the function, so there is no $a_1$ in that sequence. So, yes, you only consider the terms for n>1.

Also, I wouldn't say that at n=1, 1/(n-1) is larger than any number M because 1/(n-1) is not defined for n=1. If it's not defined, it doesn't make sense to talk about whether it's larger or smaller than some real number M.

What you're thinking of is the real function $f: \mathbb{R}-\{1\} \to \mathbb{R}$ where f(x) = 1/(x-1). As before f(1) isn't defined, and it doesn't make sense to say f(1) is smaller or bigger than some number. Consider the fact that as x approaches 1, the function diverges to $\pm\infty$ depending on which side you approach x=1 from, so is f(1) really big or really small?

4. Oct 11, 2014

### Fredrik

Staff Emeritus
The sequence $\big(\frac{1}{n-1}\big)_{n=2}^\infty$ is convergent, and its limit is 0, as you said. But there's no sequence $\big(\frac{1}{n-1}\big)_{n=1}^\infty$, because $\frac 1 0$ is not larger than any number. It's just undefined.

A convergent sequence is bounded because "convergent" means that for all $\varepsilon>0$ there's a number x such that all but a finite number of terms are in the interval $(x-\varepsilon,x+\varepsilon)$. This makes it very easy to see that there's an $r>0$ such that all terms are in the interval $(x-r,x+r)$.

Edit: I wrote this about 45 minutes ago, but got distracted by something before I could submit it.