Find which initial conditions lead to convergence

In summary, the set B is defined as the set of real numbers b1 for which the sequence bn+1 = (1+b_n^2)/2 converges as n approaches infinity. It can be shown that B is equal to the interval [-1,1], as the sequence is increasing for all b1 not equal to -1 or 1, and converges to 1 for b1 = -1 or 1.
  • #1
1,462
44

Homework Statement


Let ##b_1\in \mathbb{R}## be given and ##n=1,2,\dots## let $$b_{n+1} := \frac{1+b_n^2}{2}.$$ Define the set $$B := \{b_1\in\mathbb{R} \mid \lim_{n\to\infty}b_n \text{ converges}\}$$

Identify the set ##B##.

Homework Equations




The Attempt at a Solution


I claim that ##B = [-1,1]##.

First, we note that ##\forall b_1 \not = \pm 1##, the sequence is increasing. The base case holds because if ##|1-b_1|> 0## then ##\displaystyle \frac{1+b_1^2}{2} > b_1##. Suppose that for some ##k## we have that ##b_{k} \ge b_{k-1}##. Then $$b_{k+1} - b_k = \frac{1+b_{k}^2}{2} - \frac{1+b_{k-1}^2}{2}= \frac{(b_k+b_{k-1})(b_l-b_{k-1})}{2},$$ and the latter expression is positive by the inductive hypothesis.

If ##b_1 \in (-\infty, 1) \cup (1,\infty)##, then the sequence will diverge to ##\infty##, since it is increasing. So suppose that ##b_1\in (-1,1)##. Then the sequence is bounded above since ##b_{k+1} = \frac{1+b_k^2}{2} < \frac{1+1}{2} = 1##. So by the monotone convergence theorem, the sequence converges in this case. Also, if ##b_1=\pm 1##, then ##b_n = 1## for all ##n>1## and so converges to ##1##. Hence, ##B = [-1,1]##.
 
Physics news on Phys.org
  • #2
I think a direct proof instead of an induction is shorter: ##b_{n+1}=\dfrac{1+b_n^2}{2}>b_n \Longleftrightarrow (b_n-1)^2>0## and done.
The rest is o.k., except for the typo ##b_1 \in (-\infty, 1) \cup (1,\infty)\longrightarrow b_1 \in (-\infty, -1) \cup (1,\infty)##
 

Suggested for: Find which initial conditions lead to convergence

Back
Top