MHB Fixed point,, Jacobi- & Newton Method, Linear Systems

AI Thread Summary
The discussion revolves around fixed points, the Jacobi method, and Newton's method for solving equations and linear systems. It establishes that the only fixed point for the function g(x) = x - x^3 is x* = 0, and shows that a sequence defined by x_{k+1} = g(x_k) converges to this point when starting from a negative initial value. The Jacobi method is analyzed for two linear systems, concluding that it converges for the first system but not for the second due to the lack of diagonal dominance. Additionally, the convergence of Newton's method is examined, with a focus on the behavior of the sequence generated from a specific starting point, ultimately questioning the completeness of the findings regarding convergence and limits. The thread highlights the importance of conditions for convergence in iterative methods.
mathmari
Gold Member
MHB
Messages
4,984
Reaction score
7
Hey! :giggle:

Question 1 :
Let $g(x)-=x-x^3$. The point $x=0$ is a fixed point for $g$. Show that if $x^{\star}$ is a fixed point of $g$, $g(x^{\star})=x^{\star}$, then $x^{\star}=0$. If $(x_k)$ the sequence $x_{k+1}=g(x_k)$, $k=0,1,2,\ldots$ show that if $0>x_0>-1$ then $(x_k)$ is increasing and converges to $0$.

My solution :
$g(x)=x-x^3$
$g(0)=0$
$g(x^{\star})=x^{\star} \Rightarrow x^{\star} -{x^{\star}}^3=x^{\star} \Rightarrow {x^{\star}}^3=0 \Rightarrow x^{\star} =0$
$x_{k+1}=g(x_k)$
$0>x_0>-1$
We have that $g'(x)=1-3x^2$ and $g'(x)=0 \Rightarrow 3x^2=1 \Rightarrow x^2=\frac{1}{3} \Rightarrow x=\pm\frac{1}{\sqrt{3}}$.
Then $g'(x)>0$ for $-\frac{1}{\sqrt{3}}<x<\frac{1}{\sqrt{3}}$ and $g'(x)\leq 0$ for $x\leq -\frac{1}{\sqrt{3}}$and $x\geq \frac{1}{\sqrt{3}}$.
So $g$ is increasing at $\left (-\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}\right )$ so the sequence $(x_k)$ is increasing in that interval.
Then $x_0<0 \Rightarrow x_{k+1}g(x_k)<g(x^{\star})=x^{\star}$. Increasing and upper bounded sequence, so it converges to $x^{\star}=0$.
Question 2 :
We consider the linear system \begin{align*}&3x+y=1 \\ &x+2y=3\end{align*} If we know that if we apply Jacobi Method tosolve the above linear system, it would converge, then check if Jacobi mathod converges if we apply it for the linear system \begin{align*}&x+2y=3 \\ &3x+y=1\end{align*}

My solution :
At the initial system we have the symmetric matrix $A=\begin{pmatrix}3 & 1 \\ 1 & 2\end{pmatrix}$. At the second system we have the matrix $\tilde{A}=\begin{pmatrix}1 & 2 \\ 3 & 1\end{pmatrix}$. This matrix is not symmetric neither it is diagonally dominant. So we don't have convergence.
Question 3 :
Let $x,y,\epsilon_1, \epsilon_2\in \mathbb{R}$ such that \begin{align*}&3x+y=7+\epsilon_1 \\ &4x+2y=10+\epsilon_2\end{align*} Show that $|x-2|+|y-1|\leq 3(|\epsilon_1|+|\epsilon_2|)$.

My solution :
From the first equation we get $y=7+\epsilon_1-3x\ \ \ \ \ (\star)$.
Substituting this in the second equation we get \begin{align*}4x+2(7+\epsilon_1-3x)=10+\epsilon_2 &\Rightarrow 4x+14+2\epsilon_1-6x=10+\epsilon_2 \\ & \Rightarrow -2x=-4+\epsilon_2-2\epsilon_1 \\ & \Rightarrow x=2-\frac{\epsilon_2}{2}+\epsilon_1\end{align*}
Substituting this $(\star)$ we get \begin{equation*}y=7+\epsilon_1-6+\frac{3}{2}\epsilon_2-3\epsilon_1=1-2\epsilon_1+\frac{3}{2}\epsilon_2\end{equation*}
Then \begin{align*}|x-2|+|y-1|&=\left |2-\frac{\epsilon_2}{2}+\epsilon_1-1\right |+\left |1-2\epsilon_1+\frac{3}{2}\epsilon_2-1\right |\\ & = \left |\epsilon_1-\frac{\epsilon_2}{2}\right |+\left |-2\epsilon_1+\frac{3}{2}\epsilon_2\right |\\ & \leq |\epsilon_1|+\frac{|\epsilon_1|}{2}+2|\epsilon_1|+\frac{3}{2}|\epsilon_2| \\ & = 3|\epsilon_1|+2|\epsilon_2| \\ & \leq 3|\epsilon_1|+3|\epsilon_2| = 3\left (|\epsilon_1|+|\epsilon_2| \right )\end{align*}
Question 4 :
Let $f(x)=x-x^3$. Let $(x_k)$ be the sequence that we get if we consider Newton's method to approximate a root. If $x_0=-\frac{1}{\sqrt{5}}$, then does the sequence converge? Is yes, find the limit.

My solution :
We have that $f'(x)=1-3x^2$.
Then \begin{equation*}x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)} \Rightarrow x_{n+1}=x_n-\frac{x_n-x_n^3}{1-3x_n^2}=-\frac{2x_n^3}{1-3x_n^2}\end{equation*}
Let $g(x)=-\frac{2x_n^3}{1-3x_n^2}$.
$x_1=g(x_0)=g\left (-\frac{1}{\sqrt{5}}\right )$.
\begin{equation*}\left |g'(x)\right |=\left |-\frac{6x^2}{(1-3x^2)^2}\right |=\frac{6x^2}{(1-3x^2)^2}\end{equation*}
We have that \begin{equation*}|g'(x)|<1 \Rightarrow 6x^2<1-6x^2+9x^4 \Rightarrow 9x^4-12x^2+1<0\end{equation*}
Are my solutions correct ? :unsure:
 
Mathematics news on Phys.org
mathmari said:
Then $g'(x)>0$ for $-\frac{1}{\sqrt{3}}<x<\frac{1}{\sqrt{3}}$ and $g'(x)\leq 0$ for $x\leq -\frac{1}{\sqrt{3}}$and $x\geq \frac{1}{\sqrt{3}}$.
So $g$ is increasing at $\left (-\frac{1}{\sqrt{3}}, \frac{1}{\sqrt{3}}\right )$ so the sequence $(x_k)$ is increasing in that interval.
Then $x_0<0 \Rightarrow x_{k+1}g(x_k)<g(x^{\star})=x^{\star}$. Increasing and upper bounded sequence, so it converges to $x^{\star}=0$.

Let's see...
Consider $f: x\mapsto x-1$, which has $f'(x)=1>0$ so it's increasing.
But a sequence $(x_k)$ with $x_{k+1}=f(x_k)$ won't be increasing, will it? :oops:
 
Klaas van Aarsen said:
Let's see...
Consider $f: x\mapsto x-1$, which has $f'(x)=1>0$ so it's increasing.
But a sequence $(x_k)$ with $x_{k+1}=f(x_k)$ won't be increasing, will it? :oops:

Ahh yes, the sequence is decreasing... So what could we do instead? :unsure:
 
mathmari said:
Ahh yes, the sequence is decreasing... So what could we do instead?
Hint: we need that $|x_{k+1}|=|g(x_k)|<|x_k|$. 🤔
 
mathmari said:
Question 2 :
We consider the linear system \begin{align*}&3x+y=1 \\ &x+2y=3\end{align*} If we know that if we apply Jacobi Method to solve the above linear system, it would converge, then check if Jacobi mathod converges if we apply it for the linear system \begin{align*}&x+2y=3 \\ &3x+y=1\end{align*}

My solution :
At the initial system we have the symmetric matrix $A=\begin{pmatrix}3 & 1 \\ 1 & 2\end{pmatrix}$. At the second system we have the matrix $\tilde{A}=\begin{pmatrix}1 & 2 \\ 3 & 1\end{pmatrix}$. This matrix is not symmetric neither it is diagonally dominant. So we don't have convergence.

The Jacobi method is intended for strictly diagonally dominant systems. It doesn't say what it will do for systems that are not.
I think we need to try to apply it to see if it will converge or not. 🤔
 
Klaas van Aarsen said:
Hint: we need that $|x_{k+1}|=|g(x_k)|<|x_k|$. 🤔

We have that $|x_{k+1}|=|g(x_k)|=|x_k-x_k^3|=|x_k|\cdot |1-x_k^2|$. So we have to show that $|1-x_k^2|<1$, right? :unsure:
Klaas van Aarsen said:
The Jacobi method is intended for strictly diagonally dominant systems. It doesn't say what it will do for systems that are not.
I think we need to try to apply it to see if it will converge or not. 🤔

So there is no relation between the two systems, I mean as for the convergence of the method, is there?

From the second system we get : $$\begin{pmatrix}1 & 2 \\ 3 & 1\end{pmatrix}=\begin{pmatrix}0 & 0 \\ 3 & 0\end{pmatrix}+\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}+\begin{pmatrix}0 & 2 \\ 0 & 0\end{pmatrix} =L+D+U$$
The method converges if the spectral radius of $D^{-1}(D-A)$ isless than $1$, right?
We have that $$D^{-1}(D-A)=\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\begin{pmatrix}0 & -2 \\ -3 & 0\end{pmatrix}=\begin{pmatrix}0 & -2 \\ -3 & 0\end{pmatrix}$$ The eigenvalues are $\pm \sqrt{6}$, so the spectral radius is greater than $1$, and so we don't have convergence.

Is that correct? :unsure:
 
mathmari said:
We have that $|x_{k+1}|=|g(x_k)|=|x_k-x_k^3|=|x_k|\cdot |1-x_k^2|$. So we have to show that $|1-x_k^2|<1$, right?

That could work.

Alternatively we have $|x_{k+1}|=|g(x_k)|<|x_k| \implies \left|\frac{g(x_k) - 0}{x_k-0}\right| < 1$.
That is, if $g$ is Lipschitz continuous on the interval with a constant less than $1$, then the iterative method converges. 🤔
mathmari said:
So there is no relation between the two systems, I mean as for the convergence of the method, is there?

From the second system we get : $$\begin{pmatrix}1 & 2 \\ 3 & 1\end{pmatrix}=\begin{pmatrix}0 & 0 \\ 3 & 0\end{pmatrix}+\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}+\begin{pmatrix}0 & 2 \\ 0 & 0\end{pmatrix} =L+D+U$$
The method converges if the spectral radius of $D^{-1}(D-A)$ is less than $1$, right?
We have that $$D^{-1}(D-A)=\begin{pmatrix}1 & 0 \\ 0 & 1\end{pmatrix}\begin{pmatrix}0 & -2 \\ -3 & 0\end{pmatrix}=\begin{pmatrix}0 & -2 \\ -3 & 0\end{pmatrix}$$ The eigenvalues are $\pm \sqrt{6}$, so the spectral radius is greater than $1$, and so we don't have convergence.

Is that correct?
The method indeed converges if the spectral radius is less than $1$.

The converse is not necessarily true though. (Worried)
 
mathmari said:
Question 3 :
...

Then \begin{align*}|x-2|+|y-1|&=\left |2-\frac{\epsilon_2}{2}+\epsilon_1-{\color{red}\mathbf{1}}\right |+\left |1-2\epsilon_1+\frac{3}{2}\epsilon_2-1\right |\\ & = \left |\epsilon_1-\frac{\epsilon_2}{2}\right |+\left |-2\epsilon_1+\frac{3}{2}\epsilon_2\right |\\ & \leq |\epsilon_1|+\frac{|\epsilon_1|}{2}+2|\epsilon_1|+\frac{3}{2}|\epsilon_2| \\ & = 3|\epsilon_1|+2|\epsilon_2| \\ & \leq 3|\epsilon_1|+3|\epsilon_2| = 3\left (|\epsilon_1|+|\epsilon_2| \right )\end{align*}

All good. (Nod)
You do have a ${\color{red}\mathbf{1}}$ that I've marked in red that should be a $2$ though.

mathmari said:
Question 4 :
Let $f(x)=x-x^3$. Let $(x_k)$ be the sequence that we get if we consider Newton's method to approximate a root. If $x_0=-\frac{1}{\sqrt{5}}$, then does the sequence converge? Is yes, find the limit.

My solution :
We have that $f'(x)=1-3x^2$.
Then \begin{equation*}x_{n+1}=x_n-\frac{f(x_n)}{f'(x_n)} \Rightarrow x_{n+1}=x_n-\frac{x_n-x_n^3}{1-3x_n^2}=-\frac{2x_n^3}{1-3x_n^2}\end{equation*}
Let $g(x)=-\frac{2x_n^3}{1-3x_n^2}$.
$x_1=g(x_0)=g\left (-\frac{1}{\sqrt{5}}\right )$.
\begin{equation*}\left |g'(x)\right |=\left |-\frac{6x^2}{(1-3x^2)^2}\right |=\frac{6x^2}{(1-3x^2)^2}\end{equation*}
We have that \begin{equation*}|g'(x)|<1 \Rightarrow 6x^2<1-6x^2+9x^4 \Rightarrow 9x^4-12x^2+1<0\end{equation*}
This seems incomplete. o_O
How do you know it converges?
What did you find for the limit?
 
Back
Top