Neighbourhood of Convergence of Sequence

  • Context: MHB 
  • Thread starter Thread starter Sudharaka
  • Start date Start date
  • Tags Tags
    Convergence Sequence
Click For Summary

Discussion Overview

The discussion revolves around the convergence of a sequence defined by an iterative method for finding roots of a function \(f\) with a continuous second-order derivative. Participants explore the conditions under which the sequence converges to a root, particularly focusing on the existence of a neighbourhood around a point \(x_0\) where the function satisfies certain properties.

Discussion Character

  • Exploratory
  • Technical explanation
  • Mathematical reasoning
  • Debate/contested

Main Points Raised

  • Some participants propose using Taylor's theorem to express the function \(f\) and analyze the error in the iterative sequence.
  • Others argue that the convergence of the sequence depends on ensuring that the initial guess \(x_1\) is sufficiently close to \(x_0\), leading to the definition of a neighbourhood \(U(x_0)\).
  • A counterexample involving the function \(f(x) = x^{1/3}\) is presented, but it is challenged on the grounds that it does not meet the criteria of having a continuous second-order derivative.
  • Participants discuss the implications of the continuity of \(f'\) and the condition \(f'(x_0) \neq 0\) for the existence of a neighbourhood where \(f'(x) \neq 0\) holds.
  • There is a focus on the relationship between the errors \(\varepsilon_n\) and \(\varepsilon_{n+1}\) and the conditions required for the sequence to remain within the defined neighbourhood.
  • Some participants express uncertainty about how to ensure that subsequent iterations \(x_n\) remain within the neighbourhood once \(x_1\) is chosen from it.

Areas of Agreement / Disagreement

Participants generally agree on the need for a neighbourhood around \(x_0\) for convergence, but there are multiple competing views on how to establish and ensure this neighbourhood effectively. The discussion remains unresolved regarding the specific conditions that guarantee all iterates remain within the neighbourhood.

Contextual Notes

Limitations include the need for careful consideration of boundary conditions and the continuity of derivatives, as well as the potential for \(f''(\gamma_n)\) to be zero, which could affect the convergence analysis.

Sudharaka
Gold Member
MHB
Messages
1,558
Reaction score
1
Hi everyone, :)

Can somebody give me a hint to solve this problem. :)

Problem:

Let \(f\) be a function defined on \([a,\,b]\) with continuous second order derivative. Let \(x_0\in (a,\,b)\) satisfy \(f(x_0)=0\) but \(f'(x_0)\neq 0\). Prove that, there is a neighbourhood of \(x_0\), say \(U(x_0)\), such that, for all \(x_1\in U(x_0)\), the following sequence,

\[x_{n+1}=x_{n}-\frac{f(x_n)}{f'(x_n)}\]

where \(n=1,\,2,\,\cdots\) is convergent.
 
Physics news on Phys.org
Sudharaka said:
Hi everyone, :)

Can somebody give me a hint to solve this problem. :)

Problem:

Let \(f\) be a function defined on \([a,\,b]\) with continuous second order derivative. Let \(x_0\in (a,\,b)\) satisfy \(f(x_0)=0\) but \(f'(x_0)\neq 0\). Prove that, there is a neighbourhood of \(x_0\), say \(U(x_0)\), such that, for all \(x_1\in U(x_0)\), the following sequence,

\[x_{n+1}=x_{n}-\frac{f(x_n)}{f'(x_n)}\]

where \(n=1,\,2,\,\cdots\) is convergent.

Use Taylor's theorem:
$$f(x_n) = f(x_0) + f'(x_0)(x_n - x_0) + \frac 1 {2!} f''(\xi_n)(x_n - x_0)^2$$
where $\xi_n$ is between $x_0$ and $x_n$.

Let $\varepsilon_n = x_0 - x_n$.
This is the error with respect to the root of f in iteration n.
Consider the error $\varepsilon_{n+1}$ in terms of $\varepsilon_n$.
If it approaches zero, you're done.
 
I like Serena said:
Use Taylor's theorem:
$$f(x_n) = f(x_0) + f'(x_0)(x_n - x_0) + \frac 1 {2!} f''(\xi_n)(x_n - x_0)^2$$
where $\xi_n$ is between $x_0$ and $x_n$.

Let $\varepsilon_n = x_0 - x_n$.
This is the error with respect to the root of f in iteration n.
Consider the error $\varepsilon_{n+1}$ in terms of $\varepsilon_n$.
If it approaches zero, you're done.

Thanks very much for the reply but I am not sure whether I get you. If the error approaches zero then the sequence converges. But how does that guarantee the existence of a neighbourhood \(U(x_0)\) where each element \(x_1\) makes the sequence convergent?
 
Sudharaka said:
Thanks very much for the reply but I am not sure whether I get you. If the error approaches zero then the sequence converges. But how does that guarantee the existence of a neighbourhood \(U(x_0)\) where each element \(x_1\) makes the sequence convergent?

If you work it out, you'll find there are some boundary conditions.
To satisfy those boundary conditions you need to take $x_1$ "close enough" to $x_0$, or equivalently $\varepsilon_1$ "close enough" to 0.
This is represented by a neighbourhood $U(x_0)$ that is "small enough".
 
Sudharaka said:
Hi everyone, :)

Can somebody give me a hint to solve this problem. :)

Problem:

Let \(f\) be a function defined on \([a,\,b]\) with continuous second order derivative. Let \(x_0\in (a,\,b)\) satisfy \(f(x_0)=0\) but \(f'(x_0)\neq 0\). Prove that, there is a neighbourhood of \(x_0\), say \(U(x_0)\), such that, for all \(x_1\in U(x_0)\), the following sequence,

\[x_{n+1}=x_{n}-\frac{f(x_n)}{f'(x_n)}\]

where \(n=1,\,2,\,\cdots\) is convergent.

A counterexample shoul be $\displaystyle f(x)= x^{\frac{1}{3}}$ because the difference equation becomes...

$\displaystyle \Delta_{n} = x_{n+1} - x_{n} = - 3\ x_{n}\ (1)$

... that diverges for any $x_{1} \ne 0$...

Kind regards

$\chi$ $\sigma$
 
chisigma said:
A counterexample shoul be $\displaystyle f(x)= x^{\frac{1}{3}}$ because the difference equation becomes...

It doesn't satisfy the criteria.
f'(0) is not defined, so it does not have a continuous second order derivative around x=0.
 
I like Serena said:
If you work it out, you'll find there are some boundary conditions.
To satisfy those boundary conditions you need to take $x_1$ "close enough" to $x_0$, or equivalently $\varepsilon_1$ "close enough" to 0.
This is represented by a neighbourhood $U(x_0)$ that is "small enough".

Thanks, now I am understanding it more clearly. So we can get the \(\epsilon_{n+1}\) in terms of \(\epsilon_n\) as mentioned >>here<<.

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}\]

I hope I am correct up to this point. Am I?
 
Sudharaka said:
Thanks, now I am understanding it more clearly. So we can get the \(\epsilon_{n+1}\) in terms of \(\epsilon_n\) as mentioned >>here<<.

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}\]

I hope I am correct up to this point. Am I?

Almost.
You dropped a square.

Note that you've already used that both f' and f'' exist, and that $f'(x_n)\ne 0$.
Are you aware of the conditions involved?

And I see you picked $\gamma_n$. Didn't you like $\xi_n$?

498-xi-xi-xi.png
 
I like Serena said:
Almost.
You dropped a square.

Note that you've already used that both f' and f'' exist, and that $f'(x_n)\ne 0$.
Are you aware of the conditions involved?

Sorry, that's a typo. :) It should be,

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}^2\]

It's given that \(f\) is twice differentiable. Hence \(f'\) and \(f''\) exists. But I thought that \(f'(x_n)\neq 0\) is implied through the equation generating the terms of the sequence. What I felt from the beginning when solving this problem is how to incoperate the fact that \(f'(x_0)\neq 0\).

I like Serena said:
And I see you picked $\gamma_n$. Didn't you like $\xi_n$?

;) No I hate that symbol. First I never can write it properly and in this context I don't like to use it because it looks like \(\epsilon\) and there's a chance I will confuse the two.
 
  • #10
Sudharaka said:
Sorry, that's a typo. :) It should be,

\[\epsilon_{n+1}=-\frac{f''(\gamma_n)}{2f'(x_n)}\epsilon_{n}^2\]

It's given that \(f\) is twice differentiable. Hence \(f'\) and \(f''\) exists. But I thought that \(f'(x_n)\neq 0\) is implied through the equation generating the terms of the sequence. What I felt from the beginning when solving this problem is how to incoperate the fact that \(f'(x_0)\neq 0\).

It's the other way around.
You can only use $f'(x_n)\neq 0$ because you have $f'(x_0)\neq 0$.

Due to the fact that f' is continuous and that $f'(x_0)\neq 0$, you can infer that there will be some open interval around $x_0$ that will have $f'(x) \ne 0$ for x in that interval.

Only within that interval can you use the Taylor expansion as given.
 
  • #11
I like Serena said:
It's the other way around.
You can only use $f'(x_n)\neq 0$ because you have $f'(x_0)\neq 0$.

Due to the fact that f' is continuous and that $f'(x_0)\neq 0$, you can infer that there will be some open interval around $x_0$ that will have $f'(x) \ne 0$ for x in that interval.

Only within that interval can you use the Taylor expansion as given.

This occurred me previously but I was confused by the fact that if we take that interval how do we know for sure that all the values \(x_n\) lie in that interval? That is we choose \(x_1\) from that interval, then calculate the value \(x_2\). Now how can we guarantee that \(x_2\) also lie in that same interval?
 
  • #12
Sudharaka said:
This occurred me previously but I was confused by the fact that if we take that interval how do we know for sure that all the values \(x_n\) lie in that interval? That is we choose \(x_1\) from that interval, then calculate the value \(x_2\). Now how can we guarantee that \(x_2\) also lie in that same interval?

That happens if we can make sure that $|\varepsilon_{n+1}| < |\varepsilon_n|$.
 
  • #13
I like Serena said:
That happens if we can make sure that $|\varepsilon_{n+1}| < |\varepsilon_n|$.

Yes, I think I am getting a hold of this. So we have the inequality,

\[|\epsilon_{n+1}|=\left|\frac{f''(\gamma_n)}{2f'(x_n)}\right||\epsilon_{n}|^2\]

To get $|\varepsilon_{n+1}| < |\varepsilon_n|$, we should have,

\[|\epsilon_{n}|<\left|\frac{2f'(x_n)}{f''(\gamma_n)}\right|\]

And this is the interval that we are looking for. Am I correct? :)
 
  • #14
Sudharaka said:
Yes, I think I am getting a hold of this. So we have the inequality,

\[|\epsilon_{n+1}|=\left|\frac{f''(\gamma_n)}{2f'(x_n)}\right||\epsilon_{n}|^2\]

To get $|\varepsilon_{n+1}| < |\varepsilon_n|$, we should have,

\[|\epsilon_{n}|<\left|\frac{2f'(x_n)}{f''(\gamma_n)}\right|\]

And this is the interval that we are looking for. Am I correct? :)

Yes.
With the additional constraints that we're inside the interval $(a,b)$ and that $f'(x) \ne 0$.
Also note that $f''(\gamma_n)$ could be zero, so we have to allow for that.
 
  • #15
I like Serena said:
Yes.
With the additional constraints that we're inside the interval $(a,b)$ and that $f'(x) \ne 0$.
Also note that $f''(\gamma_n)$ could be zero, so we have to allow for that.

Well, so one last question. We have to assume that \(f''(\gamma_n)\neq 0\). This is just something we have to assume and cannot be deducted from the given details. Am I correct? :)
 
  • #16
Sudharaka said:
Well, so one last question. We have to assume that \(f''(\gamma_n)\neq 0\). This is just something we have to assume and cannot be deducted from the given details. Am I correct? :)

Not really.
If f''(x) = 0, we have instantaneous convergence to the root.
 
  • #17
I like Serena said:
Not really.
If f''(x) = 0, we have instantaneous convergence to the root.

And that I believe tells us \(f\) is a straight line. Isn't? But what's wrong with that?
 
  • #18
Sudharaka said:
And that I believe tells us \(f\) is a straight line. Isn't? But what's wrong with that?

Yep and nothing's wrong with that.
And btw, it is not given that f''(x) = 0 everywhere.
What it does mean, is that $|\varepsilon_{n+1}| < |\varepsilon_{n}|$ if $\varepsilon_{n} \ne 0$, which is what we wanted.
 
  • #19
I like Serena said:
Yep and nothing's wrong with that.
And btw, it is not given that f''(x) = 0 everywhere.
What it does mean, is that $|\varepsilon_{n+1}| < |\varepsilon_{n}|$ if $\varepsilon_{n} \ne 0$, which is what we wanted.

Yes, I am sorry, too tired to understand that \(f''(\gamma)=0\) does not mean that \(f\) is a straight line. However if \(f''(\gamma)=0\) then \(x_{n+1}=x_0\) so obviously,

\[0=|\varepsilon_{n+1}| < |\varepsilon_{n}|\]

I guess this is what you meant. Am I correct? :)
 
  • #20
Sudharaka said:
Yes, I am sorry, too tired to understand that \(f''(\gamma)=0\) does not mean that \(f\) is a straight line. However if \(f''(\gamma)=0\) then \(x_{n+1}=x_0\) so obviously,

\[0=|\varepsilon_{n+1}| < |\varepsilon_{n}|\]

I guess this is what you meant. Am I correct?

Yes.
 
  • #21
I like Serena said:
Yes.

Thank you sooooooooooooooooooo much for all your help. I think I understood every bit and piece of the problem, though it took a considerable amount of time. :)
 

Similar threads

  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 17 ·
Replies
17
Views
2K
Replies
32
Views
3K
  • · Replies 15 ·
Replies
15
Views
2K
  • · Replies 11 ·
Replies
11
Views
2K
  • · Replies 4 ·
Replies
4
Views
2K
  • · Replies 16 ·
Replies
16
Views
1K
  • · Replies 3 ·
Replies
3
Views
2K
  • · Replies 9 ·
Replies
9
Views
2K
  • · Replies 13 ·
Replies
13
Views
2K