I think there is some thing wrong

  • Thread starter Thread starter MIB
  • Start date Start date
MIB
Messages
17
Reaction score
0
I think there is some thing wrong in this exercise which I met by chance in a book of Calculus and analysis while looking for rigorous definition for angle , it says

let f : (a,b) → ℝ be a differentiable function , suppose that f' is bounded , and that f has a root r in (a,b) . suppose that for x ≠ r , Jx denote the open interval between x and r , where if f(x) > 0 then f is convex on Jx , and if f(x) < 0 then f is concave on Jx.Then prove that for any x0 in (a,b) the Newton sequence converges to a root where x0 is it initial point .

The problem here is that we can take the initial point x0 where f'(x0)=0 .for the simplicity consider f(x) := x^3 for all x in (-4,4)
 
Mathematics news on Phys.org
I think we must replace "f' is bounded on (a,b)" by "f' is non-zero throughout (a,b)"
 
Last edited:
MIB said:
I think there is some thing wrong in this exercise which I met by chance in a book of Calculus and analysis while looking for rigorous definition for angle , it says

let f : (a,b) → ℝ be a differentiable function , suppose that f' is bounded , and that f has a root r in (a,b) . suppose that for x ≠ r , Jx denote the open interval between x and r , where if f(x) > 0 then f is convex on Jx , and if f(x) < 0 then f is concave on Jx.Then prove that for any x0 in (a,b) the Newton sequence converges to a root where x0 is it initial point .

The problem here is that we can take the initial point x0 where f'(x0)=0 .for the simplicity consider f(x) := x^3 for all x in (-4,4)

There is no problem in this particular example because the initial point you chose is the root of your function, so you've already converged to the root and aren't going anywhere.

Try coming up with an example where the derivative vanishes at a point that is not the root of the function. Be sure to double-check that the concavity conditions of the theorem are still met in this case. If you can show that you can't pick a function with f'(x0) = 0 for which the concavity requirements are satisfied, then there is no problem with the theorem as stated.
 
Mute said:
There is no problem in this particular example because the initial point you chose is the root of your function, so you've already converged to the root and aren't going anywhere.

Try coming up with an example where the derivative vanishes at a point that is not the root of the function. Be sure to double-check that the concavity conditions of the theorem are still met in this case. If you can show that you can't pick a function with f'(x0) = 0 for which the concavity requirements are satisfied, then there is no problem with the theorem as stated.

I know it can be very hard to think an example where the derivative vanishes at a point which is not a root , and I think it is impossible the problem is that it says " for any x0 in (a,b) the Newton sequence converges to a root where x0 is it initial point " , and then this wrong ,because the root is in (a,b) .
 
OK I restated it as following

let f : (a,b) → ℝ be a differentiable function , suppose that f' is bounded , and that f has a root r in (a,b) . suppose that for x ≠ r , Jx denote the open interval between x and r , where if f(x) > 0 then f is convex on Jx , and if f(x) < 0 then f is concave on Jx.Then prove that for any x0 in (a,b) the Newton sequence converges to a root where x0 is it initial point where the derivative doesn't vanish at x0.

And I proved it easily then .
 
Insights auto threads is broken atm, so I'm manually creating these for new Insight articles. In Dirac’s Principles of Quantum Mechanics published in 1930 he introduced a “convenient notation” he referred to as a “delta function” which he treated as a continuum analog to the discrete Kronecker delta. The Kronecker delta is simply the indexed components of the identity operator in matrix algebra Source: https://www.physicsforums.com/insights/what-exactly-is-diracs-delta-function/ by...
Fermat's Last Theorem has long been one of the most famous mathematical problems, and is now one of the most famous theorems. It simply states that the equation $$ a^n+b^n=c^n $$ has no solutions with positive integers if ##n>2.## It was named after Pierre de Fermat (1607-1665). The problem itself stems from the book Arithmetica by Diophantus of Alexandria. It gained popularity because Fermat noted in his copy "Cubum autem in duos cubos, aut quadratoquadratum in duos quadratoquadratos, et...
Thread 'Imaginary Pythagorus'
I posted this in the Lame Math thread, but it's got me thinking. Is there any validity to this? Or is it really just a mathematical trick? Naively, I see that i2 + plus 12 does equal zero2. But does this have a meaning? I know one can treat the imaginary number line as just another axis like the reals, but does that mean this does represent a triangle in the complex plane with a hypotenuse of length zero? Ibix offered a rendering of the diagram using what I assume is matrix* notation...
Back
Top