1. Limited time only! Sign up for a free 30min personal tutor trial with Chegg Tutors
    Dismiss Notice
Dismiss Notice
Join Physics Forums Today!
The friendliest, high quality science and math community on the planet! Everyone who loves science is here!

Homework Help: Proof of convergence theory in optimization

  1. Apr 16, 2013 #1
    1. The problem statement, all variables and given/known data

    The question is:

    Suppose that lim [itex]x_k=x_*[/itex], where [itex]x_*[/itex] is a local minimizer of the nonlinear function [itex]f[/itex]. Assume that [itex]\triangledown^2 f(x_*)[/itex] is symmetric positive definite. Prove that the sequence [itex]\left \{ f(x_k)-f(x_*) \right \}[/itex] converges linearly if and only if [itex]\left \{ ||x_k-x_*|| \right \}[/itex] converges linearly. Prove that the two sequences converge at the same rate, regardless of what the rate is. What is the relationship between the rate constant for the two sequences?

    2. Relevant equations


    3. The attempt at a solution
    I guess we may use the orthogonal diagonalization of a symmetric matrix and [itex]f(x_k)-f(x_*)=\triangledown f(x_*)+\frac{1}{2}(x_k-x_*)^T\cdot\triangledown^2 f(\xi)(x_k-x_*)[/itex] and [itex]\triangledown f(x_*)=0[/itex]...... But I got stuck here. So what's your answer?
  2. jcsd
  3. Apr 16, 2013 #2

    Ray Vickson

    User Avatar
    Science Advisor
    Homework Helper

    My answer is that the result you are being asked to prove is wrong.
Share this great discussion with others via Reddit, Google+, Twitter, or Facebook

Have something to add?
Draft saved Draft deleted