- #1
edwardnash
- 3
- 0
Hi there,
I am new to optimization theory. I just went thru solving linear equations using gradient descent. I am looking into Newton's method now which calculates second order derivatives. I was wondering if we really need the hessian matrix for this method to work. Can we just compute the diagonal elements in the hessian and not all elements in the hessian and approximate the Newton's method. I was wondering if anybody familiar with these methods could help me out.
thanks,
ed
I am new to optimization theory. I just went thru solving linear equations using gradient descent. I am looking into Newton's method now which calculates second order derivatives. I was wondering if we really need the hessian matrix for this method to work. Can we just compute the diagonal elements in the hessian and not all elements in the hessian and approximate the Newton's method. I was wondering if anybody familiar with these methods could help me out.
thanks,
ed