Convergence of the iterates of the gradient method with constant stepsize
The gradient method with constant step length is the simplest method for solving unconstrained optimisation problems involving a continuously differentiable function with Lipschitz-continuous gradient. The motivation for this post came after reading this Wikipedia article where it is stated that under certain assumptions the sequence converges to a local optimum, but it is no further discussion is provided. Continue reading →
Error bounds for second order approximation
Where here we prove an approximation bound for twice continuously differentiable functions with M-Lipschitzian Hessian, that is for all . In particular, we show that for all
This is stated as Lemma 1.2.4 in: Y. Nesterov, Introductory Lectures on Convex Optimization – A basic course, Kluwer Ac. Publishers, 2004. Continue reading →