Submitted

A Unified Convergence Analysis of First Order Convex Optimization Methods via Strong Lyapunov Functions

Long Chen and Hao Luo

Submitted

arXiv   Bibtex coming soon

ABSTRACT:

 We present a unified convergence analysis for first order
convex optimization methods using the concept of strong Lyapunov
conditions. Combining this with suitable time scaling factors, we are
able to handle both convex and strong convex cases, and establish
contraction properties of Lyapunov functions for many existing
ordinary differential equation models. Then we derive prevailing first
order optimization algorithms, such as proximal gradient methods,
heavy ball methods (also known as momentum methods), Nesterov
accelerated gradient methods, and accelerated proximal gradient
methods from numerical discretizations of corresponding dynamical
systems. We also apply strong Lyapunov conditions to the discrete
level and provide a more systematical analysis framework. Another
contribution is a novel second order dynamical system called
Hessian-driven Nesterov accelerated gradient flow which can be used to
design and analyze accelerated first order methods for smooth and
non-smooth convex optimizations.