Optimal Convergence Rates for Nesterov Acceleration
RONDEPIERRE, Aude
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
RONDEPIERRE, Aude
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
< Reduce
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
Language
en
Article de revue
This item was published in
SIAM Journal on Optimization. 2019-12, vol. 29, n° 4, p. 3131–3153
Society for Industrial and Applied Mathematics
English Abstract
In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It is well-known since the pioneering work of Nesterov that the rate of convergence $O(1/t^2)$ is optimal for the class of ...Read more >
In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It is well-known since the pioneering work of Nesterov that the rate of convergence $O(1/t^2)$ is optimal for the class of convex functions with Lipschitz gradient. In this work, we show that better convergence rates can be obtained with some additional geometrical conditions, such as \L ojasiewicz property. More precisely, we prove the optimal convergence rates that can be obtained depending on the geometry of the function $F$ to minimize. The convergence rates are new, and they shed new light on the behavior of Nesterov acceleration schemes. We prove in particular that the classical Nesterov scheme may provide convergence rates that are worse than the classical gradient descent scheme on sharp functions: for instance, the convergence rate for strongly convex functions is not geometric for the classical Nesterov scheme (while it is the case for the gradient descent algorithm). This shows that applying the classical Nesterov acceleration on convex functions without looking more at the geometrical properties of the objective functions may lead to sub-optimal algorithms.Read less <
English Keywords
Lyapunov functions
rate of convergence
Lojasiewicz property
ODEs
optimization
ANR Project
Centre International de Mathématiques et d'Informatique (de Toulouse) - ANR-11-LABX-0040
Université Fédérale de Toulouse - ANR-11-IDEX-0002
Université Fédérale de Toulouse - ANR-11-IDEX-0002
Origin
Hal imported