Optimal Convergence Rates for Nesterov Acceleration
RONDEPIERRE, Aude
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
RONDEPIERRE, Aude
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
< Réduire
Institut de Mathématiques de Toulouse UMR5219 [IMT]
Équipe Recherche Opérationnelle, Optimisation Combinatoire et Contraintes [LAAS-ROC]
Langue
en
Article de revue
Ce document a été publié dans
SIAM Journal on Optimization. 2019-12, vol. 29, n° 4, p. 3131–3153
Society for Industrial and Applied Mathematics
Résumé en anglais
In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It is well-known since the pioneering work of Nesterov that the rate of convergence $O(1/t^2)$ is optimal for the class of ...Lire la suite >
In this paper, we study the behavior of solutions of the ODE associated to Nesterov acceleration. It is well-known since the pioneering work of Nesterov that the rate of convergence $O(1/t^2)$ is optimal for the class of convex functions with Lipschitz gradient. In this work, we show that better convergence rates can be obtained with some additional geometrical conditions, such as \L ojasiewicz property. More precisely, we prove the optimal convergence rates that can be obtained depending on the geometry of the function $F$ to minimize. The convergence rates are new, and they shed new light on the behavior of Nesterov acceleration schemes. We prove in particular that the classical Nesterov scheme may provide convergence rates that are worse than the classical gradient descent scheme on sharp functions: for instance, the convergence rate for strongly convex functions is not geometric for the classical Nesterov scheme (while it is the case for the gradient descent algorithm). This shows that applying the classical Nesterov acceleration on convex functions without looking more at the geometrical properties of the objective functions may lead to sub-optimal algorithms.< Réduire
Mots clés en anglais
Lyapunov functions
rate of convergence
Lojasiewicz property
ODEs
optimization
Project ANR
Centre International de Mathématiques et d'Informatique (de Toulouse) - ANR-11-LABX-0040
Université Fédérale de Toulouse - ANR-11-IDEX-0002
Université Fédérale de Toulouse - ANR-11-IDEX-0002
Origine
Importé de halUnités de recherche