The degrees of freedom of partly smooth regularizers
Langue
en
Article de revue
Ce document a été publié dans
Annals of the Institute of Statistical Mathematics. 2017-08, vol. 69, n° 4, p. 791 – 832
Springer Verlag
Résumé en anglais
We study regularized regression problems where the regularizer is a proper, lower-semicontinuous, convex and partly smooth function relative to a Riemannian submanifold. This encompasses several popular examples including ...Lire la suite >
We study regularized regression problems where the regularizer is a proper, lower-semicontinuous, convex and partly smooth function relative to a Riemannian submanifold. This encompasses several popular examples including the Lasso, the group Lasso, the max and nuclear norms, as well as their composition with linear operators (e.g., total variation or fused Lasso). Our main sensitivity analysis result shows that the predictor moves locally stably along the same active submanifold as the observations undergo small perturbations. This plays a pivotal role in getting a closed-form expression for the divergence of the predictor w.r.t. observations. We also show that, for many regularizers, including polyhedral ones or the analysis group Lasso, this divergence formula holds Lebesgue a.e. When the perturbation is random (with an appropriate continuous distribution), this allows us to derive an unbiased estimator of the degrees of freedom and the prediction risk. Our results unify and go beyond those already known in the literature.< Réduire
Mots clés en anglais
Manifold
O-minimal structures
Model selection
Sparsity
Degrees of freedom
Semi-algebraic sets
Total variation
Group Lasso
Partial smoothness
Projet Européen
Sparsity, Image and Geometry to Model Adaptively Visual Processings
Origine
Importé de halUnités de recherche