Afficher la notice abrégée

hal.structure.identifierParcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
hal.structure.identifierDynamic Networks : Temporal and Structural Capture Approach [DANTE]
dc.contributor.authorGRIBONVAL, Rémi
hal.structure.identifierInstitut für Mathematik [Potsdam]
hal.structure.identifierUnderstanding the Shape of Data [DATASHAPE]
hal.structure.identifierLaboratoire de Mathématiques d'Orsay [LMO]
dc.contributor.authorBLANCHARD, Gilles
hal.structure.identifierParcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
hal.structure.identifierGIPSA Pôle Géométrie, Apprentissage, Information et Algorithmes [GIPSA-GAIA]
dc.contributor.authorKERIVEN, Nicolas
hal.structure.identifierParcimonie et Nouveaux Algorithmes pour le Signal et la Modélisation Audio [PANAMA]
hal.structure.identifierInstitut de Mathématiques de Bordeaux [IMB]
dc.contributor.authorTRAONMILIN, Yann
dc.date.accessioned2024-04-04T02:46:16Z
dc.date.available2024-04-04T02:46:16Z
dc.date.issued2021-08-21
dc.identifier.issn2520-2316
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/191535
dc.description.abstractEnWe describe a general framework --compressive statistical learning-- for resource-efficient large-scale learning: the training collection is compressed in one pass into a low-dimensional sketch (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. A near-minimizer of the risk is computed from the sketch through the solution of a nonlinear least squares problem. We investigate sufficient sketch sizes to control the generalization error of this procedure. The framework is illustrated on compressive PCA, compressive clustering, and compressive Gaussian mixture Modeling with fixed known variance. The latter two are further developed in a companion paper.
dc.description.sponsorshipAlgorithmes, Approximations, Parcimonie et Plongements pour l'IA - ANR-19-CHIA-0009
dc.description.sponsorshipApproches statistiquement et computationnellement efficicaces pour l'intelligence artificielle - ANR-19-CHIA-0021
dc.language.isoen
dc.publisherEMS Publishing House
dc.subject.enDimension reduction
dc.subject.enstatistical learning
dc.subject.enKernel mean embedding
dc.subject.ensketching
dc.subject.enrandom features
dc.subject.enexcess risk control
dc.subject.enrandom moments
dc.title.enCompressive Statistical Learning with Random Feature Moments
dc.typeArticle de revue
dc.identifier.doi10.4171/msl/20
dc.subject.halMathématiques [math]/Statistiques [math.ST]
dc.subject.halInformatique [cs]/Apprentissage [cs.LG]
dc.subject.halInformatique [cs]/Théorie de l'information [cs.IT]
dc.identifier.arxiv1706.07180
dc.description.sponsorshipEuropePLEASE: Projections, Learning, and Sparsity for Efficient data-processing
bordeaux.journalMathematical Statistics and Learning
bordeaux.page113–164
bordeaux.volume3
bordeaux.hal.laboratoriesInstitut de Mathématiques de Bordeaux (IMB) - UMR 5251*
bordeaux.issue2
bordeaux.institutionUniversité de Bordeaux
bordeaux.institutionBordeaux INP
bordeaux.institutionCNRS
bordeaux.peerReviewedoui
hal.identifierhal-01544609
hal.version1
hal.popularnon
hal.audienceInternationale
hal.origin.linkhttps://hal.archives-ouvertes.fr//hal-01544609v1
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Mathematical%20Statistics%20and%20Learning&rft.date=2021-08-21&rft.volume=3&rft.issue=2&rft.spage=113%E2%80%93164&rft.epage=113%E2%80%93164&rft.eissn=2520-2316&rft.issn=2520-2316&rft.au=GRIBONVAL,%20R%C3%A9mi&BLANCHARD,%20Gilles&KERIVEN,%20Nicolas&TRAONMILIN,%20Yann&rft.genre=article


Fichier(s) constituant ce document

FichiersTailleFormatVue

Il n'y a pas de fichiers associés à ce document.

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée