Show simple item record

hal.structure.identifierHausdorff Center for Mathematics [HCM]
dc.contributor.authorBUNGERT, Leon
hal.structure.identifierDepartment of Electrical Engineering - Technion [Haïfa] [EE-Technion]
dc.contributor.authorHAIT-FRAENKEL, Ester
hal.structure.identifierInstitut de Mathématiques de Bordeaux [IMB]
dc.contributor.authorPAPADAKIS, Nicolas
hal.structure.identifierDepartment of Electrical Engineering - Technion [Haïfa] [EE-Technion]
dc.contributor.authorGILBOA, Guy
dc.date.accessioned2024-04-04T02:45:56Z
dc.date.available2024-04-04T02:45:56Z
dc.date.issued2021
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/191508
dc.description.abstractEnNeural networks have revolutionized the field of data science, yielding remarkable solutions in a data-driven manner. For instance, in the field of mathematical imaging, they have surpassed traditional methods based on convex regularization. However, a fundamental theory supporting the practical applications is still in the early stages of development. We take a fresh look at neural networks and examine them via nonlinear eigenvalue analysis. The field of nonlinear spectral theory is still emerging, providing insights about nonlinear operators and systems. In this paper we view a neural network as a complex nonlinear operator and attempt to find its nonlinear eigenvectors. We first discuss the existence of such eigenvectors and analyze the kernel of ReLU networks. Then we study a nonlinear power method for generic nonlinear operators. For proximal operators associated to absolutely one-homogeneous convex regularization functionals, we can prove convergence of the method to an eigenvector of the proximal operator. This motivates us to apply a nonlinear method to networks which are trained to act similarly as a proximal operator. In order to take the non-homogeneity of neural networks into account we define a modified version of the power method. We perform extensive experiments for different proximal operators and on various shallow and deep neural networks designed for image denoising. Proximal eigenvectors will be used for geometric analysis of graphs, as clustering or the computation of distance functions. For simple neural nets, we observe the influence of training data on the eigenvectors. For state-of-the-art denoising networks, we show that eigenvectors can be interpreted as (un)stable modes of the network, when contaminated with noise or other degradations.
dc.language.isoen
dc.publisherSociety for Industrial and Applied Mathematics
dc.title.enNonlinear Power Method for Computing Eigenvectors of Proximal Operators and Neural Networks
dc.typeArticle de revue
dc.identifier.doi10.1137/20M1384154
dc.subject.halInformatique [cs]/Traitement du signal et de l'image
dc.identifier.arxiv2003.04595
dc.description.sponsorshipEuropeNonlocal Methods for Arbitrary Data Sources
bordeaux.journalSIAM Journal on Imaging Sciences
bordeaux.page1114–1148
bordeaux.volume14
bordeaux.hal.laboratoriesInstitut de Mathématiques de Bordeaux (IMB) - UMR 5251*
bordeaux.issue3
bordeaux.institutionUniversité de Bordeaux
bordeaux.institutionBordeaux INP
bordeaux.institutionCNRS
bordeaux.peerReviewedoui
hal.identifierhal-03323046
hal.version1
hal.popularnon
hal.audienceInternationale
hal.origin.linkhttps://hal.archives-ouvertes.fr//hal-03323046v1
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=SIAM%20Journal%20on%20Imaging%20Sciences&rft.date=2021&rft.volume=14&rft.issue=3&rft.spage=1114%E2%80%931148&rft.epage=1114%E2%80%931148&rft.au=BUNGERT,%20Leon&HAIT-FRAENKEL,%20Ester&PAPADAKIS,%20Nicolas&GILBOA,%20Guy&rft.genre=article


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record