Afficher la notice abrégée

hal.structure.identifierInstitut de Mathématiques de Bordeaux [IMB]
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorPIERRE, Fabien
hal.structure.identifierInstitut de Mathématiques de Bordeaux [IMB]
dc.contributor.authorAUJOL, Jean-François
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorBUGEAU, Aurélie
hal.structure.identifierInstitut de Mathématiques de Bordeaux [IMB]
dc.contributor.authorPAPADAKIS, Nicolas
hal.structure.identifierInstitut Polytechnique de Bordeaux [Bordeaux INP]
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorTA, Vinh-Thong
dc.date.issued2015
dc.description.abstractEnThis paper provides a new method to colorize gray-scale images. While the computation of the luminance channel is directly performed by a linear transformation, the colorization process is an ill-posed problem that requires some priors. In the literature two classes of approach exist. The first class includes manual methods that need the user to manually add colors on the image to colorize. The second class includes exemplar-based approaches where a color image, with a similar semantic content, is provided as input to the method. These two types of priors have their own advantages and drawbacks. In this paper, a new variational framework for exemplar-based colorization is proposed. A nonlocal approach is used to find relevant color in the source image in order to suggest colors on the gray-scale image. The spatial coherency of the result as well as the final color selection is provided by a nonconvex variational framework based on a total variation. An efficient primal-dual algorithm is provided, and a proof of its convergence is proposed. In this work, we also extend the proposed exemplar-based approach to combine both exemplar-based and manual methods. It provides a single framework that unifies advantages of both approaches. Finally, experiments and comparisons with state-of-the-art methods illustrate the efficiency of our proposal. 1. Introduction. The colorization of a gray-scale image consists of adding color information to it. It is useful in the entertainment industry to make old productions more attractive. The reverse operation is based on perceptual assumptions and is today an active research area [28], [13], [37]. Colorization can also be used to add information in order to help further analysis of the image by a user (e.g., sensor fusion [43]). It can also be used for art restoration ; see, e.g., [17] or [41]. It is an old subject that began with the ability of screens and devices to display colors. A seminal approach consists in mapping each level of gray into a color-space [18]. Nevertheless, all colors cannot be recovered without an additional prior. In the existing approaches, priors can be added in two ways: with a direct addition of color on
dc.language.isoen
dc.publisherSociety for Industrial and Applied Mathematics
dc.subject.en65K10
dc.subject.en94A08
dc.subject.encolorization
dc.subject.enoptimization
dc.subject.ennon-local methods AMS subject classifications 68U10
dc.subject.en49M29
dc.title.enLuminance-Chrominance Model for Image Colorization
dc.typeArticle de revue
dc.identifier.doi10.1137/140979368
dc.subject.halInformatique [cs]/Traitement des images
bordeaux.journalSIAM Journal on Imaging Sciences
bordeaux.page536–563
bordeaux.peerReviewedoui
hal.identifierhal-01166919
hal.version1
hal.popularnon
hal.audienceInternationale
hal.origin.linkhttps://hal.archives-ouvertes.fr//hal-01166919v1
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=SIAM%20Journal%20on%20Imaging%20Sciences&rft.date=2015&rft.spage=536%E2%80%93563&rft.epage=536%E2%80%93563&rft.au=PIERRE,%20Fabien&AUJOL,%20Jean-Fran%C3%A7ois&BUGEAU,%20Aur%C3%A9lie&PAPADAKIS,%20Nicolas&TA,%20Vinh-Thong&rft.genre=article


Fichier(s) constituant ce document

FichiersTailleFormatVue

Il n'y a pas de fichiers associés à ce document.

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée