Show simple item record

dc.rights.licenseopenen_US
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
hal.structure.identifierESTIA - Institute of technology [ESTIA]
dc.contributor.authorCLAY, Alexis
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
hal.structure.identifierESTIA - Institute of technology [ESTIA]
dc.contributor.authorCOUTURE, Nadine
ORCID: 0000-0001-7959-5227
IDREF: 111534275
hal.structure.identifierCommunication Langagière et Interaction Personne-Système [CLIPS - IMAG]
dc.contributor.authorNIGAY, Laurence
dc.date.accessioned2025-03-28T09:29:02Z
dc.date.available2025-03-28T09:29:02Z
dc.date.issued2009-12-08
dc.date.conference2009-09-10
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/205746
dc.description.abstractEnIn the field of affective computing, one of the most exciting motivations is to enable a computer to sense users' emotions. To achieve this goal an interactive application has to incorporate emotional sensitivity. Following an engineering approach, the key point is then to define a unifying software architecture that allows any interactive system to become emotionally sensitive. Most research focus on identifying and validating interpretation systems and/or emotional characteristics from different modalities. However, there is little focus on modeling generic software architecture for emotion recognition. Therefore, we propose an integrative approach and define such a generic software architecture based on the grounding theory of multimodality. We state that emotion recognition should be multimodal and serve as a tool for interaction. As such, we use results on multimodality in interactive applications to propose the emotion branch, a component-based architecture model for emotion recognition systems that integrates itself within general models for interactive systems. The emotion branch unifies existing emotion recognition applications architectures following the usual three-level schema: capturing signals from sensors, extracting and analyzing emotionally-relevant characteristics from the obtained data and interpreting these characteristics into an emotion. We illustrate the feasibility and the advantages of the emotion branch with a test case that we developed for gesture-based emotion recognition.
dc.language.isoENen_US
dc.title.enEngineering affective computing: a unifying software architecture
dc.typeCommunication dans un congrèsen_US
dc.identifier.doi10.1109/ACII.2009.5349541en_US
dc.subject.halInformatique [cs]/Interface homme-machine [cs.HC]en_US
bordeaux.page1-6en_US
bordeaux.hal.laboratoriesESTIA - Rechercheen_US
bordeaux.institutionUniversité de Bordeauxen_US
bordeaux.conference.title3rd International Conference on Affective Computing and Intelligent Interaction and Workshops, 2009. ACII 2009.en_US
bordeaux.countrynlen_US
bordeaux.title.proceedingAffective Computing and Intelligent Interaction (ACII 2009)en_US
bordeaux.conference.cityAmsterdamen_US
bordeaux.import.sourcehal
hal.identifierhal-00408182
hal.version1
hal.invitednonen_US
hal.proceedingsouien_US
hal.conference.end2009-09-12
hal.popularnonen_US
hal.audienceInternationaleen_US
hal.exportfalse
workflow.import.sourcehal
dc.rights.ccPas de Licence CCen_US
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.date=2009-12-08&rft.spage=1-6&rft.epage=1-6&rft.au=CLAY,%20Alexis&COUTURE,%20Nadine&NIGAY,%20Laurence&rft.genre=unknown


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record