Afficher la notice abrégée

dc.rights.licenseopenen_US
dc.contributor.authorGRATEROL, Wilfredo
dc.contributor.authorDIAZ-AMADO, Jose
dc.contributor.authorCARDINALE, Yudith
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorDONGO, Irvin
dc.contributor.authorLOPES-SILVA, Edmundo
dc.contributor.authorSANTOS-LIBARINO, Cleia
dc.date.accessioned2023-04-05T14:01:45Z
dc.date.available2023-04-05T14:01:45Z
dc.date.issued2021-02
dc.identifier.issn1424-8220en_US
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/172804
dc.description.abstractEnFor social robots, knowledge regarding human emotional states is an essential part of adapting their behavior or associating emotions to other entities. Robots gather the information from which emotion detection is processed via different media, such as text, speech, images, or videos. The multimedia content is then properly processed to recognize emotions/sentiments, for example, by analyzing faces and postures in images/videos based on machine learning techniques or by converting speech into text to perform emotion detection with natural language processing (NLP) techniques. Keeping this information in semantic repositories offers a wide range of possibilities for implementing smart applications. We propose a framework to allow social robots to detect emotions and to store this information in a semantic repository, based on EMONTO (an EMotion ONTOlogy), and in the first figure or table caption. Please define if appropriate. an ontology to represent emotions. As a proof-of-concept, we develop a first version of this framework focused on emotion detection in text, which can be obtained directly as text or by converting speech to text. We tested the implementation with a case study of tour-guide robots for museums that rely on a speech-to-text converter based on the Google Application Programming Interface (API) and a Python library, a neural network to label the emotions in texts based on NLP transformers, and EMONTO integrated with an ontology for museums; thus, it is possible to register the emotions that artworks produce in visitors. We evaluate the classification model, obtaining equivalent results compared with a state-of-the-art transformer-based model and with a clear roadmap for improvement.
dc.language.isoENen_US
dc.title.enEmotion Detection for Social Robots Based on NLP Transformers and an Emotion Ontology
dc.typeArticle de revueen_US
dc.identifier.doi10.3390/s21041322en_US
dc.subject.halInformatique [cs]en_US
bordeaux.journalSensorsen_US
bordeaux.page1322en_US
bordeaux.volume21en_US
bordeaux.hal.laboratoriesESTIA - Rechercheen_US
bordeaux.issue4en_US
bordeaux.institutionUniversité de Bordeauxen_US
bordeaux.institutionBordeaux INPen_US
bordeaux.institutionBordeaux Sciences Agroen_US
bordeaux.peerReviewedouien_US
bordeaux.inpressnonen_US
bordeaux.import.sourcehal
hal.identifierhal-03217670
hal.version1
hal.exportfalse
workflow.import.sourcehal
dc.rights.ccPas de Licence CCen_US
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Sensors&rft.date=2021-02&rft.volume=21&rft.issue=4&rft.spage=1322&rft.epage=1322&rft.eissn=1424-8220&rft.issn=1424-8220&rft.au=GRATEROL,%20Wilfredo&DIAZ-AMADO,%20Jose&CARDINALE,%20Yudith&DONGO,%20Irvin&LOPES-SILVA,%20Edmundo&rft.genre=article


Fichier(s) constituant ce document

Thumbnail

Ce document figure dans la(les) collection(s) suivante(s)

Afficher la notice abrégée