Mostrar el registro sencillo del ítem

dc.rights.licenseopenen_US
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorCLAY, Alexis
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorCOUTURE, Nadine
ORCID: 0000-0001-7959-5227
IDREF: 111534275
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorDECARSIN, Elodie
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorDESAINTE-CATHERINE, Myriam
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorVULLIARD, Pierre Henri
IDREF: 228223326
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorLARRALDE, Joseph
dc.date.accessioned2024-12-20T09:23:53Z
dc.date.available2024-12-20T09:23:53Z
dc.date.issued2012-05-21
dc.date.conference2012-05-21
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/204043
dc.description.abstractEnThe augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual and acoustic) to a dance live performance, and allowing the dancer to interact with them. In this paper, we describe a novel interaction that we used in the frame of this project: using the dancer's movements to recognize the emotions he expresses, and use these emotions to generate musical audio flows evolving in real-time. The originality of this interaction is threefold. First, it covers the whole interaction cycle from the input (the dancer's movements) to the output (the generated music). Second, this interaction isn't direct but goes through a high level of abstraction: dancer's emotional expression is recognized and is the source of music generation. Third, this interaction has been designed and validated through constant collaboration with a choreographer, culminating in an augmented ballet performance in front of a live audience.
dc.description.sponsorshipCultural experience: Augmented Reality and Emotion - ANR-07-RIAM-0002en_US
dc.language.isoENen_US
dc.subject.enInteractive sonification
dc.subject.enmotion
dc.subject.engesture and music
dc.subject.eninteraction
dc.subject.enlive performance
dc.subject.enmusical human-computer interaction
dc.title.enMovement to emotions to music: using whole body emotional expression as an interaction for electronic music generation
dc.typeCommunication dans un congrèsen_US
dc.subject.halInformatique [cs]/Interface homme-machine [cs.HC]en_US
dc.subject.halInformatique [cs]/Son [cs.SD]en_US
bordeaux.hal.laboratoriesESTIA - Rechercheen_US
bordeaux.institutionUniversité de Bordeauxen_US
bordeaux.conference.titleNew Interfaces for Musical Expressionen_US
bordeaux.countryusen_US
bordeaux.title.proceedingProceedings of the 12th International Conference on New Interfaces for Musical Expression (NIME'12)en_US
bordeaux.conference.cityAnn Arboren_US
bordeaux.import.sourcehal
hal.identifierhal-00771385
hal.version1
hal.invitednonen_US
hal.proceedingsouien_US
hal.conference.end2012-05-23
hal.popularnonen_US
hal.audienceInternationaleen_US
hal.exportfalse
workflow.import.sourcehal
dc.rights.ccPas de Licence CCen_US
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.date=2012-05-21&rft.au=CLAY,%20Alexis&COUTURE,%20Nadine&DECARSIN,%20Elodie&DESAINTE-CATHERINE,%20Myriam&VULLIARD,%20Pierre%20Henri&rft.genre=unknown


Archivos en el ítem

Thumbnail

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem