Mostrar el registro sencillo del ítem
Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation
dc.rights.license | open | en_US |
hal.structure.identifier | ESTIA INSTITUTE OF TECHNOLOGY | |
dc.contributor.author | CLAY, Alexis | |
hal.structure.identifier | ESTIA INSTITUTE OF TECHNOLOGY | |
dc.contributor.author | COUTURE, Nadine
ORCID: 0000-0001-7959-5227 IDREF: 111534275 | |
hal.structure.identifier | ESTIA INSTITUTE OF TECHNOLOGY | |
dc.contributor.author | DECARSIN, Elodie | |
hal.structure.identifier | Laboratoire Bordelais de Recherche en Informatique [LaBRI] | |
dc.contributor.author | DESAINTE-CATHERINE, Myriam | |
hal.structure.identifier | Laboratoire Bordelais de Recherche en Informatique [LaBRI] | |
dc.contributor.author | VULLIARD, Pierre Henri
IDREF: 228223326 | |
hal.structure.identifier | Laboratoire Bordelais de Recherche en Informatique [LaBRI] | |
dc.contributor.author | LARRALDE, Joseph | |
dc.date.accessioned | 2024-12-20T09:23:53Z | |
dc.date.available | 2024-12-20T09:23:53Z | |
dc.date.issued | 2012-05-21 | |
dc.date.conference | 2012-05-21 | |
dc.identifier.uri | https://oskar-bordeaux.fr/handle/20.500.12278/204043 | |
dc.description.abstractEn | The augmented ballet project aims at gathering research from several fields and directing them towards a same application case: adding virtual elements (visual and acoustic) to a dance live performance, and allowing the dancer to interact with them. In this paper, we describe a novel interaction that we used in the frame of this project: using the dancer's movements to recognize the emotions he expresses, and use these emotions to generate musical audio flows evolving in real-time. The originality of this interaction is threefold. First, it covers the whole interaction cycle from the input (the dancer's movements) to the output (the generated music). Second, this interaction isn't direct but goes through a high level of abstraction: dancer's emotional expression is recognized and is the source of music generation. Third, this interaction has been designed and validated through constant collaboration with a choreographer, culminating in an augmented ballet performance in front of a live audience. | |
dc.description.sponsorship | Cultural experience: Augmented Reality and Emotion - ANR-07-RIAM-0002 | en_US |
dc.language.iso | EN | en_US |
dc.subject.en | Interactive sonification | |
dc.subject.en | motion | |
dc.subject.en | gesture and music | |
dc.subject.en | interaction | |
dc.subject.en | live performance | |
dc.subject.en | musical human-computer interaction | |
dc.title.en | Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation | |
dc.type | Communication dans un congrès | en_US |
dc.subject.hal | Informatique [cs]/Interface homme-machine [cs.HC] | en_US |
dc.subject.hal | Informatique [cs]/Son [cs.SD] | en_US |
bordeaux.hal.laboratories | ESTIA - Recherche | en_US |
bordeaux.institution | Université de Bordeaux | en_US |
bordeaux.conference.title | New Interfaces for Musical Expression | en_US |
bordeaux.country | us | en_US |
bordeaux.title.proceeding | Proceedings of the 12th International Conference on New Interfaces for Musical Expression (NIME'12) | en_US |
bordeaux.conference.city | Ann Arbor | en_US |
bordeaux.import.source | hal | |
hal.identifier | hal-00771385 | |
hal.version | 1 | |
hal.invited | non | en_US |
hal.proceedings | oui | en_US |
hal.conference.end | 2012-05-23 | |
hal.popular | non | en_US |
hal.audience | Internationale | en_US |
hal.export | false | |
workflow.import.source | hal | |
dc.rights.cc | Pas de Licence CC | en_US |
bordeaux.COinS | ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.date=2012-05-21&rft.au=CLAY,%20Alexis&COUTURE,%20Nadine&DECARSIN,%20Elodie&DESAINTE-CATHERINE,%20Myriam&VULLIARD,%20Pierre%20Henri&rft.genre=unknown |