Mostrar el registro sencillo del ítem

dc.rights.licenseopenen_US
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorBOTTECCHIA, Sébastien
ORCID: 0000-0001-9414-9778
IDREF: 157656802
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorCANOU, Joseph
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorGOMEZ, David
IDREF: 154885509
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorCHAUMETTE, Serge
IDREF: 034300236
hal.structure.identifierESTIA INSTITUTE OF TECHNOLOGY
dc.contributor.authorCOUTURE, Nadine
ORCID: 0000-0001-7959-5227
IDREF: 111534275
dc.date.accessioned2023-05-23T14:09:21Z
dc.date.available2023-05-23T14:09:21Z
dc.date.issued2018-10-10
dc.date.conference2018-10-10
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/182272
dc.description.abstractEnThis work aims to explore the use of a gesture-based interaction language called SoundPainting to interact with a swarm of unmanned air systems (UaS). A swarm is not just a collection of UASs, it is much more, both in terms of supported features and in terms of issues. The goal of this communication is to present the issues linked to the interaction with an semiautonomous swarm by gestures. The key point is that Soundpainting has been designed for directing a set of improvising live performers, and, thus, we link this ability of improvisation to the capacity of decision of the autonomous drones. Soundpainting already integrates the notion of groups of entities and makes it possible to address one single entity of a set/subset, still being able to address the set as a whole. Indeed, Soundpainting allows a real exchange and an adaptive dialogue between the soundpainter (here is the pilot) and the group, enabling contextual interpretation by each individual, and generating rich interaction and dialogue. In the proposed approach, a pilot performs SoundPainting gestures in front of a Kinect sensor (©Microsoft), then, the gesture recognized by a gesture recognition system is sent to the control command system in order to direct the drones ‘movements. On a more general context, our perspective, beyond the definition of a Soundpainting based gestural interaction used to interact with individual UaS or with swarms of UaSs, is to experiment it, first within a simulator, then within flying platforms, and finally within real-world scenarios. We will hopefully come with answers to the following questions: How relevant gestures are? How reliable is their detection? How is the User Experience?
dc.language.isoENen_US
dc.title.enInteracting with a swarm of semiautonomous drones with SoundPainting Gestures
dc.typeAutre communication scientifique (congrès sans actes - poster - séminaire...)en_US
dc.subject.halInformatique [cs]/Interface homme-machine [cs.HC]en_US
dc.subject.halInformatique [cs]/Vision par ordinateur et reconnaissance de formes [cs.CV]en_US
dc.subject.halInformatique [cs]/Systèmes embarquésen_US
dc.subject.halInformatique [cs]/Robotique [cs.RO]en_US
bordeaux.hal.laboratoriesESTIA - Rechercheen_US
bordeaux.institutionUniversité de Bordeauxen_US
bordeaux.institutionBordeaux INPen_US
bordeaux.institutionBordeaux Sciences Agroen_US
bordeaux.conference.titleUnmanned and Swarming Conference: Research Challenges for Future Unmanned Systems and Autonomous Swarmingen_US
bordeaux.countryfren_US
bordeaux.conference.cityBordeauxen_US
bordeaux.peerReviewednonen_US
bordeaux.import.sourcehal
hal.identifierhal-01973067
hal.version1
hal.exportfalse
workflow.import.sourcehal
dc.rights.ccPas de Licence CCen_US
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.date=2018-10-10&rft.au=BOTTECCHIA,%20S%C3%A9bastien&CANOU,%20Joseph&GOMEZ,%20David&CHAUMETTE,%20Serge&COUTURE,%20Nadine&rft.genre=conference


Archivos en el ítem

ArchivosTamañoFormatoVer

No hay archivos asociados a este ítem.

Este ítem aparece en la(s) siguiente(s) colección(ones)

Mostrar el registro sencillo del ítem