Show simple item record

dc.rights.licenseopenen_US
hal.structure.identifierInstitut de Neurosciences cognitives et intégratives d'Aquitaine [INCIA]
dc.contributor.authorLENTO, Bianca
hal.structure.identifierInstitut de Neurosciences cognitives et intégratives d'Aquitaine [INCIA]
dc.contributor.authorSEGAS, Effie
hal.structure.identifierInstitut de Neurosciences cognitives et intégratives d'Aquitaine [INCIA]
dc.contributor.authorLECONTE, Vincent
hal.structure.identifierInstitut de Neurosciences cognitives et intégratives d'Aquitaine [INCIA]
dc.contributor.authorDOAT, Emilie
dc.contributor.authorDANION, Frederic
dc.contributor.authorPETERI, Renaud
hal.structure.identifierLaboratoire Bordelais de Recherche en Informatique [LaBRI]
dc.contributor.authorBENOIS-PINEAU, Jenny
hal.structure.identifierInstitut de Neurosciences cognitives et intégratives d'Aquitaine [INCIA]
dc.contributor.authorDE RUGY, Aymar
dc.date.accessioned2024-10-14T09:56:33Z
dc.date.available2024-10-14T09:56:33Z
dc.date.issued2024-08-30
dc.identifier.issn2052-4463en_US
dc.identifier.urihttps://oskar-bordeaux.fr/handle/20.500.12278/202474
dc.description.abstractEn3D-ARM-Gaze is a public dataset designed to provide natural arm movements together with visual and gaze information when reaching objects in a wide reachable space from a precisely controlled, comfortably seated posture. Participants were involved in picking and placing objects in various positions and orientations in a virtual environment, whereby a specific procedure maximized the workspace explored while ensuring a consistent seated posture by guiding participants to a predetermined neutral posture via visual feedback from the trunk and shoulders. These experimental settings enabled to capture natural arm movements with high median success rates (>98% objects reached) and minimal compensatory movements. The dataset regroups more than 2.5 million samples recorded from 20 healthy participants performing 14 000 single pick-and-place movements (700 per participant). While initially designed to explore novel prosthesis control strategies based on natural eye-hand and arm coordination, this dataset will also be useful to researchers interested in core sensorimotor control, humanoid robotics, human-robot interactions, as well as for the development and testing of associated solutions in gaze-guided computer vision.
dc.description.sponsorshipContrôle intuitif d'une prosthèse de poignet à base de mouvement naturel et d'information visuelle - ANR-23-CE19-0031en_US
dc.language.isoENen_US
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 United States*
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/us/*
dc.subject.enArm
dc.subject.enFixation
dc.subject.enMovement
dc.subject.enVirtual Reality
dc.title.en3D-ARM-Gaze: a public dataset of 3D Arm Reaching Movements with Gaze information in virtual reality
dc.title.alternativeSci Dataen_US
dc.typeArticle de revueen_US
dc.identifier.doi10.1038/s41597-024-03765-4en_US
dc.subject.halSciences du Vivant [q-bio]/Neurosciences [q-bio.NC]en_US
dc.identifier.pubmed39214999en_US
bordeaux.journalScientific Dataen_US
bordeaux.volume11en_US
bordeaux.hal.laboratoriesInstitut de neurosciences cognitives et intégratives d'Aquitaine (INCIA) - UMR 5287en_US
bordeaux.issue1en_US
bordeaux.institutionUniversité de Bordeauxen_US
bordeaux.institutionCNRSen_US
bordeaux.peerReviewedouien_US
bordeaux.inpressnonen_US
bordeaux.identifier.funderIDDirection Générale de l’Armementen_US
bordeaux.import.sourcehal
hal.identifierhal-04684166
hal.version1
hal.popularnonen_US
hal.audienceInternationaleen_US
hal.exportfalse
workflow.import.sourcehal
dc.rights.ccCC BY-NC-NDen_US
bordeaux.COinSctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=Scientific%20Data&rft.date=2024-08-30&rft.volume=11&rft.issue=1&rft.eissn=2052-4463&rft.issn=2052-4463&rft.au=LENTO,%20Bianca&SEGAS,%20Effie&LECONTE,%20Vincent&DOAT,%20Emilie&DANION,%20Frederic&rft.genre=article


Files in this item

Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record