Mostrar el registro sencillo del ítem
Asymmetric multi-task learning for interpretable gaze-driven grasping action forecasting
dc.rights.license | open | en_US |
dc.contributor.author | GONZALEZ-DIAZ, Ivan | |
dc.contributor.author | MOLINA-MORENO, Miguel | |
hal.structure.identifier | Laboratoire Bordelais de Recherche en Informatique [LaBRI] | |
dc.contributor.author | BENOIS-PINEAU, Jenny | |
hal.structure.identifier | Institut de Neurosciences cognitives et intégratives d'Aquitaine [INCIA] | |
dc.contributor.author | DE RUGY, Aymar | |
dc.date.accessioned | 2024-09-30T08:42:13Z | |
dc.date.available | 2024-09-30T08:42:13Z | |
dc.date.issued | 2024-07-18 | |
dc.identifier.issn | 2168-2208 | en_US |
dc.identifier.uri | https://oskar-bordeaux.fr/handle/20.500.12278/202009 | |
dc.description.abstractEn | This work tackles the problem of automatically predicting the grasping intention of humans observing their environment, with eye-tracker glasses and video cameras recording the scene view. Our target application is the assistance to people with motor disabilities and potential cognitive impairments, using assistive robotics. Our proposal leverages the analysis of human attention captured in the form of gaze fixations recorded by an eye-tracker on the first person video, as the anticipation of prehension actions is a well studied and well known phenomenon. We propose a multi-task system that simultaneously addresses the prediction of human attention in the near future, and the anticipation of grasping actions. In our model, visual attention is modeled as a competitive process between a discrete set of states, each one associated to a well-known gaze movement pattern from visual psychology. We additionally consider an asymmetric multitask problem, where attention modeling is an auxiliary task that helps to regularize the learning process of the main action prediction task, and propose a constrained multi-task loss that naturally deals with this asymmetry. Our model shows superior performance than other losses for dynamic multi-task learning, current dominant deep architectures for general action forecasting and particularly-tailored models for predicting grasping intention. In particular, it provides state-of-the-art performance in three datasets for egocentric action anticipation, with an average precision of 0.569 and 0.524 in GITW and Sharon datasets, respectively, and an accuracy of 89.2% and a success rate of 51.7% in Invisible dataset. | |
dc.language.iso | EN | en_US |
dc.rights | Attribution-NonCommercial-NoDerivs 3.0 United States | * |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/3.0/us/ | * |
dc.subject.en | Grasping Action Forecasting | |
dc.subject.en | Multi-Task Learning | |
dc.subject.en | Interpretable Attention Prediction | |
dc.subject.en | Constrained Loss | |
dc.title.en | Asymmetric multi-task learning for interpretable gaze-driven grasping action forecasting | |
dc.title.alternative | IEEE J Biomed Health Inform | en_US |
dc.type | Article de revue | en_US |
dc.identifier.doi | 10.1109/JBHI.2024.3430810 | en_US |
dc.subject.hal | Sciences du Vivant [q-bio]/Neurosciences [q-bio.NC] | en_US |
dc.identifier.pubmed | 39024089 | en_US |
bordeaux.journal | IEEE Journal of Biomedical and Health Informatics | en_US |
bordeaux.page | 1-17 | en_US |
bordeaux.hal.laboratories | Institut de neurosciences cognitives et intégratives d'Aquitaine (INCIA) - UMR 5287 | en_US |
bordeaux.institution | Université de Bordeaux | en_US |
bordeaux.institution | CNRS | en_US |
bordeaux.peerReviewed | oui | en_US |
bordeaux.inpress | non | en_US |
bordeaux.identifier.funderID | Spanish National Plan for Scientific and Technical Research and Innovation | en_US |
bordeaux.import.source | hal | |
hal.identifier | hal-04684197 | |
hal.version | 1 | |
hal.popular | non | en_US |
hal.audience | Internationale | en_US |
hal.export | false | |
workflow.import.source | hal | |
dc.rights.cc | CC BY-NC-ND | en_US |
bordeaux.COinS | ctx_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.jtitle=IEEE%20Journal%20of%20Biomedical%20and%20Health%20Informatics&rft.date=2024-07-18&rft.spage=1-17&rft.epage=1-17&rft.eissn=2168-2208&rft.issn=2168-2208&rft.au=GONZALEZ-DIAZ,%20Ivan&MOLINA-MORENO,%20Miguel&BENOIS-PINEAU,%20Jenny&DE%20RUGY,%20Aymar&rft.genre=article |