Detection of Risky Situations for Frail Adults With Hybrid Neural Networks on Multimodal Health Data
Idioma
EN
Article de revue
Este ítem está publicado en
IEEE MultiMedia. 2022-02-01, vol. 29, n° 1, p. 7-17
Resumen en inglés
In healthcare applications, the multimedia methodology is applied to multimodal signals and visual data. This article focuses on the detection of risk situations of frail people from lifelog multimodal signals and video ...Leer más >
In healthcare applications, the multimedia methodology is applied to multimodal signals and visual data. This article focuses on the detection of risk situations of frail people from lifelog multimodal signals and video recorded with wearable sensors. We propose a hybrid 3D convolutional neural network (3DCNN) and gated recurrent unit (GRU) (3DCNN-GRU) deep architecture with two branches. The first branch is a GRU network with a global attention block for classification of multisensory signal data. The second branch is a 3DCNN with windowing synchronized with multidimensional time-series signals. Two branches of the neural network are fused yielding promising results. The method produces 83.26% accuracy with dataset BIRDS. Benchmarking is also fulfilled on a publicly available dataset in action recognition.< Leer menos