Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/73290
Title: An attention recurrent model for human cooperation detection
Authors: Freire-Obregón, David 
Castrillón-Santana, Modesto 
Barra, Paola
Bisogni, Carmen
Nappi, Michele
UNESCO Clasification: 120304 Inteligencia artificial
220990 Tratamiento digital. Imágenes
Issue Date: 2020
Journal: Computer Vision And Image Understanding
Abstract: User cooperative behaviour is mandatory and valuable to warranty data acquisition quality in forensic biometrics. In the present paper, we consider human cooperative behaviour in front of wearable security cameras. Moreover, we propose a human cooperation detection pipeline based on deep learning. Recently, recurrent neural networks (RNN) have shown remarkable performance on several tasks such as image captioning, video analysis, or natural language processing. Our proposal describes an RNN architecture with the aim at detecting whether a human is exhibiting an adversarial behaviour by trying to avoid the camera. This data is obtained by analysing the noise patterns of human movement. More specifically, we are not only providing an extensive analysis on the proposed pipeline considering different configurations and a wide variety of RNN types, but also an ensemble of the generated models to outperform each single model. The experiment has been carried out using videos captured from a mobile device camera (GOTCHA Dataset) and the obtained results have demonstrated the robustness of the proposed method.
URI: http://hdl.handle.net/10553/73290
ISSN: 1077-3142
DOI: 10.1016/j.cviu.2020.102991
Source: Computer Vision and Image Understanding [ISSN 1077-3142], v. 197-198, (Agosto 2020)
Appears in Collections:Artículos
Show full item record

SCOPUSTM   
Citations

19
checked on Dec 15, 2024

WEB OF SCIENCETM
Citations

17
checked on Dec 15, 2024

Page view(s)

120
checked on Jun 22, 2024

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.