Identificador persistente para citar o vincular este elemento:
https://accedacris.ulpgc.es/jspui/handle/10553/154924
| Título: | Predicting Soccer Penalty Kick Direction Using Human Action Recognition | Autores/as: | Freire Obregón, David Sebastián Santana Jaria, Oliverio Jesús Lorenzo-Navarro, Javier Hernández Sosa, José Daniel Castrillón Santana, Modesto Fernando |
Clasificación UNESCO: | 2405 Biometría | Palabras clave: | Action Prediction Human Action Recognition Penalty Kick Soccer Vision Transformers |
Fecha de publicación: | 2026 | Publicación seriada: | Lecture Notes In Computer Science | Conferencia: | 23rd International Conference on Image Analysis and Processing (ICIAP) 2025 | Resumen: | Action anticipation has become a prominent topic in Human Action Recognition (HAR). However, its application to real-world sports scenarios remains limited by the availability of suitable annotated datasets. This work presents a novel dataset of manually annotated soccer penalty kicks to predict shot direction based on pre-kick player movements. We propose a deep learning classifier to benchmark this dataset that integrates HAR-based feature embeddings with contextual metadata. We evaluate twenty-two backbone models across seven architecture families (MViTv2, MViTv1, SlowFast, Slow, X3D, I3D, C2D), achieving up to 63.9% accuracy in predicting shot direction (left or right)–outperforming the real goalkeepers’ decisions. These results demonstrate the dataset’s value for anticipatory action recognition and validate our model’s potential as a generalizable approach for sports-based predictive tasks. | URI: | https://accedacris.ulpgc.es/jspui/handle/10553/154924 | ISBN: | 978-3-032-10184-6 | ISSN: | 0302-9743 | DOI: | 10.1007/978-3-032-10185-3_21 | Fuente: | Lecture Notes in Computer Science[ISSN 0302-9743],v. 16167 LNCS, p. 260-272, (Enero 2026) |
| Colección: | Actas de congresos |
Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.