Please use this identifier to cite or link to this item: https://accedacris.ulpgc.es/jspui/handle/10553/158201
Title: Video Action Recognition in SoC FPGAs Driven by Neural Architecture Search
Authors: González Suárez,Daniel De Jesús 
Hernández Fernández, Pedro 
Fernández, Víctor
Marrero Callicó, Gustavo Iván 
UNESCO Clasification: 3307 Tecnología electrónica
Keywords: Neural Architecture Search
FPGA
System on Chip
Video Action Recognition
Reinforcement Learning, et al
Issue Date: 2025
Project: OASIS Open AI-driven Stack for enhanced HPEC platforms in Integrated Systems
Conference: 40th Conference on Design of Circuits and Integrated Systems (DCIS) 2025. Santander
Abstract: This work presents a hardware-aware Neural Architecture Search (NAS) framework for video-based human action recognition, targeting real-time deployment on FPGAbased System-on-Chip (SoC) platforms. The proposed method explores a constrained search space of Convolutional Neural Network (CNN)–Recurrent Neural Network (RNN) architectures aligned with a hardware-software pipeline where CNNs are mapped to FPGA Deep Learning Processing Units (DPUs) and RNNs to embedded ARM cores. A reinforcement learning (RL)-based controller, guided by a position-based discounted reward strategy, progressively learns to generate architectures that emphasize high-impact design decisions. Experiments on the UCF101 dataset demonstrate that the proposed architectures achieve 81.07% accuracy, among the highest reported for CNNRNN models relying exclusively on spatial information. The results validate the effectiveness of the proposed framework in driving hardware-compatible and performance-optimized architecture exploration
URI: https://accedacris.ulpgc.es/jspui/handle/10553/158201
ISBN: 979-8-3315-8091-9
DOI: 10.1109/DCIS67520.2025.11281932
Appears in Collections:Ponencias
Adobe PDF (451,07 kB)
Show full item record

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.