Identificador persistente para citar o vincular este elemento: http://hdl.handle.net/10553/54978
Título: Ego-motion classification for body-worn videos
Autores/as: Meng, Zhaoyi
Sánchez, Javier 
Morel, Jean Michel
Bertozzi, Andrea L.
Brantingham, P. Jeffrey
Clasificación UNESCO: 220990 Tratamiento digital. Imágenes
Fecha de publicación: 2018
Publicación seriada: Mathematics and Visualization 
Conferencia: International conference on Imaging, Vision and Learning Based on Optimization and PDEs, IVLOPDE 2016 
Resumen: Portable cameras record dynamic first-person video footage and these videos contain information on the motion of the individual to whom the camera is mounted, defined as ego. We address the task of discovering ego-motion from the video itself, without other external calibration information. We investigate the use of similarity transformations between successive video frames to extract signals reflecting ego-motions and their frequencies. We use novel graph-based unsupervised and semi-supervised learning algorithms to segment the video frames into different ego-motion categories. Our results show very accurate results on both choreographed test videos and ego-motion videos provided by the Los Angeles Police Department.
URI: http://hdl.handle.net/10553/54978
ISSN: 1612-3786
DOI: 10.1007/978-3-319-91274-5_10
Fuente: Tai XC., Bae E., Lysaker M. (eds) Imaging, Vision and Learning Based on Optimization and PDEs. IVLOPDE 2016. Mathematics and Visualization. Springer, Cham
Colección:Actas de congresos
miniatura
pdf
Adobe PDF (3,81 MB)
Vista completa

Visitas

191
actualizado el 23-nov-2024

Descargas

40
actualizado el 23-nov-2024

Google ScholarTM

Verifica

Altmetric


Comparte



Exporta metadatos



Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.