Identificador persistente para citar o vincular este elemento:
http://hdl.handle.net/10553/130231
Título: | A dynamic programming approach for fast and robust object pose recognition from range images | Autores/as: | Zach, Christopher Peñate Sánchez, Adrián Pham, Minh Tri |
Clasificación UNESCO: | 1203 Ciencia de los ordenadores | Palabras clave: | Three-dimensional displays Sensors Solid modeling Robustness Feature extraction, et al. |
Fecha de publicación: | 2015 | Editor/a: | Institute of Electrical and Electronics Engineers (IEEE) | Publicación seriada: | Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition | Conferencia: | IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) | Resumen: | Joint object recognition and pose estimation solely from range images is an important task e.g. in robotics applications and in automated manufacturing environments. The lack of color information and limitations of current commodity depth sensors make this task a challenging computer vision problem, and a standard random sampling based approach is prohibitively time-consuming. We propose to address this difficult problem by generating promising inlier sets for pose estimation by early rejection of clear outliers with the help of local belief propagation (or dynamic programming). By exploiting data-parallelism our method is fast, and we also do not rely on a computationally expensive training phase. We demonstrate state-of-the art performance on a standard dataset and illustrate our approach on challenging real sequences. | URI: | http://hdl.handle.net/10553/130231 | ISBN: | 978-1-4673-6964-0 978-1-4673-6963-3 |
ISSN: | 1063-6919 | DOI: | 10.1109/CVPR.2015.7298615 | Fuente: | EEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2015. [ISSN: 1063-6919], p. 196-203 (June 2015). |
Colección: | Actas de congresos |
Los elementos en ULPGC accedaCRIS están protegidos por derechos de autor con todos los derechos reservados, a menos que se indique lo contrario.