Please use this identifier to cite or link to this item: http://hdl.handle.net/10553/130231
Title: A dynamic programming approach for fast and robust object pose recognition from range images
Authors: Zach, Christopher
Peñate Sánchez, Adrián 
Pham, Minh Tri
UNESCO Clasification: 1203 Ciencia de los ordenadores
Keywords: Three-dimensional displays
Sensors
Solid modeling
Robustness
Feature extraction, et al
Issue Date: 2015
Publisher: Institute of Electrical and Electronics Engineers (IEEE) 
Journal: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition 
Conference: IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 
Abstract: Joint object recognition and pose estimation solely from range images is an important task e.g. in robotics applications and in automated manufacturing environments. The lack of color information and limitations of current commodity depth sensors make this task a challenging computer vision problem, and a standard random sampling based approach is prohibitively time-consuming. We propose to address this difficult problem by generating promising inlier sets for pose estimation by early rejection of clear outliers with the help of local belief propagation (or dynamic programming). By exploiting data-parallelism our method is fast, and we also do not rely on a computationally expensive training phase. We demonstrate state-of-the art performance on a standard dataset and illustrate our approach on challenging real sequences.
URI: http://hdl.handle.net/10553/130231
ISBN: 978-1-4673-6964-0
978-1-4673-6963-3
ISSN: 1063-6919
DOI: 10.1109/CVPR.2015.7298615
Source: EEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2015. [ISSN: 1063-6919], p. 196-203 (June 2015).
Appears in Collections:Actas de congresos
Show full item record

SCOPUSTM   
Citations

39
checked on Nov 17, 2024

Page view(s)

74
checked on Oct 12, 2024

Google ScholarTM

Check

Altmetric


Share



Export metadata



Items in accedaCRIS are protected by copyright, with all rights reserved, unless otherwise indicated.