RGB-D Object Tracking: A Particle Filter Approach on GPU


This paper presents a particle filtering approach for 6-DOF object pose tracking using an RGB-D camera. Our particle filter is massively parallelized in a modern GPU so that it exhibits real-time performance even with several thousand particles. Given an a priori 3D mesh model, the proposed approach renders the object model onto texture buffers in the GPU, and the rendered results are directly used by our parallelized likelihood evaluation. Both photometric (colors) and geometric (3D points and surface normals) features are employed to determine the likelihood of each particle with respect to a given RGB-D scene. Our approach is compared with a tracker in the PCL both quantitatively and qualitatively in synthetic and real RGB-D sequences, respectively.


Changhyun Choi
College of Computing,
Georgia Tech
heanylab [at] gmail.com

Henrik Christensen
College of Computing,
Georgia Tech
hic [at] cc.gatech.edu



GDE Error: Error retrieving file - if necessary turn off error checking (404:Not Found)



This work has in part been sponsored by the Boeing Corporation. The support is gratefully acknowledged.


[bibtex key=choi13:_rgb_d_objec_track]

Posted in Conference, IROS, Multi-robot Semantic Mapping, Publications.