RGB-D Object Tracking: A Particle Filter Approach on GPU

Abstract

This paper presents a particle filtering approach for 6-DOF object pose tracking using an RGB-D camera. Our particle filter is massively parallelized in a modern GPU so that it exhibits real-time performance even with several thousand particles. Given an a priori 3D mesh model, the proposed approach renders the object model onto texture buffers in the GPU, and the rendered results are directly used by our parallelized likelihood evaluation. Both photometric (colors) and geometric (3D points and surface normals) features are employed to determine the likelihood of each particle with respect to a given RGB-D scene. Our approach is compared with a tracker in the PCL both quantitatively and qualitatively in synthetic and real RGB-D sequences, respectively.

Authors

Changhyun Choi
College of Computing,
Georgia Tech
heanylab [at] gmail.com

Henrik Christensen
College of Computing,
Georgia Tech
hic [at] cc.gatech.edu

 

Paper

GDE Error: Error retrieving file - if necessary turn off error checking (404:Not Found)

Video

Acknowledgement

This work has in part been sponsored by the Boeing Corporation. The support is gratefully acknowledged.

Citation

  • [DOI] C. Choi and H. I. Christensen, “Rgb-d object tracking: a particle filter approach on gpu,” in Intelligent robots and systems (iros), 2013 ieee/rsj international conference on, Tokyo, 2013, pp. 1084-1091.
    [Bibtex]
    @INPROCEEDINGS{choi13:_rgb_d_objec_track,
    author={Changhyun Choi and Christensen, H.I},
    booktitle={Intelligent Robots and Systems (IROS), 2013 IEEE/RSJ International Conference on},
    title={RGB-D object tracking: A particle filter approach on GPU},
    year={2013},
    month={Nov},
    pages={1084-1091},
    abstract={This paper presents a particle filtering approach for 6-DOF object pose tracking using an RGB-D camera. Our particle filter is massively parallelized in a modern GPU so that it exhibits real-time performance even with several thousand particles. Given an a priori 3D mesh model, the proposed approach renders the object model onto texture buffers in the GPU, and the rendered results are directly used by our parallelized likelihood evaluation. Both photometric (colors) and geometric (3D points and surface normals) features are employed to determine the likelihood of each particle with respect to a given RGB-D scene. Our approach is compared with a tracker in the PCL both quantitatively and qualitatively in synthetic and real RGB-D sequences, respectively.},
    keywords={feature extraction;graphics processing units;image colour analysis;image sequences;image texture;maximum likelihood estimation;object tracking;particle filtering (numerical methods);pose estimation;rendering (computer graphics);solid modelling;3D points feature;6-DOF object pose tracking;GPU;PCL;RGB-D object tracking;RGB-D sequence;a priori 3D mesh model;colors feature;degrees-of-freedom;geometric feature;graphics processing unit;object rendering;parallelized likelihood evaluation;particle filter approach;photometric feature;red-green-blue depth;surface normals feature;synthetic sequence;texture buffers;Cameras;Graphics processing units;Image color analysis;Rendering (computer graphics);Robots;Solid modeling;Three-dimensional displays},
    doi={10.1109/IROS.2013.6696485},
    ISSN={2153-0858},
    address={Tokyo},
    url_link={http://dx.doi.org/10.1109/IROS.2013.6696485}
    }
Posted in IROS.