OmniMapper: A Modular Multimodal Mapping Framework

Abstract

Simultaneous Localization and Mapping (SLAM) is not a problem with a one-size-fits-all solution. The literature includes a variety of SLAM approaches targeted at different environments, platforms, sensors, CPU budgets, and applications. We propose OmniMapper, a modular multimodal framework and toolbox for solving SLAM problems. The system can be used to generate pose graphs, do feature-based SLAM, and also includes tools for semantic mapping. Multiple measurement types from different sensors can be combined for multimodal mapping. It is open with standard interfaces to allow easy integration of new sensors and feature types. We present a detailed description of the mapping approach, as well as a software framework that implements this, and present detailed descriptions of its applications to several domains including mapping with a service robot in an indoor environment, large- scale mapping on a PackBot, and mapping with a handheld RGBD camera.

Authors

Alexander J. B. Trevor
College of Computing,
Georgia Tech
atrevor [at] cc.gatech.edu
John G. Rogers III
United States Army Research Lab,
Adelphi, MD
john.g.rogers59.civ [at] mail.mil
Henrik Christensen
College of Computing,
Georgia Tech
hic [at] cc.gatech.edu

Paper

GDE Error: Error retrieving file - if necessary turn off error checking (404:Not Found)

Video

Code/Data

The software framework is available as open source at http://www.omnimapper.org

Acknowledgement

This work was financially supported by the Boeing Corporation and the US Army Research Lab.

Citation

  • [DOI] A. J. B. Trevor, J. G. Rogers, and H. I. Christensen, “Omnimapper: a modular multimodal mapping framework,” in Robotics and automation (icra), 2014 ieee international conference on, 2014, pp. 1983-1990.
    [Bibtex]
    @INPROCEEDINGS{trevor14:_omnim,
    author={Trevor, Alexander J.B. and Rogers, John G. and Christensen, Henrik I},
    booktitle={Robotics and Automation (ICRA), 2014 IEEE International Conference on},
    title={OmniMapper: A modular multimodal mapping framework},
    year={2014},
    month={May},
    pages={1983-1990},
    abstract={Simultaneous Localization and Mapping (SLAM) is not a problem with a one-size-fits-all solution. The literature includes a variety of SLAM approaches targeted at different environments, platforms, sensors, CPU budgets, and applications. We propose OmniMapper, a modular multimodal framework and toolbox for solving SLAM problems. The system can be used to generate pose graphs, do feature-based SLAM, and also includes tools for semantic mapping. Multiple measurement types from different sensors can be combined for multimodal mapping. It is open with standard interfaces to allow easy integration of new sensors and feature types. We present a detailed description of the mapping approach, as well as a software framework that implements this, and present detailed descriptions of its applications to several domains including mapping with a service robot in an indoor environment, large-scale mapping on a PackBot, and mapping with a handheld RGBD camera.},
    keywords={Simultaneous localization and mapping;Three-dimensional displays;Time measurement;Trajectory},
    doi={10.1109/ICRA.2014.6907122},
    url_link={http://dx.doi.org/10.1109/ICRA.2014.6907122},
    }
Posted in ICRA and tagged , , , .