Abstract
Direct search for objects as part of navigation poses a challenge for small items. Utilizing context in the form of object-object relationships enable hierarchical search for targets efficiently. Most of the current approaches tend to directly incorporate sensory input into a reward-based learning approach, without learning about object relationships in the natural environment, and thus generalize poorly across domains. We present Memory-utilized Joint hierarchical Object Learning for Navigation in Indoor Rooms (MJOLNIR), a target-driven navigation algorithm, which considers the inherent relationship between target objects, and the more salient contextual objects occurring in its surrounding. Extensive experiments conducted across multiple environment settings show an 82.9% and 93.5% gain over existing state-of-the-art navigation methods in terms of the success rate (SR), and success weighted by path length (SPL), respectively. We also show that our model learns to converge much faster than other algorithms, without suffering from the well-known overfitting problem. Additional details regarding the supplementary material and code are available at this https URL.
Author
Yiding Qiu
Contextual Robotics Institute,
UC San Diego
yiqiu@eng.ucsd.edu
Anwesan Pal
Contextual Robotics Institute,
UC San Diego
a2pal@eng.ucsd.edu
Henrik I. Christensen
Contextual Robotics Institute,
UC San Diego
hichristensen@eng.ucsd.edu
Paper
Video
Code/ Data
Supplementary material including code and the videos of the different experiments are available at https://sites.google.com/eng.ucsd.edu/mjolnir.
Acknowledgement
The authors would like to thank Army Research Laboratory (ARL) W911NF-10-2-0016 Distributed and Collaborative Intelligent Systems and Technology (DCIST) Collaborative Technology Alliance for supporting this research.
Citation
Copyright
The documents contained in these directories are included by the contributing authors as a means to ensure timely dissemination of scholarly and technical work on a non-commercial basis. Copyright and all rights therein are maintained by the authors or by other copyright holders, notwithstanding that they have offered their works here electronically. It is understood that all persons copying this information will adhere to the terms and constraints invoked by each author’s copyright. These works may not be re posted without explicit permission of the copyright holder.