首页 | 本学科首页   官方微博 | 高级检索  
     


Multimodal fusion for autonomous navigation via deep reinforcement learning with sparse rewards and hindsight experience replay
Affiliation:1. School of Materials Science and Engineering, Jiangsu Provincial Key Laboratory of Eco-Environmental Materials, Yancheng Institute of Technology, Yancheng, Jiangsu 224051, China;2. Key Laboratory of Transparent and Opto-Functional Inorganic Materials, Shanghai Institute of Ceramics, Chinese Academy of Sciences, Shanghai 200050, China;3. School of Physics & Engineering, Tongji University, Shanghai 200092, China;1. School of Electronic Engineering, Kumoh National Institute of Technology, Gumi, Gyeongbuk 39177, Republic of Korea;2. The division of AI Software Convergence, Dongguk University, Seoul 04620, Republic of Korea
Abstract:The multimodal perception of intelligent robots is essential for achieving collision-free and efficient navigation. Autonomous navigation is enormously challenging when perception is acquired using only vision or LiDAR sensor data due to the lack of complementary information from different sensors. This paper proposes a simple yet efficient deep reinforcement learning (DRL) with sparse rewards and hindsight experience replay (HER) to achieve multimodal navigation. By adopting the depth images and pseudo-LiDAR data generated by an RGB-D camera as input, a multimodal fusion scheme is used to enhance the perception of the surrounding environment compared to using a single sensor. To alleviate the misleading way for the agent to navigate with dense rewards, the sparse rewards are intended to identify its tasks. Additionally, the HER technique is introduced to address the sparse reward navigation issue for accelerating optimal policy learning. The results show that the proposed model achieves state-of-the-art performance in terms of success, crash, and timeout rates, as well as generalization capability.
Keywords:Hindsight experience replay  Obstacle avoidance  Deep reinforcement learning  Sparse rewards  Multimodal navigation
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号