首页 | 本学科首页   官方微博 | 高级检索  
     

基于视觉注意的移动机器人环境3D建模
引用本文:郭炳华,戴宏跃,李中华.基于视觉注意的移动机器人环境3D建模[J].自动化学报,2017,43(7):1248-1256.
作者姓名:郭炳华  戴宏跃  李中华
作者单位:1.肇庆学院 肇庆 526061
基金项目:the Foundation of Guangdong Educational Committee2014KTSCX191the National Natural Science Foundation of China61201087
摘    要:人类的视觉注意具有高度的选择性.模仿这些机制可以使得机器人对其周围环境建模更具高效、智能和鲁棒特性.本文采用视觉注意提出了一种移动机器人环境3D建模方法.该方法采用障碍物距离势函数的变化率作为显著度的度量函数,利用移动机器人提取到的场景中的特征点并结合快速均值漂移算法,实现了移动机器人周围环境中物体显著性检测,并以其为栅格先验模型,结合传感器模型、投影方法采用贝叶斯估计方法构建了环境的栅格模型.建立的模型在室内和室外环境进行了实验验证和性能评估.

关 键 词:3D建模    栅格模型    移动机器人    视觉注意
收稿时间:2015-10-19

A Visual-attention-based 3D Mapping Method for Mobile Robots
Binghua Guo,Hongyue Dai,Zhonghua Li.A Visual-attention-based 3D Mapping Method for Mobile Robots[J].Acta Automatica Sinica,2017,43(7):1248-1256.
Authors:Binghua Guo  Hongyue Dai  Zhonghua Li
Affiliation:1.Department of Electrical and Information Engineering, Zhaoqing University, Zhaoqing 526061, China2.School of Data and Computer Science, Sun Yat-sen University, Guangzhou 510275, China
Abstract:Human visual attention is highly selective. The artificial vision system that imitates this mechanism increases the efficiency, intelligence, and robustness of mobile robots in environment modeling. This paper presents a 3-D modeling method based on visual attention for mobile robots. This method uses the distance-potential gradient as motion contrast and combines the visual features extracted from the scene with a mean shift segment algorithm to detect conspicuous objects in the surrounding environment. This method takes the saliency of objects as priori information, uses Bayes' theorem to fuse sensor modeling and grid priori modeling, and uses the projection method to create and update the 3-D environment modeling. The results of the experiments and performance evaluation illustrate the capabilities of our approach in generating accurate 3-D maps.
Keywords:3-D Mapping  grid model  mobile robots  visual attention
点击此处可从《自动化学报》浏览原始摘要信息
点击此处可从《自动化学报》下载全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号