Space and time sensor fusion using an active camera for mobile robot navigation |
| |
Authors: | Tae-Seok Jin Kwon-Soon Lee Jang-Myung Lee |
| |
Affiliation: | (1) Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505, Japan;(2) Department of Electrical Engineering, Dong-A University, Busan, Korea;(3) Department of Electronics Engineering, Pusan National University, Busan, Korea |
| |
Abstract: | We propose a sensor-fusion technique where the data sets for previous moments are properly transformed and fused into the current data sets to allow accurate measurements, such as the distance to an obstacle or the location of the service robot itself. In conventional fusion schemes, measurements are dependent on the current data sets. As a result, more sensors are required to measure a certain physical parameter or to improve the accuracy of a measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequences of the data sets are stored and utilized to improve the measurements. The theoretical basis is illustrated by examples, and the effectiveness is proved through simulations. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in an unstructured environment and a structured environment.This work was presented in part at the 8th International Symposium on Artificial Life and Robotics, Oita, Japan, January 24–26, 2003 |
| |
Keywords: | Multisensor data fusion Image processing Localization Navigation Mobile robot |
本文献已被 SpringerLink 等数据库收录! |
|