Design and implementation of an omnidirectional vision system for robot perception |
| |
Affiliation: | 1. Intelligent Robotics Institute, School of Mechatronical Engineering, Beijing Institute of Technology, Beijing 100081, China;2. Shenzhen Institute of Geriatrics, Shenzhen Second People’s Hospital, Shenzhen 518035, China;3. Fujian Provincial Key Laboratory of Information Processing and Intelligent Control (Minjiang University), Fuzhou 350121, China;1. State Key Laboratory of Tribology and Institute of Manufacturing Engineering, Department of Mechanical Engineering, Tsinghua University, Beijing 100084, Chinan;2. Beijing Key Lab of Precision/Ultra-precision Manufacturing Equipment and Control, Beijing 100084, Chinan;1. Dept. of Instrumentation and Control Eng., Faculty of Mechanical Eng., and Czech Institute of Informatics Robotics and Cybernetics, Czech Technical Univ. in Prague, Czech Republic;2. Dept. of Mechanical Eng., University of Connecticut, Storrs, CT, United States;3. Dept. of Computer Science, Katholieke Universiteit Leuven, Belgium |
| |
Abstract: | To meet the demand of surrounding detection of a humanoid robot, we developed an omnidirectional vision system for robot perception (OVROP) with 5 Degrees of Freedom (DOFs). OVROP has a modular design and mainly consists of three parts: hardware, control architecture and visual processing part (omnidirectional vision and stereovision). As OVROP is equipped with universal hardware and software interfaces it can be applied to various types of robots. Our performance evaluation proves that OVROP can accurately detect and track an object with 360° field of view (FOV). Besides, undistorted omnidirectional perception of surroundings can be achieved through calibrations of both monocular and stereo cameras. Furthermore, our preliminary experimental results show that OVROP can perceive a desired object within 160 ms in most cases. As a result, OVROP can provide detailed information on surrounding environment for full-scope and real-time robot perception. |
| |
Keywords: | |
本文献已被 ScienceDirect 等数据库收录! |
|