Outdoor autonomous navigation using SURF features |
| |
Authors: | Masayoshi Tabuse Toshiki Kitaoka Dai Nakai |
| |
Affiliation: | 1.Graduate School of Life and Environmental Science,Kyoto Prefectural University,Kyoto,Japan;2.X-TRANS,Osaka,Japan;3.Kyoto Prefectural Subaru High School,Kyoto,Japan |
| |
Abstract: | In this article, we propose a speeded-up robust features (SURF)-based approach for outdoor autonomous navigation. In this
approach, we capture environmental images using an omni-directional camera and extract features of these images using SURF.
We treat these features as landmarks to estimate a robot’s self-location and direction of motion. SURF features are invariant
under scale changes and rotation, and are robust under image noise, changes in light conditions, and changes of viewpoint.
Therefore, SURF features are appropriate for the self-location estimation and navigation of a robot. The mobile robot navigation
method consists of two modes, the teaching mode and the navigation mode. In the teaching mode, we teach a navigation course.
In the navigation mode, the mobile robot navigates along the teaching course autonomously. In our experiment, the outdoor
teaching course was about 150 m long, the average speed was 2.9 km/h, and the maximum trajectory error was 3.3 m. The processing
time of SURF was several times shorter than that of scale-invariant feature transform (SIFT). Therefore, the navigation speed
of the mobile robot was similar to the walking speed of a person. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|