Real-time and robust hand tracking with a single depth camera |
| |
Authors: | Ziyang Ma Enhua Wu |
| |
Affiliation: | 1. State Key Laboratory of Computer Science, Institute of Software, Chinese Academic of Sciences, Beijing, China 2. University of Chinese Academy of Sciences, Beijing, China 3. University of Macau, Macau, China
|
| |
Abstract: | In this paper, we introduce a novel, real-time and robust hand tracking system, capable of tracking the articulated hand motion in full degrees of freedom (DOF) using a single depth camera. Unlike most previous systems, our system is able to initialize and recover from tracking loss automatically. This is achieved through an efficient two-stage k-nearest neighbor database searching method proposed in the paper. It is effective for searching from a pre-rendered database of small hand depth images, designed to provide good initial guesses for model based tracking. We also propose a robust objective function, and improve the Particle Swarm Optimization algorithm with a resampling based strategy in model based tracking. It provides continuous solutions in full DOF hand motion space more efficiently than previous methods. Our system runs at 40 fps on a GeForce GTX 580 GPU and experimental results show that the system outperforms the state-of-the-art model based hand tracking systems in terms of both speed and accuracy. The work result is of significance to various applications in the field of human–computer-interaction and virtual reality. |
| |
Keywords: | |
本文献已被 SpringerLink 等数据库收录! |
|