首页 | 本学科首页   官方微博 | 高级检索  
     


EigenTracking: Robust Matching and Tracking of Articulated Objects Using a View-Based Representation
Authors:Black  Michael J  Jepson  Allan D
Affiliation:(1) Xerox Palo Alto Research Center, 3333 Coyote Hill Road, Palo Alto, CA, 94304;(2) Department of Computer Science, University of Toronto, Toronto, Ontario, M5S 3H5, Canada
Abstract:This paper describes an approach for tracking rigid and articulated objects using a view-based representation. The approach builds on and extends work on eigenspace representations, robust estimation techniques, and parameterized optical flow estimation. First, we note that the least-squares image reconstruction of standard eigenspace techniques has a number of problems and we reformulate the reconstruction problem as one of robust estimation. Second we define a ldquosubspace constancy assumptionrdquo that allows us to exploit techniques for parameterized optical flow estimation to simultaneously solve for the view of an object and the affine transformation between the eigenspace and the image. To account for large affine transformations between the eigenspace and the image we define a multi-scale eigenspace representation and a coarse-to-fine matching strategy. Finally, we use these techniques to track objects over long image sequences in which the objects simultaneously undergo both affine image motions and changes of view. In particular we use this ldquoEigenTrackingrdquo technique to track and recognize the gestures of a moving hand.
Keywords:eigenspace methods  robust estimation  view-based representations  gesture recognition  parametric models of optical flow  tracking  object recognition  motion analysis
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号