首页 | 本学科首页   官方微博 | 高级检索  
     


Accurate object contour tracking based on boundary edge selection
Authors:Myung-Cheol Roh [Author Vitae]
Affiliation:a Department of Computer Science and Engineering, Korea University, Anam-dong, Seongbuk-ku, Seoul 136-713, Korea
b Department of Computer Engineering, Hongik University, Sangsu-dong, Mapo-ku, Seoul, Korea
Abstract:In this paper, a novel method for accurate subject tracking, by selecting only tracked subject boundary edges in a video stream with a changing background and moving camera, is proposed. This boundary edge selection is achieved in two steps: (1) removing background edges using edge motion, and from the output of the previous step, (2) selecting boundary edges using a normal direction derivative of the tracked contour. Accurate tracking is based on reduction of the effects of irrelevant edges, by only selecting boundary edge pixels. In order to remove background edges using edge motion, the tracked subject motion is computed and edge motions and edges having different motion directions from the subjects are removed. In selecting boundary edges using the normal contour direction, the image gradient values on every edge pixel are computed, and edge pixels with large gradient values are selected. Multi-level Canny edge maps are used to obtain proper details of a scene. Multi-level edge maps allow tracking, even though the tracked object boundary has complex edges, since the detail level of an edge map for the scene can be adjusted. A process of final routing is deployed in order to obtain a detailed contour. The computed contour is improved by checking against a strong Canny edge map and hiring strong Canny edge pixels around the computed contour using Dijkstra's minimum cost routing. The experimental results demonstrate that the proposed tracking approach is robust enough to handle a complex-textured scene in a mobile camera environment.
Keywords:Object contour tracking  Boundary edge selection  Optical flow  Contour normal direction  Multi-level edge map
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号