首页 | 本学科首页   官方微博 | 高级检索  
     


The ChaLearn gesture dataset (CGD 2011)
Authors:Isabelle Guyon  Vassilis Athitsos  Pat Jangyodsuk  Hugo Jair Escalante
Affiliation:1. ChaLearn, 955 Creston Road, Berkeley, CA, 94708-1501, USA
2. University of Texas at Arlington, Arlington, TX, USA
3. INAOE, Puebla, Mexico
Abstract:This paper describes the data used in the ChaLearn gesture challenges that took place in 2011/2012, whose results were discussed at the CVPR 2012 and ICPR 2012 conferences. The task can be described as: user-dependent, small vocabulary, fixed camera, one-shot-learning. The data include 54,000 hand and arm gestures recorded with an RGB-D \(\hbox {Kinect}^\mathrm{TM}\) camera. The data are organized into batches of 100 gestures pertaining to a small gesture vocabulary of 8–12 gestures, recorded by the same user. Short continuous sequences of 1–5 randomly selected gestures are recorded. We provide man-made annotations (temporal segmentation into individual gestures, alignment of RGB and depth images, and body part location) and a library of function to preprocess and automatically annotate data. We also provide a subset of batches in which the user’s horizontal position is randomly shifted or scaled. We report on the results of the challenge and distribute sample code to facilitate developing new solutions. The data, datacollection software and the gesture vocabularies are downloadable from http://gesture.chalearn.org. We set up a forum for researchers working on these data http://groups.google.com/group/gesturechallenge.
Keywords:
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号