首页 | 本学科首页   官方微博 | 高级检索  
     


A motion capture‐based control‐space approach for walking mannequins
Authors:Julien Pettre  Jean‐Paul Laumond
Abstract:Virtual mannequins need to navigate in order to interact with their environment. Their autonomy to accomplish navigation tasks is ensured by locomotion controllers. Control inputs can be user‐defined or automatically computed to achieve high‐level operations (e.g. obstacle avoidance). This paper presents a locomotion controller based on a motion capture edition technique. Controller inputs are the instantaneous linear and angular velocities of the walk. Our solution works in real time and supports at any time continuous changes of inputs. The controller combines three main components to synthesize locomotion animations in a four‐stage process. First, the Motion Library stores motion capture samples. Motion captures are analysed to compute quantitative characteristics. Second, these characteristics are represented in a linear control space. This geometric representation is appropriate for selecting and weighting three motion samples with respect to the input state. Third, locomotion cycles are synthesized by blending the selected motion samples. Blending is done in the frequency domain. Lastly, successive postures are extracted from the synthesized cycles in order to complete the animation of the moving mannequin. The method is demonstrated in this paper in a locomotion‐planning context. Copyright © 2006 John Wiley & Sons, Ltd.
Keywords:digital mannequins  locomotion control  motion capture  motion blending  motion planning
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号