首页 | 官方网站   微博 | 高级检索  
     


Example‐Based Retargeting of Human Motion to Arbitrary Mesh Models
Authors:Ufuk Celikcan  Ilker O Yaz  Tolga Capin
Affiliation:1. Department of Computer Engineering, Hacettepe University, Ankara, Turkey;2. Microsoft, Redmond, WA, USA;3. Department of Computer Engineering, TED University, Ankara, TurkeyThe authors developed this work while at the Department of Computer Engineering, Bilkent University, Turkey.
Abstract:We present a novel method for retargeting human motion to arbitrary 3D mesh models with as little user interaction as possible. Traditional motion‐retargeting systems try to preserve the original motion, while satisfying several motion constraints. Our method uses a few pose‐to‐pose examples provided by the user to extract the desired semantics behind the retargeting process while not limiting the transfer to being only literal. Thus, mesh models with different structures and/or motion semantics from humanoid skeletons become possible targets. Also considering the fact that most publicly available mesh models lack additional structure (e.g. skeleton), our method dispenses with the need for such a structure by means of a built‐in surface‐based deformation system. As deformation for animation purposes may require non‐rigid behaviour, we augment existing rigid deformation approaches to provide volume‐preserving and squash‐and‐stretch deformations. We demonstrate our approach on well‐known mesh models along with several publicly available motion‐capture sequences.
Keywords:animation systems  deformations  motion capture  retargeting  I  3  7 [Computer Graphics]: Three‐Dimensional Graphics and Realism —  Animation
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司    京ICP备09084417号-23

京公网安备 11010802026262号