Presence and interaction in mixed reality environments |
| |
Authors: | Arjan Egges George Papagiannakis Nadia Magnenat-Thalmann |
| |
Affiliation: | (1) Center for Advanced Gaming and Simulation, Department of Information and Computing Sciences, Utrecht University, PO Box 80.089, 3508TB Utrecht, The Netherlands;(2) MIRALab, University of Geneva, Geneva, Switzerland |
| |
Abstract: | In this paper, we present a simple and robust mixed reality (MR) framework that allows for real-time interaction with virtual
humans in mixed reality environments under consistent illumination. We will look at three crucial parts of this system: interaction,
animation and global illumination of virtual humans for an integrated and enhanced presence. The interaction system comprises
of a dialogue module, which is interfaced with a speech recognition and synthesis system. Next to speech output, the dialogue
system generates face and body motions, which are in turn managed by the virtual human animation layer. Our fast animation
engine can handle various types of motions, such as normal key-frame animations, or motions that are generated on-the-fly
by adapting previously recorded clips. Real-time idle motions are an example of the latter category. All these different motions
are generated and blended on-line, resulting in a flexible and realistic animation. Our robust rendering method operates in
accordance with the previous animation layer, based on an extended for virtual humans precomputed radiance transfer (PRT)
illumination model, resulting in a realistic rendition of such interactive virtual characters in mixed reality environments.
Finally, we present a scenario that illustrates the interplay and application of our methods, glued under a unique framework
for presence and interaction in MR. |
| |
Keywords: | Presence Interaction Animation Real-time rendering Mixed reality |
本文献已被 SpringerLink 等数据库收录! |
|