首页 | 本学科首页   官方微博 | 高级检索  
     


GPU-based rendering for deformable translucent objects
Authors:Yi?Gong  author-information"  >  author-information__contact u-icon-before"  >  mailto:ygong@cad.zju.edu.cn"   title="  ygong@cad.zju.edu.cn"   itemprop="  email"   data-track="  click"   data-track-action="  Email author"   data-track-label="  "  >Email author,Wei?Chen,Long?Zhang,Yun?Zeng,Qunsheng?Peng
Affiliation:(1) State Key Lab of CAD&CG, Zhejiang University 310027, Hangzhou, China
Abstract:In this paper we introduce an approximate image-space approach for real-time rendering of deformable translucent models by flattening the geometry and lighting information of objects into textures to calculate multi-scattering in texture spaces. We decompose the process into two stages, called the gathering and scattering corresponding to the computations for incident and exident irradiance respectively. We derive a simplified illumination model for the gathering of the incident irradiance, which is amenable for deformable models using two auxiliary textures. In the scattering stage, we adopt two modes for efficient accomplishment of the view-dependent scattering. Our approach is implemented by fully exploiting the capabilities of graphics processing units (GPUs). It achieves visually plausible results and real-time frame rates for deformable models on commodity desktop PCs.
Keywords:Sub-surface scattering  BSSRDF  Translucency  Real-time rendering
本文献已被 SpringerLink 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号