Spatially varying image based lighting using HDR-video |
| |
Authors: | Jonas Unger Joel Kronander Per Larsson Stefan Gustavson Joakim Löw Anders Ynnerman |
| |
Affiliation: | Media and Information Technology, Linköping University, Sweden |
| |
Abstract: | Illumination is one of the key components in the creation of realistic renderings of scenes containing virtual objects. In this paper, we present a set of novel algorithms and data structures for visualization, processing and rendering with real world lighting conditions captured using High Dynamic Range (HDR) video. The presented algorithms enable rapid construction of general and editable representations of the lighting environment, as well as extraction and fitting of sampled reflectance to parametric BRDF models. For efficient representation and rendering of the sampled lighting environment function, we consider an adaptive (2D/4D) data structure for storage of light field data on proxy geometry describing the scene. To demonstrate the usefulness of the algorithms, they are presented in the context of a fully integrated framework for spatially varying image based lighting. We show reconstructions of example scenes and resulting production quality renderings of virtual furniture with spatially varying real world illumination including occlusions. |
| |
Keywords: | High dynamic range video Image based lighting Scene capture and processing Photo realistic rendering |
本文献已被 ScienceDirect 等数据库收录! |