首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 31 毫秒
1.
We propose an analysis of numerical integration based on sampling theory, whereby the integration error caused by aliasing is suppressed by pre‐filtering. We derive a pre‐filter for evaluating the illumination integral yielding filtered importance sampling, a simple GPU‐based rendering algorithm for image‐based lighting. Furthermore, we extend the algorithm with real‐time visibility computation. Free from any pre‐computation, the algorithm supports fully dynamic scenes and, above all, is simple to implement.  相似文献   

2.
We propose a novel rendering method which supports interactive BRDF editing as well as relighting on a 3D scene. For interactive BRDF editing, we linearize an analytic BRDF model with basis BRDFs obtained from a principal component analysis. For each basis BRDF, the radiance transfer is precomputed and stored in vector form. In rendering time, illumination of a point is computed by multiplying the radiance transfer vectors of the basis BRDFs by the incoming radiance from gather samples and then linearly combining the results weighted by user‐controlled parameters. To improve the level of accuracy, a set of sub‐area samples associated with a gather sample refines the glossy reflection of the geometric details without increasing the precomputation time. We demonstrate this program with a number of examples to verify the real‐time performance of relighting and BRDF editing on 3D scenes with complex lighting and geometry.  相似文献   

3.
We introduce an image‐based representation, called volumetric billboards, allowing for the real‐time rendering of semi‐transparent and visually complex objects arbitrarily distributed in a 3D scene. Our representation offers full parallax effect from any viewing direction and improved anti‐aliasing of distant objects. It correctly handles transparency between multiple and possibly overlapping objects without requiring any primitive sorting. Furthermore, volumetric billboards can be easily integrated into common rasterization‐based renderers, which allows for their concurrent use with polygonal models and standard rendering techniques such as shadow‐mapping. The representation is based on volumetric images of the objects and on a dedicated real‐time volume rendering algorithm that takes advantage of the GPU geometry shader. Our examples demonstrate the applicability of the method in many cases including levels‐of‐detail representation for multiple intersecting complex objects, volumetric textures, animated objects and construction of high‐resolution objects by assembling instances of low‐resolution volumetric billboards.  相似文献   

4.
Indirect illumination is an important element for realistic image synthesis, but its computation is expensive and highly dependent on the complexity of the scene and of the BRDF of the involved surfaces. While off‐line computation and pre‐baking can be acceptable for some cases, many applications (games, simulators, etc.) require real‐time or interactive approaches to evaluate indirect illumination. We present a novel algorithm to compute indirect lighting in real‐time that avoids costly precomputation steps and is not restricted to low‐frequency illumination. It is based on a hierarchical voxel octree representation generated and updated on the fly from a regular scene mesh coupled with an approximate voxel cone tracing that allows for a fast estimation of the visibility and incoming energy. Our approach can manage two light bounces for both Lambertian and glossy materials at interactive framerates (25–70FPS). It exhibits an almost scene‐independent performance and can handle complex scenes with dynamic content thanks to an interactive octree‐voxelization scheme. In addition, we demonstrate that our voxel cone tracing can be used to efficiently estimate Ambient Occlusion.  相似文献   

5.
We propose a unified rendering approach that jointly handles motion and defocus blur for transparent and opaque objects at interactive frame rates. Our key idea is to create a sampled representation of all parts of the scene geometry that are potentially visible at any point in time for the duration of a frame in an initial rasterization step. We store the resulting temporally‐varying fragments (t‐fragments) in a bounding volume hierarchy which is rebuild every frame using a fast spatial median construction algorithm. This makes our approach suitable for interactive applications with dynamic scenes and animations. Next, we perform spatial sampling to determine all t‐fragments that intersect with a specific viewing ray at any point in time. Viewing rays are sampled according to the lens uv‐sampling for depth‐of‐field effects. In a final temporal sampling step, we evaluate the predetermined viewing ray/t‐fragment intersections for one or multiple points in time. This allows us to incorporate all standard shading effects including transparency. We describe the overall framework, present our GPU implementation, and evaluate our rendering approach with respect to scalability, quality, and performance.  相似文献   

6.
We present Forward Light Cuts, a novel approach to real‐time global illumination using forward rendering techniques. We focus on unshadowed diffuse interactions for the first indirect light bounce in the context of large models such as the complex scenes usually encountered in CAD application scenarios. Our approach efficiently generates and uses a multiscale radiance cache by exploiting the geometry‐specific stages of the graphics pipeline, namely the tessellator unit and the geometry shader To do so, we assimilate virtual point lights to the scene's triangles and design a stochastic decimation process chained with a partitioning strategy that accounts for both close‐by strong light reflections, and distant regions from which numerous virtual point lights collectively contribute strongly to the end pixel. Our probabilistic solution is supported by a mathematical analysis and a number of experiments covering a wide range of application scenarios. As a result, our algorithm requires no precomputation of any kind, is compatible with dynamic view points, lighting condition, geometry and materials, and scales to tens of millions of polygons on current graphics hardware.  相似文献   

7.
One of the most accurate yet still practical representation of material appearance is the Bidirectional Texture Function (BTF). The BTF can be viewed as an extension of Bidirectional Reflectance Distribution Function (BRDF) for additional spatial information that includes local visual effects such as shadowing, interreflection, subsurface‐scattering, etc. However, the shift from BRDF to BTF represents not only a huge leap in respect to the realism of material reproduction, but also related high memory and computational costs stemming from the storage and processing of massive BTF data. In this work, we argue that each opaque material, regardless of its surface structure, can be safely substituted by a BRDF without the introduction of a significant perceptual error when viewed from an appropriate distance. Therefore, we ran a set of psychophysical studies over 25 materials to determine so‐called critical viewing distances, i.e. the minimal distances at which the material spatial structure (texture) cannot be visually discerned. Our analysis determined such typical distances typical for several material categories often used in interior design applications. Furthermore, we propose a combination of computational features that can predict such distances without the need for a psychophysical study. We show that our work can significantly reduce rendering costs in applications that process complex virtual scenes.  相似文献   

8.
Rendering animations of scenes with deformable objects, camera motion, and complex illumination, including indirect lighting and arbitrary shading, is a long‐standing challenge. Prior work has shown that complex lighting can be accurately approximated by a large collection of point lights. In this formulation, rendering of animation sequences becomes the problem of efficiently shading many surface samples from many lights across several frames. This paper presents a tensor formulation of the animated many‐light problem, where each element of the tensor expresses the contribution of one light to one pixel in one frame. We sparsely sample rows and columns of the tensor, and introduce a clustering algorithm to select a small number of representative lights to efficiently approximate the animation. Our algorithm achieves efficiency by reusing representatives across frames, while minimizing temporal flicker. We demonstrate our algorithm in a variety of scenes that include deformable objects, complex illumination and arbitrary shading and show that a surprisingly small number of representative lights is sufficient for high quality rendering. We believe out algorithm will find practical use in applications that require fast previews of complex animation.  相似文献   

9.
Measured reflection data such as the bidirectional texture function (BTF) represent spatial variation under the full hemisphere of view and light directions and offer a very realistic visual appearance. Despite its high‐dimensional nature, recent compression techniques allow rendering of BTFs in real time. Nevertheless, a still unsolved problem is that there is no representation suited for real‐time rendering that can be used by designers to modify the BTF's appearance. For intuitive editing, a set of low‐dimensional comprehensible parameters, stored as scalars, colour values or texture maps, is required. In this paper we present a novel way to represent BTF data by introducing the geometric BRDF (g‐BRDF), which describes both the underlying meso‐ and micro‐scale structure in a very compact way. Both are stored in texture maps with only a few additional scalar parameters that can all be modified at runtime and thus give the designer full control over the material's appearance in the final real‐time application. The g‐BRDF does not only allow intuitive editing, but also reduces the measured data into a small set of textures, yielding a very effective compression method. In contrast to common material representation combining heightfields and BRDFs, our g‐BRDF is physically based and derived from direct measurement, thus representing real‐world surface appearance. In addition, we propose an algorithm for fully automatic decomposition of a given measured BTF into the g‐BRDF representation.  相似文献   

10.
Particle‐based simulation techniques, like the discrete element method or molecular dynamics, are widely used in many research fields. In real‐time explorative visualization it is common to render the resulting data using opaque spherical glyphs with local lighting only. Due to massive overlaps, however, inner structures of the data are often occluded rendering visual analysis impossible. Furthermore, local lighting is not sufficient as several important features like complex shapes, holes, rifts or filaments cannot be perceived well. To address both problems we present a new technique that jointly supports transparency and ambient occlusion in a consistent illumination model. Our approach is based on the emission‐absorption model of volume rendering. We provide analytic solutions to the volume rendering integral for several density distributions within a spherical glyph. Compared to constant transparency our approach preserves the three‐dimensional impression of the glyphs much better. We approximate ambient illumination with a fast hierarchical voxel cone‐tracing approach, which builds on a new real‐time voxelization of the particle data. Our implementation achieves interactive frame rates for millions of static or dynamic particles without any preprocessing. We illustrate the merits of our method on real‐world data sets gaining several new insights.  相似文献   

11.
Interactive rendering with dynamic natural lighting and changing view is a long‐standing goal in computer graphics. Recently, precomputation‐based methods for all‐frequency relighting have made substantial progress in this direction. Many of the most successful algorithms are based on a factorization of the BRDF into incident and outgoing directions, enabling each term to be precomputed independent of viewing direction, and re‐combined at run‐time. However, there has so far been no theoretical understanding of the accuracy of this factorization, nor the number of terms needed. In this paper, we conduct a theoretical and empirical analysis of the BRDF in‐out factorization. For Phong BRDFs, we obtain analytic results, showing that the number of terms needed grows linearly with the Phong exponent, while the factors correspond closely to spherical harmonic basis functions. More generally, the number of terms is quadratic in the frequency content of the BRDF along the reflected or half‐angle direction. This analysis gives clear practical guidance on the number of factors needed for a given material. Different objects in a scene can each be represented with the correct number of terms needed for that particular BRDF, enabling both accuracy and interactivity.  相似文献   

12.
Generating photo‐realistic images through Monte Carlo rendering requires efficient representation of light–surface interaction and techniques for importance sampling. Various models with good representation abilities have been developed but only a few of them have their importance sampling procedure. In this paper, we propose a method which provides a good bidirectional reflectance distribution function (BRDF) representation and efficient importance sampling procedure. Our method is based on representing BRDF as a function of tensor products. Four‐dimensional measured BRDF tensor data are factorized using Tucker decomposition. A large data set is used for comparing the proposed BRDF model with a number of well‐known BRDF models. It is shown that the underlying model provides good approximation to BRDFs.  相似文献   

13.
14.
State‐of‐the‐art car paint shows not only interesting and subtle angular dependency but also significant spatial variation. Especially in sunlight these variations remain visible even for distances up to a few meters and give the coating a strong impression of depth which cannot be reproduced by a single BRDF model and the kind of procedural noise textures typically used. Instead of explicitly modeling the responsible effect particles we propose to use image‐based reflectance measurements of real paint samples and represent their spatial varying part by Bidirectional Texture Functions (BTF). We use classical BRDF models like Cook‐Torrance to represent the reflection behavior of the base paint and the highly specular finish and demonstrate how the parameters of these models can be derived from the BTF measurements. For rendering, the image‐based spatially varying part is compressed and efficiently synthesized. This paper introduces the first hybrid analytical and image‐based representation for car paint and enables the photo‐realistic rendering of all significant effects of highly complex coatings.  相似文献   

15.
Irradiance Caching is one of the most widely used algorithms to speed up global illumination. In this paper, we propose an algorithm based on the Irradiance Caching scheme that allows us (1) to adjust the density of cached records according to illumination changes and (2) to efficiently render the high‐frequency illumination changes. To achieve this, a new record footprint is presented. Although the original method uses records having circular footprints depending only on geometrical features, our record footprints have a more complex shape which accounts for both geometry and irradiance variations. Irradiance values are computed using a classical Monte Carlo ray tracing method that simplifies the determination of nearby objects and the pre‐computation of the shape of the influence zone of the current record. By gathering irradiance due to all the incident rays, illumination changes are evaluated to adjust the footprint’s records. As a consequence, the record footprints are smaller where illumination gradients are high. With this technique, the record density depends on the irradiance variations. Strong variations of irradiance (due to direct contributions for example) can be handled and evaluated accurately. Caching direct illumination is of high importance, especially in the case of scenes having many light sources with complex geometry as well as surfaces exposed to daylight. Recomputing direct illumination for the whole image can be very time‐consuming, especially for walkthrough animation rendering or for high‐resolution pictures. Storing such contributions in the irradiance cache seems to be an appropriate solution to accelerate the final rendering pass.  相似文献   

16.
Ambient occlusion is a cheap but effective approximation of global illumination. Recently, screen‐space ambient occlusion (SSAO) methods, which sample the frame buffer as a discretization of the scene geometry, have become very popular for real‐time rendering. We present temporal SSAO (TSSAO), a new algorithm which exploits temporal coherence to produce high‐quality ambient occlusion in real time. Compared to conventional SSAO, our method reduces both noise as well as blurring artefacts due to strong spatial filtering, faithfully representing fine‐grained geometric structures. Our algorithm caches and reuses previously computed SSAO samples, and adaptively applies more samples and spatial filtering only in regions that do not yet have enough information available from previous frames. The method works well for both static and dynamic scenes.  相似文献   

17.
We present a new Precomputed Radiance Transfer (PRT) algorithm based on a two dimensional representation of isotropic BRDFs. Our approach involves precomputing matrices that allow quickly mapping environment lighting, which is represented in the global coordinate system, and the surface BRDFs, which are represented in a bivariate domain, to the local hemisphere at a surface location where the reflection integral is evaluated. When the lighting and BRDFs are represented in a wavelet basis, these rotation matrices are sparse and can be efficiently stored and combined with pre‐computed visibility at run‐time. Compared to prior techniques that also precompute wavelet rotation matrices, our method allows full control over the lighting and materials due to the way the BRDF is represented. Furthermore, this bivariate parameterization preserves sharp specular peaks and grazing effects that are attenuated in conventional parameterizations. We demonstrate a prototype rendering system that achieves real‐time framerates while lighting and materials are edited.  相似文献   

18.
This work presents a new representation used as a rendering primitive of surfaces. Our representation is defined by an arbitrary cubic cell complex: a projection‐based parameterization domain for surfaces where geometry and appearance information are stored as tile textures. This representation is used by our ray casting rendering algorithm called projection mapping, which can be used for rendering geometry and appearance details of surfaces from arbitrary viewpoints. The projection mapping algorithm uses a fragment shader based on linear and binary searches of the relief mapping algorithm. Instead of traditionally rendering the surface, only front faces of our rendering primitive (our arbitrary cubic cell complex) are drawn, and geometry and appearance details of the surface are rendered back by using projection mapping. Alternatively, another method is proposed for mapping appearance information on complex surfaces using our arbitrary cubic cell complexes. In this case, instead of reconstructing the geometry as in projection mapping, the original mesh of a surface is directly passed to the rendering algorithm. This algorithm is applied in the texture mapping of cultural heritage sculptures.  相似文献   

19.
We introduce a screen‐space statistical filtering method for real‐time rendering with global illumination. It is inspired by statistical filtering proposed by Meyer et al. to reduce the noise in global illumination over a period of time by estimating the principal components from all rendered frames. Our work extends their method to achieve nearly real‐time performance on modern GPUs. More specifically, our method employs the candid covariance‐free incremental PCA to overcome several limitations of the original algorithm by Meyer et al., such as its high computational cost and memory usage that hinders its implementation on GPUs. By combining the reprojection and per‐pixel weighting techniques, our method handles the view changes and object movement in dynamic scenes as well.  相似文献   

20.
Shadow mapping has been extensively used for real‐time shadow rendering in 3D computer games, though it suffers from the inherent aliasing problems due to its image‐based nature. This paper presents an enhanced variant of light space perspective shadow maps to optimize perspective aliasing distribution in possible general cases where the light and view directions are not orthogonal. To be mathematically sound, the generalized representation of perspective aliasing errors has been derived in detail. Our experiments have shown the enhanced shadow quality using our algorithm in dynamic scenes. Copyright © 2008 John Wiley & Sons, Ltd.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号