首页 | 本学科首页   官方微博 | 高级检索  
     


Efficient and robust 3D line drawings using difference-of-Gaussian
Affiliation:1. Hangzhou Dianzi University, Hangzhou City, Zhejiang Province 310018, China;2. Nanyang Technological University, 50 Nanyang Avenue, Singapore 639798, Singapore;3. Central South University, Changsha City, Hunan Province 410083, China;4. Fraunhofer IDM@NTU, 50 Nanyang Avenue, Singapore 639798, Singapore
Abstract:Line drawings are widely used for sketches, animations, and technical illustrations because they are effective and easy to draw. The existing computer-generated lines, such as suggestive contours, apparent ridges, and demarcating curves, adopt the two-pass framework: in the first pass, certain geometric features or properties are extracted or computed in the object space; then in the second pass, the line drawings are rendered by iterating each polygonal face or edge. It is known these approaches are very sensitive to the mesh quality, and usually require appropriate preprocessing operations (e.g. smoothing, remeshing, etc.) to the input meshes. This paper presents a simple yet robust approach to generate view-dependent line drawings for 3D models. Inspired by the image edge detector, we compute the difference-of-Gaussian of illumination on the 3D model. With moderate assumption, we show all the expensive computations can be done in the pre-computing stage. Our method naturally integrates object- and image-spaces in that we compute the geometric features in the object space and then adopt a simple fragment shader to render the lines in the image space. As a result, our algorithm is more efficient than the existing object-space approaches, since the lines are generated in a single pass without iterating the mesh edges/faces. Furthermore, our method is more flexible and robust than the existing algorithms in that it does not require the preprocessing on the input 3D models. Finally, the difference-of-Gaussian operator can be extended to the anisotropic setting guided by local geometric features. The promising experimental results on a wide range of real-world models demonstrate the effectiveness and robustness of our method.
Keywords:
本文献已被 ScienceDirect 等数据库收录!
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号