首页 | 本学科首页   官方微博 | 高级检索  
文章检索
  按 检索   检索词:      
出版年份:   被引次数:   他引次数: 提示:输入*表示无穷大
  收费全文   20篇
  免费   0篇
金属工艺   1篇
一般工业技术   1篇
冶金工业   4篇
自动化技术   14篇
  2022年   1篇
  2016年   1篇
  2013年   2篇
  2011年   2篇
  2009年   1篇
  2008年   2篇
  2007年   2篇
  2006年   2篇
  2004年   1篇
  2003年   1篇
  2002年   2篇
  2001年   1篇
  1994年   1篇
  1993年   1篇
排序方式: 共有20条查询结果,搜索用时 15 毫秒
1.
Volume illustration: nonphotorealistic rendering of volume models   总被引:3,自引:0,他引:3  
Accurately and automatically conveying the structure of a volume model is a problem which has not been fully solved by existing volume rendering approaches. Physics-based volume rendering approaches create images which may match the appearance of translucent materials in nature but may not embody important structural details. Transfer function approaches allow flexible design of the volume appearance but generally require substantial hand-tuning for each new data set in order to be effective. We introduce the volume illustration approach, combining the familiarity of a physics-based illumination model with the ability to enhance important features using non-photorealistic rendering techniques. Since the features to be enhanced are defined on the basis of local volume characteristics rather than volume sample values, the application of volume illustration techniques requires less manual tuning than the design of a good transfer function. Volume illustration provides a flexible unified framework for enhancing the structural perception of volume models through the amplification of features and the addition of illumination effects  相似文献   
2.
Texture-based transfer functions for direct volume rendering   总被引:1,自引:0,他引:1  
Visualization of volumetric data faces the difficult task of finding effective parameters for the transfer functions. Those parameters can determine the effectiveness and accuracy of the visualization. Frequently, volumetric data includes multiple structures and features that need to be differentiated. However, if those features have the same intensity and gradient values, existing transfer functions are limited at effectively illustrating those similar features with different rendering properties. We introduce texture-based transfer functions for direct volume rendering. In our approach, the voxel's resulting opacity and color are based on local textural properties rather than individual intensity values. For example, if the intensity values of the vessels are similar to those on the boundary of the lungs, our texture-based transfer function will analyze the textural properties in those regions and color them differently even though they have the same intensity values in the volume. The use of texture-based transfer functions has several benefits. First, structures and features with the same intensity and gradient values can be automatically visualized with different rendering properties. Second, segmentation or prior knowledge of the specific features within the volume is not required for classifying these features differently. Third, textural metrics can be combined and/or maximized to capture and better differentiate similar structures. We demonstrate our texture-based transfer function for direct volume rendering with synthetic and real-world medical data to show the strength of our technique.  相似文献   
3.
Analyzing, visualizing, and illustrating changes within time-varying volumetric data is challenging due to the dynamic changes occurring between timesteps. The changes and variations in computational fluid dynamic volumes and atmospheric 3D datasets do not follow any particular transformation. Features within the data move at different speeds and directions making the tracking and visualization of these features a difficult task. We introduce a texture-based feature tracking technique to overcome some of the current limitations found in the illustration and visualization of dynamic changes within time-varying volumetric data. Our texture-based technique tracks various features individually and then uses the tracked objects to better visualize structural changes. We show the effectiveness of our texture-based tracking technique with both synthetic and real world time-varying data. Furthermore, we highlight the specific visualization, annotation, registration, and feature isolation benefits of our technique. For instance, we show how our texture-based tracking can lead to insightful visualizations of time-varying data. Such visualizations, more than traditional visualization techniques, can assist domain scientists to explore and understand dynamic changes.  相似文献   
4.
The kinetics of the precipitation of Co from a supersaturated solid solution of Cu-0.95 at. pct Co was investigated by isochronal annealing applying differential scanning calorimetry (DSC) with heating rates in the range 5 to 20 K min–1. The corresponding microstructural evolution was investigated by high-resolution transmission electron microscopy (HRTEM) in combination with electron energy loss spectroscopy (EELS). Upon isochronal annealing, spherical Co precipitates of fcc crystal structure form. Kinetic analysis by fitting of a modular phase transformation model to, simultaneously, all DSC curves of variable heating rate measured for Cu-0.95 at. pct Co showed that the precipitation-process mechanism can be described within the framework of this general phase transformation model by continuous nucleation and diffusion-controlled growth. By introducing additional microstructural information (here, the precipitate-particle density), for the first time, values for the separate activation energies of nucleation and growth could be deduced from the transformation kinetics.  相似文献   
5.
Illustration‐inspired techniques have provided alternative ways to visualize time‐varying data. Techniques such as speedlines, flow ribbons, strobe silhouettes and opacity‐based techniques provide temporal context to the current timestep being visualized. We evaluated the effectiveness of these illustrative techniques by conducting a user study. We compared the ability of subjects to visually track features using snapshots, snapshots augmented by illustration techniques, animations, and animations augmented by illustration techniques. User accuracy, time required to perform a task, and user confidence were used as measures to evaluate the techniques. The results indicate that the use of illustration‐inspired techniques provides a significant improvement in user accuracy and the time required to complete the task. Subjects performed significantly better on each metric when using augmented animations as compared to augmented snapshots.  相似文献   
6.
The devastating power of hurricanes was evident during the 2005 hurricane season, the most active season on record. This has prompted increased efforts by researchers to understand the physical processes that underlie the genesis, intensification, and tracks of hurricanes. This research aims at facilitating an improved understanding into the structure of hurricanes with the aid of visualization techniques. Our approach was developed by a mixed team of visualization and domain experts. To better understand these systems, and to explore their representation in NWP models, we use a variety of illustration-inspired techniques to visualize their structure and time evolution. Illustration-inspired techniques aid in the identification of the amount of vertical wind shear in a hurricane, which can help meteorologists predict dissipation. Illustration-style visualization, in combination with standard visualization techniques, helped explore the vortex rollup phenomena and the mesovortices contained within. We evaluated the effectiveness of our visualization with the help of six hurricane experts. The expert evaluation showed that the illustration-inspired techniques were preferred over existing tools. Visualization of the evolution of structural features is a prelude to a deeper visual analysis of the underlying dynamics.  相似文献   
7.
Interactive graphics practitioners have long understood that viewing a virtual object by controlling the viewpoint dynamically is more illuminating than viewing a still image or even a precomputed animation. Dynamic manipulation engages a viewer's kinesthetic sense in addition to his visual sense, adding an immediacy to the exploration experience. Finding the right way to represent data has been an active topic of much thought and discussion since the beginnings of visualization. To explore dynamic visualization's power, the author constructed a tool (Calico), for creating and manipulating bivariate color mappings using several different color models. Using Calico, she conducted two experimental studies of the effects of control over the color mapping on accuracy, confidence, and preference  相似文献   
8.
In many domains, the user is interested not only in including objects with particular desired values, but also in the distribution of values in the the set. Our approach for visualizing a set of objects uses glyphs overlaid on a composite representation of the entire set to convey objects' depth and the set's diversity. We test and apply this technique to three application domains: analyzing student applicant pools of a particular school or department, building an effective fantasy football team, and analyzing traffic activity on a network.  相似文献   
9.
10.
Photographic volumes present a unique, interesting challenge for volume rendering. In photographic volumes, the voxel color is pre-determined, making color selection through transfer functions unnecessary. However, photographic data does not contain a clear mapping from the multi-valued color values to a scalar density or opacity, making projection and compositing much more difficult than with traditional volumes. Moreover, because of the nonlinear nature of color spaces, there is no meaningful norm for the multi-valued voxels. Thus, the individual color channels of photographic data must be treated as incomparable data tuples rather than as vector values. Traditional differential geometric tools, such as intensity gradients, density and Laplacians, are distorted by the nonlinear non-orthonormal color spaces that are the domain of the voxel values. We have developed different techniques for managing these issues while directly rendering volumes from photographic data. We present and justify the normalization of color values by mapping RGB values to the CIE L*u*v* color space. We explore and compare different opacity transfer functions that map three-channel color values to opacity. We apply these many-to-one mappings to the original RGB values as well as to the voxels after conversion to L*u*v* space. Direct rendering using transfer functions allows us to explore photographic volumes without having to commit to an a-priori segmentation that might mask fine variations of interest. We empirically compare the combined effects of each of the two color spaces with our opacity transfer functions using source data from the Visible Human project  相似文献   
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号