首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
全规整重网格化三维模型的压缩   总被引:1,自引:0,他引:1  
通过保形自适应重采样,可将三维网格模型转化为规则排列的二维几何图像,从而可借鉴成熟的图像压缩技术对其进行压缩.提出了保形自适应采样算法,根据网格模型表面的有效顶点分布密度自适应地调整采样网格,并可最大限度地通过原始网格顶点进行采样.在不增加采样率的前提下,该压缩方法所得解压模型具有更小的失真度.通过大量实例对文中方法进行了验证,并与同类方法进行对比.实验结果表明该方法是切实可行的,且具有更好的压缩效果.  相似文献   

2.
In many geometry processing applications, it is required to improve an initial mesh in terms of multiple quality objectives. Despite the availability of several mesh generation algorithms with provable guarantees, such generated meshes may only satisfy a subset of the objectives. The conflicting nature of such objectives makes it challenging to establish similar guarantees for each combination, e.g., angle bounds and vertex count. In this paper, we describe a versatile strategy for mesh improvement by interpreting quality objectives as spatial constraints on resampling and develop a toolbox of local operators to improve the mesh while preserving desirable properties. Our strategy judiciously combines smoothing and transformation techniques allowing increased flexibility to practically achieve multiple objectives simultaneously. We apply our strategy to both planar and surface meshes demonstrating how to simplify Delaunay meshes while preserving element quality, eliminate all obtuse angles in a complex mesh, and maximize the shortest edge length in a Voronoi tessellation far better than the state‐of‐the‐art.  相似文献   

3.
针对对偶网格萎缩现象,提出一种基于全局能量优化的对偶网格构造方法.该方法从重建原始网格、保持原网格形状和对偶网格质量修正3个角度建立能量优化模型,并通过求解稀疏线性方程组得到对偶网格的顶点位置;得益于该方法的重建能量约束,利用重构约束矩阵与对偶网格顶点位置可以很快地重建原始网格.实验结果表明,文中方法避免了网格萎缩现象,且适用于任何拓扑结构的模型;基于该方法的网格编辑算法可以很好地保持原始网格的几何形状特征.  相似文献   

4.
We propose a new method for constructing all-hexahedral finite element meshes. The core of our method is to build up a compatible combinatorial cell complex of hexahedra for a solid body which is topologically a ball, and for which a quadrilateral surface mesh of a certain structure is prescribed. The step-wise creation of the hex complex is guided by the cycle structure of the combinatorial dual of the surface mesh. Our method transforms the graph of the surface mesh iteratively by changing the dual cycle structure until we get the surface mesh of a single hexahedron. Starting with a single hexahedron and reversing the order of the graph transformations, each transformation step can be interpreted as adding one or more hexahedra to the so far created hex complex. Given an arbitrary solid body, we first decompose it into simpler subdomains equivalent to topological balls by adding virtual 2-manifolds. Secondly, we determine a compatible quadrilateral surface mesh for all subdomains created. Then, in the main part we can use the core routine to build up a hex complex for each subdomain independently. The embedding and smoothing of the combinatorial mesh(es) finishes the mesh generation process. First results obtained for complex geometries are encouraging.  相似文献   

5.
基于三维网格模型的双重数字盲水印算法   总被引:1,自引:0,他引:1       下载免费PDF全文
为提高一重三维模型数字水印的安全性,提出一种基于三维网格模型的双重数字盲水印算法。通过改变三角网格顶点在其一环相邻顶点所确定的局部几何空间中的位置和三角面片顶点排列顺序,嵌入双重水印,使模型能抵抗严重的剪切攻击及一定程度的噪声攻击。算法在提取水印时无需原始模型。仿真结果表明,该算法能够有效抵抗平移、旋转、均匀缩放、顶点乱序、多边形乱序及剪切等攻击,具有嵌入可读水印的不可见性。  相似文献   

6.
A common way to render cell-centric adaptive mesh refinement (AMR) data is to compute the dual mesh and visualize that with a standard unstructured element renderer. While the dual mesh provides a high-quality interpolator, the memory requirements of the dual mesh data structure are significantly higher than those of the original grid, which prevents rendering very large data sets. We introduce a GPU-friendly data structure and a clustering algorithm that allow for efficient AMR dual mesh rendering with a competitive memory footprint. Fundamentally, any off-the-shelf unstructured element renderer running on GPUs could be extended to support our data structure just by adding a gridlet element type in addition to the standard tetrahedra, pyramids, wedges, and hexahedra supported by default. We integrated the data structure into a volumetric path tracer to compare it to various state-of-the-art unstructured element sampling methods. We show that our data structure easily competes with these methods in terms of rendering performance, but is much more memory-efficient.  相似文献   

7.
遗传重采样粒子滤波器   总被引:14,自引:1,他引:13  
叶龙  王京玲  张勤 《自动化学报》2007,33(8):885-887
粒子退化现象是影响粒子滤波器性能的一个重要因素. 本文针对粒子退化, 将遗传机制应用于粒子重采样, 以进化设计解决退化问题. 分析并给出了平衡粒子集的有效性与多样性的手段以取得最佳性能的遗传粒子滤波结构的方法.  相似文献   

8.
重采样方法与机器学习   总被引:5,自引:0,他引:5  
Boosting算法试图用弱学习器的线性组合逼近复杂的自然模型,以其优秀的可解释性和预测能力,得到了计算机界的高度关注.但Boosting只被看作是一种特定损失下的优化问题,其统计学本质未曾得到充分的关注.作者追根溯源,提出从统计学角度看待Boosting 方法:在统计学框架下,Boosting算法仅仅是重采样方法的一个有趣的特例.作者希望改变计算机科学家只重视算法性能忽略数据性质的现状,以期找到更适合解决"高维海量不可控数据"问题的方法.  相似文献   

9.
This paper investigates resampling techniques on a pseudohexagonal grid. Hexagonal grids are known to be advantageous in many respects for sampling and representing digital images in various computer vision and graphics applications. Currently, a real hexagonal grid device is still difficult to find. A good alternative for obtaining the advantages of a hexagonal grid is to construct a pseudohexagonal grid on a regular rectangular grid device. In this paper we first describe the options and procedures for constructing such a pseudo-hexagonal grid and then demonstrate techniques of resampling digital images on the pseudohexagonal grid. Four distinct resampling kernels are tested, and their results are illustrated and compared.  相似文献   

10.
为解决多标签学习中数据不平衡、传统重采样过程标签样本集相互影响以及弱势类信息大量重复和强势类信息大量丢失的问题,提出多标签随机均衡采样算法。该算法在多标签的条件下提出随机均衡采样思想,充分利用强势类和弱势类信息来平衡数据冗余和损失;优化样本复制和删除策略,保证不同标签重采样过程的独立性;提出平均样本数,保持数据的原始分布。实验在3个数据集下对比了3种多标签重采样算法的性能,结果表明,0.2和0.25是所提算法的最佳重采样率,且该算法尤其适用于不平衡度较高的数据集,和其他方法相比具有最好的性能。  相似文献   

11.
The use of digital image resampling can be highly suitable for diagnostic radiology. It can be applied not only to obtain zoom effects but also to solve various problems related to the comparison of images of different origin (NMR, CT, NM). The mathematical approach to this kind of algorithm, based on rational assumptions, can be very wide and diversified. In this work the principles of the stochastic and of the analytical methods have been analyzed. In particular, a complete mathematical treatment of the integral analytical methods, which are both the most theoretically interesting and analytically complex, has been performed. The practice of these techniques has to be determined each time from the analysis of the specific problem to which they are to be applied in the routine, and after a retrospective analysis of the results obtained.  相似文献   

12.
When measuring units are expensive or time consuming, while ranking them can be done easily, it is known that ranked set sampling (RSS) is preferred to simple random sampling (SRS). Available results for RSS are developed under specific parametric assumptions or are asymptotic in nature, with few results available for finite size samples when the underlying distribution of the observed data is unknown. We investigate the use of resampling techniques to draw inferences on population characteristics. To obtain standard error and confidence interval estimates we discuss and compare three methods of resampling a given ranked set sample. Chen et al. (2004. Ranked Set Sampling: Theory and Applications. Springer, New York) suggest a natural method to obtain bootstrap samples from each row of a RSS. We prove that this method is consistent for a location estimator. We propose two other methods that are designed to obtain more stratified resamples from the given sample. Algorithms are provided for these methods. We recommend a method that obtains a bootstrap RSS from the observations. We prove several properties of this method, including consistency for a location parameter. We define two types of L-estimators for RSS and obtain expressions for their exact moments. We discuss an application to obtain confidence intervals for the Winsorized mean of a RSS.  相似文献   

13.
In cluster analysis, selecting the number of clusters is an ??ill-posed?? problem of crucial importance. In this paper we propose a re-sampling method for assessing cluster stability. Our model suggests that samples?? occurrences in clusters can be considered as realizations of the same random variable in the case of the ??true?? number of clusters. Thus, similarity between different cluster solutions is measured by means of compound and simple probability metrics. Compound criteria result in validation rules employing the stability content of clusters. Simple probability metrics, in particular those based on kernels, provide more flexible geometrical criteria. We analyze several applications of probability metrics combined with methods intended to simulate cluster occurrences. Numerical experiments are provided to demonstrate and compare the different metrics and simulation approaches.  相似文献   

14.
In this paper, a gene-handling method for evolutionary algorithms (EAs) is proposed. Such algorithms are characterized by a nonanalytic optimization process when dealing with complex systems as multiple behavioral responses occur in the realization of intelligent tasks. In generic EAs which optimize internal parameters of a given system, evaluation and selection are performed at the chromosome level. When a survived chromosome includes noneffective genes, the solution can be trapped in a local optimum during evolution, which causes an increase in the uncertainty of the results and reduces the quality of the overall system. This phenomenon also results in an unbalanced performance of partial behaviors. To alleviate this problem, a score-based resampling method is proposed, where a score function of a gene is introduced as a criterion of handling genes in each allele. The proposed method was empirically evaluated with various test functions, and the results show its effectiveness.   相似文献   

15.
结合半监督学习和集成学习方法,提出了一种基于置信度重取样的SemiBoost-CR分类模型.给出了基于标注近邻与未标注近邻的置信度计算公式,按照置信度重采样,不仅选取一定比例置信度较高的未标注样本,而且选取一定比例置信度较低的未标注样本,分别以不同的策略加入到已标注的训练样本集,引入置信度高的未标注样本,用以提高基分类...  相似文献   

16.
自适应不完全重采样粒子滤波器   总被引:8,自引:4,他引:4  
针对传统重采样算法易引起粒子贫化的问题,提出了自适应不完全重采样粒子滤波 (A particle filter based on adaptive part resampling, APRPF)算法. APRPF以分步的方式仅对部分粒子进行重采样,以递推的方式计算表征 粒子退化程度的度量函数(Measurement of particle degeneracy, MPD),直到满足给定条件.重采样后的粒子由新生粒子 和未参与重采样的粒子组成,前者的存在有助于缓解退化问题,后者可使粒子集保 持一定多样性.实验结果表明,与标准粒子滤波(Sampling importance resampling, SIR)、辅助变量粒子滤波(Auxiliary particle filter, APF)、正则化粒子滤波(Regularized particle filter, RPF) 三种滤波器相比, APRPF的估计精度高;由于平均重采样次数少,计算量也小.  相似文献   

17.
有损压缩与重采样操作在图像像素间产生的相关统计特性导致有损压缩图像难以被检测。为解决该问题,提出一种适用于无损图像的重采样检测算法,利用插值信号的周期性对图像频域特征进行分析,通过估算插值系数实现重采样检测。实验结果表明,该算法鲁棒性强、应用范围广,对于JPEG有损压缩图像的重采样检测具有较高的正确率。  相似文献   

18.
Hiding Traces of Resampling in Digital Images   总被引:1,自引:0,他引:1  
Resampling detection has become a standard tool for forensic analyses of digital images. This paper presents new variants of image transformation operations which are undetectable by resampling detectors based on periodic variations in the residual signal of local linear predictors in the spatial domain. The effectiveness of the proposed method is supported with evidence from experiments on a large image database for various parameter settings. We benchmark detectability as well as the resulting image quality against conventional linear and bicubic interpolation and interpolation with a sinc kernel. These early findings on “counter-forensic” techniques put into question the reliability of known forensic tools against smart counterfeiters in general, and might serve as benchmarks and motivation for the development of much improved forensic techniques.   相似文献   

19.
为提高网络入侵检测系统的检测效率、降低数据的不平衡程度,在分析现有重抽样方法的基础上,根据网络入侵检测数据集的特点,提出快速分层最近邻(FHNN)重抽样方法,并在KDD’99数据集上进行实验验证。结果显示,该方法可以较好地删除噪声数据和冗余信息,减小数据的不平衡度和样本总量,而且运行速度快,适用于海量数据中的各类攻击检测。  相似文献   

20.
Medical image application in clinical diagnosis and treatment is becoming more and more widely, How to use a large number of images in the image management system and it is a very important issue how to assist doctors to analyze and diagnose. This paper studies the medical image retrieval based on multi-layer resampling template under the thought of the wavelet decomposition, the image retrieval method consists of two retrieval process which is coarse and fine retrieval. Coarse retrieval process is the medical image retrieval process based on the image contour features. Fine retrieval process is the medical image retrieval process based on multi-layer resampling template, a multi-layer sampling operator is employed to extract image resampling images each layer, then these resampling images are retrieved step by step to finish the process from coarse to fine retrieval.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号