首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
A Note on Point Location in Delaunay Triangulations of Random Points   总被引:1,自引:0,他引:1  
This short note considers the problem of point location in a Delaunay triangulation of n random points, using no additional preprocessing or storage other than a standard data structure representing the triangulation. A simple and easy-to-implement (but, of course, worst-case suboptimal) heuristic is shown to take expected time O(n 1/3 ) . Received November 27, 1997; revised February 15, 1998.  相似文献   

2.
We propose a mathematical formalism to define a system hierarchy and choose a way to reconfigure it based on multi-valued logic. We consider an application of this formalism to control over a group surveying the danger zone after a natural disaster.  相似文献   

3.
《Computer》2003,36(9):9-11
Cellular phone manufacturers are on the horns of a dilemma. On one hand, the demand for their products continues to grow. On the other hand, cellular-phone technology is also changing rapidly. Vendors are dealing with this problem by using an adaptive (also called reconfigurable) chips in a new way. With this approach, software can redraw a chip's physical circuitry on the fly, letting a single processor perform multiple functions. In addition, adaptive computing could increase performance while reducing energy consumption.  相似文献   

4.
Binhai Zhu 《GeoInformatica》2000,4(3):317-334
This paper studies the idea of answering range searching queries using simple data structures. The only data structure we need is the Delaunay Triangulation of the input points. The idea is to first locate a vertex of the (arbitrary) query polygon and walk along the boundary of the polygon in the Delaunay Triangulation and report all the points enclosed by the query polygon. For a set of uniformly distributed random points in 2-D and a query polygon the expected query time of this algorithm is O(n 1/3 + Q + E K + L r n 1/2), where Q is the size of the query polygon , {\bf E}K = O(n\bcdot area is the expected number of output points, L r is a parameter related to the shape of the query polygon and n, and L r is always bounded by the sum of the edge lengths of . Theoretically, when L r = O(1/n1/6) the expected query time is O(n1/3 + Q + E K), which improves the best known average query time for general range searching. Besides the theoretical meaning, the good property of this algorithm is that once the Delaunay Triangulation is given, no additional preprocessing is needed. In order to obtain empirical results, we design a new algorithm for generating random simple polygons within a given domain. Our empirical results show that the constant coefficient of the algorithm is small, at least for the special (practical) cases when the query polygon is either a triangle (simplex range searching) or an axis-parallel box (orthogonal range searching) and for the general case when the query polygons are generated by our new polygon-generating algorithms and their sizes are relatively small.  相似文献   

5.
单体型组装加权最小字符翻转(WMLF)问题指定个体联配的加权DNA片断数据,翻转权值和最小的SNP位点以推测出该个体的一对单体型。该问题是NP-难的,至今尚无实用的搜索寻优算法。根据DNA测序片段数据的特点提出了一种遗传算法。对于实际的生物实验数据,即使数据很大,该算法也可以在较短的时间得到WMLF问题的满意解,具有良好的可扩展性和较高的实用价值。  相似文献   

6.
一种边缘点特征图像配准算法   总被引:1,自引:0,他引:1  
为解决图像的精确配准问题,提出了结合LoG算法的特征点的提取方法,并将尺度不变特征算法(SIFT)应用到图像的特征描述中.首先利用LoG算法计算边缘点,对边缘点的梯度值进行排序,选择梯度较大的点作为特征点;然后采用SIFT计算特征点的特征向量,利用最小距离算法找到两幅图像的匹配点对;最后利用最相关点和次相关点比例的方法排除错误的点对.实验结果证明,算法对具有光照、角度不同的两组图像能够实现精确的配准,准确率超过90%.  相似文献   

7.
We present a bijection between the set of plane triangulations (aka maximal planar graphs) and a simple subset of the set of plane trees with two leaves adjacent to each node. The construction takes advantage of Schnyder tree decompositions of plane triangulations. This bijection yields an interpretation of the formula for the number of plane triangulations with n vertices. Moreover, the construction is simple enough to induce a linear random sampling algorithm, and an explicit information theory optimal encoding. Finally, we extend our bijection approach to triangulations of a polygon with k sides with m inner vertices, and develop in passing new results about Schnyder tree decompositions for these objects.  相似文献   

8.
The Tower of Hanoi with Forbidden Moves   总被引:2,自引:0,他引:2  
Sapir  Amir 《Computer Journal》2004,47(1):20-24
  相似文献   

9.
针对目前大型零部件边缘轮廓线提取效率低的问题,提出一种基于扫描点云数据的零部件边缘轮廓提取技术.该技术从边缘轮廓线的形成方式出发,通过分析边缘轮廓线垂直截面线的分布特征,定义了模式向量表征屋脊型和折线直线-直线相交型截面线.提取过程主要包含截面线数据的获取、截面线类型的识别和边缘轮廓线数据的提取这3个步骤.为获取边缘轮...  相似文献   

10.
In this paper we present a topologically correct and efficient version of the algorithm by Guibas and Stolfi (Algorithmica 7 (1992), pp. 381-413) for the exact computation of Delaunay and power triangulations in two dimensions. The algorithm avoids numerical errors and degeneracies caused by the accumulation of rounding errors in fixed length floating point arithmetic when constructing these triangulations.Most methods for computing Delaunay and power triangulations involve the calculation of two basic primitives: the INCIRCLE test and the CCW orientation test. Both primitives require the computation of the sign of a determinant. The key to our method is the exact computation of this sign and is based on an algorithm for determining the sign of the sum of a finite set of normalized floating point numbers of fixed mantissa length (machine numbers) exactly. The exact computation of the primitives allows the construction of the correct Delaunay and power triangulations. The method has been implemented and tested for the incremental construction of Delaunay and power triangulations. Tests have been conducted for different distributions of points for which non-exact algorithms may encounter difficulties, for example, slightly perturbed points on a grid or on a circle. Experimental results show that the performance of our implementation is comparable with that of a simple implementation of the incremental algorithm in single precision floating point arithmetic. For random distribution of points the exact algorithm is only 4 times slower than the inexact implementation. The algorithm is easy to implement, robust and portable as long as the input data to the algorithm remains exact.  相似文献   

11.
This paper analyzes how scientists working in a multidisciplinary team produce scientific evidence through building and manipulating scientific visualizations. The research is based on ethnographic observations of scientists’ weekly work meetings and the observation of videotapes of these meetings. The scientists observed work with advanced imaging technologies to produce a 4D computer model of heat transfer in human prostate tissues. The idea of ‘digital objects’ is proposed in order to conceptually locate their ‘materiality’, observed in the practices of producing evidence through the handling of three-dimensional renderings of data. The manipulation of digital objects seeks to establish meaningful differences between parameters of interest, both when building and when analyzing them. These digital objects are dealt with as part of the empirical evidence used in the course of practices of visualizing and modeling natural phenomena. This process, which can be contextualized historically in terms of the development of imaging technologies, becomes crucial in understanding what counts as empirical evidence in current scientific work.  相似文献   

12.
Optimization with graph cuts became very popular in recent years. While exact optimization is possible in a few cases, many useful energy functions are NP hard to optimize. One approach to approximate optimization is the so-called move making algorithms. At each iteration, a move-making algorithm makes a proposal (move) for a pixel p to either keep its old label or switch to a new label. Two move-making algorithms based on graph cuts are in wide use, namely the swap and expansion. Both of these moves are binary in nature, that is they give each pixel a choice of only two labels. An evaluation of optimization techniques shows that the expansion and swap algorithms perform very well for energies where the underlying MRF has the Potts prior. However for more general priors, the swap and expansion algorithms do not perform as well. The main contribution of this paper is to develop multi-label moves. A multi-label move, unlike expansion and swap, gives each pixel has a choice of more than two labels to switch to. In particular, we develop several multi-label moves for truncated convex priors. We evaluate our moves on image restoration, inpainting, and stereo correspondence. We get better results than expansion and swap algorithms, both in terms of the energy value and accuracy.  相似文献   

13.
3D点云的不规则性与无序性使点云的分类仍具有挑战性.针对上述问题,文中设计基于残差边卷积的3D点云分类算法,可直接从点云学习到具有区分度的形状描述子,用于目标分类.首先,设计具有残差学习的边卷积模块,用于点云的特征提取.通过K近邻算法,该边卷积模块在输入点云上构建局部图,使用卷积及最大池化进行局部特征的提取与聚合.然后,通过多层感知器从原始点特征中提取全局特征,并以残差学习的方式与局部特征结合.最后,以该卷积块为基本单元,构建深度神经卷积网络,实现3D点云的分类.文中方法较全面地考虑点云局部特征与全局特征的有机结合,网络具有更深层次的结构,最终得到的形状描述子更抽象,具有更高的区分度.在具有挑战性的ModelNet40、ScanObjectNN数据集上的实验证实文中方法的分类性能较优.  相似文献   

14.
一种高性能SAR图像边缘点特征匹配方法   总被引:3,自引:0,他引:3  
陈天泽  李燕 《自动化学报》2013,39(12):2051-2063
针对合成孔径雷达(Synthetic aperture radar,SAR)图像特征匹配中特征提取的不稳定性和相似度优化搜索的复杂性问题,提出了一种精确高效稳健的SAR图像边缘点集匹配方法. 首先,分析了仿射变换模型在遥感图像匹配中的适应性,并对仿射变换模型进行了参数分解;其次,提出了基于方向模板的SAR图像边缘检测算子,并利用SAR图像边缘的梯度和方向特征,建立了基于像素迁移的多源SAR边缘点集相似性匹配准则,以及图像匹配的联合相似度-联合特征均方和(Square summation joint feature,SSJF);然后,利用改进的遗传算法(Genetic algorithm,GA)来进行相似度的全局极值优化搜索,获取变换模型参数和边缘点集的对应关系;最后,从理论上分析了本文方法的性能,并利用多幅SAR图像的匹配实验以及与原有方法的对比分析,对本文方法的性能进行了验证.  相似文献   

15.
映射相关边概念的多边形内外点判别算法   总被引:15,自引:1,他引:14  
提出映射相关边、密切边的概念,将点在多边形内外的判别转化为点与密切边关系的判别.通过X方向的一次映射快速求取判别点的相关边,对相关边的二次映射可得到判别点的密切边.证明了根据密切边的矢量方向就可以判别点在多边形内外.在运算效率上对射线法进行改进,当多边形的边数较多时,更能体现文中算法优越性.  相似文献   

16.
基于斜率提取边缘点的时间序列分段线性表示方法   总被引:7,自引:0,他引:7  
本文引入解析几何中的斜率,提出了一种新颖的基于斜率提取边缘点的时间序列分段线性表示方法SEEP。对于斜率变化范围比较集中的时间序列,SEEP表示方法有着非常好的效果,与以往的分段线性表示方法相比,SEEP表示方法与原始时间序列之间的拟合误差更小,而且要小很多;对于斜率变化范围比较大的时间序列,SEEP表示方法与原始时间序列之间的拟合误差,和以往的分段线性表示方法相比,也相差不大,并且SEEP表示方法计算简单,易于实现。算法的时间复杂度仅为O(n),  相似文献   

17.
基于数学形态学的LIDAR数据分割和边缘提取   总被引:4,自引:0,他引:4  
针对通过LIDAR方法获取的数据,根据矢量转化成栅格数据的方法将数据转化成图像,然后用数学形态学中的膨胀和腐蚀方法进行序贯运算.并将得到的图像进行边缘提取和边缘矢量化,得到每个地物所对应的边缘和数据点.最后通过两个算例来说明这种方法的可行性.  相似文献   

18.
In existing methods for segmented images, either edge point extraction or preservation of edges, compromising contrast images is so sensitive to noise. The Degeneration Threshold Image Detection (DTID) framework has been proposed to improve the contrast of edge filtered images. Initially, DTID uses a Rapid Bilateral Filtering process for filtering edges of contrast images. This filter decomposes input images into base layers in the DTID framework. With minimal filtering time, Rapid Bilateral Filtering handles high dynamic contrast images for smoothening edge preservation. In the DTID framework, Rapid Bilateral Filtering with Shift-Invariant Base Pass Domain Filter is insensitive to noise. This Shift-Invariant Filtering estimates value across edges for removing outliers (i.e., noise preserving base layers of the contrast image). The intensity values are calculated in the base layer of the contrast image for accurately detecting nearby spatial locations using Shift-Invariant base Pass Domain Filter (SIDF). At last, Affine Planar Transformation is applied to detect edge filtered contrast images in the DTID framework for attaining a high quality of the image. It normalizes the translation and rotation of images. With this, Degeneration Threshold Image Detection maximizes average contrast enhancement quality and performs an experimental evaluation of factors such as detection accuracy, rate, and filtering time on contrast images. Experimental analysis shows that the DTID framework reduces the filtering time taken on contrast images by 54% and improves average contrast enhancement quality by 27% compared to GUMA, HMRF, SWT, and EHS. It provides better performance on the enhancement of average contrast enhancement quality by 28%, detection accuracy rate by 26%, and reduction in filtering time taken on contrast images by 30% compared to state-of-art methods.  相似文献   

19.
《计算机工程》2018,(3):259-263
针对传统圆检测方法在复杂背景下检测精度低、误识别及漏识别率高、可靠性差的缺点,提出一种适应于复杂背景图像下的圆检测方法。根据半径搜索范围,利用三级筛选法选出候选圆,对所有的候选圆统计边缘点个数,将统计结果除以半径,根据此值对候选圆进行筛选。依据统计结果对圆排序,根据设定的最小圆心距及最小半径差阈值消除因峰值扩散产生的干扰圆。实验结果表明,与传统梯度Hough变换相比,该方法的误识别率和漏识别率分别降低24%和8%,检测结果更精确,在复杂背景下的可靠性更高。  相似文献   

20.
研究了在n×n的正方形棋盘中,骑士马走非正规马步(r,s)、r≥1、s>2(或称广义马步),是否能经过棋盘中每个点一次,且仅一次又回到出发点的问题,即广义马步哈密顿圈问题.论文首先给出了已有的研究成果,然后从理论上证明了在n×n,n≤r s 1的正方形棋盘中不存在广义马步哈密顿圈.最后用实证的方法,提出了在n×n,n≥2(r s)的棋盘中存在广义马步哈密顿圈的猜想,并利用实证与链接构造法,证明了对于(r=1,s=4)的广义马步情况,当n≥10时,存在广义马步哈密顿圈.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号