首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 15 毫秒
1.
2.
基于条件正定核的SVM人脸识别   总被引:2,自引:0,他引:2       下载免费PDF全文
为提高人脸识别分类器的能力,采用了一种改进的可用于核学习方法的核函数—条件正定核函数。条件正定核函数一般不满足Mercer条件,但可以在核空间中计算样本间的距离,突出样本间的特征差异。对ORL、YALE、ESSEX三个标准人脸数据库进行仿真实验,结果表明基于条件正定核的SVM人脸识别算法在训练时间没有降低的情况下,与其他核函数法相比识别率有较大提高,并且当类别数增加时算法表现出较强的鲁棒性。  相似文献   

3.
4.
Image warping with scattered data interpolation   总被引:10,自引:0,他引:10  
Discusses a new approach to image warping based on scattered data interpolation methods which provides smooth deformations with easily controllable behavior. A new, efficient deformation algorithm underlies the method  相似文献   

5.
We construct iterative processes to compute the weighted normal pseudosolution with positive definite weights (weighted least squares solutions with weighted minimum Euclidean norm) for systems of linear algebraic equations (SLAE) with an arbitrary rectangular real matrix. We examine two iterative processes based on the expansion of the weighted pseudoinversc matrix into matrix power series. The iterative processes are applied to solve constrained least squares problems that arise in mathematical programming and to findL-pseudosolutions. Translated from Kibernetika i Sistemnyi Analiz, No. 2, pp. 116–124, March–April, 1998.  相似文献   

6.
In this paper, a robust algorithm is proposed for reconstructing 2D curve from unorganized point data with a high level of noise and outliers. By constructing the quadtree of the input point data, we extract the “grid-like” boundaries of the quadtree, and smooth the boundaries using a modified Laplacian method. The skeleton of the smoothed boundaries is computed and thereby the initial curve is generated by circular neighboring projection. Subsequently, a normal-based processing method is applied to the initial curve to smooth jagged features at low curvatures areas, and recover sharp features at high curvature areas. As a result, the curve is reconstructed accurately with small details and sharp features well preserved. A variety of experimental results demonstrate the effectiveness and robustness of our method.  相似文献   

7.
8.
Many multivariate interpolation schemes require as data values of derivatives that are not available in a practical application, and that therefore have to be generated suitably. A specific approach to this problem is described that is modeled after univariate spline interpolation. Derivative values are defined by the requirement that a certain functional be minimized over a suitable space subject to interpolation of given positional data. In principle, the technique can be applied in arbitrarily many variables. The theory is described in general, and a particular application is given in two variables. A major tool in the implementation of the technique is the Bézier-Bernstein form of a multivariate polynomial. The technique yields visually pleasing surfaces and is therefore suitable for design applications. It is less suitable for the approximation of derivatives of a given function.  相似文献   

9.
M. Rossini 《Computing》1998,61(3):215-234
We describe a numerical approach for the detection of discontinuities of a two dimensional function distorted by noise. This problem arises in many applications as computer vision, geology, signal processing. The method we propose is based on the two-dimensional continuous wavelet transform and follows partially the ideas developed in [2], [6] and [8]. It is well-known that the wavelet transform modulus maxima locate the discontinuity points and the sharp variation points as well. Here we propose a statistical test which, for a suitable scale value, allows us to decide if a wavelet transform modulus maximum corresponds to a function value discontinuity. Then we provide an algorithm to detect the discontinuity curves fromscattered and noisy data.  相似文献   

10.
The authors develop and analyze iterative methods with different (linear, quadratic, or of p (p2) order) rates of convergence. The methods are used to calculate weighted pseudoinverse matrices with positive defined weights. To find weighted normal pseudosolutions with positive defined weights, iterative methods with a quadratic rate of convergence are developed and analyzed. The iterative methods for calculation of weighted normal pseudosolutions are used to solve least-square problems with constraints.Translated from Kibernetika i Sistemnyi Analiz, No. 5, pp. 20–44, September–October 2004.  相似文献   

11.
Recent developments in computing and technology, along with the availability of large amounts of raw data, have contributed to the creation of many effective techniques and algorithms in the fields of pattern recognition and machine learning. The main objectives for developing these algorithms include identifying patterns within the available data or making predictions, or both. Great success has been achieved with many classification techniques in real-life applications. With regard to binary data classification in particular, analysis of data containing rare events or disproportionate class distributions poses a great challenge to industry and to the machine learning community. This study examines rare events (REs) with binary dependent variables containing many more non-events (zeros) than events (ones). These variables are difficult to predict and to explain as has been evidenced in the literature. This research combines rare events corrections to Logistic Regression (LR) with truncated Newton methods and applies these techniques to Kernel Logistic Regression (KLR). The resulting model, Rare Event Weighted Kernel Logistic Regression (RE-WKLR), is a combination of weighting, regularization, approximate numerical methods, kernelization, bias correction, and efficient implementation, all of which are critical to enabling RE-WKLR to be an effective and powerful method for predicting rare events. Comparing RE-WKLR to SVM and TR-KLR, using non-linearly separable, small and large binary rare event datasets, we find that RE-WKLR is as fast as TR-KLR and much faster than SVM. In addition, according to the statistical significance test, RE-WKLR is more accurate than both SVM and TR-KLR.  相似文献   

12.
杨军  邢琪  诸昌钤  彭强 《计算机应用》2007,27(10):2522-2524
针对带噪声的点云数据提出了一种基于贝叶斯(Bayesian)统计理论的曲面重建算法。算法的主要思想是在可能的重建概率空间上寻找最大后验概率。首先,分别计算测量过程数学模型和曲面先验概率模型;其次,通过共轭梯度优化算法确定每一个点的最大后验重建位置;最后,应用Surface Splatting 算法绘制点模型。实验结果表明,该先验概率模型不仅能去除扫描点云数据的噪声,同时还能增强曲面的细节特征。和已有的研究工作相比,本算法能获得更好的重建结果。  相似文献   

13.
Weighted pseudoinverse matrices with positive definite weights are expanded into matrix power products with negative exponents and arbitrary positive parameters. These expansions are used to develop and analyze iterative methods for evaluating weighted pseudoinverse matrices and weighted normal pseudosolutions and solving constrained least-squares problems. __________ Translated from Kibernetika i Sistemnyi Analiz, No. 1, pp. 45–64, January–February 2007.  相似文献   

14.
吴成东  卢紫微  于晓升 《控制与决策》2019,34(10):2243-2248
针对目前图像超分辨率重建效果欠佳的问题,提出一种基于加权随机森林的图像超分辨率重建算法.利用随机森林对图像块的特征进行聚类,并引入岭回归模型建立每类叶子结点中高、低分辨率图像块的映射关系,重建时根据测试低分辨率图像块所属的类别以及在每类叶子结点中的K近邻近似拟合误差,进行加权预测获得高分辨率图像块.将图像的非局部自相似性与迭代反投影算法相结合对预测的高分辨率图像进行后处理以提高重建质量.实验结果表明,所提出算法可以有效提高峰值信噪比,具有较好的可视效果.  相似文献   

15.
Traditional image resizing methods usually work in pixel space and use various saliency measures.The challenge is to adjust the image shape while trying to pres...  相似文献   

16.
G. Allasia 《Calcolo》1992,29(1-2):97-109
A class of inverse distance weighted formulas for scattered data interpolation, such as the well-known Shepard one, is reconsidered from a physical viewpoint and in particular with respect to electrostatic or gravitational fields. In this way, simple properties are explained, that permit the construction of either parallel or recursive algorithms to calculate these formulas. The formulas are then interpreted as means in accordance with the definitions of Cauchy, Chisini and others. As a consequence, some significant properties of the means can be used. Finally, a few comments are made on the numerical evaluation of the formulas considered. this work has been supported by the Italian Ministry of Scientific and Technological Research and the National Research Council.  相似文献   

17.
将全局正定径向基函数和图像分割中基于偏微分方程水平集方法的发展方程相结合,提出了一种基于全局正定径向基函数的图像分割算法。用全局正定径向基函数插值发展方程中的水平集函数,得到的插值函数具有较高的精度和光滑性,克服了传统水平集方法中复杂费时的重新初始化过程和水平集对初始轮廓位置敏感等缺点,非线性发展方程最终被转化成常微分方程组并用Euler法求解。实验结果表明该算法不需要重新初始化过程,并且在没有初始轮廓时也能够快速正确地分割图像。  相似文献   

18.
Extracting skeletal curves from 3D scattered data   总被引:8,自引:0,他引:8  
  相似文献   

19.
针对三维扫描或三维重建获取的散乱点云数据曲面重建问题, 提出基于拉普拉斯规则化的高阶平滑算法。首先, 计算点云数据的包围盒并离散化得到体素空间; 其次, 在体素空间根据隐式曲面的梯度和点云位置、法向信息建立目标函数, 并通过对目标函数的拉普拉斯规则化达到控制重建曲面光顺效果的目的; 再次, 根据最优化原理将重建问题转换为一个稀疏线性方程组求解问题; 最后, 通过步进立方体算法得到重建曲面的三角网格表示。定性和定量的实验结果表明, 该方法重建曲面绘制效果和精确度优于常用的Poisson方法。  相似文献   

20.
Gregory M. Nielson 《Computing》2009,86(2-3):199-212
We describe some new methods for obtaining a mathematical representation of a surface that approximates a scattered point cloud, {(x i , y i , z i ) i = 1, . . . , N} without the use or need of normal vector data. The fitting surface is defined implicitly as the level set of a field function which is a linear combination of trivariate radial basis functions. Optimal approximations are based upon normalized least squares criteria which lead to eigenvalue/eigenvector characterizations. The normalized aspect allows for the exclusion of the need of normal vector estimates which is one of the unique features of this new method. Localizing techniques are introduced to allow for the efficient application of these new methods to large data sets. The use of a variety of radial basis functions are introduced through various examples that illustrate the performance and efficiency of the new methods.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号