首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 515 毫秒
1.
Tackling data with gap-interval time is an important issue faced by the temporal database community. While a number of interval logics have been developed, less work has been reported on gap-interval time. To represent and handle data with time, a clause ‘when’ is generally added into each conventional operator so as to incorporate time dimension in temporal databases, which clause ‘when’ is really a temporal logical sentence. Unfortunately, though several temporal database models have dealt with data with gap-interval time, they still put interval calculus methods on gap-intervals. Certainly, it is inadequate to tackle data with gap-interval time using interval calculus methods in historical databases. Consequently, what temporal expressions are valid in the clause ‘when’ for tackling data with gap-interval time? Further, what temporal operations and relations can be used in the clause ‘when’? To solve these problems, a formal tool for supporting data with gap-interval time must be explored. For this reason, a gap-interval-based logic for historical databases is established in this paper. In particular, we discuss how to determine the temporal relationships after an event explodes. This can be used to describe the temporal forms of tuples splitting in historical databases. Received 2 February 1999 / Revised 9 May 1999 / Accepted in revised form 20 November 1999  相似文献   

2.
支持向量机最优模型选择的研究   总被引:18,自引:0,他引:18  
通过对核矩阵的研究,利用核矩阵的对称正定性,采用核校准的方法提出了一种SVM最优模型选择的算法——OMSA算法.利用训练样本不通过SVM标准训练和测试过程而寻求最优的核参数和相应的最优学习模型,弥补了传统SVM在模型选择上经验性强和计算量大的不足.采用该算法在UCI标准数据集和FERET标准人脸库上进行了实验,结果表明,通过该算法找到的核参数以及相应的核矩阵是最优的,得到的SVM分类器的错误率最小.该算法为SVM最优模型选择提供了一种可行的方法,同时对其他基于核的学习方法也具有一定的参考价值.  相似文献   

3.
Computing moments on images is very important in the fields of image processing and pattern recognition. The non-symmetry and anti-packing model (NAM) is a general pattern representation model that has been developed to help design some efficient image representation methods. In this paper, inspired by the idea of computing moments based on the S-Tree coding (STC) representation and by using the NAM and extended shading (NAMES) approach, we propose a fast algorithm for computing lower order moments based on the NAMES representation, which takes O(N) time where N is the number of NAM blocks. By taking three idiomatic standard gray images ‘Lena’, ‘F16’, and ‘Peppers’ in the field of image processing as typical test objects, and by comparing our proposed algorithm with the conventional algorithm and the popular STC representation algorithm for computing the lower order moments, the theoretical and experimental results presented in this paper show that the average execution time improvement ratios of the proposed NAMES approach over the STC approach, and also the conventional approach are 26.63%, and 82.57% respectively while maintaining the image quality.  相似文献   

4.
有监督的无参数核局部保持投影及人脸识别   总被引:1,自引:0,他引:1  
龚劬  许凯强 《计算机科学》2016,43(9):301-304, 309
针对发掘人脸图像中的高维非线性结构,将加核及构造无参数近邻图两种思想同时引入到局部保持投影算法中,在有监督的模式下,提出了一种新的有监督的无参数核局部保持投影(Parameter-less Supervised Kernel Locality Preserving Projection,PSKLPP)算法并给出了其推导过程。该算法通过将欧氏距离改为对离群数据更为鲁棒的余弦距离,构造无参数近邻图,利用核方法提取人脸图像中的非线性信息,并将其投影在一个高维非线性空间,运用局部保持投影算法得到一线性映射,有效避免了在计算相似矩阵过程中面临的复杂参数选择问题。在ORL和Yale人脸库上的仿真实验验证了所提算法的有效性。  相似文献   

5.
Kernel functions are used in support vector machines (SVM) to compute inner product in a higher dimensional feature space. SVM classification performance depends on the chosen kernel. The radial basis function (RBF) kernel is a distance-based kernel that has been successfully applied in many tasks. This paper focuses on improving the accuracy of SVM by proposing a non-linear combination of multiple RBF kernels to obtain more flexible kernel functions. Multi-scale RBF kernels are weighted and combined. The proposed kernel allows better discrimination in the feature space. This new kernel is proved to be a Mercer’s kernel. Furthermore, evolutionary strategies (ESs) are used for adjusting the hyperparameters of SVM. Training accuracy, the bound of generalization error, and subset cross-validation on training accuracy are considered to be objective functions in the evolutionary process. The experimental results show that the accuracy of multi-scale RBF kernels is better than that of a single RBF kernel. Moreover, the subset cross-validation on training accuracy is more suitable and it yields the good results on benchmark datasets.  相似文献   

6.
Kernel Projection Algorithm for Large-Scale SVM Problems   总被引:5,自引:0,他引:5       下载免费PDF全文
Support Vector Machine (SVM) has become a very effective method in statistical machine learning and it has proved that training SVM is to solve Nearest Point pair Problem (NPP) between two disjoint closed convex sets.Later Keerthi pointed out that it is difficult to apply classical excellent geometric algorithms directly to SVM and so designed a new geometric algorithm for SVM.In this article,a new algorithm for geometrically solving SVM,Kernel Projection Algorithm,is presented based on the theorem on fixed-points of projection mapping.This new algorithm makes it easy to apply classical geometric algorithms to solving SVM and is more understandable than Keerthi‘s.Experiments show that the new algorithm can also handle large-scale SVM problems.Geometric algorithms for SVM,such as Keerthi‘s algorithm,require that two closed convex sets be disjoint and otherwise the algorithms are meaningless.In this article,this requirement will be guaranteed in theory be using the theoretic result on universal kernel functions.  相似文献   

7.
As an effective learning technique based on structural risk minimization, SVM has been confirmed an useful tool in many machine learning fields. With the increase in application requirement for some real-time cases, such as fast prediction and pattern recognition, the online learning based on SVM gradually becomes a focus. But the common SVM has disadvantages in classifier’s bias and the computational complexity of online modeling, resulting in the reduction in classifier’s generality and the low learning speed. Therefore, an non-biased least square support vector classifier(LSSVC) model is proposed in this paper by improving the form of structure risk. Also, a fast online learning algorithm using Cholesky factorization is designed based on this model according to the characteristic of the non-biased kernel extended matrix in the model’s dynamic change process. In this way, the calculation of Lagrange multipliers is simplified, and the time of online learning is greatly reduced. Simulation results testify that the non-biased LSSVC has good universal applicability and better generalization capability, at the same time, the algorithm has a great improvement on learning speed.  相似文献   

8.
In this paper,an interactive learning algorithm of context-frmm language is presented.This algorithm is designed especially for system SAQ,which is a system for formal secification acquisition and verification.As the kernel of concept acquisition subsystem(SAQ/CL)of SAQ,the algorithm has been implemented on SUN SPARC workstation.The grammar to be obtained can represent sentence structure naturally.  相似文献   

9.
The requirement for anthropocentric, or human-centred decision support is outlined, and the IDIOMS management information tool, which implements several human-centred principles, is described. IDIOMS provides a flexible decision support environment in which applications can be modelled using both ‘objective’ database information, and user-centred ‘subjective’ and contextual information. The system has been tested on several real applications, demonstrating its power and flexibility. IDIOMS (Intelligent Decision-making In On-line Management Systems) is a collaboration between the National Transputer Support Centre, Sheffield University, Strand Software Technologies Ltd., Bristol Transputer Centre and a high street bank, partially funded by the DTI under the Information Engineering Advanced Technology Programme. The project has demonstrated several technical features which are not detailed in this paper, including a multi-user interface allowing dynamic shared access to data; machine learning strategies for three banking applications; a scalable, modular database engine; and realistic transactions being handled while on-line management information queries are made.  相似文献   

10.
POTENTIAL: A highly adaptive core of parallel database system   总被引:1,自引:1,他引:0       下载免费PDF全文
POTENTIAL is a virtual database machine based on general computing platforms,especially parllel computing platforms.It provides a complete solution to high-performance database systems by a ‘virtual processor virtual data bus virtual memory‘ architecture.Virtual processors manage all CPU resources in the system,on which various operations are running.Virtual data bus is responsible for the management of data transmission between associated operations.which forms the higes of the entire system.Virtual memory provides efficient data storage and buffering mechanisms that conform to data reference behaviors in database systems.The architecture of POTENTIAL is very clear and has many good features,including high efficiency,high scalability,high extensibility,high portability,etc.  相似文献   

11.
构造性核覆盖算法在图像识别中的应用   总被引:14,自引:0,他引:14       下载免费PDF全文
构造性神经网络的主要特点是:在对给定的具体数据的处理过程中,能同时给出网络的结构和参数;支持向量机就是先通过引入核函数的非线性变换,然后在这个核空间中求取最优线性分类面,其所求得的分类函数,形式上类似于一个神经网络,而构造性核覆盖算法(简称为CKCA)则是一种将神经网络中的构造性学习方法(如覆盖算法)与支持向量机(SVM)中的核函数法相结合的方法。CKCA方法具有运算量小、构造性强、直观等特点,适于处理大规模分类问题和图像识别问题。为验证CKCA算法的应用效果,利用图像质量不高的车牌字符进行了识别实验,并取得了较好的结果。  相似文献   

12.
盛明明  黄海燕  赵玉 《计算机科学》2015,42(Z11):19-21, 48
支持向量机参数是影响其性能的重要因素,但对支持向量机核参数的选取仍没有形成一套成熟的理论,从而严重影响了其广泛的应用。将克隆选择算法引入差分进化算法,对基本克隆选择算法和差分进化算法中的策略进行改进。将两种改进的算法进行融合,提出了一种基于克隆选择的差分进化算法,并将其应用于SVM核参数的优化中。测试结果表明,该算法不仅可以有效避免差分进化算法易早熟收敛的问题,而且寻优能力得到显著提高;在UCI数据库wine数据中的应用表明,利用克隆选择差分进化算法优化SVM核参数加快了参数搜索的速度,提高了SVM预测精度和泛化能力,具有较高的分类准确率和较好的推广性能。  相似文献   

13.
高斯小波支持向量机的研究   总被引:1,自引:0,他引:1  
证明了偶数阶高斯小波函数满足支持向量机的平移不变核函数条件.应用小波核函数建立了相应的高斯小波支持向量机,并且使用云遗传算法对支持向量机及其核函数的参数进行优化.用该算法与常用的高斯核和Morlet小波核支持向量机进行对比实验.通过对非线性函数的逼近和电力系统短期负荷的预测,验证了该算法的有效性和优越性,表明其具有一定的实用价值.  相似文献   

14.
This paper formalizes and analyzes cognitive transitions between artificial perceptions that consist of an analogical or metaphorical transference of perception. The formalization is performed within a mathematical framework that has been used before to formalize other aspects of artificial perception and cognition. The mathematical infrastructure consists of a basic category of ‘artificial perceptions’. Each ‘perception’ consists of a set of ‘world elements’, a set of ‘connotations’, and a three valued (true, false, undefined) predicative connection between the two sets. ‘Perception morphisms’ describe structure preserving paths between perceptions. Quite a few artificial cognitive processes can be viewed and formalized as perception morphisms or as other categorical constructs. We show here how analogical transitions can be formalized in a similar way. A factorization of every analogical transition is shown to formalize metaphorical perceptions that are inspired by the analogy. It is further shown how structural aspects of ‘better’ analogies and metaphors can be captured and evaluated by the same categorical setting, as well as generalizations that emerge from analogies. The results of this study are then embedded in the existing mathematical formalization of other artificial cognitive processes within the same premises. A fallout of the rigorous unified mathematical theory is that structured analogies and metaphors share common formal aspects with other perceptually acute cognitive processes. This revised version was published online in June 2006 with corrections to the Cover Date.  相似文献   

15.
In this paper we present a system to enhance the performance of feature correspondence based alignment algorithms for laser scan data. We show how this system can be utilized as a new approach for evaluation of mapping algorithms. Assuming a certain a priori knowledge, our system augments the sensor data with hypotheses (‘Virtual Scans’) about ideal models of objects in the robot’s environment. These hypotheses are generated by analysis of the current aligned map estimated by an underlying iterative alignment algorithm. The augmented data is used to improve the alignment process. Feedback between data alignment and data analysis confirms, modifies, or discards the Virtual Scans in each iteration. Experiments with a simulated scenario and real world data from a rescue robot scenario show the applicability and advantages of the approach. By replacing the estimated ‘Virtual Scans’ with ground truth maps our system can provide a flexible way for evaluating different mapping algorithms in different settings.  相似文献   

16.
A. Sgarro 《Calcolo》1978,15(1):41-49
Summary The informational divergence between stochastic matrices is not a metric. In this paper we show that, however, consistent definitions can be given of ‘spheres’, ‘segments’ and ‘straight lines’ using the divergence as a sort of ‘distance’ between stochastic matrices. The geometric nature of many ‘reliability functions’ of Information Theory and Mathematical Statistics is thus clarified. This work has been done within the GNIM-CNR research activity.  相似文献   

17.
In this paper, we demonstrate how craft practice in contemporary jewellery opens up conceptions of ‘digital jewellery’ to possibilities beyond merely embedding pre-existing behaviours of digital systems in objects, which follow shallow interpretations of jewellery. We argue that a design approach that understands jewellery only in terms of location on the body is likely to lead to a world of ‘gadgets’, rather than anything that deserves the moniker ‘jewellery’. In contrast, by adopting a craft approach, we demonstrate that the space of digital jewellery can include objects where the digital functionality is integrated as one facet of an object that can be personally meaningful for the holder or wearer.  相似文献   

18.
Generalized Core Vector Machines   总被引:4,自引:0,他引:4  
Kernel methods, such as the support vector machine (SVM), are often formulated as quadratic programming (QP) problems. However, given$m$training patterns, a naive implementation of the QP solver takes$O(m^3)$training time and at least$O(m^2)$space. Hence, scaling up these QPs is a major stumbling block in applying kernel methods on very large data sets, and a replacement of the naive method for finding the QP solutions is highly desirable. Recently, by using approximation algorithms for the minimum enclosing ball (MEB) problem, we proposed the core vector machine (CVM) algorithm that is much faster and can handle much larger data sets than existing SVM implementations. However, the CVM can only be used with certain kernel functions and kernel methods. For example, the very popular support vector regression (SVR) cannot be used with the CVM. In this paper, we introduce the center-constrained MEB problem and subsequently extend the CVM algorithm. The generalized CVM algorithm can now be used with any linear/nonlinear kernel and can also be applied to kernel methods such as SVR and the ranking SVM. Moreover, like the original CVM, its asymptotic time complexity is again linear in$m$and its space complexity is independent of$m$. Experiments show that the generalized CVM has comparable performance with state-of-the-art SVM and SVR implementations, but is faster and produces fewer support vectors on very large data sets.  相似文献   

19.
The problem of ‘information content’ of an information system appears elusive. In the field of databases, the information content of a database has been taken as the instance of a database. We argue that this view misses two fundamental points. One is a convincing conception of the phenomenon concerning information in databases, especially a properly defined notion of ‘information content’. The other is a framework for reasoning about information content. In this paper, we suggest a modification of the well known definition of ‘information content’ given by Dretske(Knowledge and the flow of information,1981). We then define what we call the ‘information content inclusion’ relation (IIR for short) between two random events. We present a set of inference rules for reasoning about information content, which we call the IIR Rules. Then we explore how these ideas and the rules may be used in a database setting to look at databases and to derive otherwise hidden information by deriving new relations from a given set of IIR. A prototype is presented, which shows how the idea of IIR-Reasoning might be exploited in a database setting including the relationship between real world events and database values.
Malcolm CroweEmail:
  相似文献   

20.
An algorithm for solving optimal active vibration control problems by the finite element method (FEM) is presented. The optimality equations for the problem are derived from Pontryagin’s principle in the form of a set of the fourth order ordinary differential equations that, together with the initial and final boundary conditions, constitute the boundary value problem in the time domain, which in control is referred to as a two-point-boundary-value problem. These equations decouple in the modal space and can be solved by the FEM technique. An analogy between the optimality equations and the governing equations for a set of certain static beams permits obtaining numerical solutions to the optimal control problem with the help of standard ‘structural’ FEM software. The optimal action of actuators is automatically calculated by applying the independent modal space control concept. The structure’s response to actuation forces is also determined and can independently be verified for spillover effects. As an illustration, the algorithm is used for the analysis of optimal action of actuators to attenuate vibrations of an elastic fin.  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号