首页 | 本学科首页   官方微博 | 高级检索  
相似文献
 共查询到20条相似文献,搜索用时 0 毫秒
1.
2.
The Morse-Smale complex is an efficient representation of the gradient behavior of a scalar function, and critical points paired by the complex identify topological features and their importance. We present an algorithm that constructs the Morse-Smale complex in a series of sweeps through the data, identifying various components of the complex in a consistent manner. All components of the complex, both geometric and topological, are computed, providing a complete decomposition of the domain. Efficiency is maintained by representing the geometry of the complex in terms of point sets.  相似文献   

3.
This article focuses on the techniques of evolutionary computation for generating players performing tasks cooperatively. However, in using evolutionary computation for generating players performing tasks cooperatively, one faces fundamental and difficult decisions, including the one regarding the so-called credit assignment problem. We believe that there are some correlations among design decisions, and therefore a comprehensive evaluation of them is essential. We first list three fundamental decisions and possible options in each decision in designing methods for evolving a cooperative team. We find that there are 18 typical combinations available. Then we describe the ultimately simplified soccer game played on a one-dimensional field as a testbed for a comprehensive evaluation for these 18 candidate methods. It has been shown that some methods perform well, while there are complex correlations among design decisions. Also, further analysis has shown that cooperative behavior can be evolved, and is a necessary requirement for the teams to perform well even in such a simple game. This work was presented in part at the 10th International Symposium on Artificial Life and Robotics, Oita, Japan, February 4–6, 2005  相似文献   

4.
An algorithm is presented to compute the Taylor expansion of a polynomial B-spline function from its de Boor points. It is shown to be more efficient than existing methods and has the additional advantage of being reversible.  相似文献   

5.
In our previous work, we have provided tools for an efficient characterization of biomedical images using Legendre and Zernike moments, showing their relevance as biomarkers for classifying image tiles coming from bone tissue regeneration studies (Ujaldón, 2009) [24]. As part of our research quest for efficiency, we developed methods for accelerating those computations on GPUs (Martín-Requena and Ujaldón, 2011)  and . This new stage of our work focuses on the efficient data partitioning to optimize the execution on many-cores and clusters of GPUs to attain gains up to three orders of magnitude when compared to the execution on multi-core CPUs of similar age and cost using 1 Mpixel images. We deploy a successive and successful chain of optimizations which exploit symmetries in trigonometric functions and access patterns to image pixels which are effectively combined with massive data parallelism on GPUs to enable (1) real-time processing for our set of input biomedical images, and (2) the use of high-resolution images in clinical practice.  相似文献   

6.
The resilience and survivability of transport backbone networks are vital for the economy and security. Modern backbone networks use a mesh of fiber optic cables, which are, due to their ubiquitous deployment, prone to failures. The goal of this paper is to develop efficient computational methods for assessing the expected traffic loss in such networks. We present both analytical and simulation approaches for this problem. Our analytical approach is based on the cut set enumeration technique, while our simulation approach is based on Monte Carlo sampling techniques. To facilitate the computational process, we employ artificial intelligence methods based on genetic algorithms.  相似文献   

7.
We consider the efficient evaluation of recursive queries in logic databases where the queries are expressed using a Datalog program (function-free Horn-clause program) that contains only regularly or linearly recursive predicates. Using well-known results on graph traversal, we develop an efficient algorithm for evaluating relations defined by a binary-chain program. We also present a transformation by which the evaluation of a subset of queries involving nonbinary relations can be reduced to the evaluation of binary-chain queries. This transformation is guided by the choice of bound arguments in the query, and the bindings are propagated through the program so that in the evaluation of the transformed program the bindings will be used to restrict the set of database facts consulted.  相似文献   

8.
This paper proposes an evolving ant direction hybrid differential evolution (EADHDE) algorithm for solving the optimal power flow problem with non-smooth and non-convex generator fuel cost characteristics. The EADHDE employs ant colony search to find a suitable mutation operator for hybrid differential evolution (HDE) where as the ant colony parameters are evolved using genetic algorithm approach. The Newton–Raphson method solves the power flow problem. The feasibility of the proposed approach was tested on IEEE 30-bus system with three different cost characteristics. Several cases were investigated to test and validate the robustness of the proposed method in finding optimal solution. Simulation results demonstrate that the EADHDE provides very remarkable results compared to classical HDE and other methods reported in the literature recently. An innovative statistical analysis based on central tendency measures and dispersion measures was carried out on the bus voltage profiles and voltage stability indices.  相似文献   

9.
In this paper we demonstrate how genetic algorithms can be used to reverse engineer an evaluation function’s parameters for computer chess. Our results show that using an appropriate expert (or mentor), we can evolve a program that is on par with top tournament-playing chess programs, outperforming a two-time World Computer Chess Champion. This performance gain is achieved by evolving a program that mimics the behavior of a superior expert. The resulting evaluation function of the evolved program consists of a much smaller number of parameters than the expert’s. The extended experimental results provided in this paper include a report on our successful participation in the 2008 World Computer Chess Championship. In principle, our expert-driven approach could be used in a wide range of problems for which appropriate experts are available.  相似文献   

10.
11.
12.
In large-scale distributed simulation, thousands of objects keep moving and interacting in a virtual environment, which produces a mass of messages. High level architecture (HLA) is the prevailing standard for modeling and simulation. It specifies two publish-subscribe mechanisms for message filtering: class-based and value-based. However, the two mechanisms can only judge whether a message is relevant to a subscriber or not. Lacking of the ability to evaluate the relevance, all relevant messages are delivered with the same priority even when congestion occurs. It significantly limits the scalability and performance of distributed simulation. Aiming to solve the relevance evaluation problem, speed up message filtering, and filter more unnecessary messages, a new relevance evaluation mechanism Layer of Interest (LoI) was proposed by this paper. LoI defines a relevance classifier based on the impact of spatial distance on receiving attributes and attribute values. An adaptive publish-subscribe scheme was built on the basis of LoI. This scheme can abandon most irrelevant messages directly. Run-time infrastructure (RTI) can also apply congestion control by reducing the frequency of sending or receiving object messages based on each objects’ LoI. The experiment results verify the efficiency of message filtering and RTI congestion control. Supported by the National Basic Research Program of China (Grant No. 2009CB320805), the National Natural Science Foundation of China (Grant No. 60603084), and the National High-Tech Research & Development Program of China (Grant No. 2006AA01Z331)  相似文献   

13.
We describe a framework for supporting arbitrarily complex SQL queries with “uncertain” predicates. The query semantics is based on a probabilistic model and the results are ranked, much like in Information Retrieval. Our main focus is query evaluation. We describe an optimization algorithm that can compute efficiently most queries. We show, however, that the data complexity of some queries is #P-complete, which implies that these queries do not admit any efficient evaluation methods. For these queries we describe both an approximation algorithm and a Monte-Carlo simulation algorithm.  相似文献   

14.
Efficient evaluation of triangular B-spline surfaces   总被引:1,自引:0,他引:1  
Evaluation routines are essential for any application that uses triangular B-spline surfaces. This paper describes an algorithm to efficiently evaluate triangular B-spline surfaces with arbitrary many variables. The novelty of the algorithm is its generality: there is no restriction on the degree of the B-spline surfaces or on the dimension of the domain. Constructing an evaluation graph allows us to reuse partial results and hence, to decrease computation time. Computation time gets reduced even more by making choices in unfolding the recurrence relation of simplex splines such that the evaluation graph becomes smaller. The complexity of the algorithm is measured by the number of leaves of the graph.  相似文献   

15.
Let F be a disjunction of boolean functions of pairwise disjoint variables. Suppose an evaluation tree is given for each such function. We consider the problem of finding an optimal sequential order of scanning the trees for evaluation of F. A necessary and sufficient condition is given.A more general arrangement—a tree—structured arrangement of the trees is considered. It is shown to give no improvement.  相似文献   

16.
A method is proposed for computing the network transfer functions by direct evaluation of the determinant and the appropriate cofactors of the node-admittance matrix of the network using rules governed by topological formulae but without actually generating trees of the corresponding graph. This method suggests a quick analysis (and eventually design) of networks avoiding any matrix analysis or matrix inversion techniques, as in usual practice.  相似文献   

17.

In this paper, we investigate the use of multiple kernel functions for assisting single-objective Kriging-based efficient global optimization (EGO). The primary objective is to improve the robustness of EGO in terms of the choice of kernel function for solving a variety of black-box optimization problems in engineering design. Specifically, three widely used kernel functions are studied, that is, Gaussian, Matérn-3/2, and Matérn-5/2 function. We investigate both model selection and ensemble techniques based on Akaike information criterion (AIC) and cross-validation error on a set of synthetic (noiseless and noisy) and non-algebraic (aerodynamic and parameter tuning) optimization problems; in addition, the use of cross-validation-based local (i.e., pointwise) ensemble is also studied. Since all the constituent surrogate models in the ensemble scheme are Kriging models, it is possible to perform EGO since the Kriging uncertainty structure is still preserved. Through analyses of empirical experiments, it is revealed that the ensemble techniques improve the robustness and performance of EGO. It is also revealed that the use of Matérn-kernels yields better results than those of the Gaussian kernel when EGO with a single kernel is considered. Furthermore, we observe that model selection methods do not yield any substantial improvement over single kernel EGO. When averaged across all types of problem (i.e., noise level, dimensionality, and synthetic/non-algebraic), the local ensemble technique achieves the best performance.

  相似文献   

18.
The applications of radial moment functions such as orthogonal Zernike and pseudo-Zernike moments in real-world have been limited by the computational complexity of their radial polynomials. The common approaches used in reducing the computational complexity include the application of recurrence relations between successive radial polynomials and coefficients. In this paper, a novel approach is proposed to further reduce the computation complexity of Zernike and pseudo-Zernike polynomials based on the symmetrical property of radial polynomials. By using this symmetrical property, the real-valued radial polynomials computation is reduced to about one-eighth of the full set polynomials while the computation of the exponential angle values is reduced by half. This technique can be integrated with existing fast computation methods to further improve the computation speed. Besides significant reduction in computation complexity, it also provides vast reduction in memory storage.  相似文献   

19.
LVCSR systems are usually based on continuous density HMMs, which are typically implemented using Gaussian mixture distributions. Such statistical modeling systems tend to operate slower than real-time, largely because of the heavy computational overhead of the likelihood evaluation. The objective of our research is to investigate approximate methods that can substantially reduce the computational cost in likelihood evaluation without obviously degrading the recognition accuracy. In this paper, the most common techniques to speed up the likelihood computation are classified into three categories, namely machine optimization, model optimization, and algorithm optimization. Each category is surveyed and summarized by describing and analyzing the basic ideas of the corresponding techniques. The distribution of the numerical values of Gaussian mixtures within a GMM model are evaluated and analyzed to show that computations of some Gaussians are unnecessary and can thus be eliminated. Two commonly used techniques for likelihood approximation, namely VQ-based Gaussian selection and partial distance elimination, are analyzed in detail. Based on the analyses, a fast likelihood computation approach called dynamic Gaussian selection (DGS) is proposed. DGS approach is a one-pass search technique which generates a dynamic shortlist of Gaussians for each state during the procedure of likelihood computation. In principle, DGS is an extension of both techniques of partial distance elimination and best mixture prediction, and it does not require additional memory for the storage of Gaussian shortlists. DGS algorithm has been implemented by modifying the likelihood computation procedure in HTK 3.4 system. Experimental results on TIMIT and WSJ0 corpora indicate that this approach can speed up the likelihood computation significantly without introducing apparent additional recognition error.  相似文献   

20.
In this article we describe a new approach in evolutionary robotics according to which human breeders are involved in the evolutionary process. While traditionally robots are selected to reproduce automatically according to a fitness formula, which is a quantitative and strictly defined measure, human breeders can operate selection based on qualitative criteria, and rewarding behaviors that can slip between the meshes woven by the fitness formula. In authors’ opinion this may bring advantages to the evolutionary robotics methodology, allowing the production of robots that display more, and more multiform, behaviors. In order to illustrate this approach, the software Breedbot was developed in which human breeders can intervene in evolving robots, complementing the automatic evaluation. After describing the software, some results on sample evolutionary processes are reported showing that the joint use of human and artificial selection on an exploration task generates robots with a higher performance and in a shorter time compared with the exclusive action of each breeding method. Future work will explore this hypothesis further. This work was presented in part at the First European Workshop on Artificial Life and Robotics, Vienna, Austria, July 12–13, 2007  相似文献   

设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号